Social Media Addiction: Facebook Whistleblower Says Big Tech Has Known & Ignored Problem for Years


This is a rush transcript. Copy may not be in its final form.

AMY GOODMAN: This is Democracy Now!, Democracynow.org, the War and Peace Report. I’m Amy Goodman with Nermeen Shaikh.

NERMEEN SHAIKH: We are continuing to look at the landmark trial about the harms of youth social media addiction taking place in Los Angeles. Meta CEO Mark Zuckerberg testified on Wednesday. Meta is the parent company of Facebook and Instagram.

AMY GOODMAN: We are also joined by Facebook whistleblower Frances Haugen. In 2021, she turned over tens of thousands of pages of internal Facebook documents to U.S. regulators in The Wall Street Journal which became the basis of a damning series of reports called the “Facebook Papers.” Her memoir is titled The Power of One: How I Found the Strength to Tell the Truth and Why I Blew the Whistle on Facebook. Frances, yesterday, yes, Meta CEO billionaire Facebook founder Mark Zuckerberg was grilled about Instagram’s effect on the mental health of young users. What struck you most about what he had to say? What was true? And do you believe he told the truth?

FRANCES HAUGEN: I don’t know if you’ve noticed over the last couple years that he has shown up on a number of podcasts where he often gets tossed pretty soft balls. What was remarkable about reading the coverage of Mark’s testimony yesterday was how different it is when the person sitting across from him asking the questions gets to ask hard follow-up questions. Things like he has bragged before on other podcasts about not fearing ever being fired because he could—because of his stock ownership in the company, he owns the majority of voting shares or the majority of the votes; He could just put in a new board if they tried to fire him. Having to actually have a meaningful conversation about what is the implication of that level of extreme control, he has never had to sit there and squirm while being asked questions like that.

NERMEEN SHAIKH: Frances, tell us, you are a former Facebook whistleblower and advocate for social media transparency. What did you learn being within Facebook? What did the companies know about the addictive potential of their algorithms and content?

FRANCES HAUGEN: Within the child safety space, you could say there’s kind of two major areas. One is the well-being of kids. I would say that is where work on addiction falls. Then I would say child safety like dealing with predators, dealing with people who distribute child abuse material. I worked within threat intelligence. They’re the nitty-gritty hands-on investigators going in there and hunting individual people who were running scams, running terrorist organizations. My team worked on counterespionage.

The closest I saw firsthand how Facebook underinvested in the safety of children was that the team that was responsible for finding people who were distributing child abuse material or looking for adults that preyed on children or finding these marketplaces where adults solicit nude photos of children from the children directly, that team was so strapped for resources that if you had given them a single engineer more, they probably would have accomplished 10 times as much. That is the attitude I saw repeated at Facebook firsthand.

The documents I brought out, though, painted an even broader picture. While I saw things that made me uncomfortable when it came to kids, what the documents of Facebook showed was that Facebook viewed them very instrumentally. They cared about the role of teenagers in bringing in their younger siblings, bringing their parents onto the platform. They worried about public perception. Not the actual health of the kids but the perception that these products might be addictive. And what we know, now that these documents have been brought forward because of the court cases, is that there were lots and lots of experiments being run to make these products safer. They knew that the kids said these changes, things like don’t send me an alert in the middle of the night, made kids less stressed, let them sleep better. And yet they didn’t launch them because it also made them use Instagram 1 percent less.

AMY GOODMAN: What did these companies know and when did they know it? You have Zuckerberg testifying. This is under oath last week after Instagram CEO Adam Mosseri spoke on the witness stand. He pushed back on the science behind social media addiction by denying users could be clinically addicted. Can you talk about that and the case that is at the heart of the trial, the 20-year-old woman known as K.G.M. who says her addiction to using YouTube and Instagram worsened her depression and suicidal thoughts?

FRANCES HAUGEN: This one of these little turns of phrase that I think is a wonderful illustration of how Meta, Facebook, Instagram have gotten very, very good at speaking in a very precise style where factually can defend what they say. Addiction is a medical term. So, if I get addicted to painkillers, if I get addicted to cigarettes—going off of cigarettes, because your nervous system has become dependent on nicotine you go through very intense physiological symptoms. Or you go off painkillers; You’re going to literally be vomiting, right? When kids stop using social media, if you isolate that child, you do see things that are indicative of their brain chemistry is changing. They are used to tons and tons of dopamine. Now they don’t have that stimulation. I think any parent who gives their kids a lot of screen time and then tries to have them sit still for dinner or like makes them go on family vacation and leave their phone behind sees that behavior change.

From a medical standpoint, that behavioral dependence is not considered medically to be addiction. But when you come in there and downplay what happens with compulsive use, which is the scientific term of art around these things, it really downplays how having a generation of children who get hooked at seven, eight, nine. Because remember, 30 percent of seven- to nine-year-olds were on social media as recently as 2022. Imagine what it is today.

When you have kids whose brains are being just basted in dopamine from scrolling all day at such a young age, it changes their ability to sit still in class, to interact meaningfully face-to-face with her family or friends. You see this from a growing number of reports from teachers who say, “I have taught for 20 years and I don’t understand what is going on. The kids who come into my seventh grade class have never behaved like this before.” That is kind of the consequences that we’re living with of having—as the court documents show, they have been getting warned about this for 10-plus years and yet they continue to optimize for spending more and more time on these platforms.

AMY GOODMAN: As we said, we are also still joined by Lori Schott who lost her daughter Anna and is attending the trial. Lennon Torres as well, and Laura Marquez-Garrett. Nermeen?

NERMEEN SHAIKH: Lori, let’s go back to you. You said about your daughter—because it is not just concerns about what these children, young people are putting online but also what they are receiving in response. You said about your daughter, “I was so worried about what my child was putting out online. I didn’t realize what she was receiving.” So could you tell us what precisely she was receiving and how you came to see it?

LORI SCHOTT: With Annalee, after she passed away we were able to gain access into her platforms. From what I saw at first opening that as a parent who lost a child and trying to understand this world of social media, it brought me to my knees. This child was pushed content about anxiety and depression. And it wasn’t just one or two screenshot images; It was pushed to her constantly. It was just time and time again the same theme that pulled her down that rabbit hole. It hooked her into this world that just destroyed her mental health. It told her she wasn’t good enough. It told her she was broken. My daughter was not broken. These platforms are broken. Mark Zuckerberg is broken. And we are here to fight for change on that, because no child should be exposed to what my daughter saw. We tried tirelessly as parents to guide her and she was a beautiful person, and they took that person away from us.

AMY GOODMAN: Laura Marquez-Garrett, you were just with both Lori and with Lennon. You were attending the trial. You are here in New York. As you were getting ready for the show, I saw your tattoo on your forearm. Can you pull back your jacket and show us the significance of this tattoo?

LAURA MARQUEZGARRETT: Sure. I have two, and these are my children’s names, and each of the rays of sun is a child that has been lost to a social media and/or AI product. And these are patterns.

AMY GOODMAN: How many?

LAURA MARQUEZGARRETT: 296. That’s not all of them. That’s just the ones that we’ve either drafted and filed their complaints or we’ve met their parents and come to love these kids.

AMY GOODMAN: What does it mean when you file 1,200 complaints? Complaints to who? What happens?

LAURA MARQUEZGARRETT: There’s unlimited. So we file them in state and federal courts. We’re filing them—this is one of many cases. We have the federal, the MDL, where you have the attorney generals, you have school districts. We have one-off state court cases in Delaware, in New York. We had Vermont, Connecticut. Essentially with these companies, these are the best resourced companies in the world. So it is a fight, it is a battle, and we are trying to find those points where they need to be held accountable where we can break through and where we can hold them accountable.

NERMEEN SHAIKH: Laura, as you mentioned, of course, you’re an attorney for the Social Media Victims Law Center. If you could give us an example of what it would mean to place restrictions on social media use that would protect children? We just heard earlier in our introduction about a child who died while participating in a blackout or choking challenge. So just explain, what kinds of restrictions are we hoping for?

LAURA MARQUEZGARRETT: Sure. These are design defects. Think of the Pinto. It’s not a restriction as much as fixing it and fixing a defective part, a part that is exploding. Someone asked me the other day, “Meta’s saying they’re fixing it. They’re doing this.” My answer was that that’s kind of like sticking a fire extinguisher in a Ford Pinto and saying, “There you go. We fixed it.” It’s still a defective part. And so ultimately fixing it—and they know this; It’s in their documents that are becoming public—they could remove the addictive mechanisms. It is as simple as—think of your television set at home. We have this remote control. We get to turn the volume down, up, change the channels. They’ve kept those controls on the back end. They could give you the option to slow down the algorithm.

AMY GOODMAN: Explain the algorithm. For people who aren’t familiar with social media, what does it mean when Lori Schott said that her daughter Annalee kept being fed with the same thing, going down a rabbit hole?

LAURA MARQUEZGARRETT: Sure. These are social media algorithms specifically, because there are many types of algorithms, and these companies have programmed them for engagement first. So, we have instances where children—and we have the data—children will look for uplifting speeches, inspirational quotes, and they will get breakup and suicide. We have this, over and over. We literally have images we can provide where a child looked for this and got the opposite. Not because they wanted it but because TikTok determined this is what will keep this kid hooked.

Now, if a child is going through a breakup—and this is Mason Edens specifically—he looks for I believe it was inspirational quotes. If TikTok had shown him what he sought out, what would he have done? Well, potentially he would have put it down and he would have talked to someone. They don’t want that. So they are programming for engagement above all else. That is a programming choice. It is frankly a form of profiling. They’re taking thousands of data points that our children—we as consumers, we’re not consenting to that. We’re not saying yes. I mean, they can tell you the kind of car you drive, your education, all of these pieces that they can then use to profile and target. And in the case of vulnerable children, it’s deadly.

NERMEEN SHAIKH: So you mean that—literally, in this case—a child is looking for inspiration and instead gets in fact quite the opposite? The way that we are as adults familiar with the way that algorithms work is that, for example, on a news site you look at the same news site, you keep getting fed the same new site. This is extraordinary that the child is receiving something that is exactly the opposite. And you are saying that is because it is more addictive?

LAURA MARQUEZGARRETT: No, that’s the defect. You’re thinking of like search engine algorithms, right? If I go into a typical search engine, it’s programmed differently. It’s designed differently. If I search for a Chinese food restaurant, I will get, here are some Chinese food restaurants within a five-mile radius. If I go onto Instagram, I may end up with a beheading video in China, something that—actually and I’ll use an LGBTQ instance, because I had a young person say to me once, “When I would look up gay pride on Instagram, I would get half gay pride and half Westboro Baptist Church, you’re going to hell.” That is what these algorithms are doing. They are not showing our children what our children are asking to see. They are programmed to show them what they think they cannnot look away from. Which are car accidents, extremes, outrage, all of these things. And that is what is causing harm.

AMY GOODMAN: Facebook whistleblower Frances Haugen, as we wrap up, you see at the Trump inauguration the billionaire brotherhood that includes Mark Zuckerberg. How does that influence the lack of regulation that we are seeing today?

FRANCES HAUGEN: Before I do that I want to put one tiny little thing and to add a cherry on top of what Laura said. There’s a lot of really basic things that you can do even to make the current system safer. For example, they know, they have asked users before and said, “Does this content make you feel bad?” If they saw people beginning to look at more and more and more content that people say, “When I see this, it makes me feel bad,” if they just gave people a choice and said, “Hey, we notice you’re looking at more and more depressing content. Do you want to keep doing this?” You can do very simple things like this.

But when we look at the oligarchs that run these companies, we’ve set a norm that we are supposed to “just trust them.” That they’ve given us these wonderful “free” gifts, even though the price of these gifts is ourselves, our kids, our data. We’ve been told, you’ve been given such great gifts, we know so much, just trust us. That age of “just trust us” needs to end. We need real accountability, real transparency, so that people who build better things get rewarded for bringing us social media that actually is good for us.

AMY GOODMAN: Lennon, we just have 30 seconds. You’re with the Heat Initiative which focuses on applying strategic pressure to big tech companies. But I want to ask you, as a young person, your advice to young people on social media now.

LENNON TORRES: My advice is to demand better and to force Mark Zuckerberg and the other lazy, lack-of-innovative CEOs to step to the side and let true innovators show you what digital community and connection can actually look like.

AMY GOODMAN: Lennon Torres, senior manager of programs and campaigns at the Heat initiative. Facebook whistleblower Frances Haugen. Lori Schott, her 18-year-old daughter Annalee died by suicide in 2020. And Laura Marquez-Garrett, attorney at the Social Media Victims Law Center based in Seattle. Laura was just named to the TIME100 list of most influential people in health and are featured in the documentary Can’t Look Away. Laura will be honored tonight by TIME.

Coming up, we look at the breaking news, the brother of the king, Andrew Mountbatten-Windsor, has been arrested because of the Epstein files. And we will look at the number 4,400. That’s the number of times judges around the country have ruled the Trump administration is detaining immigrants unlawfully. Stay with us.

[MUSIC BREAK]



Source link

Latest articles

Related articles