Discussing Vermont's lawsuit against Meta with Attorney General Charity Clark
Social media can be a place for kids to connect with one another, watch funny videos or learn to dance. But as parents or teens probably know, it can also be a place of bullying or constant self comparison. Just last week, the state of Vermont announced its suing Meta, the tech company that owns Instagram, Facebook and WhatsApp. It claims Meta purposely got children and teenagers addicted to its platforms. And it says the company did this to maximize advertising revenue.
Note: Our show is made for the ear. We highly recommend pressing play on the audio posted above. For accessibility, we also provide a transcript of part of the show. Transcripts are generated using a combination of robots and human transcribers. They may contain errors, so please check the corresponding audio before quoting in print.
Mikaela Lefrak: Attorney General, I'd like to start with some logistics around this lawsuit. Vermont along with six other states and D.C. have filed lawsuits against Meta in their own state courts. And then separately, 33 states have filed a joint federal lawsuit, Florida filed another federal lawsuit against Meta. Can you talk us through like what, why these are all happening at the same time? Was there coordination? And why Vermont chose the path that it did?
Charity Clark: Sure. So there has been an investigation taking place among all these states for the past two years, and ultimately different states decided to take different approaches. We took the the approach we took which is to sue in state court here in Vermont, because we wanted to be in charge of our own suit. We were a leader on the investigation. We feel really strongly about this issue and protecting Vermonters especially young Vermonters and we know the case law in Vermont and believe that it's favorable to our position in the case.
Mikaela Lefrak: OK, so there are laws at both the federal and the state level that protect consumers broadly and protect children and their privacy. And in the federal student suit. The attorneys general assert that Meta violates the Children's Online Privacy Protection Act. I know that that is separate from what the Vermont lawsuit is, but I am wondering if you could give us a brief overview of what what that is.
Charity Clark: So we didn't sue on that statute. So we sued on the Vermont Consumer Protection Act. So why don't I just provide an overview of that. There are two prongs to the Vermont Consumer Protection Act. The first is deception, which is familiar to all of us, because we're all consumers. So we kind of know when someone is not being truthful with us in the marketplace. And the allegation here is that Meta knew and knows that its platform is harmful. But it is continuing this narrative that it isn't harmful, and that it's safe for young people. The second prong of the Consumer Protection Act is this unfairness prong and unfit the unfairness prong has case law that unpacks what that means. And it's things like offends public policy, is oppressive or immoral or unscrupulous, or it causes significant harm to consumers. So we are arguing on on that front, that Meta designed its platform to elicit excessive and compulsive use by users, especially young users. And it knew about it, it knew that it was doing that, it did it on purpose.
Mikaela Lefrak: How do you prove something like that?
Charity Clark: Well, interestingly, Meta itself conducted a study, and it was leaked to the Wall Street Journal. So we have information about that study that was in 2021. Yeah, and it, it hasn't, I mean, a number of upsetting statistics, one of which is that Instagram makes body image issues worse for one in three teen girls. So you know, that one's compelling. A few others that I found really compelling relating to true harm, were that among teens who wanted to kill themselves, 6% said the feeling started on Instagram, and of teens who wanted to hurt themselves, 9% said the feeling started on Instagram. And these are from
Mikaela Lefrak: I guess my question is like, that sounds like very, very compelling evidence. But I've also been, I've been reading articles from from folks who studied these types of cases very closely, who are are concerned that this is an uphill battle from the states who are part of the federal lawsuit and from the individual states like Vermont who have also filed in their own states to prove this direct causation. Are there any concerns about that? I mean, these are these are this is a taxpayer payer funded lawsuit, it's going to take a lot of time, and you're going up against one of the biggest tech giants in the world.
Charity Clark: I mean, when you list all of those states that have taken action as a result of this investigation, I mean, you have a lot of information about how confident we are in our approach. The other thing to keep in mind is that recall how long it took for states to act against the tobacco companies. And there was a time when we didn't realize that smoking tobacco was really bad for you and doctors were smoking themselves in hospitals and things like that. I mean, these these things take time. I wonder if some people are saying, you know, this doesn't really seem like a compelling case. They are misinformed, unlike those of us who have been a part of this investigation, or it's just, Instagram is so new. And they haven't really realized no, no, this is this is really happening. This harm is real. It's quantifiable. And what I have learned since we have brought this lawsuit, by the feedback that I'm getting is that a lot of people already sensed that this harm was occurring, that the addictive nature of Instagram was real, because it felt that way to them as users.
Mikaela Lefrak: I think, yeah, that's, that's the interesting part of this is that I think, you know, when people see the headlines about these types of lawsuits, a lot of us, I know I do go, Oh, yeah. It doesn't always feel good to be on social media. But feeling that feeling is different than having to go into a court of law and prove it. I'm wondering if you know, this is an active lawsuit, but is there data that you can share that that kind of helps prove that point? Like, for example, do we know how many teens in Vermont are using Instagram?
Charity Clark: Well, we we have, unfortunately, had to redact a lot of the complaint, we have this very long 117-page complaint and a lot of it is redacted because under the terms of the investigation, we had to redact it. So I'm reluctant to delve too deeply into the information that we have. But normally, when that occurs, eventually, it's unredacted. And I hope that in this case that does take place eventually, and hopefully earlier rather than later.
This harm is real. It's quantifiable. ... A lot of people already sensed that this harm was occurring, that the addictive nature of Instagram was real, because it felt that way to them as users.Attorney General Charity Clark
Mikaela Lefrak: I was I was going to ask you about the redactions. In preparing for this conversation, looking through the lawsuit, there's very large portions that are completely blacked out, for example, a section about how Meta directed its business model at Vermont specifically is almost entirely redacted, including details on the scope of young Vermonters Instagram use. Are those redactions a typical part of a lawsuit?
Charity Clark: Like I think that I mean, I don't know, typical might be a strong word. But that's definitely not uncommon. We've done this before. And in those cases, eventually, we are able to unredact the complaint and refile it. So I hope that in this case, we'll be able to do that as well.
Mikaela Lefrak: Attorney General Clark, what what is the the ultimate goal of this lawsuit, the best case scenario?
Charity Clark: What we want is for the harm to stop, that's the best case scenario is injunctive relief that allows the harm to stop. We've also asked for monetary relief, we've asked for our attorneys fees, our investigation fees, things like that. And we've also asked for what the Consumer Protection Act allows for which is a $10,000 penalty for each violation. And that would be for each consumer using Instagram would be up to $10,000 is what the law provides. And that's what we've asked for here.
Mikaela Lefrak: Each consumer using— OK, so how would you would people have to apply or tell you you were a part of it?
Charity Clark: We would get the information from from Instagram, about how many users are in Vermont and you know, make a calculation. And that's what we would ask the court for.
Mikaela Lefrak: And Attorney General, what are the next steps in this suit? And what are you keeping an eye on with with the other suits that have been filed against Meta?
Charity Clark: Something that has been really heartwarming, almost, is to see the collaboration among the states, regardless of political affiliation. The states have been almost entirely united about this, and even about the details. I mean, all the states are really focused on the injunctive relief, making the harm stop. And that has been a really positive outcome, just as an aside, but just to walk through the next steps of any lawsuit, you know, you file a complaint. That's what happens when you file a lawsuit, you file that long document you reference, and then the other side, the defendants get to file an answer. Often at that time, they will also file a motion to dismiss to say, “Nope, the law is not on your side, it's on my side.” And so you know, we don't know if that's what's going to happen, often that is what happens. And then after that, you begin the process called discovery where information and documents are exchanged, depositions take place, which is basically an interview that's sworn, and all of that information gathering and document gathering will help you build your case or your defense depending on what side you're on. So we're just, you know, proceeding with the understanding that that's what this is going to look like going forward. Of course, we are just taking care of this one suit in Vermont and Meta is juggling all of these suits across the country. So we'll stay in coordination with other attorneys general across the nation and and really see how it goes from here.
Mikaela Lefrak: In response to the lawsuits, Meta recently issued a statement saying the company is quote, disappointed. And according in full here, instead of working productively, with companies across the industry to create clear, age appropriate standards for the many apps that teens use, the attorneys general have chosen this path, the company said. What are your thoughts?
Charity Clark: Well, I'm disappointed too, you know, it would have been great if we could have resolved this before we got to the point of filing a lawsuit. But here we are, and you know, to make, you know, just hone in on one thing you said there is in our lawsuit, we even point out that they are not taking age verification seriously. So you know, let's start there. There's many, many things that they could be doing without our having to file a lawsuit and they've chosen not to.
Mikaela Lefrak: Not taking age verification seriously, what does that mean?
Charity Clark: Well, you're not supposed to be on Instagram if you're under 13. And, you know, we include that information or on our lawsuit that they're really not taking that seriously.
And the algorithm doesn't have a moral judgment. It just keeps showing us the thing that we keep wanting to look at. And that is very harmful, but especially to young people.Attorney General Charity Clark
Mikaela Lefrak: Are there ways — not to make you solve their problems for them — but but can you explain that a bit more to folks? I mean, I know you have to you have to put your name and your date of birth and when you sign up for an Instagram account, what are ways that they could keep you from lying?
Charity Clark: You know, they have done incredible things. I'm sure they can fix this too. And you know, Vermont children and parents and all of us deserve them. We deserve that. We deserve to have a platform that is safe. And isn't creating these incredible harms? I mean, there is a mental health crisis, especially among teens all across this country. And there is a correlation between their mental health, especially girls, and Instagram. Can I talk about one element of that in particulars? Probably the most harmful and common is negative social comparison, which we all experience, but among teens, it's very acute. And the thing that is so compelling is that because Instagram, while we experience it as a social media platform, it's really an advertising platform. They collect our data, and our interests. And they use that to sell us ads, they or they sell others’ ads, they're the product, you know. And with negative social comparison, they show us more of the type of content that we are interested in. So if we're going down a dark rabbit hole, that's what they're going to keep showing us. So if someone has an eating disorder, and we're intrigued by this content that is harmful to us, we're going to keep looking at it. And the algorithm doesn't have a moral judgment. It just keeps showing us the thing that we keep wanting to look at. And that is very harmful, but especially to young people, and especially teen girls.
Mikaela Lefrak: I still wonder how you go about proving that X type of content in in, you know, this rabbit hole that you've been pushed down is harmful, while this other type of content is simply a company legally showing you content that you might be interested in?
Charity Clark: Yeah, I mean, I can't say a lot about what we already know. But we certainly are going to be taking the time in this lawsuit to further build our case. Based on a two-year investigation, we already have a lot of that information that we need. And we'll we'll be continuing. One thing that I've really appreciated is since the lawsuits were announced, the conversation has really been elevated, people are talking about this, I you know, had a family brunch over the weekend. And we were talking a lot about it with my teenage niece and nephew and hearing their perspective and their use. And I think that that's always a good thing. We want to make sure that we're putting tools in place to protect ourselves. As Attorney General, you know, one of my tools is bringing a lawsuit against a bad actor. And that's what we've done here. But I really have appreciated the conversations that are taking place, I think they're so important. And also, as I mentioned, the sense that there is a problem people already sense that this isn't a surprise to people and you know, it's now's a great time to be thinking about your own social media use, or how your children are using Instagram and takes measures to protect yourself. It is designed to be addictive. And if you are experiencing that, you know, reach out and put a plan in place to help protect yourself.
Mikaela Lefrak: And has working on this lawsuit preparing for this time changed either your own behaviors or the ways that you talk to a your kid your your nieces and nephews, people in your family?
Charity Clark: I think the thing that has changed for me is the experiences that we’re having to realize that they were designed to be that way. And now I feel like I can almost intellectualize when I am encountering, you know, I'm doomscrolling. And I'll give you a couple of examples. One of the design features is what's called infinite scroll. And that is when you see the the post you're interested in. It's your sister's barbecue. Yay. And but just beneath it, you see the beginning of the next post, you see who posted it and it looks like is it a sunny sunset? I don't know what it is, but you'll likely get it's 1 a.m. And you're exactly, exactly. So knowing that those are tricks that are put in place to keep my eyeballs scrolling scrolling to keep selling those ads, I think has helped me reflect on my own interaction with the platform. Infinite scroll really got me. Ephemeral content is another trick that they use, and that is content that disappears after a certain amount of time. So on Instagram, there's the stories that disappear after 24 hours. And that keeps you wanting to return. And so you don't miss anything this fear of missing out or FOMO. I never thought I would file a complaint in in court that actually use the term FOMO but it's it's real and it's a tool that's used to keep people, especially young people, you know, plugged in to their to their Instagram accounts.
Mikaela Lefrak: Well, lastly, Attorney General, do you think Vermont will be filed or be part of other lawsuits aimed at tech companies like Meta? They're certainly not the only one operating in this space.
Charity Clark: Yeah, I don't want to get ahead of ourselves and we we certainly have a policy where we don't acknowledge investigations that we have going on or talk about them. However, one such investigation is already public and that is against TikTok, and we don't know where that will take us. But I have, of course, have concerns about the way that other social media platforms are using the same tricks to keep people scrolling and create this excessive and compulsive use.
Note: This is a partial transcript of the conversation. To hear the rest of the show, including input from a child psychologist and a college student on teen social media usage and impacts, listen to the full audio provided above.
Broadcast at noon Monday, Oct. 30, 2023; rebroadcast at 7 p.m.
Have questions, comments or tips? Send us a message.