States sue Meta for addictive features harming kids

1 year ago

More than 30 states filed a federal lawsuit against Meta, the owner of Facebook and Instagram, alleging the platforms’ apps are designed to be addictive and harm children’s mental health.

The joint lawsuit from a slate of 33 state attorneys general was filed Tuesday in a San Francisco federal court. The suit specifically claims Meta allegedly violated a federal children’s online privacy law, and state consumer protection laws by making its products addictive and then lying about how they harm children’s mental health. Additionally, multiple state attorneys general are also filing separate lawsuits against Meta in their own state courts alleging Meta’s practices violate state consumer protection laws.

If successful, the states’ lawsuits could force Meta to change the way it designs and markets its platforms to the public, and lead to hefty fines. The legal strategy has drawn comparisons to the various lawsuits filed against the tobacco industry in the 1990s, which led to hundreds of billions of dollars in damages, and changed how the industry markets its products.

Harmful to kids: The lawsuit alleges that “Meta deceptively represented that the features were not manipulative; that its Social Media Platforms were not designed to promote young users’ prolonged and unhealthy engagement with social media; and that Meta had designed and maintained its Social Media Platforms to ensure safe experiences for young users. These representations, both express and implied, were false and misleading.”

The lawsuit is the largest state-led challenge alleging a social media company violated the federal Children’s Online Privacy Protection Act and consumer protection laws, and follows a similar strategy used by Indiana, Arkansas and Utah, which have each filed state consumer protection lawsuits against TikTok in the past year.

The lawsuits are designed to circumvent Section 230 of the Communications Decency Act, a decades-old law that protects platforms from being held liable for most content users post. The consumer protection lawsuits don’t target specific content and instead assert that Meta or TikTok deceived the public about the safety of children on their apps.

Meta pushes back: “We’re disappointed that instead of working productively with companies across the industry to create clear, age-appropriate standards for the many apps teens use, the attorneys general have chosen this path,” a Meta spokesperson said in a statement.

The company has said over the past several years it has introduced more than 30 tools to support teens and parents, including automatically setting teens under age 16 to private on Instagram and providing parental supervision tools.

States act as Congress falters: While the states are acting, Congress has failed to pass any federal child online safety laws, including the Kids Online Safety Act. The bipartisan bill requiring platforms to audit their risks to minors advanced out of committee this summer but hasn’t advanced to a Senate floor vote. It has faced vocal pushback from civil rights and advocacy groups over the potential it could violate teens’ privacy online and could lead to detrimental impacts particularly on LGBTQ youth.

Several states began investigating Meta after Facebook whistleblower Frances Haugen testified in 2021 that Instagram knew its algorithms pushed unhealthy eating content to teen girls. Her testimony inspired KOSA and other legislation.

What’s next: Meta is likely to seek to dismiss the lawsuits under its Section 230 legal protections to prevent the legal challenges from moving forward. The use of state consumer protection laws against social media companies is still a relatively novel legal approach, and will be tested in the federal and state courts. The lawsuits could take years to resolve, and could spur federal and state lawmakers to pass more legislation to protect children’s online safety in the interim.

Read Entire Article