Social Media Giant Under Fire in Child Safety Lawsuit

Social Media Giant Under Fire in Child Safety Lawsuit

Children's online safety has been a key topic of debate for the past six months, with giant social media platforms like ByteDance, Meta, Snapchat, and Alphabet now facing heat. These platforms have been brought to court numerous times due to concerns about negative mental health effects, allegedly addictive and unsafe for children.

Forty-two U.S. states and various schools have filed lawsuits alleging that social media platforms cause mental and psychological harm to minors. The lawsuit addressed the serious impact that Instagram, Facebook, and other sites have on the social and psychological realities of America's youth. The lawsuit addressed more than 140 lawsuits and individual cases filed against the platforms.

Recently, U.S. District Judge Yvonne Gonzalez Rogers denied a request to dismiss the lawsuits because of their addictive nature against children. Most of these lawsuits were filed by schools and various states across the country.

The lawsuits are a combined total of over 100 lawsuits filed in early 2022 after Facebook whistleblower Frances Haugen revealed Instagram's negative effects on teen mental health.

Arturo Béjar, another whistleblower for Meta, pointed to the company's policies, adding that the platform was fully aware of the harm it was doing to children but failed to act. According to Béjar, Meta offers users a "placebo" tool that does not address the issues affecting teenagers. He claimed that the company misrepresents the frequency of harm experienced by its users, especially children.

Read more How to Stop Facebook from Training AI on Your Data

The lawsuit focuses primarily on applying product liability laws to online platforms, requiring improved warnings and design changes. The ruling noted that Instagram, Facebook, YouTube, Snapchat, and TikTok would be held liable despite Section 230 and the First Amendment to the Communications Decency Act.

Section 230 states that online media platforms should not be considered third-party content publishers. This means that social media platforms cannot be held liable if users post illegal or disturbing material on their platforms. Major tech companies were trying to gain immunity under this very clause.

However, Judge Rogers dismissed all claims under Section 230. The court held that the platform was responsible for its design. The platform did not provide adequate parental control measures that parents could use to limit their children's screen time.

During the trial, Judge Rogers added that the plaintiffs' claims did not fall under free speech or free expression. Rather, they relate to issues such as the lack of strong age verification, inadequate parental control measures, and the complexity of account deletion.

Plaintiffs added that mental health problems are not due to content, but to design features. Judge Rogers wrote, "Addressing these deficiencies need not change how or what speech Defendants disseminate."

It is rare for many states to cooperate in suing tech giants for consumer injury. However, such cooperative efforts show that states are taking the issues against children seriously and pooling their resources to fight social media platforms, just as they fought big pharma and big tobacco companies.

Many lawmakers around the world are fighting to regulate the use of Instagram, Facebook, and other platforms for children.

In the last few years, Utah, California, and the United Kingdom have passed laws to improve privacy and security measures for young people. Utah passed a law that automatically turns off social media notifications for children overnight to reduce interruptions during sleep. However, lawsuits against online child safety in the U.S. are proceeding rather slowly as tech giants work hard to dismiss them.

Court documents were recently filed alleging that Meta CEO Mark Zuckerberg rejected various efforts to improve the platform for children and teens. Google spokesman Jose Castañeda countered that the claims were false. He further stated that the company offers age-appropriate content for families and children on YouTube and provides robust parental control features.

Other platforms have yet to respond. In recent years, many lawsuits have been filed against social media platforms for being harmful to children. However, many of these lawsuits, including harassment on Grindr, have not received court attention and were dismissed.

Numerous recent studies have shown that online platforms can disrupt mental health, and lawmakers are under pressure to create laws that protect children, such as age verification. While it is not yet clear whether online platforms are legally liable for harm, this lawsuit may open the door to better safety claims in the future.

.

Categories