Waukegan Public Schools is going public with plans to join a group of other school districts in class-action lawsuits against some of the biggest social media companies in the world, alleging that the policies and content on those sites are harming students.
According to officials in District 60, the plan is to join suits against Meta, Snapchat, TikTok and several other companies, asserting that the platforms do not do enough to protect their youngest users from harmful content.
“Whether it be threats, bullying, harassment, (or) an increase in mental health issues with students,” attorney William Shinoff of the Frantz Law Group said. “We have given enough time at this point for DC and legislation.”
Waukegan’s school district is one of hundreds filing suits in both state and federal court against social media companies. Multi-district litigation is not a new tactic for Waukegan school officials, as the district also settled similar cases against JUUL Labs over the marketing of their vaping products.
“A big part of why these districts are doing this is they want to go and step up on behalf of children and get the change that is necessary,” Shinoff said.”
The suits will also seek compensatory damages, which would be used to fund mental health programs within the school district.
A spokesperson for Meta, the company that operates Facebook, Instagram, and other platforms, says it recognizes the need for mental health treatment for young residents.
Local
“Teen health is a complex issue,” a Meta spokesperson said. “To help teens you may be struggling, we all need a greater appreciation for the many issues they face in their daily lives. Reports from the CDC and others point to growing academic pressure, concerns of safety in schools, the lingering impact of the pandemic and limited access to mental health care as key factors. We want to work with schools and academic experts to better understand these issues, and how social media can provide teens with support when they need it, in a way that acknowledges the full picture.”
Google echoed similar sentiments, and also argued that any potential allegations that they have insufficient control systems in place to protect children from in appropriate content are “simply not true.”
Feeling out of the loop? We'll catch you up on the Chicago news you need to know. Sign up for the weekly> Chicago Catch-Up newsletter.
“Protecting kids across our platforms has always been core to our work,” a spokesperson for Google said. “In collaboration with child development specialists, we have built age-appropriate experiences for kids and families on YouTube, and provide parents with robust controls. The allegations in these complaints are simply not true.”
Snapchat officials say the company “vets all content before it can reach a large audience,” and that such practices help to keep children away from “potentially harmful material.”
“Snapchat was designed differently from other social media platforms because nothing is more important to us than the well-being of our community,” a spokesperson said. “Our app opens directly to a camera rather than a feed of content that encourages passive scrolling, and is primarily used to help real friends communicate.”