California judge rules Snap must face lawsuit over children's fentanyl purchases
A California state judge has ruled that Snap must face a lawsuit from families whose children allegedly used the Snapchat app to purchase fentanyl on the platform.
Tuesday's ruling by Los Angeles County Superior Court Judge Lawrence Riff permits a dozen claims to proceed against Snap from the relatives of young people who allegedly suffered fentanyl overdoses that in many cases proved fatal.
The decision could ultimately weaken the tech industry's legal immunity shield, known as Section 230, and expose social media platforms to even more lawsuits.
Riff's decision allowing the case to move forward means Snap will have to fight allegations ranging from product defects to negligence and wrongful death.
The ruling highlights the latest lawsuit to effectively circumvent a broad legal tool that tech companies have used for decades to nip content moderation suits in the bud. It has also come under frequent attack from politicians of both political parties in recent years.
Tech platforms and internet speech experts have credited Section 230 for the rise of the modern internet, arguing that it facilitated the creation of email, web forums, review sites and e-commerce by raising the bar for lawsuits against tech startups and individual internet users.
In siding with the family plaintiffs, however, Riff said that Section 230 of the Communications Decency Act does not apply, and the lawsuit cannot be thrown out under the law, because the case does not seek to hold Snap accountable for the content made by third-party drug dealers.
Instead, Riff wrote in the decision, Snap can be sued because the lawsuit goes after product and business decisions that are "independent … of the drug sellers' posted content."
In a statement, Snap said it works closely with law enforcement to investigate violations of its anti-drug policies, and deploys technology to proactively detect drug dealers' activity.
It said it would keep fighting the plaintiffs' claims, arguing that they are "both legally and factually flawed."
In November, a federal judge dealt another potential blow to Section 230 by allowing other product liability claims to move ahead against Google, Meta, Snap and TikTok.
That lawsuit alleges, among other things, that the tech companies have contributed to a youth mental health crisis by failing to implement effective parental controls in their apps and by promoting the use of image filters that alter a user's appearance in allegedly harmful ways.
If successful, the lawsuits could also lift the prospects for a bevy of similar claims filed against Meta by dozens of state attorneys general. The states allege Meta harmed the mental health of teens through features such as persistent mobile notifications that keep users hooked on its apps.
for more features.