Warning: This article includes descriptions of self-harm.
The families of two teenage boys who died by suicide filed a lawsuit Wednesday against Meta, alleging that the tech company has ignored the rising danger of sexual blackmail schemes targeting teens on Instagram.
The families, who are from Pennsylvania and Scotland, said in the lawsuit that their boys fell victim to the same type of “sextortion” scam thousands of miles apart: A stranger contacts a teen on Instagram pretending to be a romantic interest, solicits nude or intimate photos and then threatens to share the images with friends and family unless the teen shares more or pays an extortion fee.
The lawsuit is the latest attempt to hold Instagram accountable for what some users say are lethal scams that take place on its platform. Meta, which owns Instagram, is facing at least four other sextortion-related lawsuits that claim that Instagram is a defective product and has been negligent by not dealing with a sextortion problem for years.
The lawyers behind the latest suit said they plan to draw on a newly available trove of internal corporate records from Meta to bolster their case.
“This was known,” Matthew Bergman, the families’ lead lawyer, said in an interview. “This was not an accident. This was not a coincidence. This was a foreseeable consequence of the deliberate design decisions that Meta made. Their own documents show that they were very aware of this extortion phenomenon, and they simply chose to put their profits over the safety of young people.”
A representative for Meta said the company was working on a response to the lawsuit.
In a statement last year on the issue of sextortion, Instagram said: “Sextortion is a horrific crime. We work aggressively to fight this abuse and support law enforcement in investigating and prosecuting the criminals behind it.” The company also said then that it was working to help protect people from sextortion.

While Instagram has instituted some design changes for minors in recent years to attempt to address extortion, the lawsuit argues that the changes came too late and that Instagram should be held responsible for the two teens’ deaths.
Teenagers’ treatment at the hands of social media companies — and of Instagram in particular — has become a major flashpoint in society. A movement has sprung up of parents who say Instagram and other apps contributed to their children’s suicides or exploitation, and members of Congress are investigating Meta’s AI chatbots over child safety concerns. Last year, Meta CEO Mark Zuckerberg told parents who were in the audience at a Senate hearing on child online safety that he was “sorry for everything that you have all gone through” and that Meta would “continue doing industry-leading efforts to make sure that no one has to go through the types of things that your families have had to suffer.”
This month, Australia became the first country to ban people under 16 from social media.
The lawsuit stems from two deaths: Levi Maciejewski, from Shippensburg, Pennsylvania, who died last year at age 13; and Murray Dowey, from Dunblane, Scotland, who died in 2023 at 16.
Tricia Maciejewski, Levi’s mother, said in an interview that, until his death, she did not think it would be possible for a total stranger to be able to message a teenager on Instagram.
