Colorado family sues Facebook parent company over daughter’s social media use
The mother of a 14-year-old Castle Rock girl who became addicted to social media sued the parent company of Facebook and Instagram earlier this month on the grounds the company deliberately designed addictive, dangerous products and failed to warn users of the potential pitfalls.
The federal lawsuit against Meta, filed Monday in U.S. District Court for the District of Colorado, is one of at least eight lawsuits with similar claims across the country brought this month by an Alabama law firm. Attorney Clinton Richardson alleges Meta is liable for product liability, including design defects, manufacturing defects and a failure to warn users of social media’s dangers.
“Overall, this is really about accountability,” Richardson said. “We want them to be held accountable for what they are doing and what is perpetuating a mental health crisis in the United States. Facebook has put its business model of profit-at-all-costs above the well-being of young people.”
The lawsuit relies on a largely untested legal argument that’s “way out on the frontier,” said Denver attorney Randy Barnhart.
“This is a very unusual and interesting case,” Barnhart said. “Typically when we think of product liability, we think of an object, a thing — a car, a tire, a room heater. Here, it appears Facebook is selling a service. And therefore I think the issue of whether or not it is a proper product liability claim is an open question… I don’t know of a case that has dealt with the issue of whether or not a service can be a product for the purposes of product liability litigation.”
Richardson argues in the complaints that Meta knew teenagers, in particular, were vulnerable to excessive social media use, and yet intentionally designed their platforms to “exploit” young users by encouraging them to spend more and more time on the social media sites, using mechanisms such as “likes,” displaying three dots when another user is typing a message, and curating feeds to keep users logged in.
“All told, Meta’s algorithm optimizes for angry, divisive and polarizing content because it’ll increase its number of users and the time users stay on the platform per viewing session, which thereby increases its appeal to advertisers, thereby increasing its overall value and profitability,” reads the complaint in the Colorado case.
For teenage social media users, platforms like Instagram worsen self-esteem, body image and bullying, the complaint contends. Soon after the 14-year-old Castle Rock girl opened her social media accounts, her “interest in any activity other than viewing and posting on the Meta platforms progressively declined,” the lawsuit alleges.
She slept little as the addiction worsened, the complaint claims, and eventually engaged in self-harm, developed an eating disorder and attempted suicide, according to the lawsuit. The Denver Post is not identifying the girl or her mother, because she is a minor. The family declined to comment through Richardson.
A spokeswoman for Instagram declined to comment on the case Thursday, but Meta has previously denied that the company put profits over safety, saying last year that it expected to spend $5 billion on safety and security in 2021 and that it employs about 40,000 people focused on user safety.
Fort Collins attorney Tom Metier said the lawsuit raises “viable” arguments.
“There’s a pattern, according to the complaint… (of the company recognizing) what will make Meta more popular and therefore drive more profits in advertising dollars, and at some point, and apparently many points, it’s alleged the choice was made to create harm in exchange for profit,” he said. “And so there’s an intentionality that could be devastating for Meta.”
He added that in most product liability cases, manufacturers of physical products are required to identify the strengths and weaknesses of their products and consider what harm the products could cause. Companies have a duty to make reasonably safe products, and when a product can’t be made physically safe, companies must warn consumers about the “truth of the dangers,” he said.
Similarly, parents who did not grow up using Instagram and Facebook need to be told about the actual psychological danger of the platforms, he said.
“Saying, ‘You should monitor your children’s use of their computers and cellphones and social media use’ is completely inadequate,” he said. “Because that doesn’t tell you the kind of information you need to know about suicide rates, self-abuse, many many things that occur as a result.”
The lawsuits come out of Facebook whistleblower’s Frances Haugen’s testimony before Congress last year, Richardson said.
Haugen claimed that the company’s internal research showed Instagram, a photo-sharing platform, worsened mental health particularly for girls on the site, leading to body-image problems and in some cases eating disorders or suicidal thoughts. She backed up her reports with tens of thousands of pages of documents she copied before leaving her job at Facebook, where she worked in the company’s civil integrity unit.
Richardson said he expects to file “dozens” more such lawsuits against Meta.