arrow left
arrow right
  • FINN LATICI-MCAVOY, , ET AL. VS META PLATFORMS, INC., ET AL. Product Liability (not asbestos or toxic/environmental) (General Jurisdiction) document preview
  • FINN LATICI-MCAVOY, , ET AL. VS META PLATFORMS, INC., ET AL. Product Liability (not asbestos or toxic/environmental) (General Jurisdiction) document preview
  • FINN LATICI-MCAVOY, , ET AL. VS META PLATFORMS, INC., ET AL. Product Liability (not asbestos or toxic/environmental) (General Jurisdiction) document preview
  • FINN LATICI-MCAVOY, , ET AL. VS META PLATFORMS, INC., ET AL. Product Liability (not asbestos or toxic/environmental) (General Jurisdiction) document preview
  • FINN LATICI-MCAVOY, , ET AL. VS META PLATFORMS, INC., ET AL. Product Liability (not asbestos or toxic/environmental) (General Jurisdiction) document preview
  • FINN LATICI-MCAVOY, , ET AL. VS META PLATFORMS, INC., ET AL. Product Liability (not asbestos or toxic/environmental) (General Jurisdiction) document preview
  • FINN LATICI-MCAVOY, , ET AL. VS META PLATFORMS, INC., ET AL. Product Liability (not asbestos or toxic/environmental) (General Jurisdiction) document preview
  • FINN LATICI-MCAVOY, , ET AL. VS META PLATFORMS, INC., ET AL. Product Liability (not asbestos or toxic/environmental) (General Jurisdiction) document preview
						
                                

Preview

1 Joseph G. VanZandt (pro hac vice) Sydney Everett (pro hac vice)* 2 BEASLEY ALLEN CROW METHVIN PORTIS & MILES, LLC 3 234 Commerce Street 4 Montgomery, AL 36103 Tel: 334-269-2343 5 Email: Joseph.VanZandt@BeasleyAllen.com Email: Sydney.Everett@BeasleyAllen.com 6 C. Brooks Cutter, SBN 121407 7 Jennifer S. Domer, SBN 305822 Margot P. Cutter, SBN 306789 8 CUTTER LAW P.C. 401 Watt Avenue 9 Sacramento, CA 95864 Tel: (916) 290-9400 10 Email: bcutter@cutterlaw.com Email: jdomer@cutterlaw.com 11 Email: mcutter@cutterlaw.com 12 Attorneys for Plaintiffs *PRO HAC VICE APPLICATION(S) FORTHCOMING 13 14 SUPERIOR COURT OF THE STATE OF CALIFORNIA 15 FOR THE COUNTY OF LOS ANGELES 16 Finn Latici-McAvoy; Madison Cosgrove; Tanya CASE NO. Allen on behalf of C.S.; Celesta Palmer on 17 behalf of N.P.; and Christie Perez on behalf of COMPLAINT FOR DAMAGES AND 18 A.P., DEMAND FOR JURY TRIAL 19 Plaintiffs, 20 v. 21 META PLATFORMS, INC., a Delaware Corporation; FACEBOOK HOLDINGS, LLC, a 22 Delaware Limited Liability Company; 23 FACEBOOK OPERATIONS, LLC, a Delaware Limited Liability Company; INSTAGRAM, 24 LLC, a Delaware Limited Liability Company; SNAP INC., a Delaware Corporation; TIKTOK, 25 INC., a California Corporation; BYTEDANCE, 26 INC., a Delaware Corporation; GOOGLE LLC, a Delaware Limited Liability Company; 27 YOUTUBE LLC, a Delaware Limited Liability Company; and DOES 1 through 100, inclusive, 28 COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL 1 Defendants. 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 ii COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL 1 Plaintiffs Finn Latici-McAvoy, Madison Cosgrove, C.S., N.P., and A.P. (“Plaintiffs”), the 2 Minors by and through their respective Guardian ad Litems, allege upon personal knowledge and 3 information and belief, based upon, inter alia, the investigation made by and through their attorneys 4 as to all other matters, as follows: 5 I. INTRODUCTION 6 1. This matter arises from an egregious breach of the public trust by Defendants Meta 7 Platforms, Inc. (hereinafter, “Meta Platforms”), Facebook Holdings, LLC (hereinafter, “Facebook 8 1”), Facebook Operations, LLC (hereinafter, “Facebook 2”), Instagram, LLC (hereinafter, 9 “Instagram,” and collectively with Meta Platforms, Facebook 1, and Facebook 2, “Meta”); Snap Inc. 10 (hereinafter, “Snap”); YouTube LLC and Google LLC, hereinafter, collectively, “YouTube”); 11 TikTok, Inc. (hereinafter, “TikTok”) and ByteDance, Inc. (hereinafter, “ByteDance,” and collectively 12 with TikTok, Snap, and YouTube, “Non-Meta”); and DOES 1-100 (collectively with “Non-Meta” 13 and “Meta,” “Defendants”). 14 2. Over the last two decades, more and more of our lives have moved onto social media 15 platforms and other digital public spaces. In this vast, still largely unregulated universe of digital 16 public spaces, which are privately owned and primarily run for profit, there exists a tension between 17 what is best for technology companies’ profit margins and what is best for the individual user 18 (especially the predictable adolescent user) and society. Business models are often built around 19 maximizing user engagement without regard to whether users engage with the platform and one 20 another in safe and healthy ways. Technology companies focus on maximizing time spent, not time 21 well spent. In recent years, there has been growing concern about the impact of digital technologies, 22 particularly social media, on the mental health and wellbeing of adolescents. Many researchers argue 23 that Defendants’ social media products facilitate cyberbullying, contribute to obesity and eating 24 disorders, instigate sleep deprivation to achieve around-the-clock platform engagement, encourage 25 children to negatively compare themselves to others, and develop a broad discontentment for life. 26 They have been connected to depression, anxiety, self-harm, and ultimately suicidal ideation, suicide 27 attempts, and completed suicide. 28 1 COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL 1 3. Defendants have intentionally designed their products to maximize users’ screen time, 2 using complex algorithms designed to exploit human psychology and driven by advanced computer 3 algorithms and artificial intelligence available to two of the largest technology companies in the 4 world. Defendants have progressively modified their products to promote problematic and excessive 5 use that they know threatens the actuation of addictive and self-destructive behavioral patterns. 6 4. Excessive screen time is harmful to adolescents’ mental health, sleep patterns, 7 emotional well-being. Defendants’ products lack any warnings that foreseeable product use can 8 disrupt healthy sleep patterns, or specific warnings to parents when their child’s product usage 9 exceeds healthy levels or occurs during sleep hours, rendering the platforms unreasonably dangerous. 10 Reasonable and responsible parents are not able to accurately monitor their child’s screen time 11 because most adolescents own or can obtain access to mobile devices and engage in social media use 12 outside their parents’ presence. 13 5. Defendants do not charge their users to use their platforms but instead, receive money 14 from advertisers who pay a premium to target advertisements to specific categories of people as 15 studied and sorted by Defendants’ algorithms. Thus, Defendants generate revenue based upon the 16 total time spent on the application, which directly correlates with the number of advertisements that 17 can be shown to each user. 18 6. Rather than making meaningful changes to safeguard the health and safety of its users, 19 Defendants have consistently chosen to prioritize profit over safety by continuing to implement and 20 require its users to submit to product components that increase the frequency and duration of users’ 21 engagement, resulting in the pernicious harms described in greater detail below. 22 7. Plaintiffs bring claims of strict liability based upon Defendants’ defective design of 23 their social media products that renders such products not reasonably safe for ordinary consumers in 24 general and minors in particular. It is technologically feasible to design social media products that 25 substantially decrease the incidence and magnitude of harm to ordinary consumers and minors arising 26 from their foreseeable use of Defendants’ products with a negligible increase in production cost. It is 27 also technologically feasible to design and implement effective age and identity verification protocols 28 2 COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL 1 to ensure that only children of an appropriate age may have access to these products. 2 8. Plaintiffs also bring claims for strict liability based on Defendants’ failure to provide 3 adequate warnings to minor users and their parents of the danger of the mental, physical, and 4 emotional harms arising from the foreseeable use of their social media products. 5 9. Plaintiffs also bring claims for common law negligence arising from Defendants’ 6 unreasonably dangerous social media products and their failure to warn of such dangers. Defendants 7 knew or, in the exercise of ordinary care, should have known that their social media products were 8 harmful to a significant percentage of their minor users and failed to re-design their products to 9 ameliorate these harms or warn minor users and their parents of dangers arising out of the foreseeable 10 use of their products. Defendants intentionally created an attractive nuisance to children, but 11 simultaneously failed to provide adequate safeguards from the harmful effects they knew were 12 occurring. 13 10. As is now generally known, in Fall 2021, Frances Haugen, a former Facebook 14 employee turned whistleblower, came forward with internal documents showing that Meta was aware 15 that its platforms and products cause significant harm to its users, especially our children. Non-Meta 16 Defendants’ products—their social media platforms—have similar designs and mechanisms of action 17 resulting in similar addictive qualities and harmful outcomes to minor users. To this day, the addictive 18 qualities of Defendants’ products and their harmful algorithms are not fully known or understood by 19 minor users or their parents. 20 II. JURISDICTION AND VENUE 21 11. This Court has jurisdiction over this action pursuant to CAL. CODE CIV. PRO. §§ 395 22 and 410.10. 23 12. This Court has subject matter jurisdiction over all causes of action alleged in this 24 complaint pursuant to CAL. CONST. art. VI, § 10, and this is a court of competent jurisdiction to grant 25 the relief requested. Plaintiffs’ claims arise under the laws of the State of New York, are not 26 preempted by federal law, do not challenge conduct within any federal agency’s exclusive domain, 27 and are not statutorily assigned to any other trial court. 28 3 COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL 1 13. This Court has general personal jurisdiction over Defendants because each is 2 headquartered and has its principal place of business in the State of California and has continuous 3 and systematic operations within the State of California. 4 14. This Court also has specific personal jurisdiction over Defendants because they 5 actively do business in Los Angeles County and the State of California. Defendants have purposely 6 availed themselves of the benefits, protections, and privileges of the laws of the State of California 7 through the design, development, programming, manufacturing, promotion, marketing, and 8 distribution of the products at issue and have purposely directed their activities toward this state. 9 Defendants have sufficient minimum contacts with this state to render the exercise of jurisdiction by 10 this Court permissible. 11 15. Venue is proper in Los Angeles Superior Court pursuant to CAL. CODE CIV. PRO. §§ 12 395 and 395.5 because Defendants regularly conduct business and certain of Defendants’ liability 13 arose in Los Angeles County. 14 III. PARTIES 15 Plaintiffs 16 16. Plaintiff Finn Latici-McAvoy is a resident of New York, New York. Plaintiff hereby 17 asserts the causes of action enumerated in Section VI, below (the “Causes of Action”), against all 18 named Defendants. 19 17. Plaintiff Madison Cosgrove is a resident of Selden, New York. Plaintiff here by asserts 20 the causes of action enumerated in Section VI, below, against all named Defendants. 21 18. Plaintiff C.S., a Minor, is a resident of Deposti, New York. C.S. is the daughter of 22 Tanya Allen. Plaintiff hereby asserts the causes of action enumerated in Section VI, below, against 23 Defendants Meta, Snap, TikTok and ByteDance. 24 19. Plaintiff N.P., a Minor, is a resident of Ripley, New York. Plaintiff is the daughter of 25 Celesta Palmer. Plaintiff hereby asserts the causes of action enumerated in Section VI, below, against 26 Defendants Meta, Snap, and YouTube. 27 28 4 COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL 1 20. Plaintiff A.P., a Minor, is a resident of New York, New York. Plaintiff is the daughter 2 of Christie Perez. Plaintiff hereby asserts the causes of action enumerated in Section VI, below, 3 against Defendants Meta and Snap. 4 Defendant Meta Platforms, Inc. 5 21. Meta Platforms is a multinational technology conglomerate, having its principal place 6 of business in Menlo Park, California. Meta develops and maintains social media platforms, 7 communication platforms, and electronic devices.1 Meta Platforms was originally incorporated in 8 Delaware on July 29, 2004, as “TheFacebook, Inc.” On September 20, 2005, the company changed 9 its name to “Facebook, Inc.” On October 28, 2021, the company assumed its current designation. 10 While Plaintiffs have attempted to identify the specific Meta Platforms subsidiary(ies) that committed 11 each of the acts alleged in this Complaint, Plaintiffs were not always able to do so, in large part due 12 to ambiguities in Meta Platforms’ and its subsidiaries’ own documents, public representations, and 13 lack of public information. However, upon information and belief, Meta Platforms oversees the 14 operations of its various platforms and subsidiaries, some of which have been identified and are listed 15 below. For this reason, unless otherwise specified, the shorthand “Meta” contemplates the apparent 16 control that Meta Platforms wields over its subsidiaries’ overall operations and, therefore, further 17 refers to its various subsidiaries and predecessors. To the extent this assumption is incorrect, the 18 knowledge of which Meta Platforms’ subsidiary, current or former, is responsible for specific conduct 19 is knowledge solely within Meta’s possession, the details of which Plaintiffs should be permitted to 20 elucidate during the discovery phase. 21 22. Meta Platforms’ subsidiaries include but may not be limited to: Facebook 1 22 (Delaware); Facebook 2 (Delaware); Facebook Payments. Inc. (Florida); Facebook Technologies, 23 LLC (Delaware); FCL Tech Limited (Ireland); Instagram (Delaware); Novi Financial, Inc. 24 25 26 1 These platforms and products include Facebook (its self-titled app, Messenger, Messenger Kids, 27 devices calledWorkplace, Marketplace, etc.), Instagram (and its self-titled app), and a line of electronic virtual reality Oculus Quest (soon to be renamed “Meta Quest”). 28 5 COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL 1 (Delaware); Runways Information Services Limited (Ireland); Scout Development LLC (Delaware); 2 Siculus (Delaware); and a dozen other entities whose identity or relevance is presently unclear. 3 Subsidiary Meta Defendants 4 23. Facebook 1 was incorporated in Delaware on March 11, 2020, and is a wholly owned 5 subsidiary of Meta Platforms. Facebook 1 is primarily a holding company for entities involved in 6 Meta Platforms’ supporting and international endeavors, and its principal place of business is in 7 Menlo Park, California. Meta Platforms is the sole member of this LLC Defendant. 8 24. Facebook 2 was incorporated in Delaware on January 8, 2012, and is a wholly owned 9 subsidiary of Meta Platforms. Facebook 2 is likely a managing entity for Meta Platforms’ other 10 subsidiaries, and its principal place of business is in Menlo Park, California. Meta Platforms is the 11 sole member of this LLC Defendant. 12 25. Instagram was founded by Kevin Systrom and Mike Krieger in October 2010. In April 13 2012, Meta Platforms purchased the company for $1 billion (later statements from Meta Platforms 14 have indicated the purchase price was closer to $2 billion). Meta Platforms reincorporated the 15 company on April 7, 2012, in Delaware. Currently, the company’s principal place of business is in in 16 Menlo Park, CA. Instagram is a social media platform tailored for photo and video sharing. Meta 17 Platforms is the sole member of this LLC Defendant. 18 26. By admission, Facebook and Instagram are products (Meta’s Vice President of 19 Messaging Products Loredana Crisan, Celebrating 10 Years of Messenger With New Features 20 (August 25, 2021, last visited July 29, 2022, at 1:10 PM CST) https://about.fb.com/news/2021/08/m 21 essenger-10th-birthday/), the safety of which of was not duly addressed prior to public distribution 22 (Our Progress Addressing Challenges and Innovating Responsibly (September 21, 2021, last visited 23 July 29, 2022, at 1:17 PM CST) https://about.fb.com/news/2021/09/our-progress-addressing- 24 challenges-and-innovating-responsibly/). 25 27. Meta knowingly exploited its most vulnerable users—children worldwide—to drive 26 corporate profit. Meta operates the world’s largest family of social networks, enabling billions of 27 users worldwide to connect, view, and share content through mobile devices, personal computers, 28 6 COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL 1 and virtual reality headsets. A user does not have to pay to create an account. Instead of charging 2 account holders to access the platform, Meta became one of the world’s most valuable companies 3 from the sale of advertisement placements to marketers across its various platforms and applications. 4 For example, upon information and belief, Meta generated $69.7 billion from advertising in 2019, 5 more than 98% of its total revenue for the year. Meta can generate such revenues by marketing its 6 user base to advertisers. Meta collects and analyzes data to assemble virtual dossiers on its users, 7 covering hundreds if not thousands of user-specific data segments. This data collection and analysis 8 allows advertisers to micro-target advertising and advertising dollars to very specific categories of 9 users, who can be segregated into pools or lists using Meta’s data segments. Only a fraction of these 10 data segments come from content that is explicitly designated by users for publication or explicitly 11 provided by users in their account profiles. Many of these data segments are collected by Meta 12 through surveillance of each user’s activity on the platform and off the platform, including behavioral 13 surveillance that users are not even aware of, like navigation paths, watch time, and hover time. At 14 bottom, the larger Meta’s user database grows, the more time the users spend on the database, and 15 the more detailed information that Meta can extract from its users, the more money Meta makes. 16 28. As of October 2021, Facebook had roughly 2.91 billion monthly active users, thus 17 reaching 59% of the world’s social networking population, the only social media platform to reach 18 over half of all social media users. Instagram has become the most popular photo-sharing social media 19 platform amongst teenagers and young adults in the United States, with over 57 million users below 20 the age of eighteen, meaning that 72 percent of America’s youth use Instagram. 11 percent of parents 21 in the U.S. know their child between the ages of 9 and 11 uses Instagram.2 Likewise, 6 percent of 22 parents in the U.S. know their child between the ages of 9 and 11 uses Facebook.3 23 29. Two Meta products, the www.Facebook.com (“Facebook”) and www.Instagram.com 24 (“Instagram”) websites and respective interrelated apps (collectively “Meta 2”), rank among the most 25 2 26 Katherine Schaeffer, 7 facts about Americans and Instagram, Pew Research Center (Oct. 7, 2021), https://www.pewresearch.org/fact-tank/2021/10/07/7-facts-about-americans-and-instagram/. 27 3 Id. 28 7 COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL 1 popular social networking products, with more than two billion combined users worldwide. It is 2 estimated that nine out of ten teens use social media platforms, with the average teen using the 3 platforms roughly three hours per day. Given the delicate, developing nature of the teenage brain and 4 Meta’s creation of social media platforms designed to be addictive, it comes as no surprise that we 5 are now grappling with the ramifications of Meta’s growth-at-any-cost approach, to wit, a generation 6 of children physiologically entrapped by products the effects of which collectively result in long- 7 lasting adverse impact on their rapidly evolving and notoriously precarious mental health. 8 30. Meta, as originally conceived, ostensibly functioned like an enormous virtual bulletin 9 board, where content was published by authors. But Meta has evolved over time with the addition of 10 numerous features and products designed by Meta to engage users. The earliest of these—the search 11 function and the “like” button—were primarily user-controlled features. In more recent years, 12 however, Meta has taken a more active role in shaping the user-experience on the platform with more 13 complex features and products. The most visible of these are curated recommendations, which are 14 pushed to each user in a steady stream as the user navigates the website, and in notifications sent to 15 the user’s smartphone and email addresses when the user is disengaged with the platform. These 16 proprietary Meta products include News Feed (a newsfeed of stories and posts published on the 17 platform, some of which are posted by your connections, and others that are suggested for you by 18 Meta), People You May Know (introductions to persons with common connections or background), 19 Suggested for You, Groups You Should Join, and Discover (recommendations for Meta groups to 20 join). These curated and bundled recommendations are developed through sophisticated algorithms. 21 As distinguished from the earliest search functions that were used to navigate websites during the 22 Internet’s infancy, Meta’s algorithms are not based exclusively on user requests or even user inputs. 23 Meta’s algorithms combine the user’s profile (e.g., the information posted by the user on the platform) 24 and the user’s dossier (the data collected and synthesized by Meta to which Meta assigns categorical 25 designations), make assumptions about that user’s interests and preferences, make predictions about 26 what else might appeal to the user, and then make very specific recommendations of posts and pages 27 28 8 COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL 1 to view and groups to visit and join based on rankings that will optimize Meta’s key performance 2 indicators. 3 31. A user’s “feed” on both Facebook and Instagram is comprised of an endless series of 4 photos, videos, text captions, and comments posted by accounts that the user follows, along with 5 advertising and content specifically selected and promoted by Instagram and Facebook. 6 32. Instagram also features a “discover” page where a user is shown an endless feed of 7 content that is selected by an algorithm designed by Instagram based upon the users’ data profile: 8 demographics, prior activity in the platform, and other data points. Meta has added similar features 9 to Facebook on the apps “menu” and “watch” sections. 10 33. Engineered to meet the evolving demands of the “attention economy,”4 a term used to 11 describe the supply and demand of a person’s attention, which is a highly valuable commodity for 12 internet websites, in February 2009, Meta introduced perhaps its most conspicuous effort to addict 13 users—intermittent variable rewards (“IVR”): its “Like” button; Instagram launched that same year 14 and came ready-made with a like function shaped as a heart. Additional features of Meta’s IVR 15 include its delay-burst notification system, comments, posts, shares, and other dopamine-triggering 16 content. Instagram’s notification algorithm delays notifications to deliver them in spaced-out, larger 17 bursts. Facebook likely uses a similar feature. These designs take advantage of users’ dopamine- 18 driven desire for social validation and optimize the balance of negative and positive feedback signals 19 to addict users. 20 34. IVR is a method used to addict a user to an activity by spacing out dopamine triggering 21 stimuli with dopamine gaps—a method that allows for anticipation and craving to develop and 22 strengthens the addiction with each payout. The easiest way to understand this term is by imagining 23 a slot machine. You pull the lever (intermittent action) with the hope of winning a prize (variable 24 reward). In the same way, you refresh Defendants’ feeds, endure the brief delay, and then learn if 25 anyone has tagged you in a photo, mentioned you in a post, sent you a message, or liked, commented 26 4 The business model is simple: The more attention a platform can pull from its users, the more 27 effective its advertising space becomes, allowing it to charge advertisers more. 28 9 COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL 1 on, or shared either of your posts. As explained below, Meta (and, upon information and belief, all 2 Defendants) space out notifications into multiple bursts (dopamine gaps) rather than notifying users 3 in real-time to maximize the platforms’ addictiveness. 4 35. Over the past decade or so, Meta has added features and promoted the use of auto- 5 playing short videos and temporary posts on Facebook and Instagram, with the former being referred 6 to as “Reels,” while the latter is referred to as Instagram “Stories.” 7 36. Facebook and Instagram notify users by text and email of activity that might be of 8 interest, which is designed to and does prompt users to open Facebook and Instagram and be exposed 9 to content selected by the platforms to maximize the length of time and amount of content viewed by 10 the user. Facebook and Instagram include many other harm-causing features, as discussed below. 11 37. Equipped with ample information about the risks of social media, the ineffectiveness 12 of its age-verification protocols, and the mental processes of teens, Meta has expended significant 13 effort to attract preteens to its products, including substantial investments in designing products that 14 would appeal to children ages 10-to-12. Meta views pre-teens as a valuable, unharnessed commodity, 15 so valuable that it has contemplated whether there is a way to engage children during play dates.5 16 Meta’s unabashed willingness to target children—in the face of its conscious, long-standing, plainly 17 deficient age-verification protocols—demonstrates the depths to which Meta is willing to reach to 18 maintain and increase its profit margin. 19 38. Faced with the potential for reduction in value due to its declining number of users, in 20 or around early 2018, Meta (and likely Meta 2) revamped its interface to transition away from 21 chronological ranking, which organized the interface according to when content was posted or sent, 22 to prioritize Meaningful Social Interactions, or “MSI,” which emphasizes users’ connections’ 23 interactions (e.g., likes and comments) and gives greater significance to the interactions of 24 connections that appeared to be the closest to users. To effectuate this objective, Facebook developed 25 26 5 Georgia Wells and Jeff Horwitz, Facebook’s Effort to Attract Preteens Goes Beyond Instagram Kids, 27 11632849667.Show (2021), https://www.wsj.com/articles/facebook-instagram-kids-tweens-attract- Documents 28 10 COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL 1 and employed an “amplification algorithm” to execute engagement-based ranking, which considers 2 a post’s likes, shares, and comments, as well as a respective user’s past interactions with similar 3 content, and exhibits the post in the user’s newsfeed if it otherwise meets certain benchmarks. The 4 algorithm covertly operates on the proposition that intense reactions invariably compel attention. As 5 it measures reactions and contemporaneously immerses users in the most reactive content, and 6 negative content routinely elicits passionate reactions, the algorithm effectively works to steer users 7 toward the most negative content. 8 39. Meta CEO Zuckerberg publicly recognized this in a 2018 post, in which he 9 demonstrated the correlation between engagement and sensational content that is so extreme that it 10 impinges upon Meta’s own ethical limits, with the following chart:6 11 12 13 14 15 16 17 18 19 20 40. The algorithm controls what appears in each user’s News Feed and promotes content 21 that is objectionable and harmful to many users. In one internal report, Meta concluded that “[o]ur 22 approach has had unhealthy side effects on important slices of public content, such as politics and 23 news,” with one data scientist noting that “[t]his is an increasing liability.” In other internal memos, 24 Meta concluded that because of the new algorithm, “[m]isinformation, toxicity, and violent content 25 are inordinately prevalent.” Other documents show that Meta employees also discussed Meta’s 26 6 Mark Zuckerberg, A Blueprint for Content Governance and Enforcement, FACEBOOK, 27 https://www.facebook.com/notes/751449002072082/ (last visited January 8, 2022). 28 11 COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL 1 motive for changing its algorithm—namely, that users began to interact less with the platform, which 2 became a worrisome trend for Meta’s bottom line. Meta found that the inflammatory content that the 3 new algorithm was feeding to users fueled their return to the platform and led to more engagement, 4 which, in turn, helped Meta sell more of the digital ads that generate most of its revenue. All told, 5 Meta’s algorithm optimizes for angry, divisive, and polarizing content because it’ll increase its 6 number of users and the time users stay on the platform per viewing session, which thereby increases 7 its appeal to advertisers, thereby increasing its overall value and profitability. 8 41. Upon information in belief, at least as far back as 2019, Meta initiated, inter alia, a 9 Proactive Incident Response experiment, which began researching the effect of Meta on the mental 10 health of today’s youth.7 Meta’s own in-depth analyses show significant mental-health issues 11 stemming from the use of Instagram among teenage girls, many of whom linked suicidal thoughts 12 and eating disorders to their experiences on the app.8 Meta’s researchers have repeatedly found that 13 Instagram is harmful for a sizable percentage of teens that use the platform. In an internal presentation 14 from 2019, Meta researchers concluded that “[w]e make body issues worse for one in three teen 15 girls,” and “[t]eens blame Instagram for increases in the rate of anxiety and depression.” Similarly, 16 in a March 2020 presentation posted to Meta’s internal message board, researchers found that 17 “[t]hirty-two percent of teen girls said that when they feel bad about their bodies, Instagram made 18 them feel worse.” Sixty-six percent of teen girls and forty-six percent of teen boys have experienced 19 negative social comparisons on Instagram. Thirteen-and-one-half percent of teen-girl Instagram users 20 say the platform makes thoughts of “suicide and self-injury” worse. Seventeen percent of teen-girl 21 Instagram users say the platform makes “[e]ating issues” worse. Instagram users are twice as likely 22 to develop an eating disorder as those who don’t use social media. 23 42. Meta is aware that teens often lack the ability to self-regulate. Meta is further aware 24 7 See Protecting Kids Online: Testimony from a Facebook Whistleblower, United States Senate 25 Committee on Commerce, Science, & Transportation, Sub-Committee on Consumer Protection, Product Safety, and Data Security, https://www.c-span.org/video/?515042-1/whistleblower-frances- 26 haugen-calls-congress-regulate-facebook. 8 See Wall Street Journal Staff, The Facebook Files, WSJ (2021), https://www.wsj.com/articles/the- 27 facebook-files-11631713039?mod=bigtop-breadcrumb. 28 12 COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL 1 that, despite the platforms’ adverse impact on teenage users’ well-being, the absence of impulse 2 control often renders teens powerless to oppose the platforms’ allure. Meta is conscious of the fact 3 that the platform dramatically exacerbates bullying and other difficulties prevalent within the high 4 school experience, as the reach of the same now affects users within the ideally otherwise safe 5 confines of the home. The advent of social media largely occurred after today’s parents became 6 adults, the consequence being a large swath of parents that lack the context needed to appreciate the 7 contemporary perils of Meta and Instagram, who are likewise ill-equipped to offer advice sufficient 8 to effectively mitigate against it. 9 43. The shift from chronological ranking to the algorithm modified the social networking 10 environment in such a way that it created a new iteration of the Meta experience, one that is 11 profoundly more negative, one that exploits some of the known psychological vulnerabilities of 12 Facebook’s most susceptible patronage, to wit, juveniles, resulting in a markedly enlarged threat to 13 the cohort’s mental health and the related frequency of suicidal ideation. 14 44. Meta professes to have implemented protective measures to counteract the well- 15 established dangers of its sites’ customized, doggedly harmful content; however, its protocols apply 16 only to content conveyed in English and remove only three-to-five percent of harmful content. Meta 17 knows its quality-control and age-verification protocols are woefully ineffective but is either 18 unwilling or incapable of properly managing its platforms. This is consistent with its established 19 pattern of recognizing and subsequently ignoring the needs of its underage users and its obligation to 20 create a suitable environment accessible only by its age-appropriate users, all in the interest of reaping 21 obscene profit. 22 45. Instead of providing warnings at sign-up or during use, Meta provides no warning at 23 all. Rather, the most accessible and full information regarding the mental and physical health risks of 24 Meta’s platforms comes from third parties. Meta has a “Youth Portal” website that does not appear 25 to be widely promoted by Meta or even recommended to teen users on its platforms.9 Although the 26 27 9 Safety Center, Facebook, https://www.facebook.com/safety/youth (last visited Sept. 20, 2022). 28 13 COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL 1 website claims to be comprehensive in its coverage of safety information for the platforms, it fails to 2 directly address any of the features or health risks listed above. The website states, “Welcome to our 3 Youth Portal. Consider this your guide to all things Facebook: general tips, insider tricks, privacy and 4 safety information, and everything else you need to have a great experience on Facebook. It’s also a 5 space for you to hear from people your age, in their own voices, about the issues that matter to them 6 online. Take a look around — these resources were made specifically for you, your friends, and your 7 real-life experiences online and off.”10 The website merely provides instructional guides regarding 8 mental health in general—it does not identify, warn, or take responsibility for the impact of the 9 platform and its features on users’ mental health. By contrast, it shifts blame to other factors, such as 10 third parties posting “suicide challenges,” the general societal issue of substance abuse, and the 11 COVID-19 pandemic. 12 46. The only content on the website that has a semblance of a warning for the issues listed 13 above is a link to a “Family Digital Wellness Guide” created by the Boston Children’s Hospital 14 Digital Wellness Lab. Buried in this guide is a mention that screens should not be used an hour before 15 bed, because “[u]sing screens before bedtime or naptime can excite kids and keep them from falling 16 asleep. The ‘blue light’ that comes from TVs and other screen devices can disrupt your child’s natural 17 sleep cycle, making it harder for them to fall asleep and wake up naturally. . . . [Late screen use can] 18 result[ ] in your child getting less sleep and struggling to wake up on time. On average, school-age 19 children need 9-12 hrs of sleep each night.” 20 47. The “Family Digital Wellness Guide” only alludes to the platforms’ manipulation, 21 addictiveness, behavioral control, and data tracking of users: “Advertisers target children with lots of 22 commercials, everything from sneakers and toys to unhealthy foods and snacks high in fat, sugar, and 23 calories. Your children may also start becoming familiar with online influencers, who are also often 24 paid to advertise different products and services on social media. Helping your child think critically 25 26 27 10 Id. 28 14 COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL 1 about how advertising tries to change behaviors, helps your child understand the purpose of ads, and 2 empowers them to make informed decisions.” The guide also briefly discusses cyberbullying. 3 48. The guide mentions the body image harms social media inflicts but solely blames 4 influencers as the cause rather than the platforms’ algorithms and features and asserts that the burden 5 to remedy the issue is on parents rather than social media companies. “Science says: Tweens are often 6 exposed to a lot of information online and through other media, both true and false, about how bodies 7 ‘should’ look and what they can do to ‘improve’ their appearance. Certain body types are often 8 idolized, when in reality bodies are incredibly diverse. There are many online accounts, websites, and 9 influencers that make youth feel inadequate by encouraging them to lose weight or build up muscle, 10 harming both their mental and physical health. . . . Protip: Actively listen and show that you care 11 about how your child is feeling about puberty and how their body is changing. Talk with them about 12 images on social and other media as these often set unrealistic ideals, and help them understand that 13 these images are often digitally altered or filtered so that people look more ‘beautiful’ than they really 14 are.” No similar warning is offered to young users in Meta’s advertisements, at signup, or anywhere 15 on the platform. Instead, Meta unreasonably and defectively leaves it to individuals’ research ability 16 for a user to be informed about the key dangers of their platforms. 17 49. This informational report is from a third party, not Meta. Meta merely links to this 18 information on a “Youth Portal” website in a location that is difficult and time-consuming to find. 19 The guide does not mention the strong role that Facebook’s and Instagram’s individual or collective 20 algorithm(s) and features play in each of these harms. Furthermore, it is uncertain how long even this 21 limited information has been tethered by Meta. 22 50. On another Meta created website that proposes to “help young people become 23 empowered in a digital world,” its “Wellness” subpage lists five activities, “mindful breathing,” 24 “finding support,” “building resilience: finding silver linings,” “a moment for me,” and “taking a 25 26 27 28 15 COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL 1 break.”11 Nowhere does the website mention the mental health risks posed by Facebook and 2 Instagram as a result of the product features listed above. 3 Defendant Snap Inc. 4 51. Snap is a Delaware corporation with its principal place of business in Santa Monica, 5 California. Snap owns and operates the Snapchat social media platform, an application that is widely 6 marketed by Snap and available to users throughout the United States. Snapchat is a platform for 7 engaging in text, picture, and video communication. The platform is also for editing and 8 dissemination of content. The app contains a discovery page and a TikTok-like short video feed that 9 algorithmically presents endless content to users. The primary objective of the platform is to 10 maximize the frequency and length of each user’s viewing sessions. Indeed, 59 percent of teenagers 11 in the U.S. actively use Snapchat,12 and 22 percent of parents in the U.S. know their children between 12 the ages of 9 and 11 use Snapchat.13 13 52. Snapchat was founded in 2011 by Reggie Brown, Evan Spiegel, and Bobby Murphy, 14 three Stanford college students. It began as a simple application designed to allow a user to send a 15 picture to a friend that would later disappear. Having gained only 127 users a few months after its 16 launch, Snapchat began to market to high school students. Within the following year, Snapchat grew 17 to more than 100,000 users. 18 53. Snapchat became well-known for the ephemeral nature of its content, which, in effect, 19 removes all accountability for sent content. Specifically, Snapchat allows users to form groups and 20 share posts or “Snaps” that disappear after being viewed by the recipients. However, Snapchat's social 21 media product quickly evolved from there, as its leadership made design changes and rapidly 22 developed new product features intended to, and successfully did, increase Snapchat’s popularity 23 among minors. 24 25 11 Wellness, Meta, https://www.facebook.com/fbgetdigital/youth/wellness (last visited Sept. 20, 2022). 26 12 Vogels, et al., supra note 13. 27 13 Schaeffer, supra note 2. 28 16 COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL 1 54. In 2012, Snapchat added video capabilities to its product, pushing the number of Snaps 2 to 50 million per day. It then added the “Stories” and “Chat” features in 2013; live video chat 3 capabilities, text conversations, “Our Story,” Geofilters, and Snapcash in 2014; Discovery, QR code 4 incorporation, and facial recognition software in 2015; and Memories and Snapchat Groups in 2016. 5 55. Upon knowledge, information, and belief, formed after a reasonable inquiry under the 6 circumstances, by 2015, advertisements were pervasive on Snapchat, and by 2018, 99% of Snapchat’s 7 total revenue came from advertising. Like Meta and Defendants in general, Snapchat decided to 8 monetize its userbase and changed its product in ways that made it more harmful for users yet resulted 9 in increased engagement and profits for Snapchat. By 2015, Snapchat had over 75 million active users 10 and was the most popular social media application amongst American teenagers in terms of the 11 number of users and time spent using the product. To further expand its userbase, Snapchat 12 incorporates several product features that serve no purpose other than to create dependency on 13 Snapchat’s social media product. These features, in turn, result in sleep deprivation, anxiety, 14 depression, shame, interpersonal conflicts, and other serious mental and physical harms. Snapchat 15 knows, or should know, that its product is harmful to adolescents, but, as with Defendants in general, 16 it consistently opts for increased profit at the expense of the well-being of its clientele. Defendants’ 17 products are used by millions of children every day, children who have become addicted to these 18 products because of their design and product features, to the point that parents cannot remove all 19 access to the products without minor users adamantly protesting, often engaging in self-harm, 20 threatening hunger strikes and/or suicide, and other foreseeable consequences of withdrawal from 21 these products, where such cessation would require professional intervention. 22 56. In addition to the types of features discussed above, Snapchat’s defective, addictive, 23 harm-causing features include (1) Snapchat streaks, (2) limited availability content, (3) Trophies, (4) 24 Snapscore, (5) Snapmap, (6) image filters, (7) Spotlight, (8) general user interface, and (9) many other 25 design features. 26 57. Snapchat streaks provide a reward to users based on how many consecutive days they 27 communicate with another user. In other words, the longer two users are able to maintain a streak by 28 17 COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL 1 exchanging a communication (a “snap”) at least once a day, the more rewarded the users are. The 2 reward comes in the form of a cartoon emoji appearing next to the conversation within Snapchat’s 3 interface. The longer the streak is maintained, the more exciting the emoji. Eventually, the emoji will 4 change to a flame, and the number of days the streak has lasted will be positioned next to the flame. 5 If the streak is about to end, the emoji changes to an hourglass to add pressure on users to maintain 6 the streak and reengage with the platform:14 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 14 Lizette Chapman, Inside the Mind of a Snapchat Streaker, Bloomberg (Jan. 30, 2017 at 5:00 AM CST), https://www.bloomberg.com/news/features/2017-01-30/inside-the-mind-of-a-snapchat-streak 27 er?leadSource=uverify%20wall. 28 18 COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL 1 58. This feature hijacks teens’ craving for social success and connectedness and causes 2 teen users to feel pressure to use Snapchat daily or suffer social consequences. As some academics 3 and mental health treatment providers have described, streaks “provide a validation for the 4 relationship. . . . Attention to your streaks each day is a way of saying ‘we’re OK.’ . . . The makers 5 built into the app a system so you have to check constantly or risk missing out,” said Nancy Colier, a 6 psychotherapist and author of The Power of Off. “It taps into the primal fear of exclusion, of being 7 out of the tribe and not able to survive.”15 For teens, streaks can become a metric for self-worth and 8 popularity. By design, the user’s mental wellbeing becomes connected to their performance in Snap’s 9 product.16 Some teenagers even provide their log-in information to others to maintain their streaks 10 for them when they know they will not be able to do so for a time. 11 59. Time-limited content also creates pressure to use the platform daily. Users can post 12 stories that will only be available for 24 hours. Many teens feel an obligation to view all their contact’s 13 stories each day before the content disappears. 14 60. Trophies are awarded to users based on actions performed in the app, such as reaching 15 streaks of certain milestone lengths or using different portions of the app. Each trophy is a unique 16 badge to display on a user’s profile. 17 61. All users receive a “Snapscore” based on their total number of snaps sent and received. 18 Users can see the scores of friends, causing blows to the self-esteem of many young users and an 19 addictive drive to increase their score. 20 62. “Snap Map,” a feature of Snapchat that shows the location of other users on a map, 21 also causes self-esteem and mental health damage to teens. The human desire to belong to an 22 “ingroup” is powerfully connected to self-worth, especially among teens. In a recent study, young 23 respondents reported that they check Snap Map to see where their friends were to avoid exclusion, 24 followed by increased anxiety. Snap Map allows users to view content constantly with minimal effort 25 15 Id. 26 16 Yael Klein, How Snapchat Streaks Are Getting Teens Addicted to the App, Evolve Treatment 27 (quoting https://evolvetreatment.com/blog/snapchat-streaks-addicted-teens/ Centers, (last visited Sept. 9, 2022) one teen, “having more streaks makes you feel more popular.”). 28 19 COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL 1 and c