Read the lawsuit

A PDF version of this document with embedded text is available at the link below:

Download the original document (pdf)

Page 1 of 31

Case 2:24-cv-06535 Document 1 Filed 08/02/24 Page 1 of 31 Page ID #:1 1 BRIAN M. BOYNTON 2 Principal Deputy Assistant Attorney General, Civil Division ARUN G. RAO 3 Deputy Assistant Attorney General, Civil Division 4 AMANDA N. LISKAMM Director, Consumer Protection Branch 5 LISA K. HSIAO 6 Senior Deputy Director, Civil Litigation RACHAEL L. DOUD 7 ZACHARY A. DIETERT 8 9 10 12 Assistant Directors BENJAMIN A. CORNFELD MARCUS P. SMITH Trial Attorneys Consumer Protection Branch Civil Division, U.S. Department of Justice 450 5th Street, NW, Suite 6400-South Washington, DC 20001 13 Telephone: (202) 305-1537 (Cornfeld) 14 (202) 353-9712 (Smith) 15 Attorneys for Plaintiff United States of America 16 17 UNITED STATES DISTRICT COURT CENTRAL DISTRICT OF CALIFORNIA 18 UNITED STATES OF AMERICA, 19 20 Plaintiff, 22 21 V. 22 23 BYTEDANCE LTD., a Cayman Islands company; BYTEDANCE INC., a 24 Delaware corporation; TIKTOK LTD., a 25 Cayman Islands company; TIKTOK INC., a California corporation; TIKTOK 26 PTE. LTD., a Singapore company; and TIKTOK U.S. DATA SECURITY INC., a Delaware corporation, 27 28 Case No. 2:24-cv-06535 COMPLAINT FOR PERMANENT INJUNCTION, CIVIL PENALTY JUDGMENT, AND OTHER RELIEF DEMAND FOR JURY TRIAL Page 1

Page 2 of 31

1 2 Case 2:24-cv-06535 Document 1 Filed 08/02/24 Page 2 of 31 Page ID #:2 Defendants. 3 Plaintiff, the United States of America ("the United States”), acting upon 4 notification and referral from the Federal Trade Commission (“FTC”), for its 5 Complaint alleges: 6 7 1. NATURE OF THE CASE Defendants operate TikTok, one of the world's largest online social 8 media platforms. TikTok collects, stores, and processes vast amounts of data from 9 its users, who include millions of American children younger than 13. 10 2. For years, Defendants have knowingly allowed children under 13 to 11 create and use TikTok accounts without their parents' knowledge or consent, have 12 collected extensive data from those children, and have failed to comply with 13 parents' requests to delete their children's accounts and personal information. 14 3. Defendants' conduct violates the Children's Online Privacy Protection 15 Act of 1998 ("COPPA") and Children's Online Privacy Protection Rule ("Rule" or 16 "COPPA Rule"), a federal statute and regulations that protect children's privacy 17 and safety online. It also defies an order that this Court entered in 2019 to resolve a lawsuit in which the United States alleged that TikTok Inc.'s and TikTok Ltd.'s 19 predecessor companies similarly violated COPPA and the COPPA Rule by 18 20 allowing children to create and access accounts without their parents' knowledge 21 or consent, collecting data from those children, and failing to comply with parents' 22 requests to delete their children's accounts and information. 23 4. To put an end to TikTok's unlawful massive-scale invasions of 24 children's privacy, the United States brings this lawsuit seeking injunctive relief, 25 civil penalties, and other relief. 26 27 JURISDICTION AND VENUE 5. This Court has subject matter jurisdiction pursuant to 28 U.S.C. 28 §§ 1331, 1337(a), 1345, and 1355. Page 2

Page 3 of 31

Case 2:24-cv-06535 Document 1 Filed 08/02/24 Page 3 of 31 Page ID #:3 1 6. Venue is proper in this District under 28 U.S.C. §§ 1391(b)(2), (b)(3), 2 (c)(1), (c)(2), (c)(3), and (d), 1395(a), and 15 U.S.C. § 53(b). 3 4 PLAINTIFF 7. Plaintiff is the United States of America. Plaintiff brings this action 5 for violations of Section 5(a) of the FTC Act, 15 U.S.C. § 45(a), Section 1303(a) of 6 COPPA, 15 U.S.C. § 6502(a), and the COPPA Rule, 16 C.F.R. pt. 312 (effective 7 July 1, 2013). For these violations, Plaintiff seeks a permanent injunction, civil 8 penalties, and other relief, pursuant to Sections 5(m)(1)(A) and 13(b) of the FTC 9 Act, 15 U.S.C. §§ 45(m)(1)(A) and 53(b), Sections 1303(c) and 1306(d) of 10 COPPA, 15 U.S.C. §§ 6502(c), 6505(d), and the COPPA Rule, 16 C.F.R. § 312.9. 11 12 8. DEFENDANTS Defendant Tik Tok Inc. is a California corporation with its principal 13 place of business at 5800 Bristol Parkway, Suite 100, Culver City, California 14 90230. TikTok Inc. transacts or has transacted business in this District and 15 throughout the United States. 16 9. Defendant TikTok U.S. Data Security Inc. is a Delaware corporation 17 with its principal place of business shared with TikTok Inc. TikTok U.S. Data 18 Security Inc. transacts or has transacted business in this District and throughout the 19 United States. 20 10. Defendant ByteDance Ltd. is a Cayman Islands company. It has had 21 offices in the United States and in other countries. ByteDance Ltd. transacts or has 22 transacted business in this District and throughout the United States. 22 23 11. Defendant ByteDance Inc. is a Delaware corporation with its principal 24 place of business at 250 Bryant Street, Mountain View, California, 94041. 25 ByteDance Inc. transacts or has transacted business in this District and throughout 26 the United States. 27 12. Defendant TikTok Pte. Ltd. is a Singapore company with its principal 28 place of business at 8 Marina View Level 43 Asia Square Tower 1, Singapore, Page 3

Page 4 of 31

Case 2:24-cv-06535 Document 1 Filed 08/02/24 Page 4 of 31 Page ID #:4 1018960. TikTok Pte. Ltd. transacts or has transacted business in this District and 2 throughout the United States. 3 13. Defendant TikTok Ltd. is a Cayman Islands company with its 4 principal place of business in Singapore or Beijing, China. TikTok Ltd. transacts 5 or has transacted business in this District and throughout the United States. 6 7 14. COMMON ENTERPRISE Defendants are a series of interconnected companies that operate the 8 TikTok social media platform. Defendant ByteDance Ltd. is the parent and owner 9 of Defendants ByteDance, Inc. and TikTok Ltd. TikTok Ltd. owns Defendants 10 TikTok LLC and TikTok Pte. Ltd. TikTok LLC in turn owns Defendant TikTok 11 Inc., which owns Defendant TikTok U.S. Data Security Inc. 12 15. Upon information and belief, a group of ByteDance Ltd. and TikTok 13 Inc. executives, including Zhang Yiming, Liang Rubo, Zhao Penyuan, and Zhu 14 Wenjia, direct and control TikTok's core features and development. Since 2019, 15 ByteDance Ltd. and TikTok Inc. have promoted TikTok in the United States, 16 spending hundreds of millions of dollars on advertising, employing U.S.-based 17 staff and executives, and developing and distributing TikTok to run on Apple and 18 Android devices. 19 16. Byte Dance Inc. and TikTok Inc. have responsibilities for developing, 20 providing, and supporting TikTok in the United States. 21 17. TikTok Pte. Ltd. serves as the U.S. distributor of TikTok through the 22 Apple App Store and Google Play Store. 23 18. TikTok Ltd. identifies itself as the developer of TikTok in the Apple 24 App Store, and TikTok Pte. Ltd. identifies itself as the developer of TikTok in the 25 Google Play Store. The tiktok.com domain is registered to TikTok Ltd. 26 19. Beginning in 2023, TikTok Inc. transferred personal information of 27 children to TikTok U.S. Data Security Inc., which has maintained that data without 28 notice to those children's parents or parental consent. Page 4

Page 5 of 31

Case 2:24-cv-06535 Document 1 Filed 08/02/24 Page 5 of 31 Page ID #:5 1 20. Defendants share officers and directors. For example, TikTok Inc.'s 2 chief executive officers between 2020 and the present (Kevin Mayer, V Pappas, 3 and Shou Zi Chew), have simultaneously held senior positions at ByteDance Ltd., 4 and ByteDance Ltd.'s chief executive officers (Zhang Yiming and Liang Rubo) 5 have simultaneously served as directors of TikTok Ltd. TikTok Inc.'s Global 6 Chief Security Officer, Roland Cloutier, also served as cyber risk and data security 7 support for ByteDance Ltd. ByteDance Inc. and TikTok Pte. Ltd.'s officers and 8 directors have also overlapped with each other, and with officers and directors of 9 TikTok Inc. Defendants intertwine their finances; for example, ByteDance Ltd. 10 provides compensation and benefits to TikTok Inc.'s CEO, and TikTok Inc. 11 employees participate in ByteDance Ltd.'s stock option plan. 12 21. Defendants have one centralized bank account for ByteDance Ltd.'s 13 more than a dozen products, including TikTok. Defendants operate on a “shared 14 services" model in which ByteDance Ltd. provides legal, safety, and privacy 15 resources, including personnel. ByteDance's largest shareholder, Zhang Yiming, 16 signed the 2019 consent order with the United States on behalf of Musical.ly, 17 TikTok Ltd.'s predecessor company. 18 22. Defendants have operated as a common enterprise while engaging in 19 the unlawful acts and practices alleged below. 20 21 COMMERCE 23. At all times relevant to this Complaint, Defendants have maintained a 22 substantial course of trade in or affecting commerce, as “commerce” is defined in 23 Section 4 of the FTC Act, 15 U.S.C. § 44. 24 THE CHILDREN'S ONLINE PRIVACY PROTECTION ACT 25 AND RULE 26 24. Congress enacted COPPA in 1998 to protect the safety and privacy of 27 children online by prohibiting operators of Internet websites and online services 28 from the unauthorized or unnecessary collection of information of children Page 5

Page 6 of 31

Case 2:24-cv-06535 Document 1 Filed 08/02/24 Page 6 of 31 Page ID #:6 1 younger than 13 years old. COPPA directed the FTC to promulgate a rule 2 implementing COPPA. The FTC promulgated the COPPA Rule on November 3, 3 1999, under Section 1303(b) of COPPA, 15 U.S.C. § 6502(b), and Section 553 of 4 the Administrative Procedure Act, 5 U.S.C. § 553. The Rule went into effect on 5 April 21, 2000. The FTC promulgated revisions to the Rule that went into effect 6 on July 1, 2013. Pursuant to COPPA Section 1303(c), 15 U.S.C. § 6502(c), and 7 Section 18(d)(3) of the FTC Act, 15 U.S.C. § 57a(d)(3), a violation of the Rule 8 constitutes an unfair or deceptive act or practice in or affecting commerce, in 9 violation of Section 5(a) of the FTC Act, 15 U.S.C. § 45(a). 10 25. The COPPA Rule applies to any operator of a commercial website or 11 online service directed to children. It also applies to any operator of a commercial 12 website or online service that has actual knowledge that it collects, uses, and/or 13 discloses personal information from children. The Rule requires an operator to 14 meet specific requirements prior to collecting, using, or disclosing children's 15 personal information online. These requirements include: 16 17 18 19 20 21 23 24 22222 28 25 26 27 28 a) Posting a privacy policy on its website or online service providing clear, understandable, and complete notice of its information practices, including what information the operator collects from children online, how it uses such information, its disclosure practices for such information, and other specific disclosures set forth in the Rule; b) Providing clear, understandable, and complete notice of its information practices, including specific disclosures, directly to parents; c) Obtaining verifiable parental consent prior to collecting, using, and/or disclosing children's personal information; d) Providing reasonable means for parents to review personal information collected from children online, at a parent's request; and Page 6

Page 7 of 31

Case 2:24-cv-06535 Document 1 Filed 08/02/24 Page 7 of 31 Page ID #:7 1 2 3 4 26. e) Deleting personal information collected from children online, at a parent's request. THE 2019 PERMANENT INJUNCTION Musical.ly was a video-based platform with millions of U.S. child 5 users. In February 2019, the United States filed a complaint against Musical.ly and 6 Musical.ly, Inc. alleging violations of the COPPA Rule, 16 C.F.R. pt. 312, and 7 Section 5 of the FTC Act, 15 U.S.C. § 45. See United States v. Musical.ly, et al., 8 No. 2:19-cv-01439-ODW-RAO (C.D. Cal. Feb. 27, 2019) (Dkt. No. 1). 9 27. On March 27, 2019, this Court entered a Stipulated Order for Civil 10 Penalties, Permanent Injunction, and Other Relief against Musical.ly and 11 Musical.ly, Inc. United States v. Musical.ly, et al., No. 2:19-cv-01439-ODW-RAO 12 (C.D. Cal. Mar. 27, 2019) (Dkt. No. 10) (the 2019 Permanent Injunction). The 13 order imposed a $5.7 million civil penalty; required Defendants to destroy personal 14 information of users under the age of 13 and, by May 2019, remove accounts of 15 users whose age could not be identified; enjoined Defendants from violating the 16 COPPA Rule; and required Defendants to retain certain records related to 17 compliance with the COPPA Rule and the 2019 Permanent Injunction. 18 28. In April 2019, Musical.ly was renamed TikTok Ltd., and in May 19 2019, Musical.ly Inc. was renamed TikTok Inc. The renaming did not alter the 20 companies' compliance obligations under the 2019 Permanent Injunction. DEFENDANTS' BUSINESS ACTIVITIES 21 222 22 29. Since before 2019, Defendants have operated TikTok, a video-based 23 social media platform that consumers may access via the Internet or through a 24 downloadable software application or “app.” In November 2017, ByteDance Ltd. 25 purchased Musical.ly and, in 2018 it merged it into TikTok. 26 30. The TikTok platform allows users to create, upload, and share short- 27 form videos. The TikTok app is free to download. It generates revenue for 28 Defendants through advertising and eCommerce, including through the TikTok for Page 7

Page 8 of 31

Case 2:24-cv-06535 Document 1 Filed 08/02/24 Page 8 of 31 Page ID #:8 1 Business platform, as well as in-app purchases of TikTok “coin” through the 2 TikTok Shop. 3 31. TikTok features a "For You" feed in which an algorithm subject to 4 Defendants' control selects videos for each user based on its determination of their 5 interests, pushes those videos to the user, and plays them. 6 32. TikTok's algorithms are trained on data collected from users via the 7 Tik Tok platform and from third-party sources. Such data include videos viewed, 8 "liked," or shared, accounts followed, comments, content created, video captions, 9 sounds, and hashtags, as well as device and account settings such as language 10 preference, country setting, and device type. 11 33. As of 2024, there are more than 170 million TikTok users in the 12 United States, including many children and teens. In 2022, two-thirds of U.S. 13 teens reported using TikTok, including about 61% of teens aged 13 or 14. By late 14 2023, nearly half of U.S. teens reported using TikTok multiple times a day. 15 16 DEFENDANTS' UNLAWFUL CONDUCT 34. Defendants have known of COPPA, the COPPA Rule, and their 17 requirements since at least 2017, directly or through their predecessors and 18 affiliates, including through Musical.ly's and Musical.ly, Inc.'s agreement to the 19 2019 Permanent Injunction, which requires compliance with COPPA and the 20 COPPA Rule. 21 35. TikTok is directed to children (i.e., individuals under age 13, as used 22 herein and in COPPA and the Rule). An online service that does not target 23 children as its primary audience is not deemed directed to children under the 24 COPPA Rule if it satisfies certain criteria. Defendants purport to satisfy these 25 criteria by requiring users creating accounts to report their birthdates. As described 26 in this Complaint, however, Defendants have allowed children to bypass or evade 27 this "age gate” and collected personal information even from individuals who 28 identify themselves as children. Further, as described in this Complaint, Page 8

Page 9 of 31

Case 2:24-cv-06535 Document 1 Filed 08/02/24 Page 9 of 31 Page ID #:9 1 Defendants have actual knowledge that they are collecting personal information 2 from children. 3 36. Defendants have violated COPPA and the COPPA Rule through the 4 conduct described in this Complaint, including by (1) knowingly creating accounts 5 for children and collecting data from those children without first notifying their 6 parents and obtaining verifiable parental consent; (2) failing to honor parents' 7 requests to delete their children's accounts and information; and (3) failing to 8 delete the accounts and information of users they know are children. 9 37. Each time Defendants have collected a child's personal information 10 without parental notice or verifiable consent, or have failed to delete that 11 information at the request of the child's parents or upon learning it was collected 12 from a child whose parents' were not notified or did not provide verifiable consent, 13 Defendants violated COPPA and the COPPA Rule. 14 18 I. 19 20 38. Defendants' conduct has resulted in millions of children using 15 TikTok, but the precise magnitude of Defendants' violations is difficult to 16 determine due to their failure to comply with the 2019 Permanent Injunction's 17 requirement that they keep records demonstrating its COPPA compliance. Defendants Have Knowingly Created Accounts for Children and Collected Those Children's Data Without Parental Notice or Consent. 39. Since at least March 2019, Defendants have offered in the United 21 States what they refer to as TikTok for Younger Users or "Kids Mode" (hereinafter 22 "Kids Mode") to children who identify themselves as being under 13 when they 23 create an account, and a regular TikTok experience to other users. However, 24 Defendants have knowingly allowed children under 13 to create accounts in the 25 regular TikTok experience and collected extensive personal information from those 26 children without first providing parental notice or obtaining verifiable parental 27 consent, as required by the COPPA Rule. Defendants have also violated the 28 COPPA Rule by collecting, without parental notice and consent, several varieties Page 9

Page 10 of 31

Case 2:24-cv-06535 Document 1 Filed 08/02/24 Page 10 of 31 Page ID #:10 1 of personal information from children with Kids Mode accounts, and by using 2 children's information in ways that the COPPA Rule prohibits. 3 4 A. Defendants Allowed Children to Evade or Bypass TikTok's Age Gate 40. Since at least March 2019, when consumers in the United States 5 attempt to create a TikTok account, they generally have had to go through the 6 platform's “age gate” by providing a birthday (day, month, and year). If a 7 consumer indicates that they are 13 or older, they are prompted for a username, 8 password, and email address or phone number. Defendants then create a regular 9 account for the user, and the user can view, create, post, and share videos, as well 10 as message other TikTok users. 11 41. For TikTok users who self-identify as 13 or older at the age gate, 12 Defendants collect a wide variety of personal information, such as first and last 13 name, age, email address, phone number, persistent identifiers for the device(s) 14 used to access TikTok, social media account information, and profile image(s), as 15 well as photographs, videos, and audio files containing the user's image and voice 16 and the metadata associated with such media (such as when, where, and by whom 17 the content was created). 18 42. Over time, Defendants collect increasingly more information from 19 these users, including usage information, device information, location data, image 20 and audio information, metadata, and data from cookies and similar technologies 21 that track users across different websites and platforms. 22 43. Since at least March 2019, if a U.S. consumer inputs into the age gate 23 a birthday indicating they are a child under 13 years old, the child generally is 24 prompted to provide a username (that does not include any personal information) 25 and a password. The TikTok platform then creates an account for that child in 26 Kids Mode. Defendants do not notify parents or obtain parental consent for Kids 27 Mode accounts. 28 Page 10

Page 11 of 31

Case 2:24-cv-06535 Document 1 Filed 08/02/24 Page 11 of 31 Page ID #:11 1 44. In Kids Mode, a user can view videos but cannot create or upload 2 videos, post information publicly, or message other users. Defendants still collect 3 and use certain personal information from children in Kids Mode. 4 45. Defendants' methodologies for screening out child users are deficient 5 in multiple ways. Until at least late 2020, if a child in the U.S. submitted a 6 birthday reflecting that they were under 13 years old, the TikTok platform did not 7 prevent the child from evading the age gate by trying again: i.e., restarting the 8 account creation process and giving the age gate a birthday indicating they were 13 9 or older, even though by that point Defendants knew from the birthday the user had 10 previously provided that the user was a child. 11 12 13 14 15 46. Until at least May 2022, Defendants offered consumers a way to avoid 16 the TikTok age gate altogether when creating a TikTok account, by allowing them 17 to use login credentials from certain third-party online services, including 18 Instagram and Google. Defendants internally identified these TikTok accounts as 19 "age unknown" accounts. 20 47. For example, Defendants allowed children to create TikTok accounts 21 without age gating them by letting children use login credentials from Instagram, 22 even though Instagram did not itself require users to disclose their age or date of 23 birth to create an Instagram account until at least December 2019. 24 48. Defendants also allowed children to create TikTok accounts without 25 age gating by letting children use login credentials from Google. Google allowed 26 children under the age of 13 to create Google accounts with parental consent to use 27 Google. 28 Page 11

Page 12 of 31

Case 2:24-cv-06535 Document 1 Filed 08/02/24 Page 12 of 31 Page ID #:12 1 49. Defendants' insufficient policies and practices thus allowed children 2 to create a non-Kids Mode TikTok account, gaining access to adult content and 3 features of the general TikTok platform without providing age information. 4 Without parental notice or consent, Defendants then collected and maintained vast 5 amounts of personal information from the children who created and used these 6 regular TikTok accounts. 7 50. These policies and practices led to the creation of millions of accounts 8 for which Defendants did not know the age of the user. 9 51. Defendants did not start requiring all users to go through a TikTok age 10 gate until at least 2022, closing what employees internally described in early 2021 11 as an age gate "loophole." 12 13 14 B. Defendants Failed to Comply with COPPA and the COPPA Rule Even for Accounts in “Kids Mode" 52. In Kids Mode, Defendants collect and maintain a username, password, 15 and birthday (day, month, and year). They have also collected several types of 16 persistent identifiers from Kids Mode users without notifying parents or obtaining 17 their consent, including IP address and unique device identifiers. 18 53. The COPPA Rule permits operators to collect a persistent identifier 19 from children under certain circumstances without first obtaining verifiable 20 parental consent, but only if no other personal information is collected and the 21 identifier is used for the sole purpose of providing support for the online service's 22 internal operations. See 16 C.F.R. § 312.4(c)(7). Defendants' collection and use 23 of persistent identifiers from Kids Mode users do not comply with this provision. 24 54. Defendants additionally collect dozens of other types of information 25 concerning child users with Kids Mode accounts—including app activity data, 26 device information, mobile carrier information, and app information—which they 27 combine with persistent identifiers and use to amass profiles on children. 28 Page 12

Page 13 of 31

Case 2:24-cv-06535 Document 1 Filed 08/02/24 Page 13 of 31 Page ID #:13 1 55. Defendants did not need to collect all of the persistent identifiers they 2 have collected from users in Kids Mode to operate the TikTok platform. 3 56. Until at least mid-2020, Defendants shared information they collected 4 from children in Kids Mode with third parties for reasons other than support for 5 internal operations. Defendants did not notify parents of that practice. 6 57. For example, Defendants shared this information with Facebook and 7 AppsFlyer, a marketing analytics firm, in part to encourage existing Kids Mode 8 users whose use had declined or ceased to use Kids Mode more frequently. 9 Defendants called this process "retargeting less active users." This practice used 10 children's personal information for reasons beyond support for the internal 11 operations of Kids Mode and thus was not permitted by the COPPA Rule. 12 58. Separately, users in Kids Mode can send feedback to TikTok using an 13 in-app "Report a Problem" function. When doing so, Defendants require the child 14 to enter the child's email address. 15 59. Between February 2019 and July 2022, for example, Defendants 16 collected over 300,000 problem reports from users in Kids Mode that included 17 children's email addresses. 18 60. Defendants did not delete these children's email addresses after 19 processing the reports, and thus retained these email addresses longer than 20 reasonably necessary to fulfill the purpose for which the information was collected, 21 in violation of the Rule. See 16 C.F.R. § 312.10. Defendants did not notify 22 parents of this ongoing practice. 23 22 24 25 II. Defendants Have Obstructed and Failed to Honor Parents' Requests to Delete Their Children's Accounts and Data. 61. Since 2019, Defendants have allowed millions of children to create 26 general TikTok accounts—i.e., accounts outside of Kids Mode. 27 62. Many children create and use a general TikTok account without their 28 parents' knowledge. Frequently, however, a parent becomes aware that their child Page 13

Page 14 of 31

Case 2:24-cv-06535 Document 1 Filed 08/02/24 Page 14 of 31 Page ID #:14 1 has a general TikTok account and seeks to have it and its associated data deleted. 63. The COPPA Rule and the 2019 Permanent Injunction require 2 3 Defendants to delete personal information collected from children at their parents' 4 request. Nevertheless, in many instances Defendants have obstructed parents' 5 ability to make such requests and have failed to comply with these requests. 6 7 8 A. Defendants Maintained an Unreasonable Process for Parents to Request Deletion of their Children's Data 64. Defendants failed to create a simple process for parents to submit a 9 deletion request. For example, the word "delete" does not appear in many of 10 Defendants' online parental guidance materials, such as TikTok's “Guardian's 11 Guide," the "Privacy and Security on TikTok” page, TikTok's “New User Guide,” 12 and other materials on tiktok.com such as the "Parental Controls Guide" and "The 13 Parent's Guide to TikTok." 14 65. Parents must navigate a convoluted process to figure out how to 15 request deletion of their child's account and information. For example, as recently 16 as 2023, a parent visiting tiktok.com to request deletion of their child's TikTok 17 account and information had to scroll through multiple webpages to find and click 18 on a series of links and menu options that gave no clear indication they apply to 19 such a request. Parents then had to explain in a text box that they are a parent who 20 wanted their child's account and data to be deleted. 21 66. At times, Defendants also directed parents to send their requests to 22 delete their children's accounts and personal information to an email address. As 23 detailed below, in many cases Defendants failed to respond in a timely manner to 24 these requests, or simply failed to respond to them at all. 25 67. Even if a parent succeeded in submitting a request to delete their 26 child's account and information, Defendants often did not honor that request. In 27 response to each request, Defendants' staff would review the account for 28 "objective indicators" that the account holder was under 13, or "underage,” based Page 14

Page 15 of 31

Case 2:24-cv-06535 Document 1 Filed 08/02/24 Page 15 of 31 Page ID #:15 1 on the user's handle, biography or “bio,” 2 Under Defendants' policy, an account would be identified as an underage account 3 and deleted only if the reviewed elements contained an explicit admission that the 4 user was under 13—for example, “I am in first grade” or “I am 9 years old”- 5 6 To determine whether a child was 7 younger than 13, Defendants instructed reviewers to use 8 9 10 68. If the account failed to meet Defendants' rigid criteria, Defendants' 11 policy until recently was to respond to the underage account deletion request by 12 asking the parent to complete and sign a form confirming their relationship to the 13 child and the nature of the request. The parent had to certify under penalty of 14 perjury that they were the parent or guardian of the account user. Defendants 15 required parents to complete the form regardless of whether the parent had already 16 provided Defendants with all of the information the form requested. 17 69. If a parent or guardian did not submit the secondary form, Defendants 18 would not delete the child's regular TikTok account, which remained active. 19 70. Defendants' policies and practices subverted parents' efforts to delete 20 their children's accounts and resulted in Defendants retaining children's 21 accounts and personal information—even though their parents identified them as 22 children and asked TikTok to delete their accounts. 23 71. Defendants were well aware this was occurring. For example, in a 24 2018 exchange, a high-level employee of Defendants explicitly acknowledged that 25 Defendants had “actual knowledge” of children on TikTok upon receiving the first 26 parental request, and yet did not delete children's accounts upon receiving the 27 request. In the exchange, the former CEO of TikTok Inc. communicated about 28 underage users on TikTok with the executive responsible for child safety issues in Page 15

Page 16 of 31

Case 2:24-cv-06535 Document 1 Filed 08/02/24 Page 16 of 31 Page ID #:16 1 the United States. The employee in charge of child safety issues questioned why 2 parents had to fill out a second form after they already provided the necessary 3 information, noting: "Why we reply with this template everytime [sic] when we 4 already have all the info that's needed? [I]n this case, we already have the 5 username, the name of the reporter, and the age, yet we still reply with the 6 template." He added that if the person reporting the account “doesn't reply then 7 we have actual knowledge of underage user and took no action!" 8 72. Despite this awareness that they were failing to respect parents' 9 deletion requests, Defendants continued using this flawed process through 2023. 10 11 12 B. Defendants Failed to Delete Children's Data upon Parental Request and Cease Collecting Children's Personal Information 73. In addition to using what they knew to be a flawed process to address 13 parents' deletion requests, Defendants in many cases did not respond to parents' 14 requests at all. As of late December 2020, Defendants had a backlog of thousands 15 of emails dating back months requesting that TikTok delete individual children's 16 accounts. 17 74. Defendants' inadequate policies and inaction led to numerous children 18 continuing to maintain regular TikTok accounts even though their parents had 19 asked Defendants to delete those accounts. In a sample of approximately 1,700 20 children's TikTok accounts about which Defendants received complaints and 21 deletion requests between March 21, 2019, and December 14, 2020, approximately 22 500 (30%) remained active as of November 1, 2021. Several hundred of these 23 accounts were still active in March 2023. This sample of children's accounts is 24 likely a small fraction of the thousands of deletion requests Defendants received 25 and failed to act on. 26 75. Many parents made multiple requests for Defendants to remove their 27 children's account and personal information. On at least some occasions, even 28 Page 16

Page 17 of 31

Case 2:24-cv-06535 Document 1 Filed 08/02/24 Page 17 of 31 Page ID #:17 1 when a parent or guardian completed Defendants' secondary form, Defendants still 2 failed to delete their children's accounts and information. 3 76. Compounding these problems, even when Defendants did delete a 4 child's account and personal information at their parent's request, at least until 5 recently, Defendants did nothing to prevent the same child from re-creating their 6 account with the same device, persistent identifiers, and email address or phone 7 number as before. This means that a child whose account has been removed could 8 simply create a new account. 9 III. 10 11 Defendants Have Failed to Delete Children's Accounts and Information Identified by Their Own Systems and Employees. 77. Defendants purport to use technology, user reports, and human 12 moderation to identify children's TikTok accounts so that those accounts and the 13 information collected from them can be deleted. But Defendants know their 14 processes and policies are deficient, and they fail to delete accounts and 15 information that even their own employees and systems identify as belonging to 16 children. 17 18 A. Defendants' "Keyword Matching" Process 78. Since approximately 2020, Defendants have used "keyword 19 matching" purportedly to identify children's accounts for deletion. Defendants' 20 keyword matching process searches users' profiles for terms deemed likely to 21 correspond to child accounts—for example, “4th grade" and "9 years old"-and 22 submits accounts that include those terms for review and potential removal. 23 Defendants' keyword matching practices have proven woefully deficient. 24 Defendants' human content moderators review accounts flagged as 25 potentially belonging to children by the keyword matching process or by other 26 methods. Similar to Defendants' restrictive approach to parental deletion requests, 27 the content moderators who review accounts may delete them as belonging to 28 children only if rigid criteria are satisfied. For example, under the policy, an 79. Page 17

Page 18 of 31

3 Case 2:24-cv-06535 Document 1 Filed 08/02/24 Page 18 of 31 Page ID #:18 1 account can be marked as underage and deleted only if either there is an explicit 2 admission of an age under 13 or 4 5 80. Earlier versions of the policy were even more restrictive. For 6 example, to mark and delete an account as underage, the policy between the spring 7 of 2020 and early 2021 required an explicit admission of age, regardless of what 8 videos the account had posted. The pre-April 2020 version of the policy required 9 both (i) an explicit admission of age and (ii) that 10 11 81. Defendants' content moderators are not told why an account was 12 flagged as possibly underage and cannot access any videos posted by the user 13 beyond -even though the account may have dozens or 14 hundreds of videos revealing that the user is a child. The moderators cannot view 15 other information about the accounts they are reviewing either, including the 16 videos watched by the user or the accounts the user follows. If the policy's rigid 17 criteria are not met, content moderators have no discretion to designate an account 18 as underage; they must allow any such account to remain on the platform even if 19 they know the account holder is in fact a child. 20 82. Defendants have also failed to allow content moderators sufficient 21 time to conduct even the limited review they permit. At times since entry of the 22 2019 Permanent Injunction, TikTok has had tens of millions of monthly active 23 users in the United States. Meanwhile, TikTok Inc.'s content moderation team 24 included fewer than two dozen full-time human moderators responsible for 25 identifying and removing material that violated all of its content-related policies, 26 including identifying and deleting accounts of unauthorized users under age 13. 27 28 Page 18

Page 19 of 31

Case 2:24-cv-06535 Document 1 Filed 08/02/24 Page 19 of 31 Page ID #:19 1 83. During at least some periods since 2019, TikTok Inc.'s human 2 moderators spent an average of only five to seven seconds reviewing each account 3 flagged by a keyword to determine if it belonged to a child. 4 84. The deficiency of Defendants' policies is shown by the fact that 5 regular Tik Tok accounts belonging to children can be easily found by searching for 6 the same basic terms and variations used by Defendants' keyword matching 7 algorithm. Some of these accounts have existed for long periods—able to garner 8 hundreds of followers and hundreds or even thousands of “likes,” a sign of 9 approval by other TikTok users. 10 85. By adhering to these deficient policies, Defendants actively avoid 11 deleting the accounts of users they know to be children. Instead, Defendants 12 continue collecting these children's personal information, showing them videos not 13 intended for children, serving them ads and generating revenue from such ads, and 14 allowing adults to directly communicate with them through TikTok. 15 16 B. Accounts Referred from Video Moderation Queues 86. Many accounts that belong to children come to Defendants' attention 17 when one user reports another user's video as violating one of Defendants' 18 policies. Those videos are then added to "video queues" and reviewed by human 19 content moderators who review the videos to determine whether they comply with 20 Defendants' policies. If those content moderators encounter a video that depicts a 21 child under 13, they can apply labels to designate suspected child users, such as 22 "Content Depicting Under the Age of Admission" or "Suspected Underaged User." 23 These moderators can remove a specific video from TikTok, but they lack 24 authority to delete or remove the account even if it is clearly the account of a child. 25 Instead, by applying the labels, they refer the video to the separate content 26 moderation team that assesses whether accounts belong to underage users (the 27 "underage queue"). 28 Page 19

Page 20 of 31

Case 2:24-cv-06535 Document 1 Filed 08/02/24 Page 20 of 31 Page ID #:20 1 87. Until at least October 2022, however, this process did not work. 2 Accordingly, when Defendants' moderators tagged specific videos as depicting a 3 child under 13, the associated accounts were not actually referred to the team 4 authorized to delete the associated account. Instead, those accounts remained live, 5 and Defendants continued to collect and retain those children's personal 6 information and to show them videos and messages from regular TikTok users. 7 Due to Defendants' recordkeeping deficiencies, detailed below, they cannot 8 identify the number of accounts affected by this issue. The limited records 9 Defendants do have, however, make clear that millions of accounts were involved. C. Accounts Identified in Quality Assurance Reviews 10 11 88. 89. Defendants conduct quality assurance reviews of the content 12 moderation processes described above. The quality assurance reviews require 13 content moderators to re-review a subset of previously reviewed accounts or 14 videos. This process aims to identify instances in which TikTok content 15 moderators incorrectly applied company policies to those accounts or videos. 16 Until at least September 2022, however, when Defendants' quality 17 assurance analysts identified a specific account that a moderator incorrectly failed 18 to flag for deletion as belonging to a child, Defendants did not then go back and 19 delete the account. Instead, the account remained live. Accordingly, Defendants 20 failed to delete numerous children's accounts that their own quality assurance team 21 specifically identified as belonging to children. 22 23 D. Accounts That Moderators Have Marked "Ban as Underage" 90. Even where accounts satisfied Defendants' rigid criteria, were 24 identified as belonging to children, and were marked for deletion, Defendants 25 failed to delete many of the accounts. 26 91. Internal communications reveal that Defendants' employees were 27 aware of this issue. In a September 2021 online chat, for example, employees 28 discussed the fact that accounts were being marked as banned for underage but Page 20

Page 21 of 31

Case 2:24-cv-06535 Document 1 Filed 08/02/24 Page 21 of 31 Page ID #:21 1 were not being deleted, and suggested this had been occurring since mid-July 2 2020. One employee noted that she was seeing this “a lot” and “I run across 3 usually like 3-4 accounts [like that] a day," while another noted “[t]hat shouldn't 4 be happening at all or we can get in trouble . because of COPPA." 5 92. Even though Defendants were aware of this problem, and the 2019 6 Permanent Injunction required them to maintain records regarding their COPPA 7 compliance or lack thereof, they failed to retain records documenting this issue and 8 the accounts affected. The extremely limited records Defendants have produced to 9 the government reveal that even for small segments of the time period at issue, at 10 least several hundred accounts were affected. 11 12 E. Data Collected From Purportedly Deleted Accounts 93. Defendants retain children's personal information long after they 13 identify an account as belonging to a child and determine they should delete 14 information related to the account. For example, Defendants retain app activity log 15 data related to children for 18 months. 16 94. Moreover, Defendants have retained children's information in 17 numerous database locations long after purportedly deleting their accounts. 18 Defendants have not documented what information collected from users is saved in 19 what locations or why, and they have been unable to explain how or why the 20 information was in those locations, or why it was not deleted. 21 95. Defendants have also failed to delete information children posted to 22 TikTok that was later incorporated into other users' videos, even when Defendants 23 possessed identifiers linking the information to an account that they deleted 24 because it belonged to a child. For example, until at least 2022, Defendants 25 retained sound recordings of numerous children from accounts Defendants had 26 determined belonged to children, and those sound recordings continued to appear 27 in other users' videos. 28 Page 21

Page 22 of 31

Case 2:24-cv-06535 Document 1 Filed 08/02/24 Page 22 of 31 Page ID #:22 1 96. Similarly, Defendants retained profile photographs of users that 2 Defendants knew to be children. For example, TikTok allows users to include in 3 their videos another user's comment, which is displayed alongside the 4 commenter's photograph and username. When Defendants did “delete” the 5 account of a child, that child's comments remained in other users' posts, along 6 with their photograph and username. These images had unique identifiers that tied 7 each child's photograph, username, and comment to an account that Defendants 8 knew had been deleted because it belonged to a child. 9 IV. 10 11 Defendants' Violations Have Occurred on a Massive Scale. A. Defendants' Policies Result in Millions of Children Using TikTok 97. As discussed above, Defendants adopted and implemented inadequate 12 and ineffective policies to stop children from creating general TikTok accounts and 13 to remove those accounts when they were discovered. As a result, for years 14 millions of American children under 13 have been using TikTok and Defendants 15 have been collecting and retaining children's personal information. 16 98. Defendants' internal analyses show that millions of TikTok's U.S. 17 users are children under the age of 13. For example, the number of U.S. TikTok 18 users that Defendants classified as age 14 or younger in 2020 was millions higher 19 than the U.S. Census Bureau's estimate of the total number of 13- and 14-year olds 20 in the United States, suggesting that many of those users were children younger 21 than 13. 22 99. Third-party studies shared with TikTok Inc. similarly show that in the 23 United States and other countries, child usage of TikTok is common and large 24 numbers of children have regular TikTok accounts. In fact, regulators in other 25 countries, including the Netherlands, Ireland, and the United Kingdom, have fined 26 Defendants for impermissibly collecting data from children. 27 100. Defendants and their employees have long known that children 28 misrepresent their ages to pass through TikTok's age gate, and that despite other Page 22

Page 23 of 31

Case 2:24-cv-06535 Document 1 Filed 08/02/24 Page 23 of 31 Page ID #:23 1 measures purportedly designed to remove children from the platform, children are 2 ubiquitous. 3 101. In January 2020, for example, a TikTok moderator recognized that 4 Defendants maintain accounts of children despite the "fact that we know the user is 5 U13," i.e., under age 13, so long as the child's profile does not admit that fact 6 explicitly. Another employee admitted that TikTok moderators were required to 7 ignore any "external information” indicating that a user under review is a child. 8 102. As another example, in a July 2020 chat, one of Defendants' 9 employees circulated the profiles of numerous underage users he had identified 10 "literally through one minute of scanning," noting “[t]his is incredibly concerning 11 and needs to be addressed immediately." 12 103. Defendants have other methods to identify and remove children's 13 accounts from the general TikTok platform but do not use them for that purpose. 14 For example, TikTok has its own age-determining technology—“grade level,” the 15 algorithm for which is based on users' behavior and other metrics—for purposes 16 such as advertising. Unlike TikTok's age gate, this method is based on observable 17 behaviors and not solely users' self-reported age. Defendants have not used it to 18 attempt to identify children on the platform so that their accounts can be removed. 104. In a November 2019 message, a company employee told TikTok 19 20 Inc.'s then-head of content partnerships, who led its relationships with major 21 brands, that "we have two age level . . . one is age gate and one is grade level.” He 22 continued that the age gate is “filled in by users themselves” and “many of them 23 will fill in false information,” while “grade level [is] calculated by algorithm . . . 24 through user's behavior or other metrics, which are more accurate.” He went on 25 that, for purposes of a search, “I used grade level so we will see many users under 26 13." 27 105. Not only do Defendants not use their grade level technology to 28 identify and remove children from the TikTok general platform, but they appear to Page 23

Page 24 of 31

Case 2:24-cv-06535 Document 1 Filed 08/02/24 Page 24 of 31 Page ID #:24 1 have programmed grade level to avoid gaining knowledge that users were under 2 13. In 2020, Defendants' lowest age group band was for ages under 15, meaning 3 that it would not identify users as under 13 specifically. Defendants later revised 4 this age cutoff so that the lowest age segment was under 16. 5 6 7 B. Defendants Failed to Keep Records Required by the 2019 Permanent Injunction 106. The 2019 Permanent Injunction required TikTok Inc. and TikTok Ltd. 8 to create and maintain all records necessary to demonstrate full compliance with 9 the 2019 Permanent Injunction, including records to show full compliance with 10 COPPA and the COPPA Rule. Defendants have failed to create and maintain all 11 such records. 12 107. First, when Defendants identified issues concerning their COPPA 13 compliance, they frequently failed to maintain records that would be needed to 14 show how many accounts were affected, which accounts were affected, and what, 15 if anything was done to remedy the issues. For example, as noted above, 16 Defendants did not maintain records regarding accounts that were referred to the 17 underage queue from the video queue but not actually reviewed, or regarding their 18 failure to delete children's accounts that had been designated as underage. 19 108. Further, Defendants have failed to create or maintain records 20 sufficient to document their moderators' review of regular accounts identified as 21 potentially belonging to children and the actions taken as a result. When asked by 22 the United States for documentation of certain specific accounts of children, 23 Defendants initially produced no records and claimed their account records were 24 "not intended to be reviewed in the ordinary course of business.” The records 25 Defendants subsequently produced do not make it possible to systematically 26 determine what action has been taken on specific accounts and why. 27 109. Additionally, Defendants' employees use Feishu (sometimes referred 28 to as Lark), a ByteDance Ltd. corporate messaging and office collaboration. Page 24

Page 25 of 31

Case 2:24-cv-06535 Document 1 Filed 08/02/24 Page 25 of 31 Page ID #:25 1 platform, to communicate with each other. Defendants enabled features in Feishu, 2 such as one called “recall,” that allow employees to easily erase internal 3 communications, leaving no record of the communication. Employees used the 4 feature to delete messages permanently, including, potentially, messages relevant 5 to compliance with the 2019 Permanent Injunction and COPPA. Defendants did 6 not change this practice until at least May 2023. 7 110. Defendants enabled another feature in Feishu that allows employees 8 to choose when their communications will be deleted. 9 111. A late 2021 risk assessment for Defendant ByteDance Ltd. found that 10 the company was incapable of extracting accurate and usable records about and 11 from internal Lark messages. The risk assessment found that because they used 12 Feishu, Defendants lacked a reliable way to memorialize the vast majority of 13 employees' business communications and could not assure preservation in 14 compliance with government investigations and litigation subpoenas. 15 16 C. Tik Tok Inc. Misrepresented its Remedial Conduct to the FTC 112. On June 12, 2020, TikTok Inc. stated to the FTC that “[o]n May 11, 17 2019. [it] took offline all US accounts that did not go through [its then-recently 18 imposed] age gate. These accounts . . . were not accessible to the Company. 19 TikTok did not use or disclose the information for any purpose.” TikTok Inc. also 20 stated that it “completed on May 24, 2020” the deletion of children's data as 21 required by the 2019 Permanent Injunction. V Pappas, as "GM of TikTok," 22 certified on TikTok Inc.'s behalf under penalty of perjury that the prior statement 23 was true and correct. 24 113. After follow-up inquiry by the FTC, TikTok Inc. acknowledged that 25 its June 12, 2020, claims had been false. In fact, TikTok Inc. had retained and been 26 using data that it previously represented it “did not use," was “not accessible” to it, 27 and was "delet[ed].” That data included personal information and other data of 28 Page 25

Page 26 of 31

Case 2:24-cv-06535 Document 1 Filed 08/02/24 Page 26 of 31 Page ID #:26 1 child, teen, and adult users, including IP addresses, device IDs, device models, and 2 advertising IDs. 3 4 *** 114. Based on the facts and violations of law alleged in this Complaint, the 5 United States has reason to believe that Defendants are violating or are about to 6 violate COPPA, the COPPA Rule, and the FTC Act. 7 VIOLATIONS OF COPPA, THE COPPA RULE AND THE FTC 8 ACT 9 115. Paragraphs 1 through 114 are incorporated as if set forth herein. 116. Defendants are “operators,” under 16 C.F.R. § 312.2, and thus subject 11 to the COPPA Rule. 10 12 117. Defendants collect personal information from children through the 13 TikTok app and website, which are both online services or websites directed to 14 children. Defendants have actual knowledge that they are collecting personal 15 information from children. 16 118. In numerous instances, in connection with the acts and practices 17 described above, Defendants collected, used, and disclosed personal information 18 from children in violation of COPPA and the COPPA Rule, including by: 19 20 21 22222 23 24 25 26 27 28 a) Failing to provide notice on their website or online service of what information they collect from children, how they use such information, their disclosure practices, and other content required by the Rule, in violation of Sections 312.3(a) and 312.4(d) of the Rule, 16 C.F.R. §§ 312.3(a), 312.4(d); b) Failing to make reasonable efforts to provide direct notice to parents of what information they collect online from children, how they use such information, their disclosure practices for such information, and other content required by the Rule, in violation of Sections 312.4(b) and 312.4(c) of the Rule, 16 C.F.R. §§ 312.4(b)-(c); Page 26

Page 27 of 31

Case 2:24-cv-06535 Document 1 Filed 08/02/24 Page 27 of 31 Page ID #:27 Failing to obtain consent from parents before any collection, use, or disclosure of personal information from children, in violation of Sections 312.3(b) and 312.5(a)(1) of the Rule, 16 C.F.R. §§ 1 c) 2 3 4 150 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 2222 26 27 28 28 312.3(B), 312.5(a)(1); d) e) Failing to provide a reasonable means for a parent to refuse to permit the further use or maintenance of any personal information collected from a child, in violation of Sections 312.3(c) and 312.6(a)(2)-(3) of the Rule, 16 C.F.R. §§ 312.3(c), 312.6(a)(2)-(3); Failing to provide parents the opportunity at any time to direct Defendants to delete personal information collected from children, in violation of Section 312.6(a)(2) of the Rule, 16 C.F.R. § 312.6(a)(2); f) Failing to delete, at the request of parents, personal information collected from children, in violation of Section 312.6(a)(2) of the Rule, 16 C.F.R. § 312.6(a)(2); g) Retaining personal information collected online from children for longer than reasonably necessary to fulfill the purpose for which the information was collected, in violation of Section 312.10 of the Rule, 16 C.F.R. § 312.10; i) h) Failing to timely delete personal information collected from children in order to respond on a one-time basis to a specific request, in violation of Section 312.5 of the Rule, 16 C.F.R. § 312.5(c)(3); Failing to limit their collection of children's personal information for which they lacked verifiable parental consent to only the limited information permitted by the Rule's exceptions to prior parental consent requirements, in violation of Section 312.5(c) of the Rule, 16 C.F.R. § 312.5(c); j) Failing to limit use of children's personal information for which they lacked verifiable parental consent to solely the purposes Page 27

Page 28 of 31

Case 2:24-cv-06535 Document 1 Filed 08/02/24 Page 28 of 31 Page ID #:28 1 2 3 4 5 6 7 8 9 10 permitted by the Rule (such as the use of a persistent identifier for the sole purpose of providing support for the internal operations of their website or online service, permitted by Section 312.4(c)(7), of the Rule) in violation of Section 312.5(c) of the Rule, 16 C.F.R. § 312.5(c); and k) Conditioning children's participation in the online service by requiring the disclosure of more personal information than is reasonably necessary to participate, in violation of Section 312.7 of the Rule, 16 C.F.R. § 312.7s. 119. Pursuant to Section 1303(c) of COPPA, 15 U.S.C. § 6502(c), and 11 Section 18(d)(3) of the FTC Act, 15 U.S.C. § 57a(d)(3), a violation of the Rule 12 constitutes an unfair or deceptive act or practice in or affecting commerce, in 13 violation of Section 5(a) of the FTC Act, 15 U.S.C. § 45(a). 14 120. Defendants violated the Rule as described above with the knowledge 15 required by Section 5(m)(1)(A) of the FTC Act, 15 U.S.C. § 45(m)(1)(A). 16 121. Each collection, use, or disclosure of a child's personal information 17 in which Defendants violated the Rule in any of the ways described above 18 constitutes a separate violation for which Plaintiff seeks monetary civil penalties. 19 15 U.S.C. § 45(m)(1)(A). 20 122. Each day Defendants maintained data collected in violation of the 21 Rule, or otherwise continued to collect such data, is a continuing failure to comply 22 with the Rule and constitutes a separate violation under 15 U.S.C. § 45(m)(1)(C). 23 123. Section 5(m)(1)(A) of the FTC Act, 15 U.S.C. § 45(m)(1)(A), as 24 modified by Section 4 of the Federal Civil Penalties Inflation Adjustment Act of 25 1990 and Section 701 of the Federal Civil Penalties Inflation Adjustment Act 26 Improvements Act of 2015, 28 U.S.C. § 2461, and Section 1.98(d) of the FTC's 27 Rules of Practice, 16 C.F.R. § 1.98(d), authorizes this Court to award monetary 28 Page 28

Page 29 of 31

Case 2:24-cv-06535 Document 1 Filed 08/02/24 Page 29 of 31 Page ID #:29 1 civil penalties of not more than $51,744 for each violation of the Rule assessed 2 after January 10, 2024. 3 4 CONSUMER INJURY 124. Consumers are suffering, have suffered, and will continue to suffer 5 substantial injury as a result of Defendants' violations of the COPPA Rule. Absent 6 injunctive relief by this Court, Defendants are likely to continue to injure 7 consumers and harm the public interest. 8 9 10 PRAYER FOR RELIEF 125. Wherefore, Plaintiff requests that the Court: A. Enter a permanent injunction to prevent future violations of the 11 COPPA Rule by Defendants; 12 B. Impose civil penalties on each Defendant for every violation of the 13 COPPA Rule; and 14 15 proper. C. Award any additional relief as the Court determines to be just and 16 17 18 19 20 21 23 28 28 24 25 26 27 222222 *** Page 29

Page 30 of 31

Case 2:24-cv-06535 Document 1 Filed 08/02/24 Page 30 of 31 Page ID #:30 1 Dated: August 2, 2024 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 Respectfully submitted, BRIAN M. BOYNTON Principal Deputy Assistant Attorney General, Civil Division ARUN G. RAO Deputy Assistant Attorney General AMANDA N. LISKAMM Director, Consumer Protection Branch LISA K. HSIAO Senior Deputy Director, Civil Litigation RACHAEL L. DOUD ZACHARY A. DIETERT Assistant Directors /s/ Marcus P. Smith BENJAMIN A. CORNFELD MARCUS P. SMITH Trial Attorneys Consumer Protection Branch Civil Division, U.S. Department of Justice 450 5th Street, NW, Suite 6400-South Washington, DC 20001 Tel.: (202) 305-1537 (Cornfeld) (202) 353-9712 (Smith) Fax: (202) 514-8742 Email: Benjamin. A.Cornfeld2@usdoj.gov Marcus.P.Smith@usdoj.gov Counsel for Plaintiff United States of America 28 28 Page 30

Page 31 of 31

Case 2:24-cv-06535 Document 1 Filed 08/02/24 Page 31 of 31 Page ID #:31 1 2 OF COUNSEL, FOR THE FEDERAL TRADE COMMISSION: 3 JONATHAN W. WARE IRIS MICKLAVZINA 4 SARAH CHOI 5 MICHAEL SHERLING 6 Attorneys Federal Trade Commission 7600 Pennsylvania Avenue NW, Mailstop CC-6316 8 Washington, DC 20580 (202) 326-2726 (Ware) 9 (202) 326-2517 (Micklavzina) (202) 326-2212 (Choi) 10 (202) 326-3286 (Sherling) 11 12 (202) 326-3197 (fax) jware1@ftc.gov imicklavzina@ftc.gov 13 schoil@ftc.gov msherling@ftc.gov 14 15 16 17 18 19 20 21 23 28 28 24 25 26 27 222222 Page 31