1. Home
  2. |Insights
  3. |Online Privacy and Safety: The FTC Weighs in on Surveillance, Privacy, and Safeguards

Online Privacy and Safety: The FTC Weighs in on Surveillance, Privacy, and Safeguards

Client Alert | 4 min read | 09.25.24

After conducting an investigation targeted at nine popular social media and video streaming companies, the Federal Trade Commission (FTC or Commission) released a Staff Report examining their data practices, including those relating to minors.  The FTC based its report on responses to questions it compelled under Section 6(b) (which enables the Commission to require an entity to file reports or answers in writing to specific questions) from Amazon.com, Inc. (which owns the gaming platform Twitch), Facebook, Inc. (now Meta Platforms, Inc.), YouTube LLC, Twitter, Inc. (now X Corp.), Snap Inc., ByteDance Ltd. (which owns the video-sharing platform TikTok), Discord Inc., Reddit, Inc., and WhatsApp Inc.

Key areas of inquiry for the FTC include:

  • Indefinite Data Retention. The amount of data that the companies collected and could retain indefinitely, both from users and non-users, and in ways that consumers might not expect. This includes information from both on and off the company websites, as well as information that users entered themselves, information gathered, and information purchased from data brokers and other companies. The FTC believes the nine companies’ data collection, minimization, and retention practices are inadequate for protecting consumer privacy, with some companies failing to have a documented minimization, retention, or deletion policy and some failing to actually delete data when users request deletion.
  • Improper sale of ads based on users’ personal information without their consent. According to their responses, the companies sold advertising services to other businesses based on the personal information of users, and this sale took place behind the scenes and out of view to consumers, which poses privacy risks. Consumers likely did not understand, and may not have even been aware of, how the information collected about them is used.
  • Use of algorithms, data analytics, or AI. The companies used algorithms, data analytics, and artificial intelligence for content recommendations, advertising, and inferring personal details about users, which meant that users lacked control over how their information was used for those systems. Users and non-users were not able to review or correct the amassed personal information or review how decisions were made, causing a lack of control and transparency.
  • Inadequate protection of children and teens. The Children’s Online Privacy Protection Rule (COPPA Rule) imposes certain requirements on operators of websites or online services directed to children under 13, or operators who know they are collecting personal information from a child under 13. Though companies are required to comply with the COPPA Rule, they are not required to go further and extend those requirements to teenagers, and the FTC found that the responding companies did not extend the requirements to those 13 and over. Instead, they the operators treated teens like adults when collecting and monetizing their personal information.
  • Anticompetitive incentives jeopardizing individual data privacy. The FTC believes that companies are incentivized to collect and monetize as much data as possible, which can increase data abuses and market dominance that potentially threaten consumer privacy. According to the Staff Report, market dominance can reduce competition, which may leave users with fewer choices between companies and their level of data privacy protections.

The FTC Staff proposed a number of recommendations for social media companies:

  •  Implement stringent data policies, including minimizing data collection to only what is necessary, limiting data sharing with other companies, and adopting clear and transparent policies for consumers.
  •  Implement safeguards around the receipt, use, and disclosure of sensitive personal information, especially information that can be used for targeted advertising.
  •  Provide more consumer control and transparency about the data used for automated decision-making systems like artificial intelligence and algorithms, and implement stringent testing and monitoring of those systems.
  •  Provide greater protection for children and teenagers, not only ensuring compliance with the COPPA Rule, but also providing additional safeguards, including for teenagers.
  •  Focus on competing on the merits and avoid anticompetitive behavior in the form of abusive and dominant data practices.

While Staff made several recommendations, there was little to no guidance on how to achieve these goals.  What was clear is that it will not be enough for social media companies to just issue policies regarding their company’s use of data. Rather, social media companies will need to enforce their policies, develop new technologies to ensure limited data collection, limited data sharing, and the other recommended safeguards and controls.  This will also likely require a commitment by these companies to increase staff size in trust and safety, content and other functions and improve processes and procedures.  Notably, Staff recommended social media companies provide greater protection for children and teenagers, but did not identify how —and the “how” to protect children and teen online has been the source of great debate, litigation, and legislation all across the country with no clear consensus on what or how to effectively do that while balancing social media companies’ own rights and liberties.

Last, while the study was precisely that, and does not expand or confer additional enforcement powers on the FTC, we would be hard pressed to imagine that the FTC will not use its existing enforcement power to bring claims against social media companies who do not heed these recommendations.  The FTC’s use of section 5 and section 6 are broad and we  expect to see privacy, data protection and competition related claims filed against social media companies who don’t comply in short order.

We would like to thank Nicholas Pung, Senior Law Clerk*, for his contribution in preparing this alert.

*not admitted to practice law

Insights

Client Alert | 7 min read | 09.26.24

Banks and Financial Service Providers Take Note: EU Law on Greenwashing and Social-Washing Is Changing – And It Is Likely Going to Have a Wide Impact

The amount of litigation regarding environmental and climate change issues is, perhaps unsurprisingly, growing worldwide.[1] A significant portion of that litigation relates to so-called ‘greenwashing’, ‘climate-washing’ or ‘social-washing’ disputes. In other words, legal cases where people or organisations (often NGOs and consumer groups) accuse companies, banks, financial institutions or others, of making untrue statements. They argue these companies or financial institutions are pretending their products, services or operations are more environmentally-friendly, sustainable, or ethically ‘good’ for society – than is really the case. Perhaps more interestingly, of all the litigation in the environmental and climate change space – complainants bringing greenwashing and social washing cases have, according to some of these reports, statistically the most chance of winning. So, in a nutshell, not only is greenwashing and social washing litigation on the rise, companies and financial institutions are most likely to lose cases in this area....