Blue Diamond Web Services

Your Best Hosting Service Provider!

September 19, 2024

FTC report on predatory social media data hoarding hints at future regulations

A new FTC report on how social media and streaming sites collect and monetize their hoards of user data doesn’t really feature a lot of surprises for anyone who’s followed the space. It’s more helpful to consider this part of a paper trail the agency is laying down in order to justify new regulations in the space.

The report has its roots way back in late 2020, when the FTC ordered nine of the tech companies with the biggest data collection apparatus to disclose numerous aspects of how their surveillance capitalism business models worked. (The companies: Amazon, Facebook, YouTube, Twitter, Snap, ByteDance, Discord, Reddit, and WhatsApp.)

What data do you collect, on whom, and how long is it kept? If asked to delete, do you do so? What do you use it for, who do you sell it to, and what do they use it for? The questions are quite comprehensive, the better to avoid the possibility of prevarication or obscuration through withholding of important data.

The responses of the companies were, predictably, evasive, as the FTC’s Bureau of Consumer Protection Director Samuel Levine notes in the preface:

Echoing the way that firms conceal and hide their collection practices, many of the Companies provided the Commission with limited, incomplete, or unhelpful responses that appeared to have been carefully crafted to be self-serving and avoid revealing key pieces of information.

The resulting report details all manner of shenanigans, representing both malice and incompetence. Few of the practices disclosed will surprise anyone at this point, but the executive summary starting on page 9 is a great refresher on all the skulduggery we have come to expect from the likes of these.

Of course, it has been nearly four years since then, and many of the companies have made changes to their practices, or have been fined or otherwise chastised. But despite the elevation of Lina Khan to Chair of the FTC subsequent to this inquiry, there has been no large revision or expansion of rules that lay down bright lines like “thou shalt not sell data on a user’s health challenges to advertisers.”

One exception you might hope for, compliance with the Children’s Online Privacy Protection Act, also seems to be an afterthought. As the FTC writes:

…In an apparent attempt to avoid liability under the COPPA Rule, most [social media and video streaming services] asserted that there are no child users on their platforms because children cannot create accounts. Yet we know that children are using SMVSSs. The SMVSSs should not ignore this reality…Almost all of the Companies allowed teens on their SMVSSs and placed no restrictions on their accounts, and collected personal information from teens just like they do from adults.

Meta allegedly ignored obvious violations for years; Amazon settled for $25 million after “flouting” the law; TikTok owner ByteDance is the target of a similar lawsuit filed just last month.

So what’s the point of the report, if all this is known?

Well, the FTC has to do its due diligence too when considering rules that could restrict a bunch of multi-billion-dollar global tech companies. If the FTC in 2020 had said, “These companies are out of control, we propose a new rule!” then the industries impacted would quite justifiably challenge it by saying there is no evidence of the kind of practices the rule would prohibit. This kind of thing happened with net neutrality as well: the broadband companies challenged it on (among other things) the basis that the harms were overstated, and won.

Though Chair Khan’s statement accompanying the report suggests it will help inform state and federal lawmakers’ efforts (which is likely true), it is almost certain that this will provide a foundational fact basis on which to build out a new rulemaking. The very fact that the companies both admit to doing these things, and that they have been caught red-handed doing others in the meantime, would strengthen any argument for new regulations.

Khan also fends off dissent from within, from Commissioners who (despite voting unanimously to issue the report) accuse it of attempting to regulate speech or dictate business models. She dispatches these arguments with the confidence of someone already drafting a proposal.

That proposal (should it exist) would likely be aimed at trimming the wings of those companies that have come to embody entire industries within themselves. As Khan puts it:

…It is the relative dominance of several of these platforms that gives their decisions and data practices an outsized impact on Americans. When a single firm controls a market and is unchecked by competition, its policies can effectively function as private regulation. A consolidated market is also more susceptible to coordination with–or cooptation by–the government. Unchecked private surveillance by these platforms creates heightened risk of improper surveillance by the state. How these markets are structured can result in greater risks to—or greater protections of—people’s core liberties.

In other words, let’s not leave it to them, and the FTC likely doesn’t intend to.

Keep reading the article on Tech Crunch


FTC Says Social Media Platforms Engage in ‘Vast Surveillance’ of Users

Social media platforms are engaging in “vast surveillance” of people online and failing to protect children, according to a new report from the U.S. Federal Trade Commission. And if you thought Big Tech was serious about calling for FTC Chair Lina Khan to be fired before, just wait until this report properly trickles through Silicon Valley today.

The FTC issued a warning letter back in late 2020 to nine social media and video streaming services alleging their operations were “dangerously opaque” and said their data collection techniques and algorithms were “shrouded in secrecy.” The companies—Amazon, Facebook, YouTube, X, Snap, ByteDance, Discord, Reddit, and WhatsApp—were told the FTC would be investigating their practices and Thursday’s report is the result of those efforts.

The report notes that the amount of data collected by large tech companies is enormous, even using the words “simply staggering,” to describe how both users and non-users alike can be tracked in myriad ways. And that data that’s collected directly by platforms is then combined with data from third-party brokers to compile an even more detailed picture of any given person, according to the FTC.

“They track what we do on and off their platforms, often combining their own information with enormous data sets purchased through the largely unregulated consumer data market. And large firms are increasingly relying on hidden pixels and similar technologies—embedded on other websites—to track our behavior down to each click,” the FTC report reads.

“In fact, the Companies collected so much data that in response to the Commission’s questions, they often could not even identify all the data points they collected or all of the third parties
they shared that data with,” the report continues.

The report also warns that AI is complicating the picture even more, with companies feeding data into their artificial intelligence training without consistent approaches to monitoring or testing standards.

The report lists things the FTC would like policymakers to do, emphasizing that “self-regulation is not the answer,” while also laying out changes the big tech companies are supposed to make. On the policymaker side, the FTC says Congress should pass comprehensive federal privacy legislation to limit surveillance and give consumers rights over their data. The FTC also advocates for new privacy legislation that it says will “fill in the gap in privacy protections” that exist in the Children’s Online Privacy Protection Act of 1998, abbreviated as COPPA.

As for the companies, the FTC wants to see these platforms limit data collection and implement “concrete and enforceable data minimization and retention policies.” The FTC also calls on the companies to limit the sharing of data with third parties and to delete consumer data when it’s not needed anymore. The new report also calls on companies to, “not collect sensitive information through privacy-invasive ad tracking technologies,” which include pixel trackers, and give better protections to teens.

But, again, this report is likely to only increase the calls for Khan to be fired, which have grown louder in the business community in recent months.

“The report lays out how social media and video streaming companies harvest an enormous amount of Americans’ personal data and monetize it to the tune of billions of dollars a year,” Lina Khan said in a statement published online.

“While lucrative for the companies, these surveillance practices can endanger people’s privacy, threaten their freedoms, and expose them to a host of harms, from identify theft to stalking. Several firms’ failure to adequately protect kids and teens online is especially troubling. The Report’s findings are timely, particularly as state and federal policymakers consider legislation to protect people from abusive data practices.”

Gizmodo reached out to all nine of the tech companies mentioned by name in the new report but only Discord and Google responded immediately while Meta, which owns Facebook and WhatsApp, declined to comment.

Google gave Gizmodo a very short statement about the 129-page report, only focusing on rather narrow issues like reselling data and ad personalization for kids.

“Google has the strictest privacy policies in our industry—we never sell people’s personal information and we don’t use sensitive information to serve ads,” Google spokesperson José Castañeda said over email. “We prohibit ad personalization for users under 18 and we don’t personalize ads to anyone watching ‘made for kids content’ on YouTube.”

Discord sent a more robust statement and believes its business is very different from the other eight companies mentioned in the report.

“The FTC report’s intent and focus on consumers is an important step. However, the report lumps very different models into one bucket and paints a broad brush, which might confuse consumers and portray some platforms, like Discord, inaccurately,” said Kate Sheerin, Head of US/Canada Public Policy for Discord.

“The report itself says ‘the business model varies little across these nine companies.’ Discord’s business model is very different—we are a real-time communications platform with strong user privacy controls and no feeds for endless scrolling. At the time of the study, Discord did not run a formal digital advertising service, which is a central pillar of the report. We look forward to sharing more about Discord and how we protect our users.”

We’ll update this post if we hear back from any of the other companies referenced in the FTC report we didn’t hear from on Thursday.


and this