The FTC wants to stop Meta from debuting new products and services profiting from the data it collects, until they comply with privacy requirements for minors.
The agency on Tuesday suggested modifications to its 2020 privacy policy with Facebook, citing the company's incomplete compliance with the policy. It also accused Facebook of deceiving parents regarding their control over their children's communication on the Messenger Kids app and misrepresenting the access it granted certain app developers to users' private data.
The FTC reached a $5 billion settlement in 2019 over allegations that the company violated a 2012 privacy order from the FTC by misleading users regarding their capacity to manage the privacy of their data.
"Facebook has repeatedly violated its privacy promises," said Samuel Levine, Director of the FTC's Bureau of Consumer Protection. "The company's recklessness has put young users at risk, and Facebook needs to answer for its failures."
The adjustments would prevent Meta from profiting from the data it collects from minors, including data acquired via its virtual reality products. Additionally, it would be subjected to increased restrictions on the use of facial recognition technology and required to provide further security measures for its users.
They include a blanket prohibition against monetizing data of children and teens under 18, a pause on launching new products and services, extending compliance to merged companies, limits on future uses of facial recognition technology, and strengthening existing requirements.
Facebook believes it's being unfairly targeted.
"Let's be clear about what the FTC is trying to do: usurp the authority of Congress to set industry-wide standards and instead single out one American company while allowing Chinese companies, like TikTok, to operate without constraint on American soil," a Facebook spokesperson said.
"FTC Chair Lina Khan's insistence on using any measure— however baseless— to antagonize American business has reached a new low," they continued.
TikTok probably isn't Meta's best choice to use as a response to its agreement violations. The US government has voiced concerns over the social media platform and there is proposed legislation that could restrict or ban TikTok outright.
Facebook's orders & violations
The 2020 privacy order required Facebook to pay a $5 billion civil penalty and expanded the privacy program and an independent third-party assessor's role to evaluate the effectiveness of Facebook's program. For example, under the 2020 policy, Facebook must conduct a privacy assessment of any new or altered product, service, or process before implementation and document its risk mitigation determinations.
The independent assessor had found several weaknesses in Facebook's privacy program according to the Order to Show Cause, which says they pose significant risks to the public.
Furthermore, the FTC has requested that Facebook address the accusations that from late 2017 to mid-2019, it misrepresented to parents the extent of control they had over their children's communication on the Messenger Kids product. Facebook promised that the app would let kids only communicate with parental-approved contacts.
However, in specific cases, kids could communicate with unapproved contacts in group text chats and video calls. The FTC says these misrepresentations violated the 2012 order, the FTC Act, and the COPPA Rule.
The COPPA Rule says that operators of websites or online services aimed towards children under the age of 13 notify parents and receive their verified parental consent before gathering personal data from the children.
The FTC has officially requested that Meta responds within 30 days to the proposed discoveries from the agency's investigation as it seeks to make alterations to the 2020 policy. Accordingly, the Commission voted 3-0 to issue the Order to Show Cause, which marks the beginning of a process in which Meta will have a chance to respond.
After reviewing facts and any arguments from Meta, the FTC will find if modifying the 2020 order is in the public's interest or justified by changed conditions of fact or law.
6 Comments
My first thought for the FTC is, "Yeah, good luck with getting Facebook to truly care about privacy." The irony here is that market forces is already taking care of what the FTC probably can't in that the number of people under age 30 using Facebook (the original blue app & website) is much smaller than 10-15 years ago.
FB long ago became pretty useless. I only get a few updates from friends and groups I am in (where the value lies) and a fees filled with sponsored and “suggested for you” articles, most by people I’ve never heard of, and most poorly written, with very little information (just filler text), and the page with the article is 75% ads.
LOL. The only thing I trust less than Facebook is the FTC.
Facebook can break the law and federal regulations and get away with it. As mentioned in the story, Facebook has been collecting data on your minor children illegally since at least 2012. Of course, nobody under 13 is supposed to be on FB in the first place, but negligent/lazy parents exist — and Facebook is there to take advantage of that.