[ad_1]
The commission wants to significantly expand a record-breaking $5 billion consent order with the company from 2020 and said that Meta had failed to fully meet the legal commitments it made to overhaul its privacy practices to better protect its users.
Regulators also said Meta had misled parents about their ability to control whom their children communicated with on its Messenger Kids app and misrepresented the access it gave some app developers to users’ private data.
The proposed changes mark the third time the agency has taken action against the social media giant over privacy issues.
“The company’s recklessness has put young users at risk,” Samuel Levine, the director of the FTC‘s Bureau of Consumer Protection, said in a press statement. “Facebook needs to answer for its failures.”
The FTC’s administrative action, called an “order to show cause,” lays out the commission’s accusations against Meta as well its proposed restrictions. The FTC’s proposed changes would bar Meta from profiting from the data it collects from users younger than 18, including through Facebook, Instagram, Oculus headsets and Horizon Worlds, the company’s new virtual reality platform. Regulators want to ban the company from using that data even after those young users turn 18.
Discover the stories of your interest
That means Meta could be prohibited from using the details about young people’s activities to show them ads based on their behavior or nudge them to buy digital items, like virtual clothes for their avatars. The proposed changes could have significant financial repercussions because user data is a key to Meta’s advertising business, which is how it makes the vast majority of its money.
The aggressive action marks the first time that the commission has proposed such a blanket ban on the use of data in order to protect the online privacy of minors. And it arrives amid the most sweeping government drive to insulate young Americans online since the 1990s, when the commercial internet was still in its infancy.
Fueled by mounting concerns about depression among children and the role that potentially harmful online experiences could play in exacerbating it, lawmakers in at least two dozen states over the past year have introduced bills that would require certain sites, like social networks, to bar or limit young people on their platforms. Regulators are also intensifying their efforts, imposing fines on online services whose use or misuse of data could expose children to risks.
Over the past few years, critics have faulted Meta for recommending content on self-harm and extreme dieting to teenage girls on Instagram as well as failing to sufficiently protect young users from child sexual exploitation.
Meta, which has 30 days to challenge the filing, was not given advance notice of the action by the FTC.
[ad_2]
Source link