Meta removed 23 million pieces of harmful content on Facebook, Instagram in November

[ad_1]

Meta has announced that it took action against a total of 23 million pieces of harmful content created by Indians across its social media apps Facebook and Instagram in November. As per the company’s India Monthly Report, it removed about 19.52 million pieces of content on Facebook and over 3.39 million pieces of content on Instagram from November 1 to November 30.
Action taken on Facebook, Instagram
The policy areas under which the action on Facebook content was taken include “Spam” (14.9 million), “Adult Nudity and Sexual activity” (1.8 million), and “Violent and Graphic Content” (1.2 million), among others.
Similarly, policy areas under which the action taken on Instagram include “Suicide and Self-Injury” (1 million), “Violent and Graphic Content” (727.2K), and “Adult Nudity and Sexual Activity” (712K), among others.

By “action taken” the company means that the number of pieces of content (such as posts, photos, videos or comments) Meta took action on for going against its standards. “This metric shows the scale of our enforcement activity. Taking action could include removing a piece of content from Facebook or Instagram or covering photos or videos that may be disturbing to some audiences with a warning,” the company said.
Complaints received on Facebook
Meta says that it received 889 complaints under the IT Rules of 2021 on Facebook and it provided tools for users to resolve their issues in 511 of the reported cases.
Facebook users complained the most about their accounts being hacked, followed by lost access to pages that the users managed, bullying or harassment, and content showing users in nudity or partial nudity or in a sexual act, according to the report.

Complaints received on Instagram
Meta said in its report that Instagram received 2,368 complaints under the IT Rules of 2021 from Instagram users. The highest complaints were related to an account being hacked (939), followed by fake profile (891), bullying or harassment (136), content showing the user in nudity or partial nudity or in a sexual act (94), among others.
The company provided tools for users to resolve their issues in 1,124 cases.
Under the new IT Rules 2021, big digital and social media platforms, with more than 5 million users, have to publish monthly compliance reports.

Data of 500 million WhatsApp users leaked, How to check if you’re WhatsApp data is at risk



[ad_2]

Source link


Leave a Reply

Your email address will not be published. Required fields are marked *