
WASHINGTON (dpa-AFX) - Meta Platforms, Inc. (META) issued an apology on Wednesday after a technical error in Instagram's recommendation algorithm caused users to encounter an influx of violent and graphic videos in their Reels feed.
The malfunction affected a broad audience, including minors, and exposed them to distressing footage of shootings, accidents, and severe injuries.
A Wall Street Journal reporter's Instagram feed was overwhelmed with graphic clips, including videos of people being shot, crushed by machinery, and ejected from amusement park rides.
These videos originated from accounts with names like 'PeopleDyingHub' and 'ShockingTragedies,' which the journalist did not follow. Data indicated that Instagram's algorithm had significantly amplified these posts, with some videos reaching millions of views-far exceeding the usual engagement on similar accounts.
The incident comes as Meta revises its content moderation policies to reduce perceived over-censorship. In a statement earlier this year, the company announced it would limit automated moderation to the most serious violations-such as terrorism and child exploitation-while relying on user reports for less severe content.
Critics argue that these changes, coupled with major layoffs in Meta's trust and safety teams, may have weakened the platform's ability to prevent harmful content from reaching users.
As Meta recalibrates its moderation approach, questions persist about the company's ability to protect users while maintaining a commitment to free expression.
Copyright(c) 2025 RTTNews.com. All Rights Reserved
Copyright RTT News/dpa-AFX
© 2025 AFX News