By Clare Duffy, CNN
Meta is apologizing for a technical error after some users said they saw violent, graphic videos in their Instagram Reels feed. Photo: Fabian Sommer/picture alliance/dpa/Getty Images via CNN Newsource
Meta is apologizing for a technical error after some users said they saw violent, graphic videos in their Instagram Reels feed.
"We have fixed an error that caused some users to see content in their Instagram Reels feed that should not have been recommended," a Meta spokesperson said in a statement. "We apologize for the mistake."
Numerous Instagram users on Tuesday had reported seeing a stream of recommended videos in their Reels feed showing people being beaten or killed, including many that the platform had labeled as "sensitive content."
Meta did not comment further on what caused the glitch.
The issue comes as Meta has been making a push to boost short-form video engagement on its platforms, with rival TikTok's future in the United States in question. TikTok has just over a month remaining to find a new, non-Chinese owner for its US operations or face a ban in the country, because of a law passed last year by then-President Joe Biden and extended in January by President Donald Trump.
Instagram has been seeking to attract users wary about the possibility of losing TikTok by introducing features similar to those popular on its biggest competitor, such as a longer time limit for videos and a "tap to pause" option for Reels.
Meta is also set to launch in the coming weeks a new video creation app called Edits that's similar to CapCut, the app owned by TikTok parent company ByteDance that many creators use to make short-form videos.
Meta has also recently made significant and controversial changes to its content moderation policies and practices. The company said in January it would do away with fact checkers in favor of a user-generated, "Community Notes" model of adding context to posts, and that it would scale back its automated systems for removing content to focus on only the most extreme rules violations, such as terrorism, child sexual exploitation, drugs, fraud and scams.
When CEO Mark Zuckerberg announced the moderation changes, he acknowledged that the company would "catch less bad stuff" on its platforms, but said, in turn, it would allow more free speech.
However, a Meta spokesperson said the error that caused users to see violent videos Wednesday was unrelated to the content moderation changes.
- CNN