Facebook's Content Takedowns Take So Long They 'Don't Matter Much', Researchers Find

An anonymous reader shared this report from the Washington Post: Facebook's loosening of its content moderation standards early this year got lots of attention and criticism. But a new study suggests that it might matter less what is taken down than when. The research finds that Facebook posts removed for violating standards or other reasons have already been seen by at least three-quarters of the people who would be predicted to ever see them. "Content takedowns on Facebook just don't matter all that much, because of how long they take to happen," said Laura Edelson, an assistant professor of computer science at Northeastern University and the lead author of the paper in the Journal of Online Trust and Safety. Social media platforms generally measure how many bad posts they have taken down as an indication of their efforts to suppress harmful or illegal material. The researchers advocate a new metric: How many people were prevented from seeing a bad post by Facebook taking it down...? "Removed content we saw was mostly garden-variety spam — ads for financial scams, [multilevel marketing] schemes, that kind of thing," Edelson said... The new research is a reminder that platforms inadvertently host lots of posts that everyone agrees are bad. Read more of this story at Slashdot.

May 3, 2025 - 17:53
 0
Facebook's Content Takedowns Take So Long They 'Don't Matter Much', Researchers Find
An anonymous reader shared this report from the Washington Post: Facebook's loosening of its content moderation standards early this year got lots of attention and criticism. But a new study suggests that it might matter less what is taken down than when. The research finds that Facebook posts removed for violating standards or other reasons have already been seen by at least three-quarters of the people who would be predicted to ever see them. "Content takedowns on Facebook just don't matter all that much, because of how long they take to happen," said Laura Edelson, an assistant professor of computer science at Northeastern University and the lead author of the paper in the Journal of Online Trust and Safety. Social media platforms generally measure how many bad posts they have taken down as an indication of their efforts to suppress harmful or illegal material. The researchers advocate a new metric: How many people were prevented from seeing a bad post by Facebook taking it down...? "Removed content we saw was mostly garden-variety spam — ads for financial scams, [multilevel marketing] schemes, that kind of thing," Edelson said... The new research is a reminder that platforms inadvertently host lots of posts that everyone agrees are bad.

Read more of this story at Slashdot.