Facebook to crack down on spam and engagement farming
Meta is cracking down on spam and engagement farming on the Facebook Feed. Now, if someone could please tell Elon to do the same over at X.


If every time you log onto Facebook, you're bombarded with spam, you aren't alone: Meta said so itself.
Meta announced on Thursday that it's making changes to improve Facebook Feed to get rid of spammy content and give users more control over the content in their feed. Meta acknowledged that some accounts game the system to increase views, followers, and, as a result, monetization, leading to a fairly unpleasant scrolling experience. To fight back, the platform is lowering the reach of accounts sharing spammy content, removing accounts that coordinate fake engagement and impersonate others, and protecting and elevating creators sharing original content.
So, what does this mean for creators and users? Well, creators' content might get seen a bit easier, and your feed should get more enjoyable, in theory. But if you're an engagement farmer or part of a network built around the successes of spammy content, you might want to pivot (please).
"Meta’s platforms are built to be places where people can express themselves freely," the company wrote in a blog post. "Spammy content can get in the way of one’s ability to ultimately have their voices heard, regardless of one’s viewpoint, which is why we’re targeting the behavior that’s gaming distribution and monetization. We want the creator community to know that we’re committed to rewarding creators who create and share engaging content on Facebook. This is just one of the many investments to ensure creators can succeed on Facebook."
The social media platform said it is lowering the reach of accounts sharing spammy content, like when a user posts a particularly long caption with too many hashtags or a caption that doesn't have anything to do with the video. Meta also said it would take "more aggressive steps" to prevent accounts from coordinating fake engagement, like networks of fake comments or fake pages that exist only to inflate reach.
"Along with these efforts, we’re also exploring ways to elevate more meaningful and engaging discussions. For example, we’re testing a comments feature so people can signal ones that are irrelevant or don’t fit the spirit of the conversation," Meta wrote.
The platform will also take down profiles that are impersonating content producers. In 2024, it took down 23 million profiles — now, it intends to take down more.
"In addition to the proactive detection and enforcements we have in place to identify and remove imposters, we’ve added features to Moderation Assist, Facebook’s comment management tool, to detect and auto-hide comments from people potentially using a fake identity. Creators will also be able to report impersonators in the comments," Meta wrote.
Finally, the company said it would enhance Rights Manager to help creators protect their original content from being reused and shared on other accounts without their permission, as this "takes unfair advantage of [creators'] hard work."
This all comes shortly after Meta CEO Mark Zuckerberg's 2022 comments were uncovered, showing he feared that Facebook might be losing cultural relevancy. And Facebook really is losing cultural relevancy — and users. All the while, more AI slop is taking over social media platforms, and Meta is facing massive anti-trust trial.