Senior Holocaust Memorial Historian Won’t Press Like on Changes to Facebook’s Moderation Policy
Experts warn that the policy changes may lead to increased online antisemitism and even election meddling
In a seismic shift from its past moderation policies, Meta, Facebook’s parent company, announced the end to its fact-checking program and the move to a model of community notes. Meta CEO Mark Zuckerberg mentioned the reelection of President Donald Trump during the announcement, noting that it marked “a cultural tipping point toward prioritizing free speech,” and many Trump supporters have celebrated the news. According to experts in antisemitism and bias, though, the policy may end up amplifying hate speech, with potentially profound consequences for Jews, Israelis, and vulnerable communities everywhere.
Meta said that the new community notes model will “take a more personalized approach to political content” and will remove restrictions on some topics “that are part of mainstream discourse.” It said that professional moderators’ biases under the previous model had led to unwarranted censorship.
Robert Rozett, senior historian at Yad Vashem’s International Institute for Holocaust Research, described the new policy as “a step in the wrong direction,” noting that free speech has its limits.
It’s essential to refine these boundaries in a way that protects individuals and groups, such as Jews, who are frequently targeted online
“It’s essential to refine these boundaries in a way that protects individuals and groups, such as Jews, who are frequently targeted online. Achieving a balance between preserving free expression and safeguarding against harm is crucial,” he told The Media Line. “Everyone has the right to live without being subjected to hatred and attacks.”
Tehilla Shwartz Altshuler, head of the Democracy in the Information Age program at the Israel Democracy Institute, said that the policy change will have significant effects on Jews and on transgender people, who are often victims of online hate.
“Obviously Zuckerberg is trying to align himself with the Trump administration,” Shwartz Altshuler told The Media Line. “It’s not about ideology; it’s political calculus. Meta is pivoting hard to the right, mirroring Silicon Valley’s capital shift since Donald Trump’s November victory.”
Arik Segal, founder of the Conntix tech mediation agency and an expert in digital diplomacy, told The Media Line that financial incentives played a part in Meta’s decision as well. “The less content that is deleted for being fake means more money for Meta, and fake and controversial content is known to drive more traffic, which means more screen time and more money from advertisers,” he explained.
This holiday season, give to:
Truth and understanding
The Media Line's intrepid correspondents are in Israel, Gaza, Lebanon, Syria and Pakistan providing first-person reporting.
They all said they cover it.
We see it.
We report with just one agenda: the truth.
The less content that is deleted for being fake means more money for Meta
Facebook’s shift away from content mediation has precedent in changes to the social media platform X, formerly known as Twitter. When businessman Elon Musk acquired the platform, he imposed an approach of free speech absolutism, leading to the proliferation of antisemitism and other hate speech.
Segal pointed to influencer Dan Bilzerian, who has nearly 2 million followers on X, as an example of the content that has thrived under Musk’s approach. Bilzerian has said that Jews were behind the September 11 attacks, that Nazi Germany “flourished” following persecution of Jews, and that Jews control the media, among many other antisemitic comments.
“We will probably see similar profiles and pages on Facebook,” Segal said.
Shwartz Altshuler similarly said that X’s move to a community notes model turned it into “the biggest neo-Nazi platform in history.”
Meta’s new plan is to outdo Elon Musk’s anything-goes approach on X, turning Facebook and Instagram into platforms of unchecked chaos
“Meta’s new plan is to outdo Elon Musk’s anything-goes approach on X, turning Facebook and Instagram into platforms of unchecked chaos,” she said.
She warned that the change in policy would lead not just to increased hate but to all-out digital warfare. “It’s about election meddling, social unrest, and real wars,” she said.
The problems of community-led moderation predate X. Rozett pointed to Wikipedia, which is moderated entirely by community members. “There are issues there, too,” he said. “Because Wikipedia grants privileges to long-term editors rather than experts, it’s tough for a Holocaust scholar like me to challenge distortions effectively. This demonstrates the broader limitations of community regulation. The self-regulation can only be as good as its community members in this case, enabling pockets of hate speech to persist and grow.”
Under Facebook’s new system, bad actors may weaponize the community notes tool, the experts warned. Hateful content will have a larger reach and will stay up for longer—if ever taken down at all.
All low-level infractions will be addressed only if flagged by a user rather than automatically marked for review. “This is going to take weeks to be flagged, which is like an eternity in internet time,” Shwartz Altshuler said. “This will further divide America politically.”
Similarly, the Anti-Defamation League warned that relying on user reports was no replacement for a proactive moderation policy. “We know user reporting is broken, so for them to say they’re going to rely on that for antisemitism and hate without announcing some kind of big overhaul of reporting is tantamount to giving up,” a representative for the organization told The Media Line.
While experts criticized Meta’s new moderation policies, a notable percentage of Facebook, Instagram, and LinkedIn users celebrated the decision on social media. Many hailed the decision as a victory for free speech, with some saying that Facebook, as a private entity, should not attempt to behave like a government. Others said that Facebook isn’t meant to edit or curate content but simply to provide an open platform for users.
Nadja Bar-Ner, a security analyst specializing in cognitive warfare, welcomed Facebook’s new moderation policies. “They tried to moderate speech, and look what happened. If I go on Instagram, if I go on Facebook, I’m overwhelmed by antisemitism and straight-out hate speech and terror apologists,” he told The Media Line. He accused Facebook’s moderators of being biased against Israel and censoring pro-Israel points of view.
“There should be limits to what can be said or shared, but these are very extreme things, such as planning violence, for example,” he continued. “Beyond that, moderation not only was biased but was also acting like makeup, and people could not realize how much hate there really is out there.”
Meta has experienced a significant stock rally in recent months, surging over 65% in 2024 and maintaining a positive outlook for 2025. Analysts have raised their price target for Meta stock to $715. That increase has to do with Meta’s plans to introduce generative AI tools to its vast user base of nearly 4 billion individuals and over 200 million businesses. Despite anticipated increases in capital expenditures to support AI initiatives, analysts project Meta’s earnings per share to exceed $26 in 2025 and surpass $30 in 2026, indicating sustained growth.
According to Segal, Meta’s increased use of AI may have unpredictable consequences. “With the ease of using AI apps, things will become even more chaotic. Some people will leave the platforms, like what is happening now with X, and some will see it as an opportunity to make a profit and perhaps offer solutions,” he said. “However, it all depends on the dynamics of politics-tech relations.”
Rozett said that the problem of hate speech on social media platforms requires a more coordinated response. “What’s missing is a serious, ongoing international effort involving experts from different fields and platforms to figure out effective solutions that truly address the intricacies of this problem, which isn’t going away anytime soon,” he said. “We need all available tools: technology, education, regulations, and thoughtful policy.”