Meta and X social media giants adopted ads targeting users in Germany with violent speech of anti-Muslim and anti-Henry hatred in the country’s federal elections, according to new research by Eco, a group of non-profit corporate responsibility.
Group researchers tested whether the advertising review systems of the two platforms would approve or reject advertising submissions containing hateful and violent messages aimed at minorities before a choice where immigration has taken the central phase in the main political discourse-including advertising contained anti-Muslim slurs; Calls for immigrants to be imprisoned in concentration camps or toe; and images created by that of burning mosques and synagogues.
Most of the test advertising was approved within hours of submission for review in mid -February. German federal elections will take place on Sunday, February 23.
Hate speech advertising planned
Eco said X approved all 10 advertising of hate speech her scholars presented just days before the federal elections occur, while Meta approved half five.
The reason Meta predicted for the five refusals showed that the platform believed that there could be risks of political or social sensitivity that could affect the voting.
However, the five ads adopted by Meta included the violent speech of hate speech by comparing Muslim refugees with a “virus”, “vermin” or “rodent”, marking Muslim immigrants as “rapes”, and calling them out to be sterilized, burned or carved. Meta also approved an ad calling the synagogues to be shameless to “stop the Jewish mouse global agenda.”
As a sidenote, Eco says that none of the images created by the one who used them to illustrate hate speech advertising was labeled as artificially generated-however half of the 10 ads were still approved by Meta, regardless of the company that had one Politics seeking the discovery and use of he’s images for advertising on social issues, elections or politics.
X, meanwhile, approved all five of these hate advertising – and five others containing a violent speech of hatred aimed at Muslims and Jews.
These additional advertising advertising included messages in the attack by attacking “rodent” immigrants that the advertising copy claimed that I was “flooding” the country to steal our democracy, and an anti -Semitic sludge that suggested that Jews lie about climate change in way to destroy the European industry and to accumulate economic power.
The latest advertisement was combined with images created by him, depicting a group of shadowing men sitting around a table surrounded by stack of gold rods, with a David star on the wall-with visuals also relying heavily on anti -Semitic trope.
Another commercial advertising contained a direct attack on the SPD, the center -left party currently running the German coalition government, with a fake claim that the party wants to receive 60 million Muslim refugees from the Middle East before continuing try to delete a violent response. X also properly planned an ad to the “left” want “open borders”, and demanding the extermination of the “rapists” of Muslims.
Elon Musk, the X -owner, has used the social media platform, where he has nearly 220 million followers to intervene personally in the German elections. In a tweet in December, he called for German voters to support AFD’s right -wing party to “save Germany”. He also received a livestream with AFD leader Alice Weidel, on X.
Eco researchers disabled all test advertising before they were approved to be planned to run to ensure that no platform user was exposed to violent hate speech.
He says the tests highlight the bright flaws with accessing advertising platforms to moderate content. Indeed, in the case of the X, it is not clear whether the platform is making any advertising moderation, given all the 10 violent advertising of hate speech was quickly approved for showing.
The findings also suggest that advertising platforms can earn income as a result of distributing violent hate speech.
EU digital services operate in frame
EKO tests suggest that no platform is properly implementing hate speech prohibitions that both claim to apply for advertising content in their policies. Moreover, in the case of Meta, Eco reached the same conclusion after performing a similar test in 2023 before new EU Internet governance rules come – suggesting that the regime has no effect on how it works.
“Our findings suggest that advertising systems directed by the flaws remain radically broken, despite the act of digital services (DSA) now in full effect,” an EKO spokesman told Techcrunch.
“Instead of strengthening its advertising review process or hate speech policies, Meta seems to be returning to the entire board,” they added, indicating the company’s latest announcement of return modification and control policies Facts as a sign of “active regression” that they suggested to put it in a direct collision course with DSA rules for systemic risks.
EKO has presented its latest findings on the European Commission, which oversees the implementation of the main aspects of the DSA in the Social Media Giants. He also said he shared the results with both companies, but he did not respond.
The EU has opened DSA investigations into Meta and X, which include concerns about election security and illegal content, but the Commission has not yet completed these procedures. Although, in April, he said Meta suspects inadequate moderation of political advertising.
A preliminary decision on a part of his DSA investigation of X, which was announced in July, involved suspicions that the platform is failing to meet the rules of transparency of the regulation advertising. However, the full investigation, which began in December 2023, also has to do with illegal risks of content, and the EU has yet to reach any findings in most of the probe even over a year later.
DSA confirmed violations can withdraw up to 6% of the global annual turnover, while systemic disrespect may even lead to regional access to violating platforms temporarily blocked.
But, for now, the EU is still taking its time to make its mind in the Meta and X probes so – awaiting final decisions – any DSA sanction remains in the air.
Meanwhile, it is now just hours before German voters go to the polls-and a growing body of civil society research suggests that the Internet Governance of the EU flag has failed to defend the great process of economics of the economy EU from a string driven by technology threats.
At the beginning of this week, Global Witness released the results of X and Tiktok “for you” algorithmic resources tests in Germany, which suggest that platforms are biased in favor of promoting AFD content against other political parties . Civil society researchers have also accused X of blocking data access to prevent them from studying the risks of electoral security in continuing the German-Acquiring survey in which it is supposed to enable.
“The European Commission has taken important steps by opening DSA investigations into Meta and X, now we must see that the Commission take strong actions to address the concerns raised as part of these investigations,” Eco spokesman also told us .
“Our findings, along with the growing evidence from other civil society groups, show that the Big Tech will not clear its platforms voluntarily. Meta and X continue to allow illegal speech of hatred, promotion of violence and misinformation of elections to spread to the scale, despite their legal obligations under the DSA, ”the spokesman added. (We’ve kept the spokesman’s name to prevent harassment.)
“Regulators should take strong actions-as in the implementation of the DSA, but also for example the implementation of mitigation measures before elections. Other appropriate ‘glass’ to prevent the algorithm amplification of the boundary line content, such as hate content in the front elections. “
The campaign group also warns that the EU is now facing pressure from the Trump administration to mitigate its approach to regulating large technology. “In the current political climate, there is a real risk that the Commission will not fully enforce these new laws as a omission for the US,” they suggest.