How Can the EU Define Illegal Online Content?

[post-views]

By Shakeel Amjad

As disinformation proliferates in the wake of Hamas’s recent deadly assault in Israel, Brussels is taking a firm stance, demanding that X (formerly known as Twitter) and Meta (which includes Facebook and Instagram) clamp down on fake and misleading online posts.

Thierry Breton, the EU Commissioner known as the “digital enforcer,” has sounded the alarm. He sent letters to Elon Musk, the owner of X, and Mark Zuckerberg, the head of Meta, insisting they provide details within 24 hours about how they are removing “illegal content and disinformation” from their platforms in accordance with the EU’s new Digital Service Act (DSA). This legislation, effective for large platforms since August, imposes fines that can reach as high as six percent of a company’s global turnover for hosting illegal online content.

Breton’s warning triggered an online exchange between him and Musk. Musk requested that the EU commissioner “please list the violations you allude to on X” and emphasized his platform’s policy of being open source and transparent, a stance he claimed aligns with EU principles.

Breton countered that it was Musk’s responsibility to “demonstrate that you walk the talk” and noted that his team was prepared to “enforce rigorously” the DSA compliance rules.

Musk responded on X: “No back room deals. Please post your concerns explicitly on this platform.” He also expressed his confusion, stating, “I still don’t know what they’re talking about!”

In a bold move, Breton used his X account to endorse a rival to X known as Bluesky, which includes Twitter co-founder Jack Dorsey on its board—a platform he had just joined. Musk has shown sensitivity to competing microblogging platforms that are siphoning X/Twitter users disgruntled with the direction he’s taking the platform. At one point, Musk, a self-proclaimed free-speech “absolutist,” even had X block links directing users to one open-source rival, Mastodon, and he initiated legal action against Threads, a new Meta alternative not available in Europe due to concerns about EU regulatory oversight.

X has attracted significant attention under the DSA’s scrutiny, especially due to Musk’s cost-cutting measures that led to the removal of content moderation teams and other employees. In a pilot EU analysis of illegal online content, X fared the worst among platforms and withdrew from a voluntary EU code of practice for combating disinformation.

In his letter to Musk, Breton highlighted the “violent and terrorist content that appears to circulate on your platform” following the Hamas operation targeting Israelis living near the Gaza border. Breton’s letter to Zuckerberg urged him to ensure that moderation systems on Meta platforms were effective and to maintain vigilance regarding DSA compliance amid the ongoing conflict.

In response, a Meta spokesperson explained that after the Hamas attack, they swiftly established a special operations center staffed with experts, including fluent Hebrew and Arabic speakers, to closely monitor and respond to the evolving situation.

In the ongoing battle against disinformation, the European Union (EU) has set its sights on Meta, urging the social media giant to step up its efforts in curbing the spread of false and misleading information. The move comes in response to the surge in disinformation following Hamas’s recent assault in Israel.

A spokesperson for Meta confirmed that their teams are actively engaged in ensuring compliance with “our policies or local law” and are working in coordination with fact-checkers to combat disinformation. The spokesperson emphasized their commitment to continuing these efforts as the conflict unfolds.

Beyond the immediate concerns surrounding the Israel-Hamas conflict, EU Commissioner Thierry Breton also highlighted the critical need for Meta to address disinformation related to upcoming elections in the EU. The EU is gravely concerned about the potential impact of “fake and manipulated images and facts generated with the intention to influence elections,” and Breton underscored the seriousness of the issue.

It’s worth noting that Commissioner Breton does not unilaterally determine what constitutes illegal online content; this definition is established by EU laws or legislation in EU member countries. Nevertheless, Breton plays an active role in drawing attention to the responsibilities of online platforms in adhering to these regulations.

Breton has been a vocal proponent of the EU’s new rules, which aim to bring order to the often chaotic landscape of online content, likening it to an “online Wild West.” He has played a central role in championing the Digital Service Act (DSA), a set of regulations designed to establish greater accountability among digital platforms.

One notable consequence of the increased scrutiny on X, a platform owned by Meta, is the withdrawal of advertisers and the departure of several prominent celebrities and newsmakers.

The recent unrest in the Israel-Gaza border region has inundated major online platforms with a deluge of distressing content, including videos depicting violent deaths, hostage situations, and bombings. On X, the authenticity and accuracy of many of these posts have come under question, primarily due to changes in the platform’s verification system. X has shifted from its blue-tick verification system to one where any user can purchase verification, making it challenging to distinguish legitimate sources from unverified ones.

Furthermore, X has adopted a policy of excluding links to news articles in favor of displaying only images, providing no context or description. This shift has sparked concerns about the potential for misinformation to spread unchecked.

In response to these concerns and the broader challenge of online content usage, the international news agency AFP has initiated legal action against X in France. The legal action seeks to enforce EU laws regarding payments for the use of online content. This move underlines the growing need for accountability and adherence to regulations in the ever-evolving digital landscape.

In conclusion, the ongoing battle against disinformation in the digital age demands increased vigilance and regulatory measures. The EU’s call for Meta to address the surge in misleading information highlights the need for social media platforms to uphold their responsibilities. As the landscape of online content continues to evolve, it is essential for both platforms and regulators to adapt swiftly to combat disinformation effectively.

Subscribe our website for latest updates:
https://republicpolicy.com/shop/
Read More

Leave a Comment

Your email address will not be published. Required fields are marked *

Latest Videos