X Marks the Spot: Unpacking the EU's First DSA Fine

This blog post follows Laureline’s webinar on the topic with EU DisinfoLab.

On 5 December 2025, the European Commission issued its first financial penalty under the Digital Services Act, fining X €120 million for three distinct breaches of the regulation. The decision is consequential beyond its headline number: it signals how the Commission intends to read and enforce DSA obligations and sets the evidentiary and procedural template for cases to come. It also arrived at a moment of acute transatlantic tension, with the DSA itself under sustained political attack from the United States.

X immediately appealed the decision, paid the fine and is reportedly complying with the decision so far. Its pleas were only published recently, on 4 May 2026.

This post unpacks the substance of the infringements, the Commission's legal reasoning, X’s arguments and what this tells us about where DSA enforcement is heading.

Why X was fined – straightforward infringements:

The first non-compliance concerns Article 25(1) DSA on online interface design and specifically, the blue checkmark. X fundamentally changed how verification works: the badge, once reserved for notable public figures, journalists, and organisations, now simply comes with a paid X Premium subscription. The Commission's main objection is that X misappropriated the checkmark's historical meaning and departed from cross-industry standards, deceiving users on the authenticity and reliability of the accounts with the checkmark.

Throughout the decision, the Commission repeatedly benchmarks X's interpretation of its legal obligations against other very large online platforms, with this pattern carrying across the other infringements.

X was also found in breach of Article 39 DSA, which requires a public ad repository: the Commission found X's version unsearchable, unreliable, and missing most of the required information. The final infringement involves Article 40(12) DSA on researcher access to public data, where X was fined for imposing overly restrictive eligibility requirements and for significant shortcomings in its application and communication processes. Here, the comparison with other platforms shows that X is adopting unnecessarily narrow readings of its obligations and putting in place intentional barriers.

One of X’s main arguments against the findings is that there was no guidance from the Commission when it came to the interpretation and implementation of the DSA provisions. Similarly, X also highlights a lack of legal clarity and insufficient or misconstrued evidence. As a result, X argued that they did what they could and it would not be fair to say that they are not compliant.

The Commission's response is notable: it argues that Articles 25, 39, and 40(12) are self-executing provisions, requiring no further implementing measures or guidelines before they apply. Any guidelines the Commission might issue are discretionary and do not affect the articles' direct applicability, meaning their legal effect is immediate and binding regardless. The absence of Commission-issued guidelines cannot prevent individuals, companies, or national authorities from relying on these provisions or asserting rights and obligations under the DSA.

The first DSA case presents therefore relatively “easy” and straightforward infringements as these provisions are self-executing and clear enough so that the violations are clear on their face. They can be factually well-documented, and legally unambiguous, which is not the case for all DSA provisions. The risk assessment obligation under article 34 for example, presents a greater complexity.

This leads us to X’s argument of insufficient evidence, and what threshold is expected to prove the infringements. With these provisions, we see that the European Commission faces a comparatively lower burden of proof: it does not need to rely on novel interpretations, or complex technical assessments, but can instead point to direct, observable non-compliance.

The Commission pushes back on X's arguments by asserting that straightforward infringements should not require a disproportionate burden of proof. It also explains that in terms of threshold, the evidentiary benchmark must be tailored to the specifics of each case, ensuring that the requirements for proof are proportionate to the nature and gravity of the infringement. Moreover, evidence should be assessed based on its “methodological soundness” and its factual relevance to interpreting and applying the provisions of the DSA. This is how the Commission decides which external study or report to rely on.

In such cases, third parties such as researchers and civil society organisations, can play a valuable role in supporting the Commission, since evidence of non-compliance for these infringements can be easily documented, without investigatory powers or access to confidential business information. The case against X illustrates that dynamic as the Commission drew on civil society reports, complaints, and interviews in building its findings.

A decision challenged in the courts:

X, X.AI and Elon Musk all appealed the decision and raised between 6 to 8 grounds of challenge, most of them already argued during the procedure and detailed in the Commission’s decision.

The challenges first target the joint liability of X, X.AI, and Elon Musk under a "single economic entity". X argues that only TIUC (Twitter international Unlimited Company) was designated as a VLOP and that the concept of "provider" cannot extend to parent companies or shareholders.

X also raises serious procedural complaints, alleging it was not properly notified of charges, was denied access to evidence, and had its rights of defence violated. This includes an alleged overly restrictive access to the file and the data room, which X said were subject to burdensome procedures. Acknowledging that challenging procedural aspects is a common legal tactic, and pushing back on the substance of X's arguments, a Commission representative still noted that lessons from ongoing cases (including issues around access to documents, confidentiality, and deadlines) could feed into the DSA's scheduled 2027 revision.

The second half challenges the Commission’s “unlawfully broad” interpretation of the DSA's substantive provisions, arguing that it imposed broader obligations regarding the ad repository and access to data for researchers than what is actually required. X.AI also argues that the interpretation of Article 25(1) is in breach of free speech and the freedom to conduct a business.

Finally, X contests the fine itself, arguing it was poorly reasoned and miscalculated.

The leaked decision already sets out the Commission's counterarguments to most of X's pleas, which make for a compelling reading. As we have seen, given how straightforward the relevant DSA provisions and violations seem to be, X's challenge looks more like an exercise in bad faith.

In any case, it will be interesting to see how the courts respond, and what other decisions will be issued by the Commission, especially when it comes to more complex assessments and provisions.

Get in touch. Send an email or book a call directly with our specialists.