On 2 December 2025, the Court of Justice of the European Union delivered a judgment that will reverberate far beyond the facts of the case at hand. In Russmedia Digital and Inform Media Press (C-492/23), the Court fundamentally recalibrated the role of online platforms under EU data protection law. The message is clear and uncompromising: where online marketplaces process personal data contained in user advertisements, they cannot shield themselves behind the traditional hosting privilege. The GDPR applies in full, and it applies first.
The decision is not limited to classical classified advertisement portals. Its reasoning potentially affects any platform that enables user-generated content containing personal data and monetises the dissemination of that content. In that sense, Russmedia Digital marks a decisive step away from the long-standing notion of the “neutral intermediary”.
A fake advertisement with very real consequences
The case arose from an advertisement published on the Romanian classified ads platform publi24.ro. The advertisement falsely portrayed a woman, using her real photographs and telephone number, as offering sexual services. The content was published without her knowledge and without any form of consent. Shortly after publication, the advertisement was copied verbatim and republished on other websites, each referencing the original source.
Russmedia Digital removed the advertisement from its platform less than one hour after receiving a complaint. At that point, however, the damage had already been done: the content continued to circulate on other sites beyond the control of the original platform. The affected individual brought an action for damages, claiming non-material harm resulting from the unlawful processing of her personal data and the violation of her rights to privacy, honour, reputation and personal portrayal.
Against this background, the referring Romanian court essentially asked whether the platform operator could rely on the liability exemptions of the E-Commerce Directive (now mirrored in the Digital Services Act), or whether the GDPR independently imposed responsibility and liability for the processing of the personal data contained in the advertisement.
The questions referred: GDPR responsibility versus provider privilege
The questions referred to the Court sought clarification on two closely connected points. First, whether the operator of an online marketplace that allows users to post advertisements (anonymously and either free of charge or for remuneration) fails to comply with its obligations under the GDPR where an advertisement contains personal data, including sensitive personal data, in breach of that regulation. Second, whether the liability exemptions for intermediary service providers under Articles 12 to 15 of the E-Commerce Directive are applicable in such a situation.
In essence, the Court was asked to decide whether data protection responsibility can be displaced by sector-specific privileges for intermediaries (such as platform providers). The Court’s answer leaves little room for doubt.
Platform operators as (joint) controllers under the GDPR
The CJEU qualifies the operator of an online marketplace such as Russmedia Digital as a (joint) controller within the meaning of the GDPR, notwithstanding the fact that the concrete content of the advertisement originates from a user. The decisive criterion is not authorship of the content, but influence over the purposes and means of processing.
Russmedia Digital did not merely provide technical storage “for” advertisers. According to its general terms and conditions, the platform reserved extensive rights to use the published content (including the personal data contained therein) for its own commercial purposes. These rights covered dissemination, transmission, reproduction, modification, translation, transfer to partners and removal of content at any time, without the need to justify such actions. In doing so, the platform pursued its own economic interests linked to the circulation of the data.
Beyond this contractual framework, the platform shaped the processing in multiple ways. It defined categories and headings for advertisements, determined presentation, duration, visibility, ranking and target audience, and organised the overall structure through which the data were disseminated. It also enabled anonymous postings, thereby facilitating the publication of personal data without any built-in assurance that the advertiser was entitled to disclose them.
Taken together, these elements led the Court to conclude that the platform exercised decisive influence over the essential elements of the processing. It therefore participated in determining both purposes and means and could not escape responsibility by arguing that it did not itself determine the content of the advertisement. Such an argument, the Court emphasised, would be incompatible with the broad, functional and protection-oriented concept of “controller” enshrined in the GDPR.
Why special categories of data are at the core of the decision
A central pillar of the judgment concerns the application of Article 9 GDPR. The Court reiterates that data relating to a natural person’s sex life or sexual orientation fall squarely within the special categories of personal data that benefit from enhanced protection. This concept must be interpreted broadly, in light of the particularly serious interference with fundamental rights that such processing may entail.
Crucially, the Court clarifies that the classification of data as sensitive does not depend on their truthfulness. Even false information alleging sexual behaviour or services retains its character as data concerning sex life. The fact that the content is untrue and harmful does not diminish, but rather reinforces, the need for heightened protection, given the severe impact such data can have on the affected person’s rights and dignity.
Concrete obligations for platforms dealing with sensitive data
Where sensitive data are concerned, the Court formulates concrete and proactive obligations for platform operators acting as controllers. Before publication, they must implement appropriate technical and organisational measures enabling them to identify advertisements that contain special categories of data, verify whether the advertiser is the person whose sensitive data appear in the content, and refuse publication where this is not the case unless the advertiser can demonstrate explicit consent or another applicable exception under Article 9(2) GDPR.
This shifts data protection firmly into the design phase of platform services. For sensitive data, “data protection by design” is no longer a general aspiration but a requirement to screen, assess and, where necessary, block content before it goes live.
From a practical perspective, this raises significant challenges. Proof of valid consent is inherently fragile: consent can be forged, must be freely revocable and must be withdrawn as easily as it is given. Platforms must therefore implement mechanisms ensuring that withdrawals reach them effectively and are acted upon without delay.
The judgment leaves open how to assess situations in which users upload their own sensitive data. The Court does not explicitly address whether such conduct constitutes “explicit consent” or whether the data should be regarded as “manifestly made public” within the meaning of Article 9(2)(e) GDPR. While there are arguments in favour of this interpretation, significant uncertainty remains as to whether and to what extent such a legal basis would also cover onward transfers and further dissemination to third parties. In light of purpose limitation and good faith processing, a narrow interpretation will often be the safer approach.
In addition, the Court requires platform operators to implement security measures aimed at preventing sensitive advertisements from being copied and unlawfully republished on other websites, taking into account the risks involved and the state of the art. While such copying can never be entirely prevented, the obligation is to make it meaningfully more difficult through appropriate safeguards.
Implications for “ordinary” personal data
Although the case revolves around sensitive data, its implications go further. The Court recalls that all personal data processed on online marketplaces are subject to the general principles of the GDPR. Processing must be lawful, fair and transparent, based on a valid legal basis, accurate, kept up to date and secured by appropriate technical and organisational measures.
The enhanced ex ante duties formulated by the Court (such as prior identification of problematic content and identity verification) are explicitly articulated only for special categories of data. However, this does not absolve platforms from assessing the legality of processing “ordinary” personal data. Where users post information about themselves, implicit consent or self-initiated publication will often be available. The situation is far more delicate when data relate to third parties who have no relationship with the platform.
Here, reliance on legitimate interests frequently reaches its limits. The judgment in Mousse / CNIL & SNCF Connect (C-394/23) underscores that legitimate interests can only serve as a legal basis if they are clearly and timely communicated to the data subject. For platform-external individuals, such transparency is often practically impossible. As a result, reliance on legitimate interests for third-party data will, in many cases, be difficult to defend robustly.
GDPR versus E-Commerce Directive and DSA
The Court unequivocally rejects the idea that the liability privilege of the E-Commerce Directive can displace GDPR responsibility. Questions of personal data processing are governed exclusively by the GDPR. The host provider privilege neither limits nor replaces data protection obligations and offers no shield against GDPR violations.
This reasoning carries over directly to the Digital Services Act. While the DSA limits content-related or civil liability for third-party content, it does not alter the classification of platform operators as controllers where they process personal data. The Court stresses that sector-specific (liability and content) privileges must not undermine the GDPR’s protective framework. This approach aligns with the EDPB’s understanding of the DSA as complementary to, and without prejudice to, data protection law.
In practice, this means that DSA privileges are layered on top of existing GDPR obligations; they do not dilute them.
Practical significance and liability exposure
The Russmedia Digital judgment is likely to trigger significant reassessment among platform operators. It requires a genuine paradigm shift. Platforms that allow sensitive personal data to be published must actively ensure their protection, even where content originates from users. In particular, operators must verify whether the person depicted or described is actually the advertiser or whether valid consent exists. Absent such verification, publication must be refused.
This will fundamentally alter platform workflows, moderation processes and technical architectures and may significantly reduce the volume of permissible content. At the same time, the liability stakes are high. Platforms that continue to rely on the fiction of neutrality risk not only reputational harm but also substantial administrative fines and civil damages claims.
Conclusion
The Russmedia Digital judgment marks a turning point in EU data protection law for online platforms. It leaves no room for exceptions based on intermediary status and confirms that the provider privilege has no place in the GDPR context.
Those who set the framework for publication must also bear responsibility for its consequences. Allowing false and harmful content to circulate under the guise of technical neutrality is no longer acceptable under EU law. Platform operators will need to rethink their business models, technical systems and compliance strategies to meet this new reality.

