The European Union has accused Meta and TikTok of violating their legal transparency obligations under the bloc’s landmark Digital Services Act (DSA), intensifying regulatory scrutiny of how large online platforms manage access to public data and flag harmful content. The European Commission’s preliminary findings released this week outline concerns that both companies failed to provide researchers with sufficient access to public information—one of the key mechanisms designed to ensure accountability and safeguard users in the digital space.
The DSA, which came into full effect for major platforms in 2024, is the EU’s most ambitious regulatory effort to curb online harms, disinformation, and opaque algorithmic practices. Under its rules, platforms with more than 45 million active users in the EU—classified as “Very Large Online Platforms” (VLOPs)—are required to make data transparency a central feature of their operations. The Commission’s findings mark a new stage in the bloc’s digital governance campaign, moving from general oversight to targeted enforcement.
Why Transparency Is Central to the DSA
Transparency is the cornerstone of the DSA’s regulatory architecture. EU policymakers view independent research access as vital to exposing the societal impact of algorithm-driven systems—particularly their influence on mental health, misinformation, and political discourse. Platforms like Facebook, Instagram, and TikTok are obligated to provide a secure and structured way for academics and approved organizations to study their data.
However, the Commission’s preliminary review suggests that Meta and TikTok have introduced obstructive and overly complicated processes that discourage legitimate researchers from accessing the data they need. Meta’s systems, for example, reportedly impose “unnecessary steps and additional demands,” including interface designs that are “confusing and dissuasive.” TikTok, meanwhile, is accused of creating procedural barriers that make public data requests unduly difficult or slow.
The Commission warned that these measures “undermine transparency and limit public scrutiny,” preventing independent experts from assessing how platform algorithms amplify or suppress certain types of content. For regulators, this is more than a procedural issue—it’s a matter of public accountability in an era when digital ecosystems influence everything from elections to adolescent behaviour.
How the Breach Allegations Developed
The EU’s investigation stems from months of dialogue and repeated compliance checks following the DSA’s enforcement on major online platforms. Regulators had signalled early concerns that Meta and TikTok’s transparency tools were inconsistent with the law’s intent. Despite platform assurances, officials found that the companies’ “researcher access portals” did not deliver meaningful access to raw or anonymized datasets necessary for public-interest studies.
In addition, both platforms were found lacking in their reporting mechanisms for illegal or harmful content, particularly in relation to the detection of child sexual abuse material and terrorist propaganda. The Commission noted that Facebook and Instagram, both owned by Meta, did not provide an intuitive or easily accessible system for users to flag such content. These design flaws, regulators say, risk allowing harmful material to circulate for longer than necessary, weakening the credibility of internal moderation efforts.
The EU’s stance is that transparency and content safety are not separable obligations—they function together to create a safer, more accountable digital environment. Weakness in one undermines the other.
The Companies Push Back
Both Meta and TikTok have defended their practices, arguing that they are committed to meeting DSA obligations while balancing other legal requirements. Meta maintains that it has upgraded its content-reporting tools and data-access procedures since the DSA’s enforcement. Company representatives say they are “confident” that recent changes meet the EU’s legal standards and that ongoing engagement with regulators will clarify any remaining issues.
TikTok, for its part, acknowledges the importance of transparency but warns that easing access to platform data may conflict with the EU’s General Data Protection Regulation (GDPR)—the bloc’s signature privacy law. Company officials argue that compliance with both frameworks can create contradictions, since transparency may require sharing user data while GDPR imposes strict limits on its disclosure. TikTok has called for regulators to provide “greater clarity” on how to reconcile these obligations without violating user privacy.
Nevertheless, the Commission’s position is clear: compliance with one law does not justify breaching another. Platforms must design systems that uphold both transparency and privacy simultaneously, using anonymization and consent-based mechanisms to protect personal data while still enabling public oversight.
This latest move underscores how the EU is redefining global tech governance by turning its regulatory focus toward platform accountability rather than content control. Unlike earlier directives that dealt with copyright or competition, the DSA seeks to make systemic design choices—such as algorithms, recommendation systems, and data access—subject to public oversight.
The investigations into Meta and TikTok serve as a test case for how far Brussels can push multinational companies to open their digital black boxes. If the findings are upheld, the Commission can impose fines of up to 6% of a company’s global annual turnover, a potentially multibillion-dollar penalty given the scale of these firms. The process also sets a precedent for other large platforms, including X (formerly Twitter), YouTube, and Amazon, which are under similar review for transparency and safety compliance.
For the EU, enforcing the DSA is not just about policing Big Tech—it’s about reinforcing democratic oversight in a digital economy dominated by algorithmic decision-making. Regulators argue that researchers must be able to scrutinize how these systems shape social trends, influence elections, and affect users’ mental health.
The Challenge of Balancing Privacy and Openness
The debate around data access is not simply technical—it’s philosophical. Regulators insist that transparency is essential for public trust, but companies argue that unrestrained access could expose users to privacy risks or lead to misuse of information. The challenge lies in designing systems that anonymize data effectively while still offering meaningful insights into how platforms function.
In practice, compliance requires advanced data-sharing frameworks that separate identifiable information from aggregated trends. The EU has encouraged the use of “data sandboxes” and controlled-access repositories, where accredited researchers can study content patterns without downloading personal data. Yet many platforms have been slow to implement these measures, citing security risks and intellectual-property concerns.
Meta’s use of so-called “dark patterns”—interface designs that nudge users toward certain choices—also came under scrutiny. The Commission believes that these design elements may discourage users from reporting harmful content or exercising privacy rights, thereby undermining the DSA’s intent to empower digital citizens.
Both companies will now have an opportunity to respond formally to the Commission’s findings before any final decision is made. If they can demonstrate that new systems or policy adjustments bring them into compliance, penalties may be mitigated or avoided. However, the EU’s track record suggests limited tolerance for noncompliance: previous cases under the GDPR have resulted in record fines for major tech firms.
The broader lesson for digital platforms is unmistakable—the era of voluntary transparency is over. Under the DSA, openness is a legal duty, not a public-relations choice. As the EU consolidates its reputation as a global standard-setter in digital regulation, tech giants face mounting pressure to align their systems, processes, and corporate culture with European expectations of accountability and consumer protection.
The findings against Meta and TikTok mark another step in the EU’s push to bring order to the online world. In Brussels’ view, transparency is not a courtesy—it is the foundation of digital democracy.
(Source:www.reuters.com)
The DSA, which came into full effect for major platforms in 2024, is the EU’s most ambitious regulatory effort to curb online harms, disinformation, and opaque algorithmic practices. Under its rules, platforms with more than 45 million active users in the EU—classified as “Very Large Online Platforms” (VLOPs)—are required to make data transparency a central feature of their operations. The Commission’s findings mark a new stage in the bloc’s digital governance campaign, moving from general oversight to targeted enforcement.
Why Transparency Is Central to the DSA
Transparency is the cornerstone of the DSA’s regulatory architecture. EU policymakers view independent research access as vital to exposing the societal impact of algorithm-driven systems—particularly their influence on mental health, misinformation, and political discourse. Platforms like Facebook, Instagram, and TikTok are obligated to provide a secure and structured way for academics and approved organizations to study their data.
However, the Commission’s preliminary review suggests that Meta and TikTok have introduced obstructive and overly complicated processes that discourage legitimate researchers from accessing the data they need. Meta’s systems, for example, reportedly impose “unnecessary steps and additional demands,” including interface designs that are “confusing and dissuasive.” TikTok, meanwhile, is accused of creating procedural barriers that make public data requests unduly difficult or slow.
The Commission warned that these measures “undermine transparency and limit public scrutiny,” preventing independent experts from assessing how platform algorithms amplify or suppress certain types of content. For regulators, this is more than a procedural issue—it’s a matter of public accountability in an era when digital ecosystems influence everything from elections to adolescent behaviour.
How the Breach Allegations Developed
The EU’s investigation stems from months of dialogue and repeated compliance checks following the DSA’s enforcement on major online platforms. Regulators had signalled early concerns that Meta and TikTok’s transparency tools were inconsistent with the law’s intent. Despite platform assurances, officials found that the companies’ “researcher access portals” did not deliver meaningful access to raw or anonymized datasets necessary for public-interest studies.
In addition, both platforms were found lacking in their reporting mechanisms for illegal or harmful content, particularly in relation to the detection of child sexual abuse material and terrorist propaganda. The Commission noted that Facebook and Instagram, both owned by Meta, did not provide an intuitive or easily accessible system for users to flag such content. These design flaws, regulators say, risk allowing harmful material to circulate for longer than necessary, weakening the credibility of internal moderation efforts.
The EU’s stance is that transparency and content safety are not separable obligations—they function together to create a safer, more accountable digital environment. Weakness in one undermines the other.
The Companies Push Back
Both Meta and TikTok have defended their practices, arguing that they are committed to meeting DSA obligations while balancing other legal requirements. Meta maintains that it has upgraded its content-reporting tools and data-access procedures since the DSA’s enforcement. Company representatives say they are “confident” that recent changes meet the EU’s legal standards and that ongoing engagement with regulators will clarify any remaining issues.
TikTok, for its part, acknowledges the importance of transparency but warns that easing access to platform data may conflict with the EU’s General Data Protection Regulation (GDPR)—the bloc’s signature privacy law. Company officials argue that compliance with both frameworks can create contradictions, since transparency may require sharing user data while GDPR imposes strict limits on its disclosure. TikTok has called for regulators to provide “greater clarity” on how to reconcile these obligations without violating user privacy.
Nevertheless, the Commission’s position is clear: compliance with one law does not justify breaching another. Platforms must design systems that uphold both transparency and privacy simultaneously, using anonymization and consent-based mechanisms to protect personal data while still enabling public oversight.
This latest move underscores how the EU is redefining global tech governance by turning its regulatory focus toward platform accountability rather than content control. Unlike earlier directives that dealt with copyright or competition, the DSA seeks to make systemic design choices—such as algorithms, recommendation systems, and data access—subject to public oversight.
The investigations into Meta and TikTok serve as a test case for how far Brussels can push multinational companies to open their digital black boxes. If the findings are upheld, the Commission can impose fines of up to 6% of a company’s global annual turnover, a potentially multibillion-dollar penalty given the scale of these firms. The process also sets a precedent for other large platforms, including X (formerly Twitter), YouTube, and Amazon, which are under similar review for transparency and safety compliance.
For the EU, enforcing the DSA is not just about policing Big Tech—it’s about reinforcing democratic oversight in a digital economy dominated by algorithmic decision-making. Regulators argue that researchers must be able to scrutinize how these systems shape social trends, influence elections, and affect users’ mental health.
The Challenge of Balancing Privacy and Openness
The debate around data access is not simply technical—it’s philosophical. Regulators insist that transparency is essential for public trust, but companies argue that unrestrained access could expose users to privacy risks or lead to misuse of information. The challenge lies in designing systems that anonymize data effectively while still offering meaningful insights into how platforms function.
In practice, compliance requires advanced data-sharing frameworks that separate identifiable information from aggregated trends. The EU has encouraged the use of “data sandboxes” and controlled-access repositories, where accredited researchers can study content patterns without downloading personal data. Yet many platforms have been slow to implement these measures, citing security risks and intellectual-property concerns.
Meta’s use of so-called “dark patterns”—interface designs that nudge users toward certain choices—also came under scrutiny. The Commission believes that these design elements may discourage users from reporting harmful content or exercising privacy rights, thereby undermining the DSA’s intent to empower digital citizens.
Both companies will now have an opportunity to respond formally to the Commission’s findings before any final decision is made. If they can demonstrate that new systems or policy adjustments bring them into compliance, penalties may be mitigated or avoided. However, the EU’s track record suggests limited tolerance for noncompliance: previous cases under the GDPR have resulted in record fines for major tech firms.
The broader lesson for digital platforms is unmistakable—the era of voluntary transparency is over. Under the DSA, openness is a legal duty, not a public-relations choice. As the EU consolidates its reputation as a global standard-setter in digital regulation, tech giants face mounting pressure to align their systems, processes, and corporate culture with European expectations of accountability and consumer protection.
The findings against Meta and TikTok mark another step in the EU’s push to bring order to the online world. In Brussels’ view, transparency is not a courtesy—it is the foundation of digital democracy.
(Source:www.reuters.com)
