Our Work in Public Policy and Institutional Development
We monitor regulatory developments and provide evidence-based analysis to inform policies that strengthen information integrity, platform accountability, and institutional transparency.
Digital Services and Platform Regulation
We analyze the implementation of regulations such as the Digital Services Act and monitor platform compliance with regulatory frameworks at local, national, European, and international levels, providing evidence on their effectiveness and areas for improvement.
- → DSA implementation analysis
- → Content moderation standards monitoring
- → Systemic risks framework assessment
AI Governance and Transparency
We research the responsible development and deployment of artificial intelligence systems, especially in content recommendation and automated analysis, contributing to the debate on algorithmic transparency.
- → AI Act compliance analysis
- → Algorithmic transparency in recommendation systems
- → Automated decision-making assessment
Protection of Information Integrity
We develop and analyze strategies to strengthen information integrity while protecting fundamental rights and freedom of expression, based on evidence about which measures are effective.
- → Evaluation of strategies for information integrity
- → Media literacy framework analysis
- → Information crisis response protocols
Media Literacy and Education
We promote the integration of media literacy and critical thinking skills in educational curricula and public awareness campaigns, contributing to the development of national frameworks.
- → Contribution to national media literacy strategies
- → Development of educator training programs
- → Design of public awareness campaigns
Documents and Reports
Access our reports and resources on public policy, regulatory analysis, and position papers that provide evidence.
Truth be dammed: One year after the Valencia floods, a deluge of disinformation persists. A study on climate dis/misinformation on YouTube and TikTok
The study shows that climate disinformation related to the 2024 Valencia DANA, spread on TikTok and YouTube, reached a wide audience and was amplified by the platforms’ own algorithms. The report demonstrates how this type of content distorts public debate during times of crisis and highlights that, despite fact-checking efforts, it continues to circulate even a year later, in some cases in direct violation of the platforms’ own policies.
DownloadInput on recurrent and prominent systemic risks in the EU and on measures for their mitigation
Contribution of Fundación Maldita.es to the first annual report on the most prominent and recurring systemic risks on large online platforms and search engines. This report is to be produced each year by the European Board of Digital Services. In our contribution, we describe how disinformation can be particularly harmful to the safety and well-being of Europeans in various situations, including natural disasters and other emergencies, as well as elections.
DownloadDisinformation in online platforms targeting weather agencies: Analysis of systemic risk under the EU's Digital Services Act
Disinformation targeting entities such as AEMET undermines their public image, a crucial factor given their role in issuing warnings about adverse weather events. Although the DSA sets clear obligations, platforms are still not fully prepared to effectively mitigate the risks posed by such content, though there are differences among them. Measures such as promoting information from official authorities or labeling disinformative content can help improve the quality of online information during natural disasters.
DownloadFaster, trusted, and more useful: The Impact of Fact-Checkers in X's Community Notes
A study by Fundación Maldita.es on more than 1,175,000 Community Notes proposed by X users throughout 2024 reveals that 1 in every 27 included a link to an article from a fact-checker accredited by the European Fact-Checking Standards Network (EFCSN) or the International Fact-Checking Network (IFCN). This makes independent fact-checkers the third most cited source in Community Notes, only behind X itself and Wikipedia.
DownloadShowing 5-8 of 14
Standards for Advocacy, Public Policy and Lobbying Activities
Our public policy work is guided by strict ethical standards and principles of transparency. We maintain the highest levels of integrity in all our institutional interactions.
-
1
Transparency and Disclosure
We fully disclose our funding sources, policy positions, and institutional relationships. All lobbying activities are duly registered and documented according to applicable regulations.
-
2
Evidence-Based Advocacy
Our policy recommendations are based on rigorous research, data analysis, and expert consultation. We provide decision-makers with accurate and verified information.
-
3
Non-Partisan Approach
We maintain strict political neutrality and engage with all relevant stakeholders regardless of their political affiliation. Our focus is on evidence-based policy solutions.
-
4
Conflict of Interest Management
We identify and manage potential conflicts of interest through clear protocols and disclosure mechanisms, ensuring our advocacy remains independent and credible.
-
5
Respectful Engagement
We engage with institutional representatives in a professional and respectful manner, following established protocols and demonstrating courtesy in all interactions.
-
6
Public Interest Focus
All our advocacy efforts are directed at promoting the public interest, democratic values, and the common good, rather than narrow sectoral interests.
Get Regulatory Updates Affecting Information Integrity Straight to Your Inbox
What will you receive each month?
- ✓ DSA and AI Act implementation analysis
- ✓ Monitoring of critical platform decisions
- ✓ Insights based on our fact-checking experience
- ✓ Early access to our research reports
Contact Us
Interested in information integrity policies? Get in touch.