CeSIA (Centre pour la Securite de l'IA) is an independent French non-profit dedicated to preventing major risks from AI through research, education, and advocacy. The organization offers Europe's first accredited university-level AI safety courses at ENS Ulm and Paris-Saclay University, develops open-source evaluation tools like the BELLS benchmark for testing AI monitoring systems, and publishes the AI Safety Atlas textbook. CeSIA also engages in institutional advocacy, contributing to EU AI Act implementation and collaborating with bodies like the OECD and UNESCO. It runs the international ML4Good bootcamp program and led the Global Call for AI Red Lines, which was presented at the UN General Assembly in September 2025.
Funding Details
- Annual Budget
- -
- Monthly Burn Rate
- -
- Current Runway
- -
- Funding Goal
- -
- Funding Raised to Date
- -
- Fiscal Sponsor
- EffiSciences
Theory of Change
CeSIA believes that reducing catastrophic AI risks requires building a robust culture of AI safety in France and Europe, regions that play a decisive role in global AI governance. Their theory of change operates through three channels: (1) training the next generation of AI safety researchers and engineers through university courses, bootcamps, and open educational resources, thereby growing the field; (2) developing technical evaluation tools like the BELLS benchmark that enable third-party assessment of AI safeguards, creating accountability mechanisms for AI developers; and (3) translating technical AI safety concerns into actionable policy recommendations for European and international institutions, helping establish regulatory frameworks and red lines that prevent the most dangerous AI capabilities from being deployed without adequate safeguards.
Grants Received
from Survival and Flourishing Fund
Projects
No linked projects.
People
No linked people.
Discussion
Sign in to join the discussion.
No comments yet. Be the first to share your thoughts.
Details
- Last Updated
- Apr 2, 2026, 10:09 PM UTC
- Created
- Mar 18, 2026, 11:18 PM UTC
