Centre for AI Security and Access
The Centre for AI Security and Access (CASA) addresses the critical tension between equitable AI access and necessary security. Through research and diplomacy, CASA works to ensure that advanced AI systems can be widely and responsibly distributed globally while maintaining appropriate safeguards. The organization prioritizes advancing meaningful participation from historically underrepresented regions, particularly Global Majority countries in Africa and Southeast Asia, in global AI governance. CASA publishes policy research, op-eds, and white papers, and runs programs such as the Africa AI Safety Prize Competition.
Funding Details
- Annual Budget
- -
- Monthly Burn Rate
- -
- Current Runway
- -
- Funding Goal
- -
- Funding Raised to Date
- -
- Fiscal Sponsor
- Rethink Priorities
Theory of Change
CASA believes that AI safety and security are not barriers to equitable access but rather enablers of responsible innovation in Global Majority countries. By conducting research on how to operationalize international AI benefit-sharing and by bridging security with access, CASA aims to ensure that historically underrepresented regions have meaningful participation in global AI governance. Through policy research, prize competitions, and building regional capacity, CASA works to create practical safety tools tailored to local contexts and to influence global AI governance frameworks to be more inclusive, ultimately reducing the risk that advanced AI systems are deployed without appropriate safeguards in underserved regions.
Grants Received
from Survival and Flourishing Fund
Projects
No linked projects.
People
No linked people.
Discussion
Sign in to join the discussion.
No comments yet. Be the first to share your thoughts.
Details
- Last Updated
- Apr 2, 2026, 9:49 PM UTC
- Created
- Mar 18, 2026, 11:18 PM UTC