AI Safety Camp (AISC) is a non-profit initiative that runs a 3-month online research program where participants form teams to work on pre-selected AI safety projects. Each edition, AISC connects safety researchers and advocates with talented newcomers from diverse backgrounds and geographies, who commit at least 10 hours per week to tackle research or policy directions that are often neglected. The program serves as both an incubator for new AI safety research collaborations and a talent pipeline into the field, with alumni going on to take positions at organizations like Anthropic, DeepMind, and OpenAI, and to found new safety-focused organizations.
Funding Details
- Annual Budget
- -
- Monthly Burn Rate
- -
- Current Runway
- -
- Funding Goal
- $126,000
- Funding Raised to Date
- $600,000
- Fiscal Sponsor
- -
Theory of Change
AI Safety Camp believes that the field of AI safety is severely talent-constrained, with far fewer researchers working on safety compared to capabilities. Their theory of change is to expand and diversify the AI safety talent pipeline by providing a low-barrier, structured entry point where newcomers from varied backgrounds and geographies can try their hand at concrete AI safety research under experienced mentors. By running a part-time, virtual program, they reach people outside the traditional Bay Area and London hubs who might never otherwise enter the field. The causal chain runs from recruitment and team formation, through hands-on research experience, to participants either producing novel safety research directly, transitioning into full-time AI safety careers, or founding new safety-focused organizations. At roughly $12,000-$30,000 per new researcher entering the field, AISC aims to be one of the most cost-effective talent development programs in AI safety.
Grants Received
from Survival and Flourishing Fund
from Survival and Flourishing Fund
Projects
No linked projects.
People
No linked people.
Discussion
Sign in to join the discussion.
comment with bolding
This is my personal site - check it out!
a third threaded reply
sadf adsfsadf adsfsadf adsfsadf adsf
sadf adsfsadf adsfsadf adsfsadf adsf
sadf adsfsadf adsfsadf adsfsadf adsf
sadf adsfsadf adsfsadf adsfsadf adsf
sadf adsfsadf adsfsadf adsfsadf adsf
sadf adsfsadf adsfsadf adsfsadf adsf
sadf adsfsadf adsfsadf adsfsadf adsf
sadf adsfsadf adsfsadf adsfsadf adsf
sadf adsfsadf adsfsadf adsfsadf adsf
sadf adsfsadf adsfsadf adsfsadf adsf
sadf adsfsadf adsfsadf adsfsadf adsf
sadf adsfsadf adsfsadf adsfsadf adsf
sadf adsfsadf adsfsadf adsfsadf adsf
- sadf adsfsadf adsfsadf adsfsadf adsf
- sadf adsfsadf adsfsadf adsfsadf adsf
sadf adsfsadf adsfsadf adsfsadf adsf
sadf adsfsadf adsfsadf adsfsadf adsf
sadf adsfsadf adsfsadf adsfsadf adsf
sadf adsfsadf adsfsadf adsfsadf adsf
sadf adsfsadf adsfsadf adsfsadf adsf
sadf adsfsadf adsfsadf adsfsadf adsf
sadf adsfsadf adsfsadf adsfsadf adsf
sadf adsfsadf adsfsadf adsfsadf adsf
sadf adsfsadf adsfsadf adsfsadf adsf
sadf adsfsadf adsfsadf adsfsadf adsf
sadf adsfsadf adsfsadf adsfsadf adsf
sadf adsfsadf adsfsadf adsfsadf adsf
sadf adsfsadf adsfsadf adsfsadf adsf
sadf adsfsadf adsfsadf adsfsadf adsf
sadf adsfsadf adsfsadf adsfsadf adsf
sadf adsfsadf adsfsadf adsfsadf adsf
sadf adsfsadf adsfsadf adsfsadf adsf
sadf adsfsadf adsfsadf adsfsadf adsf
sadf adsfsadf adsfsadf adsfsadf adsf
sadf adsfsadf adsfsadf adsfsadf adsf
sadf adsfsadf adsfsadf adsfsadf adsf
sadf adsfsadf adsfsadf adsfsadf adsf
sadf adsfsadf adsfsadf adsfsadf adsf
sadf adsfsadf adsfsadf adsfsadf adsf
sadf adsfsadf adsfsadf adsfsadf adsf
sadf adsfsadf adsfsadf adsfsadf adsf
sadf adsfsadf adsfsadf adsfsadf adsf
sadf adsfsadf adsfsadf adsfsadf adsf
asdf
asdf
Details
- Last Updated
- Apr 3, 2026, 4:48 PM UTC
- Created
- Mar 18, 2026, 11:18 PM UTC
comment test