
Cambridge Boston Alignment Initiative
The Cambridge Boston Alignment Initiative (CBAI) advances research and education aimed at ensuring a safe and beneficial transition to advanced AI systems. Founded in 2022, the organization operates fully-funded research fellowship programs for students and early-career researchers in both technical AI safety and AI governance, along with CAMBRIA, a three-week ML upskilling bootcamp focused on interpretability and alignment. CBAI also serves as fiscal sponsor for MIT AI Alignment (MAIA) and the Harvard AI Safety Student Teams (AISST), supporting Boston-area community infrastructure, office space, and programming for the broader AI safety ecosystem.
Funding Details
- Annual Budget
- -
- Monthly Burn Rate
- -
- Current Runway
- -
- Funding Goal
- -
- Funding Raised to Date
- -
- Fiscal Sponsor
- -
Theory of Change
CBAI's theory of change centers on the idea that a primary bottleneck to reducing existential risk from advanced AI is the shortage of talented, well-trained researchers working on alignment and safety. By running research fellowships and technical bootcamps in the Boston/Cambridge area — home to Harvard, MIT, and Northeastern — CBAI aims to accelerate the career trajectories of promising students and early-career researchers, exposing them to mentorship from leading safety researchers and giving them hands-on research experience. Supporting student organizations at Harvard and MIT through fiscal sponsorship builds durable institutional communities that recruit and train new cohorts of safety-minded researchers each year. Taken together, these programs are intended to meaningfully increase the quantity and quality of people capable of producing high-impact AI safety research, thereby improving the field's ability to make AI development go well for humanity.
Grants Received
from Open Philanthropy
from Open Philanthropy
from Open Philanthropy
from Open Philanthropy
from Open Philanthropy
from Open Philanthropy
from Open Philanthropy
Projects
No linked projects.
People
No linked people.
Discussion
Sign in to join the discussion.
No comments yet. Be the first to share your thoughts.
Details
- Last Updated
- Apr 2, 2026, 9:58 PM UTC
- Created
- Mar 20, 2026, 2:34 AM UTC