
ML Alignment & Theory Scholars (MATS)
MATS (ML Alignment & Theory Scholars) is an independent research and educational program that connects talented researchers with top mentors in AI alignment, interpretability, governance, and security. Each cohort brings together approximately 100-120 fellows for an intensive 12-week research program in Berkeley, California or London, UK, with an optional 6-12 month funded extension. Fellows receive stipends, compute budgets, housing, meals, and dedicated research management support. Since late 2021, MATS has trained over 446 researchers, producing 170+ publications with 9,500+ collective citations, and approximately 80% of alumni now work directly in AI safety.
Funding Details
- Annual Budget
- -
- Monthly Burn Rate
- -
- Current Runway
- -
- Funding Goal
- -
- Funding Raised to Date
- -
- Fiscal Sponsor
- -
Theory of Change
MATS operates on the premise that AI alignment research is pre-paradigmatic, with diverse potentially promising research agendas. The program's theory of change centers on identifying exceptionally talented individuals, pairing them with established alignment researchers as mentors, and accelerating their development into independent researchers capable of pursuing original agendas. By supporting many different alignment research agendas simultaneously, MATS aims to decorrelate failure across approaches. The pipeline from fellowship to extension to full-time positions creates a sustained talent flow into AI safety organizations and labs. At scale, MATS functions as the primary feeder program for the AI safety research ecosystem, with 80% of alumni going on to work in the field and 10% co-founding new safety organizations.
Grants Received
from Open Philanthropy
from Open Philanthropy
from Survival and Flourishing Fund
from Open Philanthropy
from Open Philanthropy
from Survival and Flourishing Fund
Projects
No linked projects.
People
No linked people.
Discussion
Sign in to join the discussion.
No comments yet. Be the first to share your thoughts.
Details
- Last Updated
- Apr 2, 2026, 10:10 PM UTC
- Created
- Mar 18, 2026, 11:18 PM UTC