Harvard University hosts a constellation of AI safety and governance programs across its schools and institutes. The Kempner Institute for the Study of Natural and Artificial Intelligence, founded with a $500 million pledge from the Chan Zuckerberg Initiative, brings together 420+ researchers to understand the foundations of intelligence in both natural and artificial systems. The Berkman Klein Center for Internet & Society conducts research on AI ethics, governance, and accountability. Harvard also supports the Harvard AI Safety Team (HAIST/AISST), a student and researcher group focused on technical AI safety, which has received multiple grants from Open Philanthropy. Harvard Kennedy School and SEAS further contribute through AI governance policy research, interpretability labs, and dedicated AI safety coursework.
Funding Details
- Annual Budget
- -
- Monthly Burn Rate
- -
- Current Runway
- -
- Funding Goal
- -
- Funding Raised to Date
- -
- Fiscal Sponsor
- -
Theory of Change
Harvard's AI safety-relevant programs operate across multiple pathways. The Kempner Institute aims to deepen fundamental understanding of how intelligence works in artificial systems, with the premise that scientific understanding of AI systems is prerequisite to making them safe and beneficial. The Berkman Klein Center's theory of change is that rigorous, evidence-based governance research can shape policymakers, corporations, and regulators toward better AI oversight frameworks before harms become entrenched. HAIST and AISST focus on growing the pipeline of technically skilled AI safety researchers by identifying talented students early and providing them with mentorship, community, and research experience, based on the belief that more researchers working on AI safety dramatically increases the odds of solving alignment problems before transformative AI is deployed.
Grants Received
from Open Philanthropy
from Open Philanthropy
from Open Philanthropy
from Open Philanthropy
Projects
No linked projects.
People
No linked people.
Discussion
Sign in to join the discussion.
No comments yet. Be the first to share your thoughts.
Details
- Last Updated
- Apr 2, 2026, 10:10 PM UTC
- Created
- Mar 20, 2026, 2:34 AM UTC
