Meaning Alignment Institute
The Meaning Alignment Institute researches how to align AI, markets, and democracies with what people truly value, a problem they call full-stack alignment. Founded in 2023 by Joe Edelman, Ellie Hain, and Oliver Klingefjord, the institute develops novel approaches including Democratic Fine-Tuning (DFT), which gathers moral information from diverse populations to shape language model behavior, and Moral Graph Elicitation, a process for democratically mapping human values. Their work bridges philosophy, AI alignment, and democratic theory, collaborating with a network of 30+ researchers across institutions including MIT, Google DeepMind, Carnegie Mellon, and Oxford.
Funding Details
- Annual Budget
- -
- Monthly Burn Rate
- -
- Current Runway
- -
- Funding Goal
- -
- Funding Raised to Date
- -
- Fiscal Sponsor
- The Hack Foundation (Hack Club)
Theory of Change
MAI believes that current AI alignment approaches are incomplete because they reduce human values to simple preference signals, leading to systems that optimize for engagement or stated preferences rather than genuine human flourishing. Their theory of change operates at two levels. First, at the AI level, they develop Democratic Fine-Tuning and Moral Graph Elicitation to create AI systems that embody collective human wisdom rather than narrow value proxies, producing models that are wise rather than merely compliant. Second, at the institutional level, their full-stack alignment framework argues that safe AI requires aligned institutions, and they research new economic and democratic mechanisms built around meaning rather than attention or consumption. By providing rigorous philosophical foundations for what human values actually are (Thick Models of Value) and practical methods for eliciting and encoding them, they aim to create AI systems and institutions that support human flourishing rather than undermining it.
Grants Received
from Survival and Flourishing Fund
from Survival and Flourishing Fund
Projects
No linked projects.
People
No linked people.
Discussion
Sign in to join the discussion.
No comments yet. Be the first to share your thoughts.
Details
- Last Updated
- Apr 2, 2026, 9:49 PM UTC
- Created
- Mar 18, 2026, 11:18 PM UTC