Dovetail is a research group whose mission is to help humanity safely navigate the creation of powerful AI systems through foundational mathematics research that gives an understanding of the nature of AI agents. The group works in a pre-paradigmatic field — agent foundations — that draws on dynamical systems, probability theory, information theory, causal inference, and related mathematical domains. Rather than building or evaluating machine learning systems empirically, Dovetail pursues purely theoretical work aimed at formalizing core AI safety concepts as mathematical theorems. The group is funded by the Long-Term Future Fund (LTFF) and the UK's Advanced Research + Invention Agency (ARIA), and runs a research fellowship program for graduate-level mathematicians.
Funding Details
- Annual Budget
- -
- Monthly Burn Rate
- -
- Current Runway
- -
- Funding Goal
- -
- Funding Raised to Date
- -
- Fiscal Sponsor
- -
Theory of Change
Dovetail believes that many of the core arguments about AI danger can be formalized as precise mathematical theorems, and that doing so would provide clearer safety guidance, improve coordination among researchers, and legitimize policy interventions. Their two key bets are: (1) that conceptual arguments about existential risk from AI can be converted into rigorous mathematical statements, eliminating ambiguity and enabling better coordination; and (2) that safety-relevant properties of AI systems can be understood through general architectural properties — such as optimizer strength and proximity to ideal agent designs — using mathematical tools rather than purely empirical exploration. By establishing foundational paradigms for agent theory and deconfusing researchers about the nature of AI agents, Dovetail aims to enable downstream safety research to be more targeted and effective.
Grants Received
No grants recorded.
Projects
No linked projects.
People
No linked people.
Discussion
Sign in to join the discussion.
No comments yet. Be the first to share your thoughts.
Details
- Last Updated
- Apr 2, 2026, 9:59 PM UTC
- Created
- Mar 19, 2026, 10:30 PM UTC
