Palisade Research is a nonprofit that studies the capabilities and motivations of AI agents to better understand the risk of losing control to AI systems. Founded in 2023, the organization conducts empirical research on frontier AI models, focusing on areas including autonomous hacking, shutdown resistance, spear phishing and deception, and scalable disinformation. Palisade creates concrete demonstrations of dangerous capabilities to advise policymakers, the public, and AI developers on the risks posed by agentic AI systems. Their work spans technical research, policy engagement in Washington DC, and a growing science communication program.
Funding Details
- Annual Budget
- $3,232,257
- Monthly Burn Rate
- -
- Current Runway
- 7 months
- Funding Goal
- $1,133,000
- Funding Raised to Date
- $4,500,000
- Fiscal Sponsor
- -
Theory of Change
Palisade Research believes that demonstrating concrete, empirically verified examples of dangerous AI capabilities is essential for motivating appropriate policy responses and technical safety measures. By researching and publicly demonstrating risks such as autonomous hacking, shutdown resistance, and deception capabilities in frontier AI models, they aim to create a shared understanding among policymakers, the public, and AI developers of the threats posed by agentic AI systems. Their theory of change operates through three channels: technical research that produces rigorous evidence of AI risks, policy engagement that translates research findings into actionable governance recommendations, and science communication that builds broad public understanding. The ultimate goal is to help humanity maintain control over increasingly capable AI systems and avoid scenarios where AI agents could permanently disempower human decision-making.
Grants Received
from Open Philanthropy
from Survival and Flourishing Fund
from Open Philanthropy
from Survival and Flourishing Fund
from Survival and Flourishing Fund
Projects
No linked projects.
People
No linked people.
Discussion
Sign in to join the discussion.
No comments yet. Be the first to share your thoughts.
Details
- Last Updated
- Apr 2, 2026, 9:50 PM UTC
- Created
- Mar 18, 2026, 11:18 PM UTC