AIRE provides continuously updated, curated information on the AI risk landscape to enable effective decision-making by policymakers, researchers, and journalists. The platform systematically gathers emerging evidence across four prominent risk categories — cyber offense, biological risk, loss of control, and manipulation — and delivers risk overviews, model evaluation databases, benchmarks, and threat-monitoring dashboards. AIRE is a project of Observatorio de Riesgos Catastróficos Globales (ORCG), a Spain-based network of professionals dedicated to the prevention of global catastrophes, and is fiscally sponsored through Players Philanthropy Fund.
Funding Details
- Annual Budget
- -
- Monthly Burn Rate
- -
- Current Runway
- -
- Funding Goal
- -
- Funding Raised to Date
- $212,510
- Fiscal Sponsor
- Players Philanthropy Fund, Inc.
Theory of Change
AIRE believes that improving the quality and accessibility of information about AI risks is a lever for better policy decisions and risk management. By systematically monitoring AI capabilities, incidents, and threat vectors and making this evidence available to policymakers, researchers, and journalists, AIRE aims to close the information gap that currently hinders effective governance responses to advanced AI risks. Better-informed decision-makers are more likely to enact timely, evidence-based safeguards, reducing the probability of catastrophic AI-related outcomes.
Grants Received
No grants recorded.
Projects
No linked projects.
People
No linked people.
Discussion
Sign in to join the discussion.
No comments yet. Be the first to share your thoughts.
Details
- Last Updated
- Apr 2, 2026, 9:50 PM UTC
- Created
- Mar 19, 2026, 10:31 PM UTC