
AI Safety Initiative at Georgia Tech (AISI)
The AI Safety Initiative at Georgia Tech (AISI) is a community of technical and policy researchers aimed at reducing risks from advanced artificial intelligence, training the next generation of researchers, and steering the trajectory of AI development for the better. The group runs an introductory AI safety fellowship based on BlueDot Impact's curriculum, rapid upskilling cohorts using the ARENA technical curriculum, and supervised research projects targeting arXiv or workshop publication. AISI also organizes hackathons, speaker events, and policy programs, and has partnered with organizations including Anthropic and Apart Research. Its membership spans undergraduates, graduate students, PhDs, and working professionals from across the Georgia Tech community.
Funding Details
- Annual Budget
- -
- Monthly Burn Rate
- -
- Current Runway
- -
- Funding Goal
- -
- Funding Raised to Date
- -
- Fiscal Sponsor
- -
Theory of Change
AISI believes that training and supporting the next generation of AI safety researchers and policy professionals at a top technical university is a high-leverage intervention for reducing long-term AI risk. By providing structured education through fellowships and upskilling programs, AISI moves talented students from awareness into active AI safety research and careers. By facilitating original research projects and connecting students to funding and publication opportunities, AISI aims to grow the pipeline of competent researchers working on alignment and governance. Hosting hackathons and policy programs brings AI safety framing to the broader technical community at Georgia Tech, expanding the base of people engaging seriously with these risks. The consulting and grant-writing support services help multiply impact by enabling other researchers and departments to pursue safety-relevant work they might not have pursued independently.
Grants Received
No grants recorded.
Projects
No linked projects.
People
No linked people.
Discussion
Sign in to join the discussion.
No comments yet. Be the first to share your thoughts.
Details
- Last Updated
- Apr 2, 2026, 10:10 PM UTC
- Created
- Mar 19, 2026, 10:31 PM UTC