Live Theory is an AI safety research initiative led by Sahil Kulshrestha and operating under the Groundless Alignment umbrella (groundless.ai). The project develops new approaches to theorization and AI interface design that do not rely on fixed formal foundations, instead using abundant fluid AI intelligence to adapt to intellectual contributions and connect them. Core work areas include developing alternative theoretical frameworks for AI safety (diffuse concepts), creating adaptive interfaces and infrastructure (including Live Conversational Threads, Vibe Decoding, and a Soloware Platform), and identifying emerging AI risks related to deception, hallucination, and systemic insensitivity. The initiative operates in partnership with AI Safety Camp and is fiscally sponsored by the Czech organization Epistea, z.s.
Funding Details
- Annual Budget
- -
- Monthly Burn Rate
- -
- Current Runway
- -
- Funding Goal
- -
- Funding Raised to Date
- -
- Fiscal Sponsor
- Epistea, z.s.
Theory of Change
Live Theory posits that as AI systems become more adaptive and capable, traditional systematic and formal approaches to understanding and governing them will fail because they cannot handle context-dependent problems. The initiative aims to develop 'live theories' -- adaptive theoretical frameworks powered by AI that can scale insights without requiring rigid common structure. By redesigning the infrastructure of research methodology and interface design to take advantage of AI assistance, the project seeks to keep human sensemaking at pace with AI development. This includes building interfaces that support postformal reasoning for AI safety, enabling researchers and communities to better understand, anticipate, and respond to risks from increasingly adaptive AI systems. The theory of change runs from better sensemaking tools to improved threat modeling to more effective AI safety research and governance.
Grants Received
from Survival and Flourishing Fund
Projects
No linked projects.
People
No linked people.
Discussion
Sign in to join the discussion.
No comments yet. Be the first to share your thoughts.
Details
- Last Updated
- Apr 2, 2026, 9:49 PM UTC
- Created
- Mar 18, 2026, 11:18 PM UTC