Cyborgism is a research agenda, memeplex, and online community centered on the idea of building human-in-the-loop systems that pair humans with large language models to enhance human cognitive capabilities and advance AI alignment research. The core concept treats LLMs as 'simulators' rather than autonomous agents, keeping humans as the sole goal-directed entities in any research process. Rather than outsourcing alignment work to AI, cyborgism proposes training 'cyborgs' — humans skilled at steering base models through specialized prompt engineering — to perform alignment research at unprecedented scale and quality. The community has developed tools like the Loom (a branching interface for exploring LLM-generated multiverses) and maintains a collaborative wiki at cyborgism.wiki. Research activities include a 2023 track at AI Safety Camp, a mentorship stream at SERI MATS 2023, and the Act I project exploring multi-human multi-AI interaction on Discord.
Funding Details
- Annual Budget
- -
- Monthly Burn Rate
- -
- Current Runway
- -
- Funding Goal
- -
- Funding Raised to Date
- -
- Fiscal Sponsor
- -
Theory of Change
Cyborgism argues that current LLMs — especially base models used as simulators rather than autonomous agents — can dramatically expand the cognitive capabilities of human alignment researchers without introducing dangerous AI agency. By training 'cyborgs' (humans with deep LLM intuition and specialized tools like the Loom), the agenda aims to differentially accelerate alignment research relative to capabilities work. The key causal chain is: better human-AI collaboration tools -> more cognitively capable alignment researchers -> faster progress on hard alignment problems -> reduced risk that AI capabilities outpace safety. Crucially, keeping humans as the only agentic entities in the loop avoids the alignment risks that would arise from delegating research to autonomous AI systems.
Grants Received
No grants recorded.
Projects
No linked projects.
People
No linked people.
Discussion
Sign in to join the discussion.
No comments yet. Be the first to share your thoughts.
Details
- Last Updated
- Apr 2, 2026, 9:59 PM UTC
- Created
- Mar 19, 2026, 10:30 PM UTC
