Effective Altruism Forum
About
The Effective Altruism Forum (forum.effectivealtruism.org) is the central online platform for discussion, research sharing, and community building within the effective altruism movement. It is operated by the Centre for Effective Altruism (CEA), a UK-based charity headquartered in Oxford. The Forum traces its origins to a community group blog that ran from 2012 to 2014, after which the first dedicated EA Forum launched in 2014 under the stewardship of Ryan Carey and Rethink Charity volunteers using software hosted by Trike Apps. As the community grew and the platform became harder to maintain, CEA took over management in 2017. In 2018, CEA launched a fully rebuilt version — Forum 2.0 — at forum.effectivealtruism.org, built as a fork of the open-source LessWrong codebase with a dedicated paid team. The Forum functions as a place for anyone to publish long-form posts, read curated content, and engage in threaded discussions on topics including AI safety, global health, animal welfare, career guidance, cause prioritization, and EA community updates. Major EA organizations routinely post research, funding announcements, strategic updates, and job postings on the Forum, making it a key node in the EA information ecosystem. As of 2025, the Forum is led by Sarah Cheng as Project Lead within CEA's Online Team. The team is approximately 2.5 FTE dedicated to the Forum (out of a 6 FTE Online Team), with a 2025 budget of roughly $1.3 million for the Online Team as a whole. The Forum has grown significantly since relaunch, reaching approximately 4,500 monthly active logged-in users at peak in 2023, with an estimated 20,000–30,000 total readers including logged-out visitors. After a period of declining engagement in 2024, metrics stabilized and improved in 2025.
Theory of Change
The EA Forum creates impact by serving as shared intellectual infrastructure for the effective altruism community. By providing a platform where researchers, practitioners, and community members can publish and discuss ideas, the Forum accelerates the development and spread of high-quality thinking about how to do the most good. Good content reaching the right readers is the core mechanism: well-reasoned posts can shift priorities, surface neglected opportunities, attract talent to important problems, and hold organizations publicly accountable. The Forum also helps build and reinforce EA community norms around rigor, cause-neutrality, and evidence-based reasoning. In the context of AI safety and existential risk, the Forum is a key venue where safety researchers share findings, funding opportunities are announced, and the broader EA community stays informed about progress and challenges in the field.
Details
- Start Date
- -
- End Date
- -
- Expected Duration
- -
- Funding Raised to Date
- -
- Last Updated
- Apr 3, 2026, 1:21 AM UTC
- Created
- Apr 3, 2026, 1:21 AM UTC