Bryce Meyer
Bio
Bryce Meyer is a software engineer and the primary maintainer of TransformerLens, the leading open-source library for mechanistic interpretability research on GPT-style language models. TransformerLens was originally created by Neel Nanda and allows researchers to load 50+ open-source language models and inspect their internal activations, making it the de facto standard tool for mechanistic interpretability work at organizations including Anthropic, Meta Research, Redwood Research, and Apollo Research. Meyer has maintained the library with a track record of consistent contributions, rapid iteration to support newly released models, and active community support via a weekly live-coding stream in the Open Source Mechanistic Interpretability Slack. He received a $50,000 grant from the Long-Term Future Fund in 2023 to build and enhance open-source mechanistic interpretability tooling, followed by a $90,000 year-long LTFF stipend to serve as TransformerLens's primary maintainer. He is also the president of Pomelo Productions, an independent software development studio based in Milwaukee, Wisconsin, and a self-taught developer with many years of professional engineering experience.
Grants
from Long-Term Future Fund
from Long-Term Future Fund
Discussion
Sign in to join the discussion.
No comments yet. Be the first to share your thoughts.
Details
- Last Updated
- Mar 22, 2026, 2:53 PM UTC
- Created
- Mar 20, 2026, 2:48 AM UTC