Inference by Turing Post
Inference by Turing Post
Podcast Description
Inference is Turing Post’s way of asking the big questions about AI — and refusing easy answers. Each episode starts with a simple prompt: “When will we…?” – and follows it wherever it leads.Host Ksenia Se sits down with the people shaping the future firsthand: researchers, founders, engineers, and entrepreneurs. The conversations are candid, sharp, and sometimes surprising – less about polished visions, more about the real work happening behind the scenes.It’s called Inference for a reason: opinions are great, but we want to connect the dots – between research breakthroughs, business moves, technical hurdles, and shifting ambitions.If you’re tired of vague futurism and ready for real conversations about what’s coming (and what’s not), this is your feed. Join us – and draw your own inference.
Podcast Insights
Content Themes
The podcast explores themes centered on AI development, coding philosophy, and AI challenges. Key episodes include discussions on the future of coding with Amjad Masad reflecting on AI agents and their role in software development, Sharon Zhou dissecting AI hallucinations and the importance of grounding benchmarks in reality, and Mati Staniszewski tackling real-time language translation and maintaining emotional nuance in AI voice synthesis, illustrating a commitment to thoughtful, nuanced explorations of AI's impact on society.

Inference is Turing Post’s way of asking the big questions about AI — and refusing easy answers. Each episode starts with a simple prompt: “When will we…?” – and follows it wherever it leads.
Host Ksenia Se sits down with the people shaping the future firsthand: researchers, founders, engineers, and entrepreneurs. The conversations are candid, sharp, and sometimes surprising – less about polished visions, more about the real work happening behind the scenes.
It’s called Inference for a reason: opinions are great, but we want to connect the dots – between research breakthroughs, business moves, technical hurdles, and shifting ambitions.
If you’re tired of vague futurism and ready for real conversations about what’s coming (and what’s not), this is your feed. Join us – and draw your own inference.
In the first episode of Inference’s quarterly series on Open Source AI, we talk to Raffi Krikorian, CTO of Mozilla, about when open source AI stops being aspirational and becomes an operational choice.
We explore why stories like Pinterest saving $10 million by moving to open models are real, but often misunderstood, and why timing matters more than ideology. Raffi lays out his view of a missing “LAMP stack for AI” and explains why the hardest problem to solve isn’t models or data, but the connective glue that holds AI systems together.
Along the way, he shares how Mozilla is navigating these tradeoffs in practice, why even open-source-first organizations still rely on closed tools during experimentation, and what the browser era taught Mozilla about defaults, user choice, and long-term control.
He also shares a few practical recommendations in this episode that apply even if you’re still experimenting. Listen closely.
This conversation kicks off our Open Source AI series for 2026, focused on real tradeoffs, real economics, and the decisions companies are making right now. Follow on: https://www.turingpost.com/
*Did you like the episode? You know the drill:*
📌 Subscribe for more conversations with the builders shaping real-world AI.
💬 Leave a comment if this resonated.
👍 Like it if you liked it.
🫶 Thank you for watching and sharing!
*Guest:* Raffi Krikorian, CTO at Mozilla
LinkedIn: https://www.linkedin.com/in/rkrikorian/
Mozilla AI: https://mozilla.ai/
Mozilla Blog: https://blog.mozilla.org/en/mozilla/mozilla-open-source-ai-strategy/
*Links mentioned:*
Raffi’s post here about our OSAI strategy: https://blog.mozilla.org/en/mozilla/mozilla-open-source-ai-strategy/
🌐 #1: Mastering Open Source AI in 2026: Essential Decisions for Builders https://www.turingpost.com/p/opensource1
Mozilla Data Collective: https://data.mozilla.org/
Langchain: https://www.langchain.com/
OpenRouter: https://openrouter.ai/
AI2 (Allen Institute for AI): https://allenai.org/
Flower AI (Federated Learning): https://flower.dev/
Einstein’s Dreams by Alan Lightman: https://www.goodreads.com/book/show/14376.Einstein_s_Dreams
📰 The transcript and edited version at https://www.turingpost.com/krikorian
*Chapters:*
0:00 Cold Open — Values vs Economics in Open Source AI
0:28 Intro: Why This Season Focuses on Open Source AI
0:54 When Open Source Becomes a Business Decision
1:44 Pinterest Saved $10M + The Shift From Prototyping to Production
2:42 Mozilla’s “Choice Suite” + The Terraform “Exit Door”
5:21 Mozilla’s Mission: Do for AI What Mozilla Did for the Web
7:09 The “LAMP Stack” for AI + Standards Across the Stack
9:52 Small Models, Specialization, and Model Composability
15:45 Data, Privacy, and “I Own My Context”
18:36 “This Is a Fight Worth Having” + The Signal Analogy
21:42 1–2–3 Steps for Companies to Start (Instrument Choice Early)
24:22 Book Pick: Einstein’s Dreams + Closing
Turing Post is a newsletter about AI’s past, present, and future. Ksenia Se explores how intelligent systems are built – and how they’re changing how we think, work, and live.
*Follow us →*
Turing Post: https://x.com/TheTuringPost
Ksenia Se: https://www.linkedin.com/in/ksenia-se
https://huggingface.co/Kseniase
#OpenSourceAI #LAMPStackForAI #AIEconomics #MozillaAI #AIInfrastructure #DataProvenance #FederatedLearning #OpenModels

Disclaimer
This podcast’s information is provided for general reference and was obtained from publicly accessible sources. The Podcast Collaborative neither produces nor verifies the content, accuracy, or suitability of this podcast. Views and opinions belong solely to the podcast creators and guests.
For a complete disclaimer, please see our Full Disclaimer on the archive page. The Podcast Collaborative bears no responsibility for the podcast’s themes, language, or overall content. Listener discretion is advised. Read our Terms of Use and Privacy Policy for more details.