AI x DevOps by Facets.cloud
AI x DevOps by Facets.cloud
Podcast Description
Engineering teams are under pressure to move faster, do more with less, and stay ahead of an increasingly complex stack. AI is becoming a key piece of that equation — not just as a tool, but as a shift in how DevOps is done.
At Facets.cloud, we’re building infrastructure orchestration for the AI era. And with AI x DevOps Podcast, we’re creating the space for honest, technical, forward-looking conversations about that shift - from early experiments to long-term visions.
This podcast is about sharing what’s real: what’s working, what’s not, and what’s next. Whether you’re building internal copilots, streamlining CI/CD with AI, or rethinking developer experience — we want to learn from your story.
Podcast Insights
Content Themes
The podcast focuses on various themes related to AI and DevOps, featuring topics like AI-driven infrastructure orchestration, practical use cases of AI in CI/CD processes, and the evolving developer experience. For example, episodes cover specific experiments such as incorporating LLMs in coding, the implications of deterministic versus vibe-coded infrastructure, and audience insights into the future of platform engineering driven by AI technologies.

Engineering teams are under pressure to move faster, do more with less, and stay ahead of an increasingly complex stack. AI is becoming a key piece of that equation — not just as a tool, but as a shift in how DevOps is done.
At Facets.cloud, we’re building infrastructure orchestration for the AI era. And with AI x DevOps Podcast, we’re creating the space for honest, technical, forward-looking conversations about that shift – from early experiments to long-term visions.
This podcast is about sharing what’s real: what’s working, what’s not, and what’s next. Whether you’re building internal copilots, streamlining CI/CD with AI, or rethinking developer experience — we want to learn from your story.
In this episode of AI x DevOps, Rohit sits down with Görkem Ercan, CTO at Jozu, a company building a DevOps platform for AI agents and models. Görkem, a veteran with over two decades of software experience (including contributions to the Eclipse Foundation), explains why MLOps is fundamentally different from traditional, deterministic DevOps—leading to extreme pipeline fragmentation.
Here are some of our favourite takeaways:
• Standardization is Key: Why OCI is the recognized standard for packaging AI/ML artifacts, and how the Model Packs project (with ByteDance, Red Hat, and Docker) is defining the artifact structure.
• Open Source Headaches: The critical challenge maintainers face when receiving large amounts of untested, verbose, AI-generated code.
• LLM Economics: Discover why running small, fine-tuned LLMs in-house can be cheaper and provide more predictable, consistent results than generic large providers.
• KitOps Solution: How KitOps creates an abstraction that allows data scientists to focus on training while leveraging existing DevOps platforms for deployment.
Tune in now to understand the standardization movement reshaping the future of AI development!

Disclaimer
This podcast’s information is provided for general reference and was obtained from publicly accessible sources. The Podcast Collaborative neither produces nor verifies the content, accuracy, or suitability of this podcast. Views and opinions belong solely to the podcast creators and guests.
For a complete disclaimer, please see our Full Disclaimer on the archive page. The Podcast Collaborative bears no responsibility for the podcast’s themes, language, or overall content. Listener discretion is advised. Read our Terms of Use and Privacy Policy for more details.