Technically Speaking with Chris Wright

Technically Speaking with Chris Wright
Podcast Description
Struggling to keep pace with the ever-changing world of technology? For experienced tech professionals, making sense of this complexity to find real strategic advantages is key. This series offers a clear path, featuring insightful, casual conversations with leading global experts, innovators, and key voices from Red Hat, all cutting through the hype.
Drawing from Red Hat's deep expertise in open source and enterprise innovation, each discussion delves into new and emerging technologies-- from artificial intelligence and the future of cloud computing to cybersecurity, data management, and beyond. The focus is on understanding not just the 'what,' but the important 'why' and 'how': exploring how these advancements can shape long-term strategic developments for your organization and your career. Gain an insider’s perspective that humanizes complex topics, helping you anticipate what’s next and make informed decisions. Equip yourself with the knowledge to turn today's emerging tech into valuable, practical strategies and apply innovative thinking in your work.
Tune in for forward-looking discussions that connect the dots between cutting-edge technology and real-world application, leveraging a rich understanding of the enterprise landscape. Learn to navigate the future of tech with confidence.
Podcast Insights
Content Themes
The podcast explores a variety of technology topics including artificial intelligence, cloud computing, open source innovation, and cybersecurity. Specific episodes include discussions on AI optimization strategies with experts like Nick Hill and insights into enterprise AI implementations with Brian Stevens, focusing on real-world applications and strategic developments.

Struggling to keep pace with the ever-changing world of technology? For experienced tech professionals, making sense of this complexity to find real strategic advantages is key. This series offers a clear path, featuring insightful, casual conversations with leading global experts, innovators, and key voices from Red Hat, all cutting through the hype.
Drawing from Red Hat’s deep expertise in open source and enterprise innovation, each discussion delves into new and emerging technologies– from artificial intelligence and the future of cloud computing to cybersecurity, data management, and beyond. The focus is on understanding not just the ‘what,’ but the important ‘why’ and ‘how’: exploring how these advancements can shape long-term strategic developments for your organization and your career. Gain an insider’s perspective that humanizes complex topics, helping you anticipate what’s next and make informed decisions. Equip yourself with the knowledge to turn today’s emerging tech into valuable, practical strategies and apply innovative thinking in your work.
Tune in for forward-looking discussions that connect the dots between cutting-edge technology and real-world application, leveraging a rich understanding of the enterprise landscape. Learn to navigate the future of tech with confidence.
Explore what it takes to run massive language models efficiently with Red Hat’s Senior Principal Software Engineer in AI Engineering, Nick Hill. In this episode, we go behind the headlines to uncover the systems-level engineering making AI practical, focusing on the pivotal challenge of inference optimization and the transformative power of the vLLM open-source project.
Nick Hill shares his experiences working in AI including:
• The evolution of AI optimization, from early handcrafted systems like IBM Watson to the complex demands of today’s generative AI.
• The critical role of open-source projects like vLLM in creating a common, efficient inference stack for diverse hardware platforms.
• Key innovations like PagedAttention that solve GPU memory fragmentation and manage the KV cache for scalable, high-throughput performance.
• How the open-source community is rapidly translating academic research into real-world, production-ready solutions for AI.
Join us to explore the infrastructure and optimization strategies making large-scale AI a reality. This conversation is essential for any technologist, engineer, or leader who wants to understand the how and why of AI performance. You’ll come away with a new appreciation for the clever, systems-level work required to build a truly scalable and open AI future.

Disclaimer
This podcast’s information is provided for general reference and was obtained from publicly accessible sources. The Podcast Collaborative neither produces nor verifies the content, accuracy, or suitability of this podcast. Views and opinions belong solely to the podcast creators and guests.
For a complete disclaimer, please see our Full Disclaimer on the archive page. The Podcast Collaborative bears no responsibility for the podcast’s themes, language, or overall content. Listener discretion is advised. Read our Terms of Use and Privacy Policy for more details.