IEEE TEMS Radio
IEEE TEMS Radio
Podcast Description
Engaging Conversations - Education, Engineering, Management & Technology
Podcast Insights
Content Themes
The podcast focuses on a range of themes including augmented AI, engineering leadership, innovation in technology, social impact, and ethical considerations in AI. For example, Dr. Saurabh Sinha discusses augmented AI's role in 5G and ethical implications, while Jeff Perry shares insights on career development for engineering professionals. Each episode emphasizes the intersection of technology with human values and future organizational practices.

Engaging Conversations – Education, Engineering, Management & Technology
Stephen Ibaraki has been building at the intersection of AI and global impact since before most people had heard the term. He built his first AI-enabled computer in 1965. He co-created the networked AI medical database that took a Canadian industry to 80% global market share — all under NDA. He spent over a year working weekly with ITU leadership before the UN ITU AI for Good Global Summit launched in 2017, bringing in speakers, sponsors, and funding largely from his own network. He advises a community of 38,000 CEOs representing $22 million employees across 150 countries. And more than 90% of what he does today, you will never read about.
In this episode, host Madhusudan Bangalore Nagaraja sits down with Stephen — Chairman of REDDS Capital, Founder of AI for Good, IEEE TEMS board member, and IEEE Systems Council Industry Ambassador — for a candid conversation on what it actually takes to close the gap between AI adoption and real-world impact.
They cover:
— Why 88% of organizations use AI but fewer than 40% show a measurable result — and the specific leadership failure at the root of that gap, drawn from Stephen's experience at a private summit of 38,000 CEOs in Sydney
— The structured, step-by-step process that separates organizations getting quantifiable AI results from those stuck in pilot theater
— What builders and innovators should stop doing (overcomplicating what they can't control) and start doing (Cognitorial thinking — free, combinatorial idea generation — and reading on the edge daily)
— Why 2026 is the year CEOs must personally use AI every day, shorten their planning cycles to 90-day windows, and think in 10x innovation terms
— Human in the loop as the irreducible governance floor — and why responsible AI frameworks from IEEE, UNESCO, and Microsoft all converge on the same principle
— The S11: Stephen's framework of 11 converging technologies to monitor between now and 2028 — from Zetta-scale computing and 6G to quantum utility, autonomous AI scientists, biomedical rejuvenation, and the energy flywheel that underpins all of it
— The one action every engineering manager should take this week: read on the edge — IEEE Spectrum, ACM news, and the outlier signals that most leaders miss
Connect with Stephen Ibaraki:
LinkedIn: linkedin.com/in/sibaraki
Twitter / X: @sibaraki
IEEE Systems Council Interview Series: ieeesystemscouncil.org/education/tech-vision-interviews-stephen-ibaraki
Connect with your host Madhusudan Bangalore Nagaraja:
LinkedIn: linkedin.com/in/madhusudannagaraja
IEEE Senior Member, Technical Delivery Manager, eSystems Inc. | PMI Infinity Advisory Committee | Researcher, Agentic AI Systems | Irving, Texas, USA
IEEE TEMS Radio — Engaging Conversations in Education, Engineering, Management and Technology.
Produced by: IEEE Technology and Engineering Management Society
Please write to us at [email protected] to share your feedback.

Disclaimer
This podcast’s information is provided for general reference and was obtained from publicly accessible sources. The Podcast Collaborative neither produces nor verifies the content, accuracy, or suitability of this podcast. Views and opinions belong solely to the podcast creators and guests.
For a complete disclaimer, please see our Full Disclaimer on the archive page. The Podcast Collaborative bears no responsibility for the podcast’s themes, language, or overall content. Listener discretion is advised. Read our Terms of Use and Privacy Policy for more details.