Practicing with AI
Practicing with AI
Podcast Description
Hippo Education Presents: Practicing with AI -- conversations about medicine, AI, and the people navigating both. Join Vicky Pittman, a practicing clinician, and Rob Taves, a technology expert, as they explore how AI is shaping clinical practice, medical education, and patient care—and what it means for clinicians at every stage of their journey.
Explore more of Hippo's audio resources at hippoed.com/audio
Podcast Insights
Content Themes
The podcast examines the intersection of AI and healthcare, covering topics such as AI technology in clinical practice, medical education, and patient care. Episodes include discussions on defining AI, practical tips for selecting AI tools, and enhancing medical education through technology, all aimed at demystifying AI for healthcare professionals.

Hippo Education Presents: Practicing with AI — conversations about medicine, AI, and the people navigating both. Join Vicky Pittman, a practicing clinician, and Rob Taves, a technology expert, as they explore how AI is shaping clinical practice, medical education, and patient care—and what it means for clinicians at every stage of their journey.
Explore more of Hippo’s audio resources at hippoed.com/audio
Welcome to Hippo Education’s Practicing with AI, conversations about medicine, AI, and the people navigating both. This month, Rob and Vicky address a common concern of clinicians when it comes to AI and medicine: bias and fairness. How might AI perpetuate health disparities that are rooted in bias? Or could AI help to mitigate bias in medicine? To learn more about this issue, Vicky interviews data journalist, author, and NYU professor Dr. Meredith Broussard on the topic of bias in technology. They explore how and why bias exists in technology and what can be done about it.
Visit speakpipe.com/hippoed to leave a voice message about anything related to AI and medicine: your excitement, your concerns, your own experiences with AI… anything. Your voice might even make it onto a future episode.
References:
1. Broussard M. More Than A Glitch: Confronting Race, Gender, and Ability Bias in Tech. The MIT Press; 2023.
2. Broussard M. Artificial Unintelligence: How Computers Misunderstand the World. The MIT Press; 2018.
3. Bhakta NR, et al. Race and Ethnicity in Pulmonary Function Test Interpretation: An Official American Thoracic Society Statement. Am J Respir Crit Care Med. 2023;207(8):978-995. PMID: 36973004
4. Liu M, et al. Early detection of sexually transmitted infections from skin lesions with deep learning: a systematic review and meta-analysis. Lancet Digit Health. 2025;7(7):100894. PMID: 40769792
5. Rao P. SCIN: A new resource for representative dermatology images. Google Research Blog. 2024. Available at: https://research.google/blog/scin-a-new-resource-for-representative-dermatology-images/. Accessed on September 25, 2025.
7. Good Machine Learning Practice for Medical Device Development: Guiding Principles. US Food & Drug Administration. 2025. Available at: https://www.fda.gov/medical-devices/software-medical-device-samd/good-machine-learning-practice-medical-device-development-guiding-principles. Accessed on September 25, 2025.
8. Rickman S. Evaluating gender bias in large language models in long-term care. BMC Med Inform Decis Mak. 2025;25(1):274. PMID: 40784946
9. Challapally A, Pease C, Raskar R, Chari P. The GenAI Divide: State of AI in Business 2025. MIT NANDA. 2025. https://mlq.ai/media/quarterly_decks/v0.1_State_of_AI_in_Business_2025_Report.pdf

Disclaimer
This podcast’s information is provided for general reference and was obtained from publicly accessible sources. The Podcast Collaborative neither produces nor verifies the content, accuracy, or suitability of this podcast. Views and opinions belong solely to the podcast creators and guests.
For a complete disclaimer, please see our Full Disclaimer on the archive page. The Podcast Collaborative bears no responsibility for the podcast’s themes, language, or overall content. Listener discretion is advised. Read our Terms of Use and Privacy Policy for more details.