Logo
    Search

    sustainable ai

    Explore "sustainable ai" with insightful episodes like "How Autism Can Look Very Different, Even in Identical Twins", "Our tech has a climate problem: How we solve it" and "AI's New Era: Understanding Mistral's Sparse Mixture of Experts Approach" from podcasts like ""Short Wave", "TED Radio Hour" and "A Beginner's Guide to AI"" and more!

    Episodes (3)

    How Autism Can Look Very Different, Even in Identical Twins

    How Autism Can Look Very Different, Even in Identical Twins
    Sam and John Fetters, 19, are identical twins on different ends of the autism spectrum. Sam is a sophomore at Amherst College and runs marathons in his free time. John attends a school for people with special needs and loves to watch Sesame Street in his free time. Identical twins like Sam and John pose an important question for scientists: How can a disorder that is known to be highly genetic look so different in siblings who share the same genome?

    Check out more of NPR's series on the Science of Siblings.

    More science questions? Email us at shortwave@npr.org.

    Learn more about sponsor message choices: podcastchoices.com/adchoices

    NPR Privacy Policy

    Our tech has a climate problem: How we solve it

    Our tech has a climate problem: How we solve it
    AI, EVs, and satellites are tackling the climate crisis. But they have environmental downsides. This hour, TED speakers explain how to use these tools without making global warming worse. Guests include AI researchers Sasha Luccioni and Sims Witherspoon, climate researcher Elsa Dominish and astrodynamicist Moriba Jah.

    TED Radio Hour+ subscribers now get access to bonus episodes, with more ideas from TED speakers and a behind the scenes look with our producers. A Plus subscription also lets you listen to regular episodes (like this one!) without sponsors. Sign-up at plus.npr.org/ted.

    Learn more about sponsor message choices: podcastchoices.com/adchoices

    NPR Privacy Policy

    AI's New Era: Understanding Mistral's Sparse Mixture of Experts Approach

    AI's New Era: Understanding Mistral's Sparse Mixture of Experts Approach

    In this episode of "A Beginner's Guide to AI," we delve into the innovative realm of Sparse Mixture of Experts (MoE) models, with a special focus on Mistral, a French AI company pioneering in this field. We unpack the concept of Sparse MoE, highlighting its efficiency, adaptability, and scalability in AI development. We explore Mistral's groundbreaking work in applying Sparse MoE to language models, emphasizing its potential for more accessible and sustainable AI technologies. Through a detailed case study, we illustrate the real-world impact of Mistral's innovations. We also invite AI enthusiasts to join our conversation and provide an interactive element for deeper engagement with the topic. The episode concluded with insightful thoughts on the future of AI and a reflective quote from Geoff Hinton.


    This podcast was generated with the help of ChatGPT and Claude 2. We do fact-check with human eyes, but there might still be hallucinations in the output.


    Music credit: "Modern Situations by Unicorn Heads"