Logo

    foundation models

    Explore "foundation models" with insightful episodes like "Hyperscaler strategy in AI, the application landscape heats up, and what we know now about agents with Sarah and Elad", "Building AI Models Faster And Cheaper Than You Think | Lightcone Podcast", "The AI Revolution: How Foundation Models Are Shaping Our World", "Smart Talks With IBM: Transformations in AI: why foundation models are the future" and "Smart Talks with IBM: AI for Business: Multiplying the impact of AI" from podcasts like ""No Priors: Artificial Intelligence | Machine Learning | Technology | Startups", "Y Combinator", "A Beginner's Guide to AI", "Stuff To Blow Your Mind" and "Stuff To Blow Your Mind"" and more!

    Episodes (6)

    Hyperscaler strategy in AI, the application landscape heats up, and what we know now about agents with Sarah and Elad

    Hyperscaler strategy in AI, the application landscape heats up, and what we know now about agents with Sarah and Elad
    This week on a host-only episode of No Priors, Sarah and Elad discuss the AI wave as compared to the internet wave, the current state of AI investing, the foundation model landscape, voice and video AI, advances in agentic systems, prosumer applications, and the Microsoft/Inflection deal. Have a question for our next host-only episode or feedback for our team? Reach out to show@no-priors.com Follow us on Twitter: @NoPriorsPod | @Saranormous | @EladGil  Show Notes:  (0:00) Intro (0:32) How to think about scaling in 2024 (3:21) Microsoft/Inflection deal (5:28) Voice cloning (7:02) Investing climate (12:50) Whitespace in AI (16:36) AI video landscape (19:54) Agentic user experiences (22:21) Prosumer as the first wave of application AI

    Building AI Models Faster And Cheaper Than You Think | Lightcone Podcast

    Building AI Models Faster And Cheaper Than You Think | Lightcone Podcast

    If you read articles about companies like OpenAI and Anthropic training foundation models, it would be natural to assume that if you don’t have a billion dollars or the resources of a large company, you can’t train your own foundational models. But the opposite is true. In this episode of the Lightcone Podcast, we discuss the strategies to build a foundational model from scratch in less than 3 months with examples of YC companies doing just that. We also get an exclusive look at Open AI's Sora!

    The AI Revolution: How Foundation Models Are Shaping Our World

    The AI Revolution: How Foundation Models Are Shaping Our World

    In this episode of "A Beginner's Guide to AI," we delve into the revolutionary world of Foundation Models, large-scale AI systems designed to understand and interact with the world in ways that were once the realm of science fiction. Our journey explores how these models are trained on diverse datasets to perform a myriad of tasks, from writing and art creation to solving complex scientific problems, offering a glimpse into the future of AI and its potential to reshape every aspect of our lives.


    Discover how Foundation Models are breaking new ground in AI research and application, promising to accelerate innovation across industries while raising important ethical questions about privacy, bias, and the future of work. As we navigate the possibilities and challenges of this AI frontier, we'll examine real-world case studies that highlight both the transformative impact of Foundation Models and the critical debates surrounding their development and deployment.

    Want more AI Infos for Beginners? 📧 ⁠⁠⁠⁠⁠Join our Newsletter⁠⁠⁠⁠⁠! This podcast was generated with the help of ChatGPT and Claude 2. We do fact-check with human eyes, but there still might be hallucinations in the output.


    Music credit: "Modern Situations by Unicorn Heads"


    Join us for an engaging exploration of Foundation Models, where we uncover the potential of these AI giants to revolutionize the world, posing the question: How will we harness this technology to benefit humanity while safeguarding against its risks?

    Smart Talks With IBM: Transformations in AI: why foundation models are the future

    Smart Talks With IBM: Transformations in AI: why foundation models are the future

    Major breakthroughs in artificial intelligence research often reshape the design and utility of AI in both business and society. In this episode of Smart Talks with IBM, Malcolm Gladwell and Jacob Goldstein explore the conceptual underpinnings of modern AI with Dr. David Cox, VP of AI Models at IBM Research. They talk foundation models, self-supervised machine learning, and the practical applications of AI and data platforms like watsonx in business and technology.

    Visit us at: https://www.ibm.com/thought-leadership/smart/talks/

    Learn more about watsonx: https://www.ibm.com/watsonx

    This is a paid advertisement from IBM.

    See omnystudio.com/listener for privacy information.

    Smart Talks with IBM: AI for Business: Multiplying the impact of AI

    Smart Talks with IBM: AI for Business: Multiplying the impact of AI

    As businesses adopt AI, a new era of problem-solving, innovation, and creative decision-making can be brought to scale. In this episode of Smart Talks with IBM, Malcolm Gladwell and Jacob Goldstein explore the future of AI in enterprise business AI for business with Kareem Yusuf, senior vice president of product management and growth for IBM software. They discuss the advent of foundation models, how AI can transform data storage and decision-making, and how next-generation AI platforms like watsonx from IBM can empower businesses to use AI at scale.

     

    This is a paid advertisement from IBM.

    See omnystudio.com/listener for privacy information.

    What is the role of academia in modern AI research? With Stanford Professor Dr. Percy Liang

    What is the role of academia in modern AI research? With Stanford Professor Dr. Percy Liang
    When AI research is evolving at warp speed and takes significant capital and compute power, what is the role of academia? Dr. Percy Liang – Stanford computer science professor and director of the Stanford Center for Research on Foundation Models talks about training costs, distributed infrastructure, model evaluation, alignment, and societal impact. Sarah Guo and Elad Gil join Percy at his office to discuss the evolution of research in NLP, why AI developers should aim for superhuman levels of performance, the goals of the Center for Research on Foundation Models, and Together, a decentralized cloud for artificial intelligence. No Priors is now on YouTube! Subscribe to the channel on YouTube and like this episode. Show Links: See Percy’s Research on Google Scholar See Percy’s bio on Stanford’s website Percy on Stanford’s Blog: What to Expect in 2023 in AI Together, a decentralized cloud for artificial intelligence Foundation AI models GPT-3 and DALL-E need release standards - Protocol The Time Is Now to Develop Community Norms for the Release of Foundation Models - Stanford Sign up for new podcasts every week. Email feedback to show@no-priors.com Follow us on Twitter: @NoPriorsPod | @Saranormous | @EladGil | @PercyLiang Show Notes:  [1:44] - How Percy got into machine learning research and started the Center for Research and Foundation Models at Stanford [7:23] - The role of academia and academia’s competitive advantages [13:30] - Research on natural language processing and computational semantics [27:20] - Smaller scale architectures that are competitive with transformers [35:08] - Helm, holistic evaluation of language models, a project with the the goal is to evaluate language models [42:13] - Together, a decentralized cloud for artificial intelligence
    Logo

    © 2024 Podcastworld. All rights reserved

    Stay up to date

    For any inquiries, please email us at hello@podcastworld.io