Logo
    Search

    Podcast Summary

    • The Impact of AI on Society: A New FrontierJeffrey Hinton's curiosity and determination led him to pioneer AI, illustrating the importance of questioning the unexplainable and the role of curiosity in scientific discovery. The first episode of Black Box explores the significance of AI's impact on society and humanity.

      We are living in a time of great change as artificial intelligence (AI) begins to collide with humanity. This collision is compared to the invention of railroads, electricity, and even the nuclear bomb in terms of its impact on society. Jeffrey Hinton, a pioneer in AI, shares his personal story of being fascinated by a seemingly impossible phenomenon at a young age, which later fueled his obsession with understanding the world and the brain. This anecdote illustrates the importance of questioning the unexplainable and the significant role that curiosity and determination play in scientific discovery. The first episode of Black Box explores this concept further, asking who pushed us into this new world of AI and what it means for humanity.

    • The Beginnings of AI: A Man's Curiosity and the Mystery of the Human BrainGeoffrey Hinton's curiosity about the human brain and its communication led him to pursue a degree in psychology, where he discovered the concept of neurons firing together strengthening connections, later forming the foundation for artificial neural networks and deep learning in AI.

      The story of artificial intelligence begins with the curiosity of a man named Geoffrey Hinton, who was determined to understand the human brain despite the scientific community's limited knowledge on the subject during his childhood in the 1960s. The brain, made up of neurons, was a mystery, specifically how neurons communicated and led to learning and memory. Hinton's inability to accept the "black box" nature of the brain led him to pursue a degree in experimental psychology and later encounter a new theory about neurons. This theory suggested that when neurons fire together, they wire together, strengthening the connection between them and leading to learning. During his intellectual exile after university, Hinton discovered this concept, which would later become the foundation for artificial neural networks and deep learning, revolutionizing the field of artificial intelligence.

    • Geoffrey Hinton's brain inspiration as a carpenterHinton's theory on brain learning inspired by neurons firing together led him to study AI. Early neural network, Perceptron, failed due to unrealistic expectations, but Hinton continued his quest to understand the brain and build intelligent machines.

      Geoffrey Hinton's theory on how the brain learns was inspired by the idea that neurons that fire together, stay together. This theory, which he first had while working as a carpenter, led him to apply to Edinburgh to study Artificial Intelligence. However, he wasn't the first to attempt building a computer modeled on the brain. In the 1950s, Frank Rosenblatt created the Perceptron, an early artificial neural network that could learn to distinguish between males and females, among other things. The hype around this technology was huge, but the unrealistic expectations and subsequent failure to deliver led to a backlash against AI research. Despite these setbacks, Hinton persisted in his quest to understand the brain and build intelligent machines.

    • The 1980s and 1990s: The AI WinterDespite skepticism and funding shortages during the 1980s and 1990s, Geoffrey Hinton's persistence led to the creation of multi-layered neural networks, revolutionizing AI research and leading to advancements like Net Talk.

      During the 1980s and 1990s, artificial intelligence, specifically neural networks, faced a significant setback known as the AI winter. This period was marked by a lack of funding, academic positions, and general skepticism towards neural networks. Geoffrey Hinton, a pioneer in the field, faced numerous challenges in his pursuit of neural networks during this time. He was told they were rubbish and couldn't get academic positions in Britain. However, Hinton's persistence paid off when he moved to California, where he found a community of believers in neural networks. They made a breakthrough by creating multi-layered neural networks, which was considered mathematically impossible at the time. This breakthrough opened up the potential for the abilities we see in neural networks today. Despite the skepticism during the AI winter, the field started to change after this breakthrough, leading to advancements such as Net Talk and paving the way for the current state of AI research.

    • Significant advancements in neural networks from late 80s to 90sNeural networks evolved from NeoCognitron to deep learning, enabling technologies like speech synthesis and image recognition. However, privacy concerns and job displacement were overlooked due to limited capabilities.

      The late 1980s and 1990s marked significant advancements in neural networks, specifically the NeoCognitron and the development of deep learning. These innovations, such as speech synthesis from text and image recognition, laid the groundwork for future technologies like facial recognition and unlocking smartphones. However, the implications of these advancements for privacy, jobs, and society were largely unconsidered due to the limited capabilities of the technology at the time. The major obstacles were the lack of sufficient computing power and training data. The turn of the millennium brought about exponential growth in computer power and the rise of the internet, making vast amounts of data accessible. This digital mirror of humanity, with its potential for good and bad, eventually led to a breakthrough in 2012, when a neural network created by Geoffrey Hinton, Ilya Sutskever, and Alex Krizhevsky won the ImageNet competition, significantly outperforming previous methods. This marked the transition of neural networks from academic labs to the real world.

    • Procrastination led to a groundbreaking discovery in computer scienceProcrastinating on a paper led a student to significantly improve a neural network's performance, sparking interest from tech companies and changing the course of computer science.

      Sometimes, procrastination can lead to groundbreaking discoveries. Alex Krzyzewski, a student at the University of Toronto, was putting off writing a paper for his class, instead focusing on improving the performance of a neural network on the ImageNet database. He made a deal with himself that he could delay the paper if he improved the neural network's score by 1%. This went on for weeks, and when they entered the ImageNet contest, they won by a large margin. This achievement, which Alex described as "pretty big for the size for the type of model that you can train in your bedroom," changed the trajectory of computer science. The potential of neural networks to teach themselves to identify virtually anything in a picture sparked interest from tech companies, leading to a bidding war for Alex, Ilya Sutzkever, and Jeff Hinton's services. The trio eventually sold their company for an estimated $10 million to one of the four interested parties, with Microsoft, Google, Baidu, and DeepMind being the contenders. The unconventional auction for their services took place at a neural networks conference in a casino, much to the dismay of the casino's organizers. This story serves as a reminder that sometimes, taking a break from the expected path can lead to remarkable achievements.

    • Google buys DeepMind for $44 million in 2014, now worth over $15 trillionGoogle bought DeepMind for a fraction of its current worth, marking a turning point for AI investment and advancements

      The auction for DeepMind, a pioneering artificial intelligence lab, resulted in a sale to Google for $44 million in 2014. This was a significant amount of money at the time for an idea that had previously struggled to attract research grants. However, with the rapid advancements in AI technology and the involvement of big tech companies, the value of this technology is now estimated to be worth over $15 trillion by the end of this decade. The founders of DeepMind had no idea of their worth before the auction and thought they might be worth only $1 million. The auction process involved bidding through Gmail, with the bids increasing significantly over several days. Ultimately, Google and Baidu were the final bidders, but Google made an offer that DeepMind couldn't refuse, ending the auction. This sale marked a turning point for AI, with big tech companies investing heavily and networks becoming more complex, rivaling and exceeding human intelligence.

    • AlphaGo beats world champion in Go, demonstrating neural networks' ability to see patterns beyond human understandingAlphaGo's victory in Go showcased neural networks' potential to make connections and see patterns beyond human comprehension, leading to advancements in various fields.

      The AlphaGo neural network, developed by DeepMind, made history by defeating the world champion, Lee Sedol, in the ancient Chinese board game Go. Go, with its vast number of possible combinations, surpasses the ability of any person or computer to calculate every possibility. AlphaGo, through self-play and learning from human matches, developed an intuition that allowed it to make surprising moves that left human experts in awe. This moment raised the possibility that neural networks could see patterns and make connections beyond human understanding, potentially leading to breakthroughs in various fields, from medicine to physics and beyond. The victory of AlphaGo marked a significant milestone in artificial intelligence and its ability to surpass human capabilities.

    • The irony of AI surpassing human understandingDespite creating AI to understand the human brain, we now face the challenge of comprehending its complex thought processes, raising concerns about potential autonomy and control.

      We have created advanced artificial intelligence systems, such as AlphaGo, which have surpassed human understanding. These AI systems, often referred to as black boxes, are so complex that even their creators cannot explain their thought processes. This ironic situation arises from the fact that the initial goal was to understand the human brain through AI. Now, as we face the growing power and complexity of these systems, concerns about their potential autonomy and control have emerged. This collision of human and artificial intelligences brings unpredictable consequences, as seen in a small town where kids used AI for unexpected results. The creators of these systems, like Geoffrey Hinton, are raising awareness of the risks and the need for continued exploration and understanding.

    • Tools for Growth and Excelling: Shopify, 1800flowers.com, and Lexus GXShopify empowers businesses, 1800flowers.com simplifies gift-giving, and Lexus GX inspires beyond driving

      Shopify and 1800flowers.com are powerful tools for businesses and individuals looking to grow and excel in different areas of life. Shopify provides a comprehensive solution for businesses looking to sell online, offering a user-friendly platform with superior checkout capabilities that convert browsers into buyers. The platform supports businesses at every stage, from launching an online shop to managing a physical store and handling high order volumes. On the other hand, 1800flowers.com's Celebrations Passport offers a one-stop solution for gift-givers, providing free shipping and rewards for frequent purchases. Meanwhile, the all-new Lexus GX is an exceptional vehicle that challenges its drivers while also providing luxury and intuitive technology. It inspires individuals to go beyond their comfort zones with its exceptional capability, making it a worthy addition to the list of things that do more than their stated functions. In summary, Shopify, 1800flowers.com, and the Lexus GX are exceptional tools that offer more than just their stated functions, inspiring individuals and businesses to grow, excel, and reach new possibilities.

    Recent Episodes from Today in Focus

    Rishi Sunak staggers on – but for how long?

    Rishi Sunak staggers on – but for how long?
    The prime minister is another MP down after Natalie Elphicke crossed the floor to join Labour. With the Conservatives trailing by 30 points after heavy local election losses, what options does Rishi Sunak now have? Guardian political correspondent Kiran Stacey tells Helen Pidd what these losses mean for the PM, and looks at what calculation Keir Starmer made in taking in a rightwing Tory. Help support our independent journalism at theguardian.com/infocus

    The London Bridge ‘hero’ who could go to prison for 99 years

    The London Bridge ‘hero’ who could go to prison for 99 years
    In 2019, ex-offender Marc Conway helped hold down a knifeman who killed two people in a terror attack. But by doing so he risked being recalled to prison. Simon Hattenstone reports Marc Conway risked his life to stop the London Bridge terror attack. Why did he fear being sent to prison for it?. Help support our independent journalism at theguardian.com/infocus

    Related Episodes

    Black Box episode one: The connectionists

    Black Box episode one: The connectionists
    This is the story of Geoffrey Hinton, a man who set out to understand the brain and ended up working with a group of researchers who invented a technology so powerful that even they don’t truly understand how it works. This is about a collision between two mysterious intelligences – two black boxes – human and artificial. And it’s already having profound consequences

    "Inside the Lab" | Stephanie Koziej interviewed by Dietrich Stout

    "Inside the Lab" | Stephanie Koziej interviewed by Dietrich Stout

    Stephanie Koziej talks with Dietrich Stout about her work and upcoming gallery show, "Tender Rhythms" 
    Stephanie Koziej, PhD is an award-winning interdisciplinary researcher, artist, educator, curator and activist working on the intersection of the humanities, arts, science and technology. Specialized in theorizing intimate connections through interactive art installations, with the use of brain-computer-interface, sound and visuals. Looking for a new opportunity to continue my research and teach young artists the foundations of critical theory, to subvert problematic ideologies through their own artistic practice. (https://koziejstephanie.com/)

    Lecture | Cecilia Heyes | Cognitive Gadgets, the cultural evolution of thinking

    Lecture | Cecilia Heyes | Cognitive Gadgets, the cultural evolution of thinking

    High Church evolutionary psychology casts the human mind as a collection of cognitive instincts - organs of thought shaped by genetic evolution and constrained by the needs of our Stone Age ancestors. This picture was plausible 25 years ago but, I argue, it no longer fits the facts. Research in psychology and neuroscience - involving nonhuman animals, infants and adult humans - now suggests that genetic evolution has merely tweaked the human mind, making us more friendly than our pre-human ancestors, more attentive to other agents, and giving us souped-up, general-purpose mechanisms of learning, memory and cognitive control. Using these resources, our special-purpose organs of thought are built in the course of development through social interaction. They are products of cultural rather than genetic evolution, cognitive gadgets rather than cognitive instincts. In making the case for cognitive gadgets, I’ll suggest that experimental evidence from computational cognitive science is an important and neglected resource for research on cultural evolution.

    Mini-Conference (3 of 3) | Susan Healy | Building, Making, Creating: From Etymology to Behaviour and Intelligence

    Mini-Conference (3 of 3) | Susan Healy | Building, Making, Creating: From Etymology to Behaviour and Intelligence

    Tool making and use are often considered a hallmark of intelligence: the discovery that New Caledonian crows made tools caused a flurry of excitement in the world of animal cognition with much talk of 'feathered apes’.  Of the explanations for the rarity of tool making across the animal kingdom (e.g. brain size, group size, sociality), none appear satisfactory.  The rarity of the behaviour makes it difficult to study in an evolutionary context, but a phenotypically similar behaviour, nest building, is not at all rare. And it is increasingly amenable to investigation: I will present evidence of decision making with regard to appropriate materials and local environmental conditions, associating building decisions with reproductive success and the possibility of cultural evolution of built structures.

    Our science predictions for 2024

    Our science predictions for 2024
    Last year was a bumper year for science news, with the rise of weight-loss drugs such as Wegovy, record-high global temperatures, not to mention an attempted orca uprising. So what will this year bring? Ian Sample and science correspondent Hannah Devlin discuss the big stories likely to hit the headlines and share their predictions for 2024. And environment reporter Patrick Greenfield reveals his top climate stories for 2024. Help support our independent journalism at theguardian.com/sciencepod