Logo
    Search

    aza raskin

    Explore " aza raskin" with insightful episodes like "Talking With Animals… Using AI", "The AI Dilemma", "Behind the Curtain on The Social Dilemma — with Jeff Orlowski-Yang and Larissa Rhodes", "A Conversation with Facebook Whistleblower Frances Haugen" and "Spotlight — A Whirlwind Week of Whistleblowing" from podcasts like ""Your Undivided Attention", "Your Undivided Attention", "Your Undivided Attention", "Your Undivided Attention" and "Your Undivided Attention"" and more!

    Episodes (17)

    Talking With Animals… Using AI

    Talking With Animals… Using AI

    Despite our serious concerns about the pace of deployment of generative artificial intelligence, we are not anti-AI. There are uses that can help us better understand ourselves and the world around us. Your Undivided Attention co-host Aza Raskin is also co-founder of Earth Species Project, a nonprofit dedicated to using AI to decode non-human communication. ESP is developing this technology both to shift the way that we relate to the rest of nature, and to accelerate conservation research.

    Significant recent breakthroughs in machine learning have opened ways to encode both human languages and map out patterns of animal communication. The research, while slow and incredibly complex, is very exciting. Picture being able to tell a whale to dive to avoid ship strikes, or to forge cooperation in conservation areas. 

    These advances come with their own complex ethical issues. But understanding non-human languages could transform our relationship with the rest of nature and promote a duty of care for the natural world.

    In a time of such deep division, it’s comforting to know that hidden underlying languages may potentially unite us. When we study the patterns of the universe, we’ll see that humanity isn’t at the center of it.

     

    Corrections:

    Aza refers to the founding of Earth Species Project (ESP) in 2017. The organization was established in 2018.

    When offering examples of self-awareness in animals, Aza mentions lemurs that get high on centipedes. They actually get high on millipedes. 

     

    RECOMMENDED MEDIA 

    Using AI to Listen to All of Earth’s Species

    An interactive panel discussion hosted at the World Economic Forum in San Francisco on October 25, 2022. Featuring ESP President and Cofounder Aza Raskin; Dr. Karen Bakker, Professor at UBC and Harvard Radcliffe Institute Fellow; and Dr. Ari Friedlaender, Professor at UC Santa Cruz

    What A Chatty Monkey May Tell Us About Learning to Talk

    The gelada monkey makes a gurgling sound that scientists say is close to human speech

    Lemurs May Be Making Medicine Out of Millipedes

    Red-fronted lemurs appear to use plants and other animals to treat their afflictions

    Fathom on AppleTV+

    Two biologists set out on an undertaking as colossal as their subjects – deciphering the complex communication of whales 

    Earth Species Project is Hiring a Director of Research

    ESP is looking for a thought leader in artificial intelligence with a track record of managing a team of researchers

     

    RECOMMENDED YUA EPISODES 

    The Three Rules of Humane Tech

    The AI Dilemma

    Synthetic Humanity: AI & What’s At Stake

     

    Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

    The AI Dilemma

    The AI Dilemma

    You may have heard about the arrival of GPT-4, OpenAI’s latest large language model (LLM) release. GPT-4 surpasses its predecessor in terms of reliability, creativity, and ability to process intricate instructions. It can handle more nuanced prompts compared to previous releases, and is multimodal, meaning it was trained on both images and text. We don’t yet understand its capabilities - yet it has already been deployed to the public.

    At Center for Humane Technology, we want to close the gap between what the world hears publicly about AI from splashy CEO presentations and what the people who are closest to the risks and harms inside AI labs are telling us. We translated their concerns into a cohesive story and presented the resulting slides to heads of institutions and major media organizations in New York, Washington DC, and San Francisco. The talk you're about to hear is the culmination of that work, which is ongoing.

    AI may help us achieve major advances like curing cancer or addressing climate change. But the point we're making is: if our dystopia is bad enough, it won't matter how good the utopia we want to create. We only get one shot, and we need to move at the speed of getting it right.

    RECOMMENDED MEDIA

    AI ‘race to recklessness’ could have dire consequences, tech experts warn in new interview

    Tristan Harris and Aza Raskin sit down with Lester Holt to discuss the dangers of developing AI without regulation

    The Day After (1983)

    This made-for-television movie explored the effects of a devastating nuclear holocaust on small-town residents of Kansas

    The Day After discussion panel

    Moderated by journalist Ted Koppel, a panel of present and former US officials, scientists and writers discussed nuclear weapons policies live on television after the film aired

    Zia Cora - Submarines 

    “Submarines” is a collaboration between musician Zia Cora (Alice Liu) and Aza Raskin. The music video was created by Aza in less than 48 hours using AI technology and published in early 2022

    RECOMMENDED YUA EPISODES 

    Synthetic humanity: AI & What’s At Stake

    A Conversation with Facebook Whistleblower Frances Haugen

    Two Million Years in Two Hours: A Conversation with Yuval Noah Harari

    Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

    Behind the Curtain on The Social Dilemma — with Jeff Orlowski-Yang and Larissa Rhodes

    Behind the Curtain on The Social Dilemma — with Jeff Orlowski-Yang and Larissa Rhodes

    How do you make a film that impacts more than 100 million people in 190 countries in 30 languages?

    This week on Your Undivided Attention, we're going behind the curtain on The Social Dilemma — the Netflix documentary about the dark consequences of the social media business model, which featured the Center for Humane Technology. On the heels of the film's 1-year anniversary and winning of 2 Emmy Awards, we're talking with Exposure Labs' Director Jeff Orlowski-Yang and Producer Larissa Rhodes. What moved Jeff and Larissa to shift their focus from climate change to social media? How did the film transform countless lives, including ours and possibly yours? What might we do differently if we were producing the film today? 

    Join us as we explore the reverberations of The Social Dilemma — which we're still feeling the effects of over one year later. 

    A Conversation with Facebook Whistleblower Frances Haugen

    A Conversation with Facebook Whistleblower Frances Haugen

    We are now in social media's Big Tobacco moment. And that’s largely thanks to the courage of one woman: Frances Haugen.

    Frances is a specialist in algorithmic product management. She worked at Google, Pinterest, and Yelp before joining Facebook — first as a Product Manager on Civic Misinformation, and then on the Counter-Espionage team. But what she saw at Facebook was that the company consistently and knowingly prioritized profits over public safety. So Frances made the courageous decision to blow the whistle — which resulted in the biggest disclosure in the history of Facebook, and in the history of social media.

    In this special interview, co-hosts Tristan and Aza go behind the headlines with Frances herself. We go deeper into the problems she exposed, discuss potential solutions, and explore her motivations — along with why she fundamentally believes change is possible. We also announce an exciting campaign being launched by the Center for Humane Technology — to use this window of opportunity to make Facebook safer.

    Spotlight — A Whirlwind Week of Whistleblowing

    Spotlight — A Whirlwind Week of Whistleblowing

    In seven years of working on the problems of runaway technology, we’ve never experienced a week like this! In this bonus episode of Your Undivided Attention, we recap this whirlwind of a week — from Facebook whistleblower France Haugen going public on 60 Minutes on Sunday, to the massive outage of Facebook, Instagram, and WhatsApp on Monday, to Haugen’s riveting Congressional testimony on Tuesday. We also make some exciting announcements — including our planned episode with Haugen up next, the Yale social media reform panel we’re participating in on Thursday, and a campaign we’re launching to pressure Facebook to make one immediate change. 

    This week it truly feels like we’re making history — and you’re a part of it.

    Do You Want to Become a Vampire? — with L.A. Paul

    Do You Want to Become a Vampire? — with L.A. Paul

    How do we decide whether to undergo a transformative experience when we don’t know how that experience will change us? This is the central question explored by Yale philosopher and cognitive scientist L.A. Paul

    Paul uses the prospect of becoming a vampire to illustrate the conundrum: let's say Dracula offers you the chance to become a vampire. You might be confident you'll love it, but you also know you'll become a different person with different preferences. Whose preferences do you prioritize: yours now, or yours after becoming a vampire? Similarly, whose preferences do we prioritize when deciding how to engage with technology and social media: ours now, or ours after becoming users — to the point of potentially becoming attention-seeking vampires? 

    In this episode with L.A. Paul, we're raising the stakes of the social media conversation — from technology that steers our time and attention, to technology that fundamentally transforms who we are and what we want. Tune in as Paul, Tristan Harris, and Aza Raskin explore the complexity of transformative experiences, and how to approach their ethical design.

    You Will Never Breathe the Same Again — with James Nestor

    You Will Never Breathe the Same Again — with James Nestor

    When author and journalist James Nestor began researching a piece on free diving, he was stunned. He found that free divers could hold their breath for up to 8 minutes at a time, and dive to depths of 350 feet on a single breath. As he dug into the history of breath, he discovered that our industrialized lives have led to improper and mindless breathing, with cascading consequences from sleep apnea to reduced mobility. He also discovered an entire world of extraordinary feats achieved through proper and mindful breathing — including healing scoliosis, rejuvenating organs, halting snoring, and even enabling greater sovereignty in our use of technology. What is the transformative potential of breath? And what is the relationship between proper breathing and humane technology?

    A Facebook Whistleblower — with Sophie Zhang

    A Facebook Whistleblower — with Sophie Zhang

    In September of 2020, on her last day at Facebook, data scientist Sophie Zhang posted a 7,900-word memo to the company's internal site. In it, she described the anguish and guilt she had experienced over the last two and a half years. She'd spent much of that time almost single-handedly trying to rein in fake activity on the platform by nefarious world leaders in small countries. Sometimes she received help and attention from higher-ups; sometimes she got silence and inaction. “I joined Facebook from the start intending to change it from the inside,” she said, but “I was still very naive at the time.” 

    We don’t have a lot of information about how things operate inside the major tech platforms, and most former employees aren’t free to speak about their experience. It’s easy to fill that void with inferences about what might be motivating a company — greed, apathy, disorganization or ignorance, for example — but the truth is usually far messier and more nuanced. Sophie turned down a $64,000 severance package to avoid signing a non-disparagement agreement. In this episode of Your Undivided Attention, she explains to Tristan Harris and Aza Raskin how she ended up here, and offers ideas about what could be done at these companies to prevent similar kinds of harm in the future.

    Mr. Harris Zooms to Washington

    Mr. Harris Zooms to Washington

    Back in January 2020, Tristan Harris went to Washington, D.C. to testify before the U.S. Congress on the harms of social media. A few weeks ago, he returned — virtually — for another hearing, Algorithms and Amplification: How Social Media Platforms’ Design Choices Shape Our Discourse and Our Minds. He testified alongside Dr. Joan Donovan, Research Director at the Harvard Kennedy School’s Shorenstein Center on Media Politics and Public Policy and the heads of policy from Facebook, YouTube and Twitter. The senators’ animated questioning demonstrated a deeper understanding of how these companies’ fundamental business models and design properties fuel hate and misinformation, and many of the lawmakers expressed a desire and willingness to take regulatory action. But, there’s still room for a more focused conversation. “It’s not about whether they filter out bad content,” says Tristan, “but really whether the entire business model of capturing human performance is a good way to organize society.” In this episode, a follow-up to last year’s “Mr. Harris Goes to Washington,” Tristan and Aza Raskin debrief about what was different this time, and what work lies ahead to pave the way for effective policy. 

    Mind the (Perception) Gap — with Dan Vallone

    Mind the (Perception) Gap — with Dan Vallone

    What do you think the other side thinks? Guest Dan Vallone is the Director of More in Common U.S.A., an organization that’s been asking Democrats and Republicans that critical question. Their work has uncovered countless “perception gaps” in our understanding of each other. For example, Democrats think that about 30 percent of Republicans support "reasonable gun control," but in reality, it’s about 70 percent. Both Republicans and Democrats think that about 50 percent of the other side would feel that physical violence is justified in some situations, but the actual number for each is only about five percent. “Both sides are convinced that the majority of their political opponents are extremists,” says Dan. “And yet, that's just not true.” Social media encourages the most extreme views to speak the loudest and rise to the top—and it’s hard to start a conversation and work together when we’re all arguing with mirages. But Dan’s insights and the work of More in Common provide a hopeful guide to unraveling the distortions we’ve come to accept and correcting our foggy vision.

    Come Together Right Now — with Shamil Idriss

    Come Together Right Now — with Shamil Idriss

    How many technologists have traveled to Niger, or the Balkans, or Rwanda, to learn the lessons of peacebuilding? Technology and social media are creating patterns and pathways of conflict that few people anticipated or even imagined just a decade ago. And we need to act quickly to contain the effects, but we don't have to reinvent the wheel. There are people, such as this episode’s guest, Shamil Idriss, CEO of the organization Search for Common Ground, who have been training for years to understand human beings and learn how to help them connect and begin healing processes. These experts can share their insights and help us figure out how to apply them to our new digital habitats. “Peace moves at the speed of trust, and trust can’t be fast-tracked,” says Shamil. Real change is possible, but as he explains, it takes patience, care, and creativity to get there. 

    Disinformation Then and Now — with Camille François

    Disinformation Then and Now — with Camille François

    Disinformation researchers have been fighting two battles over the last decade: one to combat and contain harmful information, and one to convince the world that these manipulations have an offline impact that requires complex, nuanced solutions. Camille François, Chief Information Officer at the cybersecurity company Graphika and an affiliate of the Harvard Berkman Klein Center for Internet & Society, believes that our common understanding of the problem has recently reached a new level. In this interview, she catalogues the key changes she observed between studying Russian interference in the 2016 U.S. election and helping convene and operate the Election Integrity Partnership watchdog group before, during and after the 2020 election. “I'm optimistic, because I think that things that have taken quite a long time to land are finally landing, and because I think that we do have a diverse set of expertise at the table,” she says. Camille and Tristan Harris dissect the challenges and talk about the path forward to a healthy information ecosystem.

    A Renegade Solution to Extractive Economics — with Kate Raworth

    A Renegade Solution to Extractive Economics — with Kate Raworth

    When Kate Raworth began studying economics, she was disappointed that the mainstream version of the discipline didn’t fully address many of the world issues that she wanted to tackle, such as human rights and environmental destruction. She left the field, but was inspired to jump back in after the financial crisis of 2008, when she saw an opportunity to introduce fresh perspectives. She sat down and drew a chart in the shape of a doughnut, which provided a way to think about our economic system while accounting for the impact to the world around us, as well as for humans’ baseline needs. Kate’s framing can teach us a lot about how to transform the economic model of the technology industry, helping us move from a system that values addicted, narcissistic, polarized humans to one that values healthy, loving and collaborative relationships. Her book, “Doughnut Economics: Seven Ways to Think Like a 21st Century Economist,” gives us a guide for transitioning from a 20th-century paradigm to an evolved 21st-century one that will address our existential-scale problems.

    Are the Kids Alright? — with Jonathan Haidt

    Are the Kids Alright? — with Jonathan Haidt

    We are in the midst of a teen mental health crisis. Since 2011, the rate of U.S. hospitalizations for preteen girls who have self-harmed is up 189 percent, and with older teen girls, it’s up 62 percent. Tragically, the numbers on suicides are similar — 151 percent higher for preteen girls, and 70 percent higher for older teen girls. NYU social psychologist Jonathan Haidt has spent the last few years trying to figure out why, working with fellow psychologist Jean Twenge, and he believes social media is to blame. Jonathan and Jean found that the mental health data show a stark contrast between Generation Z and Millennials, unlike any demographic divide researchers have seen since World War II, and the division tracks with a sharp rise in social media use. As Jonathan explains in this interview, disentangling correlation and causation is a persistent research challenge, and the debate on this topic is still in full swing. But as TikTok, Instagram, Snapchat and the next big thing fine-tune the manipulative and addictive features that pull teens in, we cannot afford to ignore this problem while we sit back and wait for conclusive results. When it comes to children, our standards need to be higher, and our burden of proof lower.

    Stranger than Fiction — with Claire Wardle

    Stranger than Fiction — with Claire Wardle

    How can tech companies help flatten the curve? First and foremost, they must address the lethal misinformation and disinformation circulating on their platforms. The problem goes much deeper than fake news, according to Claire Wardle, co-founder and executive director of First Draft. She studies the gray zones of information warfare, where bad actors mix facts with falsehoods, news with gossip, and sincerity with satire. “Most of this stuff isn't fake and most of this stuff isn't news,” Claire argues. If these subtler forms of misinformation go unaddressed, tech companies may not only fail to flatten the curve — they could raise it higher. 

    Mr. Harris Goes to Washington

    Mr. Harris Goes to Washington

    What difference does a few hours of Congressional testimony make? Tristan takes us behind the scenes of his January 8th testimony to the Energy and Commerce Committee on disinformation in the digital age. With just minutes to answer each lawmaker’s questions, he speaks with Committee members about how the urgency and complexity of humane technology issues is an immense challenge. Tristan returned hopeful, and though it sometimes feels like Groundhog Day, each trip to DC reveals evolving conversations, advancing legislation, deeper understanding and stronger coalitions. 

    Rock the Voter — with Brittany Kaiser

    Rock the Voter — with Brittany Kaiser

    Brittany Kaiser, a former Cambridge Analytica insider, witnessed a two day presentation at the company that shocked her and her co-workers. It laid out a new method of campaigning, in which candidates greet voters with a thousand faces and speak in a thousand tongues, automatically generating messages that are increasingly aiming toward an audience of one. She explains how these methods of persuasion have shaped elections worldwide, enabling candidates to sway voters in strange and startling ways.