Logo
    Search

    Podcast Summary

    • Exploring Compassion and Open-Mindedness with Dylan MarronThrough engaging with Internet haters, Dylan Marron learned the importance of empathy and understanding, inspiring us to break free from negative online environments and contribute to a more compassionate digital world.

      We have the power to break free from the negative and polarized online environment and choose compassion and open-mindedness instead. Dylan Marron, a modern-day Mister Rogers for the digital age, shares his experiences of engaging with Internet haters through his podcast, "Conversations with People Who Hate Me." In 2015, Marron gained international attention for his video series, "Every Single Word," which highlighted racial representation in films. With newfound fame came hate, and Marron found himself clicking on the profile pictures of his haters out of curiosity. Through these conversations, Marron learned valuable lessons about empathy and understanding. The Center for Humane Technology's Humane Innovation Lead role, currently open for applications, aims to support the emerging humane technology ecosystem and transform the online space into a more compassionate and open-minded environment. By stepping out of the game of outrage and hate, we can make a difference and rediscover our true selves in the digital world.

    • Unexpected human connection through online hateOnline negativity can lead to unexpected human connections, but the overwhelming nature of it requires better tools and resources to process and respond.

      Social media platforms can often create a sense of distance between users, leading individuals to construct fictional backstories and seek human connection in unexpected ways. This was the experience of a content creator who, after receiving hate messages, found himself on the phone with one of his haters. Initially, he coped by making fun of the messages in his comedy routine, but when one of the haters reached out, he accepted the invitation for a conversation. This unexpected connection led him to question how many other people might be willing to engage in similar conversations, despite the initial negativity. The experience also highlighted the overwhelming nature of online negativity and the need for better tools and resources to help individuals process and respond to it.

    • Connecting with those who have caused harm can lead to understanding and transformationConnecting with people who have caused harm or negativity can help reduce fear and transform them from distant others into reachable individuals through open and empathetic communication

      Connecting with people, even those who have caused harm or negativity, can help reduce fear and transform them from distant others into reachable individuals. Dylan, in the podcast "Conversations with People Who Hate Me," shared an experience where he left a positive review for someone, Josh, who was believed to have threatened him. Josh, who had seen Dylan's videos making fun of his typos, reached out to Dylan for a call, and the conversation led to a deeper understanding of each other's experiences. Dylan shared that the conversation felt like a bridge that wasn't on any maps and felt hopeful, leading him to share it with a wider audience. Both Dylan and Josh had faced bullying in high school, and their conversation provided a flavor of their shared experiences. By connecting with Josh, Dylan was able to address the negativity and look the fear in the face, making the other person no longer a distant other but a reachable person.

    • The Internet as a mirror of society and the cycle of negativityRecognize that online negativity isn't always caused by 'hurt people hurting people.' The gamification and ease of expression online can also contribute to the cycle. Seek understanding and empathy in digital interactions, expanding our definition of common ground.

      The Internet can serve as a mirror of society, amplifying the hurt and negativity that exists offline. However, not all online negativity can be attributed to "hurt people hurting people." The gamification of online spaces and the ease of expression can also contribute to the cycle of negativity. It's essential to recognize that awareness brings the opportunity for choice and that common ground is not always necessary for connection. Instead, we need to expand our definition of common ground and strive to find understanding and empathy in our digital interactions.

    • Focusing on common ground and shared curiosity can lead to profound connections.Emphasizing shared experiences and curiosity can bridge gaps and transform platforms into spaces for healthy interactions.

      Curiosity and shared experiences can bridge the gaps between people, even without agreement on certain topics. The speaker believes that focusing on common ground and shared curiosity can lead to profound connections, rather than seeking out communities based on shared hate. The example of the compassion team at Facebook, which used nonviolent communication principles to help resolve conflicts, illustrates the potential for technology to facilitate positive interactions. The ultimate goal is to transform platforms like Twitter from "fault line for profit machines" into "bridge for profit machines," where common ground is highlighted and conflicts are resolved, making the platforms more conducive to healthy and meaningful interactions.

    • Limiting the spread of harmful content onlineAwareness and limiting the number of engagements can prevent pile-ons and reduce the spread of harmful content online. Social media platforms could implement systems to show users how many others have engaged with a post or conversation, and creators should consider the potential impact of their work going viral.

      Awareness and limiting the spread of information can help prevent pile-ons and reduce the spread of harmful content online. The example given was the Chelsea Piers golf range in New York City, where golfers are isolated in their own pods and unaware of how many others are golfing around them. Similarly, on social media, people often join pile-ons without realizing the extent of the conversation or the harm it may cause. A potential solution could be implementing a system to show users how many others have engaged with a post or conversation, helping to prevent pile-ons and promote more thoughtful engagement. Another example given was the discovery by Facebook's integrity team that limiting the number of reshares to two could significantly reduce the spread of disinformation and hateful content. For creators, this means considering the potential impact of their work going viral and the importance of promoting thoughtful engagement and limiting the spread of harmful content.

    • The danger of oversimplifying complex storiesSocial media can lead to misunderstandings and shaming, often due to the overwhelming amount of information and pressure to belong. It's important to take the time to understand the full context before engaging in shaming cultures and to practice empathy towards those targeted.

      The sharing of information, especially in the age of social media, can lead to misunderstandings, hate, and shaming, often without a full understanding of the situation. This can result in people being piled on for not being the "right" version of a like-minded person, which can be more threatening than hate based on differences. It's important to remember that the people engaging in these behaviors may not be doing so out of malice, but rather due to the overwhelming amount of information and the pressure to belong. However, this can lead to a dangerous loss of nuance and complexity in the stories we engage with, and it's crucial to take the time to understand the full context before jumping to conclusions or participating in shaming cultures. Empathy towards those who may be the targets of these cultures is essential, as the stories are often much more complicated than they appear at first glance.

    • Trauma Inflation and Empathy DeflationConstant exposure to online traumatic content leads to sensitivity to certain triggers and less empathy for others, causing a fragmented global psychosis. Bridging the divide requires bringing people into conversations and understanding/validating each other's experiences.

      The constant exposure to traumatic content online, unique to each individual, leads to a phenomenon known as "trauma inflation" and "empathy deflation." This means that society is becoming more sensitive to certain triggers while having less empathy for those who have experienced different traumas. As a result, we end up in a fragmented global psychosis where different realities clash, leading to further offense and mistrust. To combat this, it's crucial to bring people into conversations with each other, as empathy is a natural byproduct of such interactions. The only way to scale this up is through more conversations and creating opportunities for diverse perspectives to come together. This may seem like a simple solution, but it's essential to recognize the importance of understanding and validating each other's experiences to bridge the divide and build a more empathetic society.

    • The Importance of Empathy in Online InteractionsEmpathy is crucial in online interactions, creating interfaces that deepen connections, allowing people to define themselves, and practicing empathy even when challenging.

      Technology can sometimes create a disconnect between people, but it's essential to remember that there's a human on the other side of the screen. Empathy is crucial in online interactions, even when we disagree. The more opportunities we have to connect with each other through technology, the better. However, there's a need for more choices on the menu, such as a "why" option when responding to messages. Designers and technologists should focus on creating interfaces that deepen connections and empower users to engage in meaningful conversations. Additionally, allowing people to define themselves on their own terms can be a powerful tool for disarming potential conflicts. Empathy is not endorsement, and it's essential to practice it in our online interactions, even when it's challenging. By fostering human connections through technology, we can create a more empathetic and understanding world.

    • Empathy as a tool for connecting with peopleEmpathy acknowledges humanity, doesn't endorse harmful ideologies, focuses on listening and understanding, requires patience and persistence, and can lead to less intimidating conversations.

      Learning from Dylan Marron's conversation is that empathy is a powerful tool for connecting with people, even those who hold opposing views. Empathy does not mean endorsing harmful ideologies, but rather acknowledging someone's humanity. Marron encourages avoiding debate and instead focusing on listening and understanding. He also emphasizes that change is a slow process and requires patience and persistence. Additionally, Marron shares his observation that many people are hesitant to engage in conversation due to fear and discomfort, but the experience can be less intimidating than expected. Overall, Marron's message is one of hope and the importance of human connection, even in the face of disagreement and difference.

    • Lead donors' supportGenerous contributions from Omidyar Network, Craig Newmark Philanthropies, and the Evolve Foundation have enabled significant impact and progress towards goals.

      The success of our project would not have been possible without the generous support of our lead donors, such as the Omidyar Network, Craig Newmark Philanthropies, and the Evolve Foundation, and the unwavering attention of our audience. These contributions have enabled us to make a significant impact and move closer to achieving our goals. We are deeply grateful for their belief in our mission and their commitment to our cause. Additionally, we would like to express our gratitude to everyone who has taken the time to learn about our project and engage with us. Your support, whether through donations or attention, is essential to our continued progress. Together, we can make a difference.

    Recent Episodes from Your Undivided Attention

    Why Are Migrants Becoming AI Test Subjects? With Petra Molnar

    Why Are Migrants Becoming AI Test Subjects? With Petra Molnar

    Climate change, political instability, hunger. These are just some of the forces behind an unprecedented refugee crisis that’s expected to include over a billion people by 2050. In response to this growing crisis, wealthy governments like the US and the EU are employing novel AI and surveillance technologies to slow the influx of migrants at their borders. But will this rollout stop at the border?

    In this episode, Tristan and Aza sit down with Petra Molnar to discuss how borders have become a proving ground for the sharpest edges of technology, and especially AI. Petra is an immigration lawyer and co-creator of the Migration and Technology Monitor. Her new book is “The Walls Have Eyes: Surviving Migration in the Age of Artificial Intelligence.”

    RECOMMENDED MEDIA

    The Walls Have Eyes: Surviving Migration in the Age of Artificial Intelligence

    Petra’s newly published book on the rollout of high risk tech at the border.

    Bots at the Gate

    A report co-authored by Petra about Canada’s use of AI technology in their immigration process.

    Technological Testing Grounds

    A report authored by Petra about the use of experimental technology in EU border enforcement.

    Startup Pitched Tasing Migrants from Drones, Video Reveals

    An article from The Intercept, containing the demo for Brinc’s taser drone pilot program.

    The UNHCR

    Information about the global refugee crisis from the UN.

    RECOMMENDED YUA EPISODES

    War is a Laboratory for AI with Paul Scharre

    No One is Immune to AI Harms with Dr. Joy Buolamwini

    Can We Govern AI? With Marietje Schaake

    CLARIFICATION:

    The iBorderCtrl project referenced in this episode was a pilot project that was discontinued in 2019

    Former OpenAI Engineer William Saunders on Silence, Safety, and the Right to Warn

    Former OpenAI Engineer William Saunders on Silence, Safety, and the Right to Warn

    This week, a group of current and former employees from Open AI and Google Deepmind penned an open letter accusing the industry’s leading companies of prioritizing profits over safety. This comes after a spate of high profile departures from OpenAI, including co-founder Ilya Sutskever and senior researcher Jan Leike, as well as reports that OpenAI has gone to great lengths to silence would-be whistleblowers. 

    The writers of the open letter argue that researchers have a “right to warn” the public about AI risks and laid out a series of principles that would protect that right. In this episode, we sit down with one of those writers: William Saunders, who left his job as a research engineer at OpenAI in February. William is now breaking the silence on what he saw at OpenAI that compelled him to leave the company and to put his name to this letter. 

    RECOMMENDED MEDIA 

    The Right to Warn Open Letter 

    My Perspective On "A Right to Warn about Advanced Artificial Intelligence": A follow-up from William about the letter

     Leaked OpenAI documents reveal aggressive tactics toward former employees: An investigation by Vox into OpenAI’s policy of non-disparagement.

    RECOMMENDED YUA EPISODES

    1. A First Step Toward AI Regulation with Tom Wheeler 
    2. Spotlight on AI: What Would It Take For This to Go Well? 
    3. Big Food, Big Tech and Big AI with Michael Moss 
    4. Can We Govern AI? with Marietje Schaake

    Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

    War is a Laboratory for AI with Paul Scharre

    War is a Laboratory for AI with Paul Scharre

    Right now, militaries around the globe are investing heavily in the use of AI weapons and drones.  From Ukraine to Gaza, weapons systems with increasing levels of autonomy are being used to kill people and destroy infrastructure and the development of fully autonomous weapons shows little signs of slowing down. What does this mean for the future of warfare? What safeguards can we put up around these systems? And is this runaway trend toward autonomous warfare inevitable or will nations come together and choose a different path? In this episode, Tristan and Daniel sit down with Paul Scharre to try to answer some of these questions. Paul is a former Army Ranger, the author of two books on autonomous weapons and he helped the Department of Defense write a lot of its policy on the use of AI in weaponry. 

    RECOMMENDED MEDIA

    Four Battlegrounds: Power in the Age of Artificial Intelligence: Paul’s book on the future of AI in war, which came out in 2023.

    Army of None: Autonomous Weapons and the Future of War: Paul’s 2018 book documenting and predicting the rise of autonomous and semi-autonomous weapons as part of modern warfare.

    The Perilous Coming Age of AI Warfare: How to Limit the Threat of Autonomous Warfare: Paul’s article in Foreign Affairs based on his recent trip to the battlefield in Ukraine.

    The night the world almost almost ended: A BBC documentary about Stanislav Petrov’s decision not to start nuclear war.

    AlphaDogfight Trials Final Event: The full simulated dogfight between an AI and human pilot. The AI pilot swept, 5-0.

    RECOMMENDED YUA EPISODES

    1. The AI ‘Race’: China vs. the US with Jeffrey Ding and Karen Hao
    2. Can We Govern AI? with Marietje Schaake
    3. Big Food, Big Tech and Big AI with Michael Moss
    4. The Invisible Cyber-War with Nicole Perlroth

    Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

    AI and Jobs: How to Make AI Work With Us, Not Against Us With Daron Acemoglu

    AI and Jobs: How to Make AI Work With Us, Not Against Us With Daron Acemoglu

    Tech companies say that AI will lead to massive economic productivity gains. But as we know from the first digital revolution, that’s not what happened. Can we do better this time around?

    RECOMMENDED MEDIA

    Power and Progress by Daron Acemoglu and Simon Johnson Professor Acemoglu co-authored a bold reinterpretation of economics and history that will fundamentally change how you see the world

    Can we Have Pro-Worker AI? Professor Acemoglu co-authored this paper about redirecting AI development onto the human-complementary path

    Rethinking Capitalism: In Conversation with Daron Acemoglu The Wheeler Institute for Business and Development hosted Professor Acemoglu to examine how technology affects the distribution and growth of resources while being shaped by economic and social incentives

    RECOMMENDED YUA EPISODES

    1. The Three Rules of Humane Tech
    2. The Tech We Need for 21st Century Democracy
    3. Can We Govern AI?
    4. An Alternative to Silicon Valley Unicorns

    Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

    Jonathan Haidt On How to Solve the Teen Mental Health Crisis

    Jonathan Haidt On How to Solve the Teen Mental Health Crisis

    Suicides. Self harm. Depression and anxiety. The toll of a social media-addicted, phone-based childhood has never been more stark. It can be easy for teens, parents and schools to feel like they’re trapped by it all. But in this conversation with Tristan Harris, author and social psychologist Jonathan Haidt makes the case that the conditions that led to today’s teenage mental health crisis can be turned around – with specific, achievable actions we all can take starting today.

    This episode was recorded live at the San Francisco Commonwealth Club.  

    Correction: Tristan mentions that 40 Attorneys General have filed a lawsuit against Meta for allegedly fostering addiction among children and teens through their products. However, the actual number is 42 Attorneys General who are taking legal action against Meta.

    Clarification: Jonathan refers to the Wait Until 8th pledge. By signing the pledge, a parent  promises not to give their child a smartphone until at least the end of 8th grade. The pledge becomes active once at least ten other families from their child’s grade pledge the same.

    Chips Are the Future of AI. They’re Also Incredibly Vulnerable. With Chris Miller

    Chips Are the Future of AI. They’re Also Incredibly Vulnerable. With Chris Miller

    Beneath the race to train and release more powerful AI models lies another race: a race by companies and nation-states to secure the hardware to make sure they win AI supremacy. 

    Correction: The latest available Nvidia chip is the Hopper H100 GPU, which has 80 billion transistors. Since the first commercially available chip had four transistors, the Hopper actually has 20 billion times that number. Nvidia recently announced the Blackwell, which boasts 208 billion transistors - but it won’t ship until later this year.

    RECOMMENDED MEDIA 

    Chip War: The Fight For the World’s Most Critical Technology by Chris Miller

    To make sense of the current state of politics, economics, and technology, we must first understand the vital role played by chips

    Gordon Moore Biography & Facts

    Gordon Moore, the Intel co-founder behind Moore's Law, passed away in March of 2023

    AI’s most popular chipmaker Nvidia is trying to use AI to design chips faster

    Nvidia's GPUs are in high demand - and the company is using AI to accelerate chip production

    RECOMMENDED YUA EPISODES

    Future-proofing Democracy In the Age of AI with Audrey Tang

    How Will AI Affect the 2024 Elections? with Renee DiResta and Carl Miller

    The AI ‘Race’: China vs. the US with Jeffrey Ding and Karen Hao

    Protecting Our Freedom of Thought with Nita Farahany

    Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

     

     

    Future-proofing Democracy In the Age of AI with Audrey Tang

    Future-proofing Democracy In the Age of AI with Audrey Tang

    What does a functioning democracy look like in the age of artificial intelligence? Could AI even be used to help a democracy flourish? Just in time for election season, Taiwan’s Minister of Digital Affairs Audrey Tang returns to the podcast to discuss healthy information ecosystems, resilience to cyberattacks, how to “prebunk” deepfakes, and more. 

    RECOMMENDED MEDIA 

    Testing Theories of American Politics: Elites, Interest Groups, and Average Citizens by Martin Gilens and Benjamin I. Page

    This academic paper addresses tough questions for Americans: Who governs? Who really rules? 

    Recursive Public

    Recursive Public is an experiment in identifying areas of consensus and disagreement among the international AI community, policymakers, and the general public on key questions of governance

    A Strong Democracy is a Digital Democracy

    Audrey Tang’s 2019 op-ed for The New York Times

    The Frontiers of Digital Democracy

    Nathan Gardels interviews Audrey Tang in Noema

    RECOMMENDED YUA EPISODES 

    Digital Democracy is Within Reach with Audrey Tang

    The Tech We Need for 21st Century Democracy with Divya Siddarth

    How Will AI Affect the 2024 Elections? with Renee DiResta and Carl Miller

    The AI Dilemma

    Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

    U.S. Senators Grilled Social Media CEOs. Will Anything Change?

    U.S. Senators Grilled Social Media CEOs. Will Anything Change?

    Was it political progress, or just political theater? The recent Senate hearing with social media CEOs led to astonishing moments — including Mark Zuckerberg’s public apology to families who lost children following social media abuse. Our panel of experts, including Facebook whistleblower Frances Haugen, untangles the explosive hearing, and offers a look ahead, as well. How will this hearing impact protocol within these social media companies? How will it impact legislation? In short: will anything change?

    Clarification: Julie says that shortly after the hearing, Meta’s stock price had the biggest increase of any company in the stock market’s history. It was the biggest one-day gain by any company in Wall Street history.

    Correction: Frances says it takes Snap three or four minutes to take down exploitative content. In Snap's most recent transparency report, they list six minutes as the median turnaround time to remove exploitative content.

    RECOMMENDED MEDIA 

    Get Media Savvy

    Founded by Julie Scelfo, Get Media Savvy is a non-profit initiative working to establish a healthy media environment for kids and families

    The Power of One by Frances Haugen

    The inside story of France’s quest to bring transparency and accountability to Big Tech

    RECOMMENDED YUA EPISODES

    Real Social Media Solutions, Now with Frances Haugen

    A Conversation with Facebook Whistleblower Frances Haugen

    Are the Kids Alright?

    Social Media Victims Lawyer Up with Laura Marquez-Garrett

    Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

     

     

    Taylor Swift is Not Alone: The Deepfake Nightmare Sweeping the Internet

    Taylor Swift is Not Alone: The Deepfake Nightmare Sweeping the Internet

    Over the past year, a tsunami of apps that digitally strip the clothes off real people has hit the market. Now anyone can create fake non-consensual sexual images in just a few clicks. With cases proliferating in high schools, guest presenter Laurie Segall talks to legal scholar Mary Anne Franks about the AI-enabled rise in deep fake porn and what we can do about it. 

    Correction: Laurie refers to the app 'Clothes Off.' It’s actually named Clothoff. There are many clothes remover apps in this category.

    RECOMMENDED MEDIA 

    Revenge Porn: The Cyberwar Against Women

    In a five-part digital series, Laurie Segall uncovers a disturbing internet trend: the rise of revenge porn

    The Cult of the Constitution

    In this provocative book, Mary Anne Franks examines the thin line between constitutional fidelity and constitutional fundamentalism

    Fake Explicit Taylor Swift Images Swamp Social Media

    Calls to protect women and crack down on the platforms and technology that spread such images have been reignited

    RECOMMENDED YUA EPISODES 

    No One is Immune to AI Harms

    Esther Perel on Artificial Intimacy

    Social Media Victims Lawyer Up

    The AI Dilemma

    Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

    Can Myth Teach Us Anything About the Race to Build Artificial General Intelligence? With Josh Schrei

    Can Myth Teach Us Anything About the Race to Build Artificial General Intelligence? With Josh Schrei

    We usually talk about tech in terms of economics or policy, but the casual language tech leaders often use to describe AI — summoning an inanimate force with the powers of code — sounds more... magical. So, what can myth and magic teach us about the AI race? Josh Schrei, mythologist and host of The Emerald podcast,  says that foundational cultural tales like "The Sorcerer's Apprentice" or Prometheus teach us the importance of initiation, responsibility, human knowledge, and care.  He argues these stories and myths can guide ethical tech development by reminding us what it is to be human. 

    Correction: Josh says the first telling of "The Sorcerer’s Apprentice" myth dates back to ancient Egypt, but it actually dates back to ancient Greece.

    RECOMMENDED MEDIA 

    The Emerald podcast

    The Emerald explores the human experience through a vibrant lens of myth, story, and imagination

    Embodied Ethics in The Age of AI

    A five-part course with The Emerald podcast’s Josh Schrei and School of Wise Innovation’s Andrew Dunn

    Nature Nurture: Children Can Become Stewards of Our Delicate Planet

    A U.S. Department of the Interior study found that the average American kid can identify hundreds of corporate logos but not plants and animals

    The New Fire

    AI is revolutionizing the world - here's how democracies can come out on top. This upcoming book was authored by an architect of President Biden's AI executive order

    RECOMMENDED YUA EPISODES 

    How Will AI Affect the 2024 Elections?

    The AI Dilemma

    The Three Rules of Humane Tech

    AI Myths and Misconceptions

     

    Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

    Related Episodes

    Spotlight — Conversations With People Who Hate Me with Dylan Marron

    Spotlight — Conversations With People Who Hate Me with Dylan Marron

    This week on Your Undivided Attention, we’re doing something different: we’re airing an episode of another podcast that’s also part of the TED Audio Collective.

    Backing up for a moment: we recently aired an episode with Dylan Marron — creator and host of the podcast, Conversations With People Who Hate Me. On his show, Dylan calls up the people behind negative comments on the internet, and asks them: why did you write that?

    In our conversation with Dylan, we played a clip from episode 2 of Conversations With People Who Hate Me. In that episode, Dylan talks with a high school student named Josh, who’d sent him homophobic messages online. This week, we're airing that full episode — the full conversation between Dylan Marron and Josh.

    If you didn’t hear our episode with Dylan, do give it a listen. Then, enjoy this second episode of Conversations With People Who Hate Me.

    RECOMMENDED YUA EPISODES 

    Transcending the Internet Hate Game with Dylan Marron: https://www.humanetech.com/podcast/52-transcending-the-internet-hate-game

    A Conversation with Facebook Whistleblower Frances Haugen: https://www.humanetech.com/podcast/42-a-conversation-with-facebook-whistleblower-frances-haugen

    The Cure for Hate. Guest: Tony McAleer: https://www.humanetech.com/podcast/11-the-cure-for-hate

    Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

    Always, Always On: Technology, Digital Life, and New Media / Angela Gorrell

    Always, Always On: Technology, Digital Life, and New Media / Angela Gorrell

    How do visions of flourishing life converge in the new media landscape? Theologian Angela Gorrell (Baylor University) reflects on the challenges and opportunities of technology and digital life, especially those that reveal to us who we are, who we are becoming, and to whom we belong.

    Show Notes

    • The purpose of Always on: Practicing Faith in a New Media Landscape
      • New media: not just social media, but entertainment, productivity, tools, and more
    • How to develop interested conversations about the impact of new media on moral, relational, political, and spiritual life.
    • How do visions of flourishing life converge in the new media landscape?
    • Understanding (and exploiting) human psychology in new media business
    • Seeking joy through affirmation and recognition
    • Becoming curious and open to conversations about new media.
    • The idolatry of technology
    • The chief task of adolescence growing into healthy adulthood: Identity and belonging—Who am I? Whose am I? 
    • Recognition has become malformed in the new media landscape.
    • The threat of diminished humanity through new media
    • Being one’s real self online and in-person
    • The importance of participation in order to act redemptively online
    • Numbness, anxiety, and depression that comes through passivity
    • When will you disengage from new media? When will you engage and participate?
    • Developing a rhythm of life that appreciates human hybridity of physical and mental mediated life
    • Ask: How can I nurture connection in digital spaces in meaningful ways? 

    About Angela Gorrell

    Dr. Angela Williams Gorrell is Assistant Professor of Practical Theology at Baylor University's George W. Truett Theological Seminary. Prior to joining the faculty at Baylor University, she was an Associate Research Scholar at the Yale Center for Faith & Culture, working on the Theology of Joy and the Good Life Project, and a lecturer in Divinity and Humanities at Yale University in New Haven, Connecticut. She is an ordained pastor with 14 years of ministry experience. Dr. Gorrell is passionate about finding issues that matter to people and shining the light of the Gospel on them. She is currently working on a book that shares findings of the joy project while addressing America’s opioid and suicide crises. Dr. Gorrell’s expertise is in the areas of theology and contemporary culture, education and formation, new media, and youth and emerging adults.

    Ep 45 - Call the Executioner

    Ep 45 - Call the Executioner

    We have more dating profile messages, hot off the griddle for ya! Face tattoos, Buttman, and Trogdor- oh my! 

    The Internet is a weird place!

    We all know this. We all accept this. And yet, people across this nightmarish global network are still surprising us by finding the most bizarre ways to make us uncomfortable. How many times have you opened a message on your social media, email, or otherwise and thought, ‘what the literal fuck was that?’

    Join us at Hate Male Podcast as we delve into our menagerie of stories from people all over the world, and read aloud the unsolicited (and often hilarious) messages received on the internet.

    The world is terrible! So let’s laugh at it together in what could be considered as a potentially unhealthy coping mechanism. Follow us on social media to stay updated! https://www.facebook.com/hatemalepodcast/ 
    https://www.instagram.com/hatemalepodcast/ 
    https://twitter.com/hatemalepodcast 

    Producer: Steve Labedz 

    Music: Robert Bock 
    https://www.thewelltempered.com/ 

    Art: Rowan Gray 
    https://www.rowangray.net/