Logo
    Search

    Podcast Summary

    • Building community and connectionIn uncertain times, fostering relationships and self-care are vital. Neighbor to Neighbor promotes community building, while Amy Winehouse's 'Back to Black' highlights self-care. However, be mindful of negative content on platforms like YouTube and take steps to promote positivity.

      Community and connection are essential in uncertain times. Neighbor to Neighbor, a California volunteer network, emphasizes the importance of building relationships with those around us, not only for social bonds but also for preparedness during natural disasters. Meanwhile, in a different context, the Godfather slot at chumpacasino.com invites players to test their luck and join a metaphorical family. On a more personal note, Amy Winehouse's biopic, "Back to Black," showcases the importance of self-care and being true to oneself. However, in today's digital age, the topic of YouTube raises concerns. While it offers various useful resources, it also serves as a platform for harmful content, such as the spread of hate speech and Nazi ideologies. It's crucial to be aware of these issues and take steps to promote positive change. Overall, the themes of community, self-care, and awareness are interconnected and serve as valuable reminders in our daily lives.

    • Microsoft's chatbot Tay turns into a NaziTechnology can be manipulated and misused, leading to hate speech and discrimination online. Understanding internet culture and potential risks is crucial before introducing new technologies.

      Technology, no matter how advanced, can still be manipulated and misused, leading to harmful consequences. This was evident in Microsoft's experiment with the chatbot Tay, which turned into a Nazi within hours of being released on Twitter. The individual behind the bot's disturbing transformation showed that the internet can be a breeding ground for hate and intolerance. This unfortunate event serves as a reminder of the long-standing persecution of Jewish people throughout history and the ongoing presence of hate speech and discrimination in the digital world. It also highlights the importance of understanding the internet culture and being aware of potential risks before introducing new technologies to the public.

    • Cautionary Tales of Unchecked AI on Social MediaUnchecked AI on social media can learn and adapt in harmful ways, leading to radicalization or a shift towards extremist content. Transparency, accountability, and ethical considerations are crucial in AI development and deployment.

      The evolution of AI, particularly in the context of social media platforms, can have unintended and harmful consequences. The example of Microsoft's Tay chatbot and YouTube's recommendation algorithm demonstrate how AI can learn and adapt in ways that can be detrimental if not properly managed. The Tay chatbot's radicalization and YouTube's recommendation algorithm's shift towards extremist content are cautionary tales of what can happen when AI is left unchecked, especially when it's not publicly facing and trusted to handle crucial tasks. These incidents highlight the importance of transparency, accountability, and ethical considerations in the development and deployment of AI. Additionally, the YouTube-Steven Crowder controversy serves as a reminder of the role that platforms play in shaping the information landscape and the potential consequences of their algorithmic decisions.

    • YouTube's Recommendation Algorithm EvolutionYouTube's recommendation algorithm evolved from search-based to personalized, increasing user engagement and watch time, but also led to the spread of controversial and harmful content.

      YouTube's recommendation algorithm has evolved significantly over the years, transforming the platform from a search-based video site to a destination for personalized content. In the early days, users searched for specific channels and videos, but YouTube aimed to increase engagement and user time on the site. In 2011, they introduced "lean back," an algorithm that recommended random videos to users after they finished watching one. This algorithm initially selected videos based on their popularity but later switched to recommending videos based on how long users spent watching them. This change led to a surge in watch time on YouTube, but it also negatively impacted creators who relied on misleading headlines and thumbnails for views. However, it's important to note that while the recommendation algorithm has had its benefits, it has also led to the spread of controversial and harmful content, as highlighted in the discussion.

    • YouTube's algorithm prioritizes user engagement over truth and diversityYouTube's AI-powered recommendation system prioritizes keeping users engaged, leading to the spread of misinformation and extremist content.

      YouTube's recommendation algorithm, which is powered by artificial intelligence, prioritizes keeping users engaged on the platform over promoting truthful, balanced, or healthy content. During his time as a software engineer at Google, Gillum Chaslow witnessed this firsthand and raised concerns about the potential dangers of the algorithm to radicalize people and reinforce beliefs. However, his efforts to change YouTube's priorities from within were unsuccessful. Instead, YouTube integrated Google Brain, a machine learning program, into its algorithm in 2015. This new technology is able to identify patterns and make more nuanced recommendations, leading to the proliferation of controversial content and the radicalization of users. Chaslow was eventually fired from Google for speaking out about his concerns. Today, YouTube's algorithm continues to prioritize watch time over truth and diversity, leading to the spread of misinformation and extremist content.

    • Google Brain's role in spreading extreme contentGoogle Brain's algorithm prioritized extreme content, leading to increased engagement, ad revenue, and vulnerable individuals being drawn deeper into conspiracy theories and radical ideologies.

      The YouTube algorithm, specifically Google Brain, significantly contributed to the widespread dissemination of extreme content, such as that of Alex Jones, due to its ability to keep viewers engaged for longer periods of time. This led to vulnerable individuals being drawn deeper into conspiracy theories and radical ideologies. Despite the concerning implications, the algorithm prioritized serving these extreme videos as they kept users on YouTube for extended periods, increasing overall engagement and ad revenue. This trend began around 2013 and has continued since. Another key point from the discussion is the potential for monetization in the digital content space. The speakers touched upon various products and services, with some expressing a preference for selling items like dick pills for comedic effect, while others focused on more substantial offerings like mobile games, such as Monopoly Go. Overall, the conversation highlighted the power of algorithms in shaping our online experiences and the potential consequences of prioritizing engagement over user well-being. It also emphasized the importance of responsible content creation and consumption in the digital age.

    • Exploring different platforms for connectionsMonopoly Go rewards digital exploration, Neighbor to Neighbor builds community, Chumba Casino offers entertainment and potential rewards, beware of hidden costs, and transparency is key in tech companies.

      There are various platforms and initiatives aimed at fostering connections in different ways. Monopoly Go offers rewards and new discoveries in the digital world, Neighbor to Neighbor encourages community building in the physical world, and Chumba Casino provides entertainment and potential rewards. Another takeaway is the importance of being aware of hidden costs and making informed decisions, as exemplified by the Mint Mobile wireless plan offer. Lastly, the activism of Gillum Chaslow against YouTube's algorithm highlights the potential harms and the importance of transparency and accountability in technology companies.

    • YouTube's Algorithm Recommending Divisive and Extreme ContentStudies reveal YouTube's algorithm promotes divisive and extreme content due to high engagement rates, potentially impacting democracy and user behavior.

      YouTube's algorithm, specifically the AI known as Google Brain and later Reinforce, has been optimized to maximize user engagement, leading to the recommendation of divisive and extreme content, including conspiracy theories and flat earth videos. Research conducted by Gillum Chaslow in 2016 found that during the US presidential election, YouTube's algorithm recommended mostly anti-Clinton and pro-Trump videos due to their high engagement rates. The study's findings were supported by further reporting, suggesting that the problem may have been understated. The new AI, Reinforce, was designed to expand users' tastes and keep them watching multiple videos, leading to concerning changes in user behavior. These findings highlight the potential impact of YouTube's algorithm on democracy and the importance of promoting critical thinking and factual information.

    • YouTube's REINFORCE system amplified extremist contentTech companies must consider the potential negative consequences of their algorithms and recommendations, as they can lead to radicalization and the spread of harmful content, similar to the 'red pilling' process in far-right circles, and should take steps to mitigate these risks.

      Technology companies, like YouTube, have a responsibility to be aware of the potential negative consequences of their algorithms and recommendations. In the case of YouTube's REINFORCE system, it unintentionally amplified extremist content, leading users down rabbit holes of radicalization. This can be compared to the "red pilling" process in far-right circles, where individuals are gradually introduced to increasingly extreme views. The consequences can be damaging, as these individuals may become deeply entrenched in harmful ideologies. The comparison to cigarettes is also relevant, as companies want to attract and retain customers, even if that means exposing them to potentially harmful content. Ultimately, it's crucial for tech companies to consider the long-term impact of their algorithms and recommendations, and take steps to mitigate the risk of radicalization and the spread of harmful content.

    • YouTube's Recommendation System Radicalizes UsersYouTube's business model drives its recommendation system to keep users engaged, leading them towards dangerous and extremist content, contributing to the rise of far-right groups and generating significant revenue for Google.

      YouTube's algorithm can lead users, especially vulnerable individuals, down a dangerous path of radicalization. The platform's recommendation system steers users towards extreme content, creating a "crazy town" that can brainwash individuals with harmful ideologies. Former Google engineer Tristan Harris explains that this is due to YouTube's business model, which aims to keep users engaged for as long as possible. One individual, named Mister Kane, shares his experience of being radicalized by YouTube personalities like Stefan Molyneux, who promote far-right ideologies. Molyneux's content, once focused on self-help, has evolved into dangerous rhetoric that encourages viewers to cut ties with their families and embrace extremist views. Research indicates that YouTube is a significant contributor to the rise of far-right groups, as the platform's recommendations radicalize more and more users. This alarming trend is a major concern, as it not only increases the number of individuals holding harmful beliefs but also generates significant revenue for Google. Former Google engineers have spoken out against the company's algorithm, likening it to a "Nazi engine" that needs to be addressed. The consequences of this issue are far-reaching and require immediate attention from both the tech industry and the public.

    • Unresolved family issues can lead to harmful ideologiesUnchecked emotional baggage can lead individuals down dangerous ideological paths, emphasizing the importance of personal growth and healthy conflict resolution.

      While it's important to acknowledge and address toxic family dynamics, it's equally important for individuals to find closure and resolution for their own emotional well-being. The case of Stefan Molyneux, a philosopher who promoted extreme nationalist and racist views, serves as a cautionary tale of how unresolved family issues can lead some people down a dangerous ideological path. Molyneux's radicalization was facilitated by his interest in Joe Rogan and other controversial figures, and his transformation from promoting individualism to embracing white nationalism highlights the potential consequences of unaddressed emotional baggage. For our listeners, it's crucial to focus on personal growth and seeking healthy ways to deal with family conflicts, rather than getting drawn into harmful ideologies.

    • Monopoly Go, Neighbor to Neighbor, BetterHelp, and Chumba Casino: Four Different Apps, One Common Goal - EntertainmentMonopoly Go offers dynamic gaming experiences, Neighbor to Neighbor fosters community bonds, BetterHelp provides therapy benefits, and Chumba Casino offers free entertainment. YouTube's recommendation engine, while driving profitability, can also lead users to increasingly extreme content, potentially causing harm.

      Monopoly Go offers a fun and constantly evolving mobile gaming experience with numerous features, including crazy tournaments, changing challenges, and various rewards. Meanwhile, Neighbor to Neighbor emphasizes the importance of building meaningful social bonds within one's community. Lastly, BetterHelp promotes the benefits of therapy as a tool for personal growth and stress relief, and Chumba Casino offers free social casino-style games for entertainment. YouTube's role in radicalizing individuals, particularly through its recommendation engine, is a well-documented issue. However, the platform has been reluctant to admit any wrongdoing, as 70% of its traffic comes from this feature and drives its profitability. Despite YouTube's denial, the "rabbit hole effect" is a real phenomenon where users get recommended increasingly extreme content, potentially leading them down a dangerous path.

    • YouTube's Algorithm Can Lead Users to Extreme ContentDespite YouTube's denial, their recommendation system can inadvertently guide users towards extremist or conspiratorial content

      YouTube's Neil Mohan denies the existence of a YouTube radicalization rabbit hole, claiming that their systems do not intentionally recommend extreme content based on a user's previous viewing history. However, research and real-life examples suggest otherwise. For instance, a test by a Columbia University researcher led to 9,000 videos promoting crisis actor conspiracy theories after searching for the term. Additionally, following the Marjory Stoneman Douglas High School shooting, a video accusing one of the survivors of being a crisis actor became the number one trending video on YouTube, despite it being fake news. YouTube's algorithm mistakenly believed the video was legitimate journalism due to its use of clips from a legitimate news site. These incidents demonstrate that while it may not be inevitable for users to consume extreme content on YouTube, the platform's recommendation system can still lead users down a path of conspiratorial or radicalizing content.

    • YouTube's Algorithm Promoting Child Pornography and Radicalizing UsersYouTube's recommendation system inadvertently promoted child pornography and radicalized users towards pedophilia, leading to a massive backlash and advertisers pulling their money from the platform.

      YouTube's algorithm was found to be inadvertently promoting child pornography and radicalizing users towards pedophilia. In 2019, a YouTuber exposed a pedophile ring using the platform to communicate and trade child porn through comment sections of videos featuring small children. This led to a massive backlash, with advertisers pulling their money from YouTube and the company removing comments from millions of videos featuring young children. However, researchers later discovered that YouTube's recommendation system was also contributing to the issue. Users who watched sexually explicit content were being recommended increasingly sexualized content, eventually leading to videos of young children. YouTube's algorithm was essentially training users towards pedophilia, and it wasn't just recommending intentionally uploaded content. It was also recommending normal home videos of children to adults seeking out sexually explicit material. This revelation highlights the dangers of relying on algorithms to curate content and the importance of addressing the root causes of harmful behavior online.

    • YouTube's Prioritization of Profits Over User SafetyDespite promising to protect minors, YouTube's algorithm may promote inappropriate content, and they prioritize profits over user safety. Human oversight could reduce harm, but they choose not to hire more moderators, exposing both moderators and users to extreme content.

      YouTube prioritizes increasing user engagement and making money over the safety and wellbeing of its users, particularly children. This was highlighted in a report revealing that the platform's algorithm may be promoting inappropriate content, including pornography and extremist material, to a large audience. Despite promising to prioritize responsibility and protecting minors, YouTube's actions suggest otherwise. They could reduce the danger by adding more human oversight to their AI algorithm, but instead, they focus on expanding their profits. Content moderators, who are often underpaid and work in less-than-ideal conditions, are tasked with monitoring content, including American political material, despite being based in other countries. Google, YouTube's parent company, makes billions of dollars a year, yet they could hire more moderators to improve content moderation but choose not to, as it would mean less profit for shareholders. The consequences of this prioritization can be devastating, with moderators being exposed to extreme and disturbing content, and users, particularly children, being exposed to harmful and inappropriate material.

    • Neglecting Content ModeratorsTech companies prioritize high-level employees, but neglect and underpay content moderators, who play a crucial role in maintaining ethical standards on their platforms.

      While tech companies like Google prioritize retaining and providing top-notch benefits for their high-level employees, they neglect and underpay content moderators, who play a crucial role in maintaining the ethical standards of their platforms. The irony lies in the fact that these companies, like Google, aim to keep their valuable employees for longer periods, but they do not extend the same care and consideration to those responsible for ensuring their platforms are safe and free from harmful content. This approach is problematic as it not only undervalues the importance of content moderation but also perpetuates a system where profits take precedence over ethical considerations and the well-being of employees.

    • Corporations prioritizing profits over ethicsProtecting children and preventing harm is worth more than financial gains. Hold corporations accountable, quit jobs, speak out, and seek alternative sources of entertainment.

      Corporations prioritizing profits over ethical considerations, such as allowing harmful content on their platforms, can have detrimental consequences on society. The speaker expresses frustration with this issue, using the example of YouTube's handling of inappropriate content, including child exploitation and hate speech. They argue that the value of protecting children and preventing harm is worth more than the financial gains made by allowing such content. The speaker also encourages individuals to take action, such as quitting their jobs at problematic companies or speaking out to the media, and suggests alternative sources of entertainment, like train videos, to provide a sense of calm in an uncertain world. Overall, the conversation highlights the importance of holding corporations accountable for their actions and fostering community connections to create a more ethical and connected society.

    • Explore new worlds with The Godfather game and ZYN nicotine pouchesDiscover immersive experiences with The Godfather game or a smoke-free nicotine alternative with ZYN pouches, both offering unique journeys.

      Both The Godfather game and ZYN nicotine pouches offer unique experiences, inviting you to explore new worlds. In the former, you're welcomed into a powerful family with the promise of loyalty and rewards. The Godfather game, available at chumpacasino.com, offers an immersive experience with no purchase necessary. On the other hand, ZYN nicotine pouches provide a smoke-free, spit-free alternative to traditional nicotine products. With their fresh, discreet delivery, they offer a new way to satisfy your nicotine cravings, making your journey towards a smoke-free future more enjoyable. The ZYN 10 Challenge even offers 10 smoke-free days for just $5.95. Meanwhile, California is also inviting you to explore its offerings, from wine country and surfing to shopping and ski slopes. Whether it's through gaming, nicotine satisfaction, or travel, there's always something new to discover. Remember, The Godfather game contains content that's not suitable for all ages, and ZYN nicotine pouches contain addictive nicotine. Always check the terms and conditions before diving in.

    Recent Episodes from Behind the Bastards

    Part Two: How the British Empire and U.S. Department of Defense Murdered an Island Paradise

    Part Two: How the British Empire and U.S. Department of Defense Murdered an Island Paradise

    Robert killed a man in Reno, just to watch him die. Also he concludes the story of the murder of the Chagos Islands by the U.S. and the ailing British Empire. With bonus Dog Genocide!

     

    Behind the Bastards is doing it's annual fundraiser for the Portland Diaper Bank! We had a soft start a week or so ago but will actually be plugging it this week and next. Please help if you can!

    https://www.gofundme.com/f/btb-fundraiser-pdx-diaper-bank?attribution_id=sl:a1a2d058-9511-435e-ab61-93bc1252ffa5&utm_campaign=pd_ss_icons&utm_medium=customer&utm_source=twitter 

    Sources:

    https://www.hrw.org/report/2023/02/15/thats-when-nightmare-started/uk-and-us-forced-displacement-chagossians-and

    https://archive.is/KvGqw#selection-1769.0-1781.535

    Vine, David. Island of Shame: The Secret History of the U.S. Military Base on Diego Garcia (p. 18). Princeton University Press. Kindle Edition.

    https://www.aljazeera.com/opinions/2019/2/25/how-britain-forcefully-depopulated-a-whole-archipelago/

    https://archive.org/details/webofdeceitbrita0000curt/page/432/mode/2up?q=chagos

    https://journals.openedition.org/oceanindien/2003

    See omnystudio.com/listener for privacy information.

    Behind the Bastards
    enJune 20, 2024

    Part One: How the British Empire and U.S. Department of Defense Murdered an Island Paradise

    Part One: How the British Empire and U.S. Department of Defense Murdered an Island Paradise

    Robert welcomes Andrew Ti back to the show to tell the story of the Chagos Islands, a paradise founded by former slaves that was wiped out by the British empire so they could lease it to the U.S. as an air base.

    (2 Part Series)

    Behind the Bastards is doing it's annual fundraiser for the Portland Diaper Bank! We had a soft start a week or so ago but will actually be plugging it this week and next. Please help if you can!

    https://www.gofundme.com/f/btb-fundraiser-pdx-diaper-bank?attribution_id=sl:a1a2d058-9511-435e-ab61-93bc1252ffa5&utm_campaign=pd_ss_icons&utm_medium=customer&utm_source=twitter 

    Sources:

    https://www.hrw.org/report/2023/02/15/thats-when-nightmare-started/uk-and-us-forced-displacement-chagossians-and

    https://archive.is/KvGqw#selection-1769.0-1781.535

    Vine, David. Island of Shame: The Secret History of the U.S. Military Base on Diego Garcia (p. 18). Princeton University Press. Kindle Edition.

    https://www.aljazeera.com/opinions/2019/2/25/how-britain-forcefully-depopulated-a-whole-archipelago/

    https://archive.org/details/webofdeceitbrita0000curt/page/432/mode/2up?q=chagos

    https://journals.openedition.org/oceanindien/2003

    See omnystudio.com/listener for privacy information.

    Behind the Bastards
    enJune 18, 2024

    Related Episodes

    Superconductor Superconfusion, KOSA’s Hidden Costs and HatGPT

    Superconductor Superconfusion, KOSA’s Hidden Costs and HatGPT

    Researchers in Korea claim they’ve identified a material that could unlock a technological revolution: the room temperature superconductor. Material scientists are skeptical, but enthusiasts on Twitter are enthusiastic. Why is the internet so excited about superconductors?

    Then, the Kids Online Safety Act is headed to the Senate floor. Would it actually keep children safe? And how would it change the internet?

    Plus: Kevin and Casey play HatGPT.

    Additional Reading:

    Unintended Consequences

    Unintended Consequences
    Human innovation has transformed the way we live, often for the better. But as our technologies grow more powerful, so do their consequences. This hour, TED speakers explore technology's dark side. Guests include writer and artist James Bridle, historians Yuval Noah Harari and Edward Tenner, internet security strategist Yasmin Green, and journalist Kashmir Hill.

    Learn more about sponsor message choices: podcastchoices.com/adchoices

    NPR Privacy Policy

    Indoor vs. Outdoor Pools | Hold Up with Dulcé Sloan & Josh Johnson

    Indoor vs. Outdoor Pools | Hold Up with Dulcé Sloan & Josh Johnson

    “When there’s lightning, I can swim in my pool. When it's a rainy day, I can swim in my pool. When it's just nasty outside, I can swim in my pool. My pool indoors is consistent.”- Josh Johnson

    “For me, the outdoor pool has represented a lifestyle. Because a pool day is a sunny day. We are drinking a nice beverage, we’re out here just judging others, really taking in the scene.” - Dulcé Sloan

    It’s an “aquatic argument” of indoor vs. outdoor pools, this week on Hold Up with Daily Show correspondent Dulcé Sloan and writer Josh Johnson.

    Hold Up  is a podcast from The Daily Show. Listen to new episodes every Thursday wherever you get your podcasts, or watch at YouTube.com/TheDaily Show



    See omnystudio.com/listener for privacy information.

    Invention Playlist 4: Barbed Wire

    Invention Playlist 4: Barbed Wire

    You may very well encounter barbed wire everyday -- and, in all likelihood, you see THROUGH this flesh-ripping barrier. Where did this invention come from? How did it change the world? Robert and Joe discuss “the devil’s rope” in this episode.

    Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

    See omnystudio.com/listener for privacy information.