Logo
    Search

    Podcast Summary

    • Anonymous messages target cheerleaders and parents in Bucks CountyDeep fake technology used to spread disturbing content and threats to cheerleaders and their parents, raising concerns about privacy, security, and the potential consequences of anonymous online communication.

      The world of cheerleading has evolved into a highly competitive travel team sport, where parents and athletes face unique challenges. In Bucks County, a group of cheerleaders and their parents were targeted by anonymous messages containing disturbing content. The messages included photos and videos of the girls engaging in risky behaviors, as well as threatening language. The perpetrator was unknown, and it was unclear if the same person was behind the messages to both the parents and the cheerleaders. This case, involving deep fake technology, is a unique and concerning development in the world of journalism and community safety. It raises important questions about privacy, security, and the potential consequences of anonymous online communication. Stay tuned for an upcoming episode of "Is Business Broken" podcast, where we delve deeper into this topic and explore the implications for individuals, communities, and society as a whole.

    • Mom uses deepfakes to spread lies among cheerleading squadDeepfakes can manipulate media, causing harm and confusion, even within close-knit communities. Digital literacy and awareness are crucial to prevent potential consequences.

      Technology can be used to create deepfakes, manipulated media that can cause harm and confusion, even within a seemingly close-knit community like a cheerleading squad. In this case, a mom used deepfakes to spread lies and discord among the team members, causing distress and leading to police involvement. The deepfakes, presented as real images, were created using existing photos from social media and manipulated with technology. This incident highlights the evolving nature of deepfakes and the potential consequences they can have on individuals and their relationships. It also underscores the importance of digital literacy and the need to be aware of the potential for manipulated media in our increasingly digital world.

    • Deepfakes: Creating Convincing Fake Media with EaseDeepfakes pose significant threats to privacy and security, particularly for women and marginalized communities, with potential consequences ranging from embarrassment to political instability and war.

      Deepfakes, manipulated media that can create convincing fake videos, images, or audio, are becoming more accessible and easier to produce, posing significant threats to privacy and security, particularly for women and marginalized communities. The history of manipulated media dates back to the 1930s, but deepfakes, which emerged from a Reddit community in 2017, have raised new concerns due to their potential for creating nonconsensual porn and political manipulation. Experts like Danielle Citron and Hany Farid warn that the ability to create deepfakes is increasingly in the hands of the masses, requiring fewer and fewer images or videos of a person. The consequences of deepfakes can range from embarrassment and humiliation to more serious harm, such as political instability or even war. It's essential to be aware of this emerging threat and take steps to protect ourselves and our online presence.

    • Deepfakes beyond pornography: fraudulent activities and offline creationDeepfakes pose a significant concern due to their ease of creation and potential for fraudulent activities, including voice cloning for financial gain. Stay informed and take precautions to protect against their negative impact.

      Deepfake technology has become more accessible and is being used in various ways beyond nonconsensual pornography. Deepfakes are not just limited to online videos but can also be created offline, leading to fraudulent activities such as voice cloning for financial gain. The potential misuse of deepfake technology is a significant concern, and the ease with which it can be created raises questions about accountability and trust. The consequences of deepfakes can be severe, leading to financial losses, damage to reputation, and even criminal charges. It's crucial to stay informed and take necessary precautions to protect oneself from the potential risks of deepfake technology. The increasing prevalence of deepfakes highlights the need for greater awareness, education, and regulations to mitigate their negative impact.

    • Deep faking technology: Creating convincing manipulations but with limitationsDeep faking technology can generate convincing visual and audio manipulations, but the results may not perfectly resemble the original person due to limitations and ethical concerns.

      Deep faking technology is accessible and can be used to create convincing visual and audio manipulations, but the results may not perfectly resemble the original person. Amory and Anne Marie attempted to deep fake each other using various methods, with Amory creating a visual deep fake of Anne Marie as Rebecca Black and Anne Marie creating an audio deep fake of herself. However, they both faced challenges and limitations with the technology. The process was time-consuming, and the results did not perfectly resemble the original persons. Despite their efforts, they were not fully satisfied with the outcomes. This experience highlights the current state of deep faking technology and the challenges of creating realistic and convincing manipulations. Additionally, the discussion touched upon ethical concerns and potential misuse of this technology.

    • The Power of Deepfakes to DeceiveDeepfakes, even if not authentic, can spread as truth, enabling liars to deny reality, highlighting the need for awareness and expert consultation to prevent misinformation

      Deepfakes, while a growing concern, can be easily misused as a tool for deception. In the case of Maddy Haim, allegations of a deepfake video spreading online turned out to be a hoax. The lack of evidence supporting the authenticity of the original video, combined with the inability of authorities to consult experts, led to the perpetuation of the lie. This incident highlights the power of the "liars dividend," a term coined by law professors Danielle Citron and colleagues, where deepfakes, even if not authentic, can be accepted as true, enabling the liar to deny the reality. This was evident in the case of former President Trump, who denied the authenticity of a tape revealing his controversial remarks about women, even after its release. As deepfakes become more prevalent, it is crucial to increase awareness and consult experts to verify their authenticity to prevent the spread of misinformation and deception.

    • Human's ability to detect deepfakes is better than expectedWhile deepfakes pose risks, humans can still detect average ones, but the threat to democracy remains significant. Efforts to combat deepfakes are ongoing.

      While deepfakes pose significant risks, particularly in the areas of nonconsensual imagery, fraud, and election interference, the potential for humans to discern fact from fiction may not be as dire as some reports suggest. Research conducted by a PhD student at MIT Media Lab found that humans are relatively good at detecting average deepfakes, particularly when the videos are uncontroversial and only a few seconds long. However, the existence of deepfakes still poses an existential threat to democracy, as the ability to deny basic facts can undermine the very foundation of our society. Efforts to combat deepfakes are underway, with proposals for adding authentication tokens to raw original photos and the use of AI by tech companies like Facebook and Google. Ultimately, the collective desire for safety and authenticity may drive marketplace and regulatory responses, but this process could take years.

    • Struggling to Identify DeepfakesPeople find it hard to distinguish deepfakes from real videos, making it essential to stay informed and vigilant against manipulated media.

      While people can be quite accurate in distinguishing between deepfakes and real videos when presented side by side, they struggle with identifying manipulations when viewing a single video. Deepfakes, which can involve hiring actors and visual effects artists to create convincing but fake videos, can be difficult to detect even for experts. However, as the technology and awareness around deepfakes continue to evolve, there is hope that this is a problem we can collectively solve. It's important to note that creating and distributing deepfakes is not only a technical challenge, but also a legal and ethical one, as seen in the case of Rafael Espone, who was convicted of cyberbullying for sending anonymous messages and real images, but not for creating deepfakes. Overall, deepfakes pose a significant threat to individuals and society, and it's crucial that we remain vigilant and continue to explore solutions to this complex issue.

    • Markets vs Regulation: Balancing Profit and ExternalitiesProfit motive and ESG initiatives can address externalities but regulation is still necessary for meaningful progress.

      The profit motive and environmental, social, and governance (ESG) initiatives can work together to address externalities and promote positive change, but they are not a complete solution on their own. During a recent event at BU Questrom School of Business, speakers Andy King and Veidt Henness debated the role of markets versus regulation in addressing externalities. While some argue that regulation is necessary to solve externalities, others believe that the profit motive can be a powerful tool for change. However, it's important to note that the profit motive is not directly linked to externalities, and regulation will still be necessary to drive meaningful progress. Additionally, the massive scale of the required capital reallocation for the climate transition and other social justice issues necessitates the involvement of investors and the profit motive. To learn more about this topic, listen to the full episode of "Is Business Broken" on your preferred podcast platform.

    Recent Episodes from Endless Thread

    This is Not a Pyramid Scheme

    This is Not a Pyramid Scheme

    Every year, thousands of Americans lose money participating in multi-level marketing (MLM). So, last year, when a new business idea that promised to correct MLM's sins bubbled up on Instagram and TikTok, a lot of people hopped off the MLM train, and onto this new one, lured by the promise of a low-lift and lucrative side hustle.

    This new business idea is called "master resell rights." But what exactly is it? Where did it come from? And does it actually solve any of MLM's problems? Endless Thread investigates.

    *****

    Credits: This episode was produced by Grace Tatter. Mix and sound design by Emily Jankowski. It was hosted by Ben Brock Johnson, Amory Sivertson, and Grace Tatter.

    Endless Thread
    en-usJune 27, 2024

    Worm Wars

    Worm Wars

    When Endless Thread producer Nora Saks learned that a "toxic, self-cloning worm that poops out of its mouth" was invading Maine, she started sounding the alarm about the impending eco-doom.

    Until, that is, state experts clued her into the "real threat" : A different creepy crawly wriggling towards The Pine Tree State's gardens and precious forests, and fast. In this rebroadcast from January 2023, Endless Thread tunnels down a wormhole, encountering a long history of xenophobic rhetoric about so-called invasive species, and some hard truths about the field of invasion biology itself.

    Endless Thread
    en-usJune 21, 2024

    Looking for a Man, Finding a Record Deal

    Looking for a Man, Finding a Record Deal

    In April, a TikTok creator mused, "Did I just write the song of the summer?" Girl on Couch's "Looking for a man in finance" song spawned hundreds of remixes, and won her a record deal. While it might seem remarkable that a five-second TikTok sound can command the attention of pop music kingmakers, the industry has been capitalizing on internet memes for decades. Endless Thread takes a crash course in internet meme pop music history.

    Credits: This episode was produced by Grace Tatter . Mix and sound design by Emily Jankowski. The hosts are Amory Sivertson, Ben Brock Johnson, and Grace Tatter.

    Endless Thread
    en-usJune 14, 2024

    Scamming the Scammers

    Scamming the Scammers

    Border Patrol is calling: A drug cartel has your bank information, so you need to transfer all your money to a safe Bitcoin account—right now!

    Millions of people will be familiar with calls like this, in which scammers, often in other countries, use threats or promises to rob you. In 2023, individuals and businesses lost an estimated $485 billion to fraud schemes, according to Nasdaq's Global Financial Crime Report.

    Law enforcement will only do so much to recover losses. That is why some online streamers are taking matters into their own hands. And they have become famous for fighting back.

    Endless Thread's Ben Brock Johnson and Amory Sivertson explore the complicated, criminal world of scambaiters.

    *****

    Credits: This episode was produced by Ben Brock Johnson and Dean Russell. Mix and sound design by Emily Jankowski. It was hosted by Ben Brock Johnson and Amory Sivertson.

    Endless Thread
    en-usJune 07, 2024

    SwordTube, En Garde!

    SwordTube, En Garde!

    Sword influencers abound on YouTube. Those who specialize in the historic European martial arts, or HEMA, have gained legions of fans showcasing the fantastic, bladed techniques of yore.

    But talk of parries and pommels has recently given way to bigotry. Endless Thread's Ben Brock Johnson speaks with co-host Amory Sivertson about one valiant influencer fighting back.

    *****

    Credits: This episode was produced by Ben Brock Johnson and Dean Russell. Mix and sound design by Emily Jankowski. The hosts are Amory Sivertson and Ben Brock Johnson.

    Endless Thread
    en-usMay 31, 2024

    Gen Z wants you to take political action, one TikTok at a time

    Gen Z wants you to take political action, one TikTok at a time

    Gen Z is over it. The youngest generation of adults is inheriting a climate crisis, the ongoing fallout from a global pandemic, a polarized political landscape, and a tenuous economic reality.  And many Gen Z members, a generation more likely to identify as progressive than conservative, are ready for something to give.

    Enter: Gen Z for Change — a youth-led non-profit that brands itself as, "the place where the creator economy and progressive politics intersect on social media." The group leverages a hundreds-deep network of social media creators to spread calls to action over TikTok. They've also pulled on the programming expertise within their team to develop a caché of semi-automatic tools that take the guesswork out of engaging with their political agenda.

    Their latest tool, "Ceasefire Now!!" takes these efforts one step further — resulting in, by Gen Z for Change's count, two million emails calling for a ceasefire in Gaza hitting the inboxes of elected representatives in Washington every day.

    Show notes: 

    Endless Thread
    en-usMay 24, 2024

    Catfish for dinner

    Catfish for dinner

    After Taylor Paré was stood up on a date, she turned to TikTok. In a now-viral video, she claimed to have uncovered a new scheme to scam to singles looking for love on the internet. Endless Thread investigates.

    =====

    Credits: This episode was written and produced by Grace Tatter. Mix and sound design by Paul Vaitkus. The hosts are Ben Brock Johnson and Grace Tatter.

    Endless Thread
    en-usMay 17, 2024

    Hype Cycle

    Hype Cycle

    The Vision Pro is Apple's new $3,500 virtual reality headset.

    Since its debut in February, users have found new ways to use this latest iteration of a decades-old technology: scrolling TikTok at work, driving Tesla's Cybertruck, recording their kid's birth.

    But can VR truly integrate into our daily lives? Or will it forever remain a niche technology for geeks and gamers?

    Endless Thread dives into the history of VR and its potential for the future.

    =====

    Credits: This episode was written and produced by Cici Yongshi Yu. Mix and sound design by Emily Jankowski. The hosts are Ben Brock Johnson and Amory Sivertson.

    Episodes We Love: Doom Jelly

    Episodes We Love: Doom Jelly

    Imagine sitting in a hospital room for 24 consecutive hours in the most agonizing pain you can possibly imagine. You feel a sense of impending doom. You have a feeling this won’t end well. Then, the pain subsides and you walk away. Jamie Seymour has had that experience eleven different times. He’s a leading expert on one of the world’s most frightening creatures and he’s paid the price.

    This episode originally aired on Oct 12, 2018.

    The Jackie Show

    The Jackie Show

    Our interactions with nature are increasingly mediated by technology. We scroll through wildlife feeds on TikTok. We use Instagram to plan hikes. Even in the wilderness, we religiously bring our phones to document the experience. And then there are animal cams.

    Since the 1990s, people have fawned over livestreams of cute pandas and colorful fish. One could argue that animal cams another example of how we’ve jammed a screen between ourselves and the wild. But the story of Jackie the bald eagle presents a different perspective: one in which technology might bring us closer to our fellow creatures.

    Producer Dean Russell speaks with Endless Thread co-host Ben Brock Johnson about the potential upsides of technonaturalism.

    =====

    Credits: This episode was written and produced by Dean Russell. Mix and sound design by Emily Jankowski. The hosts are Ben Brock Johnson and Dean Russell.

    Related Episodes

    Warped Reality

    Warped Reality
    False information on the internet makes it harder and harder to know what's true, and the consequences have been devastating. This hour, TED speakers explore ideas around technology and deception. Guests include law professor Danielle Citron, journalist Andrew Marantz, and computer scientist Joy Buolamwini.

    Learn more about sponsor message choices: podcastchoices.com/adchoices

    NPR Privacy Policy

    The Dark Side of Deepfakes: How Misinformation Can Be Weaponized

    The Dark Side of Deepfakes: How Misinformation Can Be Weaponized
    The Dark Side of Deepfakes: How Misinformation Can Be Weaponized

    "Deepfakes: The Opportunities and Risks of Synthetic Media"Deepfakes are a form of synthetic media that use artificial intelligence to manipulate audio and video to create realistic simulations of real people. While deepfake technology has the potential to be used for harmless entertainment or practical purposes, it also presents significant risks to society. The potential for the spread of misinformation or disinformation, harm to individuals, perpetration of fraud or other crimes, and devaluation of visual evidence are all potential concerns. As deepfake technology continues to evolve, it will be important for society to carefully consider its potential uses and implications.

    KURIOUS - FOR ALL THINGS STRANGE

    EP49 | 網路交友請當心!該怎麼保護自己? feat. 玫如 & 姿瑩老師

    EP49 | 網路交友請當心!該怎麼保護自己? feat. 玫如 & 姿瑩老師
    【本集重點】 你有沒有用過任何交友 App 或網站呢? 交友軟體或網站上的照片與自我介紹,你是否完全相信? 與網友見面時,約在哪好?應注意什麼呢? 天下沒有白吃的午餐!免費的通常是最貴的! 真真假假的網路世界,需注意詐騙相關問題,避免發生憾事。 【本集來賓】 侯玫如組長/臺北市立中正高中 吳姿瑩教師/臺北市立大同高中 📢 想瞭解更多資訊素養與倫理相關內容嗎? 歡迎來 eliteracy 官網找我們唷! 👉 eliteracy 傳送門 https://linktr.ee/funsurfing

    142: YouTube's Updated Creator Priorities (October 2018), Tim Cook on Privacy & Security, and More!

    142: YouTube's Updated Creator Priorities (October 2018), Tim Cook on Privacy & Security, and More!
    On episode 142 of the BSP I talk about YouTube’s updates to the Creator Priorities in October 2018, Apple Blocking the Gray Key exploit, Tim Cook’s speech regarding Privacy & Security regulation in the United States and the implications this would have, facebook going the way of Myspace, Logitech launching a new camera capture software, PewDiePie vs. T-Series and why it matters, and a lot more.
     
    Subscribe to the full audio podcast: http://www.bandrewsays.com
     
    Twitter: @bandrewsays
    Ask Questions: AskBandrew@gmail.com
     
    00:00 - Intro
    01:10 YouTube Updates 2018 Priorities (October 2018)
    12:30 - Apple Blocked Gray Key
    13:25 - Tim Cook Gives a Middle Finger to Google & Facebook (Privacy in the United States) 
    24:18 - Facebook Going the Way of Myspace
    25:14 - Logitech Launching a new Camera Capture Software
    26:54 - Sennheiser MKH416 Summary
    27:53 - PewDiePie vs T-Series and Why It’s Important
    30:52 - Is Facebook Really Aperture?
    31:21 - What do you do to prepare for your podcast?
    33:21 - Should I get a Zoom Recorder or Can I Connect a Scarlett Solo to the Samsung S9?
    36:55 - Ask Bandrew
    37:23 - Email 1
    38:15 - Should Every Modern Gentleman Own a Suit, and How Much Should They Pay?
    43:12 - Email 2
    43:46 - Mixer or Interface for Music?
    44:38 - Interface for 4 Monitors?
    46:06 - Email 3
    47:03 - How Do I Get the Universal Audio Arrow to Work with Discord using Loopback Audio?
    49:48 - Outro

    #95 – Dawn Song: Adversarial Machine Learning and Computer Security

    #95 – Dawn Song: Adversarial Machine Learning and Computer Security
    Dawn Song is a professor of computer science at UC Berkeley with research interests in security, most recently with a focus on the intersection between computer security and machine learning. Support this podcast by signing up with these sponsors: – Cash App – use code “LexPodcast” and download: – Cash App (App Store): https://apple.co/2sPrUHe – Cash App (Google Play): https://bit.ly/2MlvP5w EPISODE LINKS: Dawn's Twitter: https://twitter.com/dawnsongtweets Dawn's Website: https://people.eecs.berkeley.edu/~dawnsong/ Oasis Labs: https://www.oasislabs.com This conversation is part of the Artificial Intelligence podcast. If you would like to get more information about this podcast go to https://lexfridman.com/ai or connect with @lexfridman on Twitter, LinkedIn, Facebook, Medium, or YouTube where you can watch the video versions of these conversations. If you enjoy the podcast, please rate it 5 stars on Apple Podcasts, follow on Spotify, or support it on Patreon. Here's the outline of the episode. On some podcast players you should be able to click the timestamp to jump to that time. OUTLINE: 00:00 - Introduction 01:53 - Will software always have security vulnerabilities? 09:06 - Human are the weakest link in security 16:50 - Adversarial machine learning 51:27 - Adversarial attacks on Tesla Autopilot and self-driving cars 57:33 - Privacy attacks 1:05:47 - Ownership of data 1:22:13 - Blockchain and cryptocurrency 1:32:13 - Program synthesis 1:44:57 - A journey from physics to computer science 1:56:03 - US and China 1:58:19 - Transformative moment 2:00:02 - Meaning of life