Logo
    Search

    Podcast Summary

    • Search Engines and Tech Platforms as Surveillance and Manipulation ToolsAvoid Google and similar platforms to protect privacy and maintain autonomy by using alternative search engines and de-googled devices.

      Search engines like Google, and other tech platforms such as Android phones and devices with personal assistants, function as surveillance and manipulation tools, collecting vast amounts of personal data and influencing opinions, purchases, and even elections. These companies exploit user data to target individuals with ads and manipulate their behavior. The speaker recommends avoiding Google and similar platforms to protect privacy and maintain autonomy. Instead, consider using alternative search engines and de-googled devices. For instance, the speaker uses a phone with a different operating system that doesn't track or upload user data. Since 2014, the speaker has not received targeted ads on their devices, demonstrating the effectiveness of this approach.

    • Online services come with a cost to privacyConsider using privacy-focused alternatives like Brave browser and search engine for greater control over online data, despite potential monthly fees.

      While using "free" online services may seem convenient, they often come with a cost to your privacy. These services collect data about you, which can be used for targeted advertising or even sold to third parties. The speaker suggests using privacy-focused alternatives like the Brave browser and search engine, which have minimal tracking and are faster than popular browsers like Chrome. These services may cost a small monthly fee but offer greater control over your online data. The speaker's personal experience with Google's censorship and control over access to the internet through its search engine highlights the importance of being aware of the potential risks and limitations of relying on a single dominant player in the tech industry.

    • Google's power to manipulate information and influence public opinionGoogle's search engine and voice assistants like Alexa have the capability to manipulate information and potentially influence public opinion through blacklists and algorithms, demonstrated by an incident where Google reportedly shut down the internet and experiments revealing significant opinion shifts through a single interaction.

      Google, driven by its geek culture, has the power to manipulate information and potentially influence public opinion through its search engine and voice assistants like Alexa. This was exemplified by the incident where Google reportedly shut down the internet during a specific early Saturday morning hour when all stock markets were closed, demonstrating their capability to do so without attracting unwanted attention. The company's use of blacklists and algorithms can subtly manipulate users, as seen in experiments revealing that a single question and answer interaction on Alexa can shift opinions by significant margins. While it's unclear if this manipulation is intentional or a byproduct of the best answer, it highlights the immense power Google holds in shaping public discourse.

    • Online algorithms and technology aren't neutralHuman biases and intentions can influence online algorithms, shaping what we see and think, with potential consequences for opinions and behaviors. Google's power to control internet access also raises transparency and accountability concerns.

      While algorithms and technology play a significant role in shaping the information we receive online, they are not neutral entities. Human biases and intentions can influence the programming and content of these systems. From the Google Street View Scandal involving rogue programmers to executive-level mandates altering algorithms, the impact on our thinking, opinions, and behaviors can be substantial and often unnoticed. It's crucial to recognize that what we don't see online may be just as important as what we do. Additionally, the extent of Google's power to control internet access, such as shutting down entire domains, raises questions about transparency and accountability.

    • Google's Control Over the InternetGoogle's dominance in browsing and search allows it to exert significant control over internet access, with no clear regulations limiting its actions.

      Google holds significant control over the internet through its popular browsers (Chrome), search engine (used for 92% of global searches), and the fact that many other search engines and browsers rely on Google for safety checks and information. This control extends to the ability to block access to websites through their platforms. The discussion also touched upon the discrepancy between how different search engines operate, with Google actively crawling the internet for updates and links, while DuckDuckGo functions as a database aggregator. Additionally, the conversation highlighted the lack of regulation or laws restricting Google's actions, making their control even more influential. The conversation also hinted at the suppression of conservative content in recent years, but it wasn't a clear-cut focus of the discussion.

    • Google's Power to Block WebsitesGoogle holds immense power to control access to information through its search engine and websites, with no legal restrictions or transparency in blocking websites, particularly those of political figures.

      Google, as a private company, holds significant power over what information is accessible through its search engine and websites. This power extends to making decisions about the quality and accessibility of websites, including those of political figures like Donald Trump's campaign site. Google can block access to websites without justification or transparency, and there are currently no relevant laws or regulations in place to prevent this. Google's founders, Sergey Brin and Larry Page, may have initially had utopian intentions, but the company has since transformed into a profit-driven advertising giant that collects and monetizes user data. Former Google executive James Whitaker famously quit the company, stating that it had become a brutal, profit-driven ad company and no longer the cool place it once was. Google's slogan "Don't be evil" was dropped in 2015.

    • Google's Secretive Blacklists Targeting ConservativesGoogle's blacklists, like the quarantine list, manipulate access to information for financial gain and value imposition, primarily targeting conservative organizations, raising concerns about suppression of certain viewpoints.

      Google, a tech giant, has been manipulating access to information through various blacklists, including the quarantine list, which affects browsers like Safari and Firefox. This was revealed during a Senate hearing in 2019 when a top executive was under oath and denied the existence of such blacklists. However, just a month later, an executive who had leaked these documents was pursued by Google. These blacklists primarily target conservative organizations, raising questions about financial motivations, deals with politicians, or the suppression of certain viewpoints. Google's primary motives are financial gain and imposing their values on others, which they believe are more valuable than others. This immense power to influence global thinking raises serious concerns.

    • Google's Influence on Public Opinion and ConsciousnessGoogle's search engines and suggestions can impact opinions and behaviors, while their collaboration with intelligence agencies raises privacy concerns in democratic societies.

      Google, through its advanced divisions and tools like search engines and suggestions, has the capability to significantly influence public opinion and consciousness, as well as collaborate with intelligence agencies. This influence extends beyond search engines to other tools and techniques, some of which have been studied extensively to understand their impact on thinking and behavior. The search engine manipulation effect (SEM) and search suggestion effect (SSE) are two such examples, capable of altering the balance of opinions among undecided voters without their knowledge. Google's collaboration with intelligence agencies in the 1990s, which involved tracking and preserving search histories, is a legitimate use of their technology in the context of national security. However, the implications of this influence on democratic processes and individual privacy are significant and warrant further discussion.

    • Google manipulates search suggestions to influence public opinionGoogle manipulates search suggestions to draw attention to negatives for certain candidates or causes, potentially shaping public opinion, particularly among undecided voters.

      Search engines like Google manipulate search suggestions to suppress negatives for certain candidates or causes, while allowing negatives to appear for others. This is done to draw people's attention towards negative search results for the opposing candidate or cause, leading them to websites that make that person or cause look bad. This manipulation is based on the negativity bias, which draws our attention to negatives over positives. The search suggestion manipulation was first implemented around 2009 as an opt-in feature, but later became automatic and non-opt-out. The number of search suggestions was also dropped from ten to four, the optimal number for maximizing control over people's searches. This manipulation can significantly influence public opinion, particularly among undecided voters.

    • Google manipulates search suggestions based on business relationships and advertising agreementsGoogle's search suggestions are influenced by business deals and personalized based on user history, potentially overshadowing new companies and limiting internet freedom

      Our online search experience is manipulated by companies like Google, and this manipulation begins from the very first character we type into the search box. The optimal number of suggestions to maximize control over search is 4, and these suggestions are heavily influenced by business relationships and advertising agreements. For instance, if you type "A," Google suggests Amazon due to their business relationship as Amazon is Google's largest advertiser and vice versa. The first suggestions are personalized based on our search history and interests, but for most people, four out of five suggestions will be for Google itself. If you're starting a new company, avoiding the letter "G" in the name might be a good idea to avoid being overshadowed by these dominant players. The internet was meant to be a public library, but it has evolved into a surveillance business model controlled by a few large monopolies, including Google, Apple, and Microsoft. While some companies like Apple and Microsoft have made efforts to limit data collection, they still collect user information and can change their practices at any time. Ultimately, it's essential to be aware of these manipulations and make informed decisions about our online behavior.

    • Google and Microsoft's secret pact and privacy concernsGoogle and Microsoft's secret deal led to the disappearance of complaints and potential shift of Bing's search results to Google. Modern operating systems' aggressive tracking capabilities raise privacy concerns, echoing Eisenhower's warning about tech elites controlling public policy.

      The relationship between tech giants Google and Microsoft, as well as the increasing dominance of technology companies in our lives, has raised significant concerns about privacy, manipulation, and control. In 2016, Google and Microsoft signed a secret pact, leading to the disappearance of complaints between them and the potential shift of Bing's search results from Microsoft to Google. Additionally, the aggressive tracking capabilities of modern operating systems like Windows 10 and 11 have raised privacy concerns. Eisenhower's warning about a technological elite controlling public policy in his 1961 speech seems particularly relevant today, with companies like Google, Apple, Facebook, and others holding immense power and influence. The potential for negative consequences, such as the creation and monetization of turmoil and chaos, highlights the need for regulation and transparency in the tech industry.

    • Algorithms influencing users' interestsAlgorithms, like YouTube's up next, can manipulate users' decisions by suggesting content based on their past engagements, potentially influencing their opinions and choices without their knowledge.

      Algorithms, such as those used by social media platforms like Facebook and YouTube, reflect users' interests, both positively and negatively. While it's not the platforms' fault that people tend to engage more with negative content, it is their responsibility to not manipulate users with this information. For instance, 70% of YouTube videos watched worldwide are suggested by the platform's up next algorithm. This algorithm can be manipulated to influence users' decisions, as shown in studies and experiments. Websites, including those used for political opinion matching and dating apps like Tinder, also employ similar algorithms to sway users' choices without their knowledge. It's crucial to be aware of this manipulation and ensure transparency in these systems.

    • Manipulation in Technology: Illusions of Personalized AttentionTechnology companies manipulate users with illusions of personalized attention, exploiting vulnerabilities and uncertainty, with potential serious consequences for policy, elections, and public narratives, yet there's a lack of action from political parties due to financial support.

      Technology companies, particularly those that offer quizzes and recommendations, manipulate users by creating an illusion of personalized attention and credibility, even when they're not considering individual answers. This manipulation is effective because it targets people who are unsure and vulnerable, and the implications of this kind of manipulation can have significant effects on policy, elections, and public narratives. Despite the potential harm, there seems to be a lack of urgency or action from political parties due to the financial support these companies provide. The situation is further complicated as technology continues to advance, giving these companies even more control over our virtual lives. For instance, the metaverse and potential cryptocurrencies could further increase their influence. It's crucial to remember the ethical guidelines these companies claim to uphold, such as "don't be evil," but the reality may be more complex. It's essential to stay informed and question the intentions behind the technology we use.

    • The Intersection of Corporate Values and PoliticsGoogle's 'don't be evil' ethos and wokeness faced scrutiny, with some questioning if it's a profit strategy. Dr. Robert Epstein shared personal experiences of political labeling and a mysterious death.

      The intersection of corporate values and politics can lead to controversial situations. The discussion centered around Google, a company known for its "don't be evil" ethos but also for its controversial policies and employee demonstrations. The wokeness of the company was questioned, with some suggesting it may be a strategy for profitability. Dr. Robert Epstein shared his experience of being labeled as a conservative right-wing figure despite his claims of being politically neutral. He also shared an unsettling encounter with a state attorney general who warned him of an impending accident, which later resulted in the death of his wife. The circumstances surrounding her death were mysterious, with her truck disappearing and being sold to someone in Mexico. The conversation underscores the potential risks and complexities that arise when corporate values and politics collide.

    • Crimes Committed with Technology and AlgorithmsTechnology and algorithms can be used to commit crimes, suppress information, and manipulate public opinion, highlighting the importance of ethical use and oversight.

      Technology and algorithms, when manipulated by powerful entities, can be used to commit crimes, suppress information, and even potentially assassinate individuals. This was discussed in relation to the journalist Michael Hastings and the suspicious circumstances surrounding his death. The concern raised is that these crimes can go undetected and unpunished due to the complexity of technology and the power held by those who control it. Another concern is the potential for these entities to suppress information and manipulate public opinion through algorithms and targeted advertising. The discussion also touched upon the possibility of hacking into hospital computers to alter medication dosages as a means of assassination. It's important to remember that algorithms are written and modified by people, and their impact on our lives should not be taken lightly.

    • Google's online profiling raises concerns of bias and manipulationStudy reveals Google search results showed pro-Clinton bias in 2016, potentially impacting elections, highlighting the need for transparency and accountability in tech's use to influence public opinion.

      Technology companies like Google have the ability to create detailed profiles on users based on their online activity, which allows them to distinguish between real people and bots or automated traffic. This profiling enables these companies to tailor content to individual users, but it also raises concerns about potential bias and manipulation of information. In 2016, a study was conducted to investigate this issue by recruiting field agents and monitoring their online searches. The results showed significant pro-Hillary Clinton bias on Google search results, but not on Bing or Yahoo. This bias could have potentially shifted millions of votes in the 2016 election without anyone's knowledge. Similar findings were observed in the 2018 midterms and 2020 election. The study's findings suggest the importance of transparency and accountability in the use of technology to influence public opinion and elections.

    • Monitoring tech companies during electionsPublic exposure of manipulation led to change. A permanent monitoring system is proposed to prevent future bias and manipulation in all 50 states.

      Monitoring and transparency are key in addressing potential manipulation and bias in tech companies' practices, particularly during elections. The example given of Google's actions in Georgia highlights the impact of public exposure. A permanent, large-scale monitoring system in all 50 states is proposed as a solution, with a consortium of non-profit organizations potentially leading the effort. The importance of monitoring emerging technologies, including AI-powered personal assistants, was also emphasized. The potential for targeted manipulation through vote reminders and the need for transparency in tech companies' practices were highlighted.

    • Manipulating public opinion through ephemeral experiencesGoogle's vast reach and influence can manipulate public opinion through ephemeral experiences, but preventing manipulation requires extensive surveillance and analysis of user data with proper security measures.

      Ephemeral experiences, brief online interactions that affect users and disappear without a trace, are a powerful tool for manipulating public opinion and potentially influencing election outcomes. Google, with its vast reach and influence, has the capability to leverage these ephemeral experiences to shape people's views, and due to their transient nature, the manipulation often goes undetected. The only way to prevent such manipulation is through extensive surveillance and analysis of a representative sample of users. This involves careful recruitment, training, and security measures to ensure the integrity of the data. The ultimate goal is to raise awareness of these issues and promote transparency and accountability in the tech industry.

    • Regulating Google with a light touchProposing public access to Google's search index could lead to increased competition and innovation, aligning with Google's 'don't be evil' philosophy

      Light touch regulation, such as making Google's search index public, could lead to increased competition and innovation in the search engine market. This idea, which was proposed in response to concerns about Google's dominance, has precedent from the regulation of AT&T in the 1950s. While this could happen in the EU, where Google faces significant fines for bias in search results, it's also a possibility in the US, where an antitrust action against Google is ongoing. This regulatory intervention wouldn't necessarily bankrupt Google, but it could result in thousands of search engines catering to niche audiences. The proponents of this idea argue that it would align with Google's "don't be evil" philosophy and could even be beneficial for the company in the long run. However, there's a possibility that Google might resist any regulations for as long as possible. The speaker has advocated for this approach in Brussels and believes it could have a global impact.

    • Impact of Technology Platforms on Elections and Public DiscourseGoogle's biased user base and social media algorithms can manipulate votes and spread misinformation, threatening democracy and humanity. Vigilance, research, and monitoring systems are necessary to combat these issues.

      The manipulation of information and algorithms in technology platforms, such as Google and social media, can significantly impact the outcome of elections and public discourse. The speaker in this discussion pointed out that Google's biased user base could potentially manipulate votes, and the control of algorithms in social media determines what goes viral, leading to the spread of misinformation. The speaker emphasized the importance of being vigilant against these powers that can potentially threaten democracy and humanity as a whole. The speaker also mentioned the need for research and monitoring systems to combat these issues. It's essential to recognize that these platforms have the power to influence public opinion and elections, and it's crucial to address these concerns to preserve the integrity of our democratic processes.

    • Google Analytics China negotiationsGoogle prioritized business interests over privacy, censoring search results in China, and the majority of children's online content goes unnoticed by parents. We need monitoring systems to protect democracy and children from tech's negative impacts, but engagement with tech executives has been met with resistance.

      Tech companies, like Google, prioritize their own interests above all else, even if it means suppressing information or compromising individual privacy. This was evident during Google's negotiations with China, where they agreed to censor search results to avoid having their technology copied. However, this issue is not limited to children in China, as the majority of online content children are exposed to goes unnoticed by parents. Solving these problems is not optional; we must set up monitoring systems to protect our democracy and our children from the potential negative impacts of technology. Unfortunately, attempts to engage with tech executives on these issues have been met with resistance, with some even going as far as sending private investigators to the homes of critics. It's essential that we remain vigilant and continue the conversation about the importance of privacy, transparency, and accountability in the tech industry.

    • Backlash from OpponentsSpeaker faced backlash, including being blacklisted and facing defamatory statements, but remains committed to exposing hidden player in elections that can shift millions of votes

      The speaker's research and testimony about potential manipulation of voter data has led to significant backlash, including being blacklisted by certain organizations and individuals. This includes an intern who was unable to join the project due to her family's disapproval, and a high-profile executive withdrawing from a conference. The speaker also faced defamatory statements from prominent figures, which they could have sued over but chose not to due to the distraction it would cause from their work. Despite these challenges, the speaker remains committed to their research, which has been well-received in the scientific community, and has ambitious plans for the future. The speaker believes that the importance of their work, which exposes a hidden player in elections that can shift millions of votes, outweighs the personal and professional costs they have faced.

    • The power and reach of tech giants raise concerns about privacy, data ownership, and freedom of expressionTech giants collect personal data, have the ability to suppress info, and their impact on industries and individuals is a growing concern. They should be subject to constitutional freedoms and protection of free speech.

      The power and reach of tech giants like Google and social media platforms raise valid concerns about privacy, data ownership, and freedom of expression. These companies collect vast amounts of personal data and have the ability to suppress information, making it essential to view them as more than just private corporations. The fear of Google and its impact on industries and individuals is a growing concern, as is the challenge of opting out of digital profiles that are built based on personal history. The shutting down of accounts, such as that of former President Trump, highlights the need for these platforms to be subject to constitutional freedoms and the protection of free speech.

    • Freedom of speech and its importance in American societyBig tech companies must uphold free speech principles, prevent manipulation and misinformation, and be accountable to the public for the consequences of their actions.

      The ability to express diverse ideas freely and engage in open debate is a critical aspect of being American. Suppressing or deleting speech, even if it's deemed "bad," goes against the principles of free speech and can lead to manipulation and misinformation. Big tech companies, who now control a significant amount of information, have a responsibility to uphold these principles and be accountable to the public. The manipulation of information can have serious consequences, especially for vulnerable groups and young people. It's essential to monitor and regulate these companies to ensure they are acting in the best interests of society and not just their shareholders. For more information and ways to get involved, visit tamebigtech.com.

    • Investigating the impact of big tech on societyJournalist Matt Crawford is uncovering potential dangers of big tech on democracy, mental health, and young people, feeling a great responsibility to inform the public despite the heavy burden and lack of attention.

      Matt Crawford, an investigative journalist, is currently the only one delving into the potential dangers of big tech on democracy, mental health, and young people. He feels a great responsibility to uncover this information, but it's a heavy burden. He wonders about the disappearance of a key figure in his research and is concerned about the lack of attention given to this issue despite its potential impact on society. He likens his role to being the one who cleans up the "poop" when others gain credentials and responsibilities. It's a tough job, but necessary. He hopes that by sharing this information, more people will be inspired to investigate and shed light on this important issue.

    Recent Episodes from The Joe Rogan Experience

    #2170 - Max Lugavere

    #2170 - Max Lugavere
    Max Lugavere is a filmmaker, health and science journalist, author, and host of The Genius Life podcast. His debut film Little Empty Boxes is out now. http://littleemptyboxes.com www.maxlugavere.com Learn more about your ad choices. Visit podcastchoices.com/adchoices

    #2169 - Protect Our Parks 12

    #2169 - Protect Our Parks 12
    Shane Gillis, Mark Normand, and Ari Shaffir are stand-up comics, writers, and podcasters. Shane is the co-host of "Matt and Shane's Secret Podcast" with Matt McCusker and one half of the sketch comedy duo "Gilly and Keeves" with John McKeever. Watch his new comedy series, "Tires," and special, "Beautiful Dogs" on Netflix. www.shanemgillis.com Mark is the co-host of the podcasts "Tuesdays with Stories" with Joe List and "We Might Be Drunk" with Sam Morril. Watch his latest stand-up special, "Soup to Nuts," on Netflix. www.marknormandcomedy.com Ari is the host of the "You Be Trippin'" podcast. His latest comedy special, "Ari Shaffir: Jew," is available now via YouTube. www.arishaffir.com Learn more about your ad choices. Visit podcastchoices.com/adchoices

    #2168 - Tyler Fischer

    #2168 - Tyler Fischer
    Tyler Fischer is a stand-up comic, actor, and filmmaker. His latest special, "The Election Special | LIVE at Comedy Mothership," is available now via YouTube. https://youtu.be/FmvJjMGX7hw?si=PyOsFVH4as8HMHBD www.tylerfischer.com Learn more about your ad choices. Visit podcastchoices.com/adchoices

    #2167 - Noland Arbaugh

    #2167 - Noland Arbaugh
    Noland Arbaugh is the first human recipient of Neuralink’s brain-computer interface implant: an innovative new technology that allows him to control digital devices with his thoughts. Noland Arbaugh: https://x.com/ModdedQuad Neuralink www.neuralink.com Learn more about your ad choices. Visit podcastchoices.com/adchoices

    #2166 - Enhanced Games

    #2166 - Enhanced Games
    Christian Angermayer and Dr. Aron D’Souza are the co-founders of the Enhanced Games, an upcoming Olympic-style event that brings together the world’s top athletes to compete without arbitrary bans on performance-enhancing substances.  www.enhanced.org Learn more about your ad choices. Visit podcastchoices.com/adchoices

    #2165 - Jack Carr

    #2165 - Jack Carr
    Jack Carr is a bestselling author, retired Navy SEAL, and host of the “Danger Close” podcast. His newest book, "Red Sky Mourning,” is available now. www.officialjackcarr.com Learn more about your ad choices. Visit podcastchoices.com/adchoices

    #2164 - Action Bronson

    #2164 - Action Bronson
    Action Bronson is a musician, chef, painter, and author. Look out for his forthcoming album "Johann Sebastian Bachlava the Doctor'' and watch his series "F*ck, That's Delicious" on YouTube. www.actionbronson.com Learn more about your ad choices. Visit podcastchoices.com/adchoices

    #2163 - Freeway Rick Ross

    #2163 - Freeway Rick Ross
    Freeway Rick Ross is a former eighties drug kingpin who is now an author, motivational speaker, and community advocate. www.freewayrickyross.com Learn more about your ad choices. Visit podcastchoices.com/adchoices

    #2162 - Tim Dillon

    #2162 - Tim Dillon
    Tim Dillon is a stand-up comic, actor, and host of "The Tim Dillon Show" podcast. His latest comedy special, "Tim Dillon: A Real Hero," is available on Netflix. Look for his book "Death by Boomers: How the Worst Generation Destroyed the Planet, but First a Child" in 2024. www.timdilloncomedy.com Learn more about your ad choices. Visit podcastchoices.com/adchoices

    Related Episodes

    Data Revolution?! Can TARTLE REALLY Fix Tech's Biggest Problem?

    Data Revolution?! Can TARTLE REALLY Fix Tech's Biggest Problem?

    Is Big Tech hoarding your data like Scrooge with his gold? Are algorithms playing puppet master with your life? Buckle up, because the TARTLE Data Commitment is CRASHING the scene with a radical plan to put YOU back in control of YOUR data!

    ** Consent, Ethics, Equality, Inclusion - these ain't just buzzwords, folks. TARTLE's got a four-pronged attack on tech's data dragon:**

    • ** Take Back Your Power:** No more shady data deals! TARTLE gives you crystal-clear control over how your information is used.
    • ⚖️ Justice for All: Ditch the biased algorithms! TARTLE champions data equality, ensuring everyone's voices are heard, not just the tech giants' chosen few.
    • ** Everyone In:** No more data deserts! TARTLE bridges the digital divide, making sure everyone has access to the benefits of the data age.
    • ️ Ethics Before Profits: Forget creepy surveillance! TARTLE prioritizes data privacy and ethical use, so your info stays safe and sound.

    But can TARTLE REALLY walk the walk? Is this just another Silicon Valley pipe dream? Dive into this video and decide for yourself! We'll dissect TARTLE's promises, expose the dark side of Big Tech's data game, and chart a course for a fairer, more ethical digital future.

    Join the data revolution! Share this video, smash that subscribe button, and let's make TARTLE's vision a reality!

    TCAST is a tech and data podcast, hosted by Alexander McCaig and Jason Rigby. Together, they discuss the most exciting trends in Big Data, Artificial Intelligence, and Humanity. It’s a fearless examination of the latest developments in digital transformation and innovation. The pair also interview data scientists, thought leaders, and industry experts. Pioneers in the skills and technologies we need for human progress. Explore our extensive TCAST selection at your pace, on your channel of choice.

    What's your data worth? Find out at ( https://tartle.co/ ) Share our Facebook Page | https://go.tartle.co/fb Watch our Instagram | https://go.tartle.co/ig Hear us Tweet | https://go.tartle.co/tweet

    The importance of values in regulating emerging technology to protect human rights with Ed Santow

    The importance of values in regulating emerging technology to protect human rights with Ed Santow

    In today’s episode no. 25, Edward Santow, Australia’s Human Rights Commissioner speaks to Reimagining Justice about one of many projects he is responsible for, namely the Commission’s Human Rights and Technology project.

    Whether you know a little or a lot about human rights or artificial intelligence, you will gain something from listening to our conversation about the most extensive consultation into AI and Human Rights anywhere in the world. Ed explains exactly what human rights are and why they should be protected, how technology is both enhancing and detracting from human rights and the best approach to take in regulating emerging technology in the future.

    We talked about protecting the rights of the most marginalized people, automated decision making and how to combat bias and something I found particularly fascinating, the tension between the universality of human rights, ubiquitous technology and how differing cultural contexts and historical experiences are shaping the principles that will guide both the development and application of technology.

    Ed Santow has been Human Rights Commissioner at the Australian Human Rights Commission since August 2016 and leads the Commission’s work on technology and human rights; refugees and migration; human rights issues affecting LGBTI people; counter-terrorism and national security; freedom of expression; freedom of religion; and implementing the Optional Protocol to the Convention Against Torture (OPCAT).

    Andrea Perry-Petersen – LinkedIn - Twitter @winkiepp – andreaperrypetersen.com.au

    Twitter - @ReimaginingJ

    Facebook – Reimagining Justice group