Logo
    Search

    Section 230: Everything You Need to Know -- Tweets, Free Speech, Beyond

    enJanuary 09, 2021

    Podcast Summary

    • Section 230: Protecting Websites from Liability for User ContentSection 230 of the Communications Decency Act shields websites from liability for their users' illegal content and moderation decisions, ensuring the Internet's functionality and fostering diverse online communities

      Section 230 of the Communications Decency Act is a crucial law that allows interactive websites to avoid liability for their users' content and for their content moderation decisions. This law, which was passed in 1996 and has survived numerous legal challenges, places the liability for illegal content on the person who posted it rather than the platform. Additionally, it protects websites from lawsuits if they choose to moderate or remove content. This was a response to a series of lawsuits in the 1990s where websites were held liable for defamatory statements made on their platforms because they had attempted to moderate the content. Section 230 is essential for the functioning of the modern Internet and allows for the existence of various online communities and platforms.

    • Section 230: Protecting Interactive Computer Services Hosting Third-Party ContentSection 230 of the Communications Decency Act protects interactive computer services, including social media platforms, from liability for third-party content, allowing them to moderate and present content as they see fit, promoting free speech online.

      Section 230 of the Communications Decency Act has been a crucial factor in the development and thriving of the modern internet, allowing websites to moderate content in good faith and present it in a way that aligns with their goals. This law, which applies to both user-generated and third-party content, does not distinguish between "publishers" and "platforms." Instead, it protects interactive computer services that host third-party content. The recent addition of a fact-check feature on Twitter, which links to third-party news sites, is an example of how this law functions in practice. Despite some debates about the role of platforms versus publishers, the law does not make such distinctions and instead focuses on protecting interactive computer services that host third-party content. Ultimately, Section 230 is a significant gift to free speech for everyone, enabling a wide range of online content and activity.

    • Twitter adding context and limiting interactions on political contentTwitter can add fact checks and limit interactions on political content under Section 230, balancing free speech and terms of service.

      Social media platforms like Twitter are navigating a complex line between enforcing their terms of service and upholding free speech principles, particularly when it comes to content from politicians. In the case of Twitter and the George Floyd protests, the platform added a fact check label and limited interactions on a tweet from the president that violated their terms of service around violent speech. This action was allowed under Section 230, as it didn't remove the content or take down the speech, but rather added more context and limited interactions. The first amendment protects individuals from government infringement on speech, not private companies like Twitter. The distinction between platforms and publishers continues to be debated, but the ability for companies to add context and limit interactions on potentially harmful content is a crucial aspect of the ongoing conversation.

    • Limited applicability of public utility and public square analogies to social media platformsTraditional public utilities and public square analogies have their limitations when applied to social media platforms as they are not one-to-one replaceable and social media companies cannot be forced to act as public forums without infringing on free speech rights.

      While there are some interesting parallels between the current debate around content moderation on social media platforms and the historical context of public utilities and the concept of a public square, the applicability of these analogies is limited. The Internet and edge providers like Google, YouTube, Twitter, and Facebook do not meet the requirements of traditional public utilities because they are not commodified services that are one-to-one replaceable. Instead, they offer unique features and experiences that cannot be easily switched out. Furthermore, the argument that social media platforms are public forums and cannot engage in content moderation has consistently been rejected in courts. While these debates are important to understand, it's crucial to recognize their limitations and consider the unique characteristics of digital platforms.

    • Courts have ruled that social media platforms aren't public squares or state actorsCourts have consistently found that social media companies don't have the same responsibilities as traditional public spaces or government entities regarding free speech.

      The concept of public squares and state action only applies to a very limited set of circumstances where private companies are replacing functions traditionally done by the government. Cases like Pruneyard, Packingham, and Manhattan News Network have set precedents that social media platforms like Twitter, Facebook, and YouTube do not qualify as public squares or state actors. The recent executive order on social media regulation may generate hype, but its constitutionality remains uncertain, as previous attempts to regulate social media in this way have been deemed unconstitutional by various agencies.

    • FCC has no authority to regulate websitesThe recent executive order asking the FCC to regulate websites goes against the law and decades of case law, and is unlikely to change the legal status of websites under FCC jurisdiction.

      The recent executive order asking the Federal Communications Commission (FCC) to reinterpret Section 230 of the Communications Decency Act (CDA) is concerning because it goes against both the law and decades of case law, which have established that the FCC has no authority to regulate websites. The FCC can make rulemaking requests, but it's an independent agency and the president cannot instruct it to do something. The FCC can create nuisance through rulemaking and enforcement, but it cannot regulate websites as they are not within its jurisdiction. The executive order misinterprets CDA 230 by applying the limitations on moderation ability to the part about not being responsible for third-party content. This misinterpretation could lead to a lengthy and distracting rulemaking process, but ultimately, it is unlikely to change the legal status of websites under FCC jurisdiction.

    • Interpreting Section 230 of the CDA ConfusinglyThe recent executive order proposes a confusing interpretation of Section 230 of the CDA, potentially leading to misuse and legal challenges, including the linking of encryption debates through the Earn It Act.

      The recent executive order regarding Section 230 of the Communications Decency Act (CDA) proposes a confusing and potentially bad faith interpretation of the law. The order suggests that the FCC should examine whether the provisions that protect websites from liability for their users' libelous or other harmful content are being conflated with the good faith aspect of moderating content. Additionally, the order instructs the attorney general to draft a state law to reinterpret CDA 230 in a way that diminishes its power, despite the fact that the law does not limit federal criminal liability for sites hosting illegal content. This could lead to confusion and potential misuse of the law. The encryption debate, which involves conflicting views on end-to-end encryption and 230 protections, is also being linked to this issue through the Earn It Act, which could create significant legal and practical challenges.

    • Impact of Executive Order on Social Media Companies' AdvertisingThe executive order's impact on social media companies' advertising is limited due to the FCC's jurisdiction and the First Amendment, but potential restrictions on government advertising spending and impact on data collection for agencies could be significant.

      Social media companies like Twitter, Facebook, and YouTube may face increased scrutiny and potential restrictions from the government, but the executive order's impact is limited due to the FCC's jurisdiction and the First Amendment. The order's threat to limit government advertising spending on these platforms is minimal, as federal procurement records show that it's a small portion of their revenues. However, the potential impact on agencies, such as the Census Bureau, that use social media advertising to reach a wider audience could be significant, potentially limiting their ability to collect data required by the Constitution. Additionally, Congress could introduce legislation to address this issue, but it would likely face First Amendment challenges and may not be successful. Overall, the executive order represents a complex issue with significant implications for free speech, government spending, and data collection.

    • The complexity of content moderationMike's theorem shows that content moderation is subjective and mistakes will be made, but it's crucial to continue the conversation and find a balance between free speech and safety.

      Content moderation on the internet, as regulated by Section 230 of the Communications Decency Act, is a complex and subjective issue with no easy solutions. Mike's "Masnick impossibility theorem" highlights the fact that any decision made about content moderation will upset someone, and the vast amount of content being generated daily ensures that mistakes will be made. Despite these challenges, it's important to continue the conversation around content moderation and its role in protecting users and maintaining a healthy online community. Solutions may include a combination of human moderation, AI technology, and clearer guidelines for content policies. Ultimately, finding a balance between free speech and safety will require ongoing dialogue and collaboration between tech companies, policymakers, and users.

    • Section 230 enables online experimentation and innovationSection 230's protection for websites allows for user-generated content and community moderation, fostering innovation and competition, but potential regulations could unintentionally limit smaller online services.

      Section 230 of the Communications Decency Act plays a crucial role in enabling diverse experimentation and innovation across various online platforms, including social media sites, blogs, and search engines. This protection extends to all websites, regardless of their size or popularity, and allows for user-generated content and community moderation. However, proposed regulations, such as the recent executive order, could inadvertently limit the growth of new and smaller online services by making compliance too costly and complex. It's essential to remember that regulations, like GDPR, have unintended consequences, often benefiting larger companies at the expense of smaller players. Therefore, striking a balance between necessary regulations and maintaining the openness and accessibility of the internet is crucial for fostering innovation and competition.

    • Push to change internet moderation rulesThe ongoing debate and experimentation with various moderation approaches in crypto and decentralized systems is crucial to finding the best solutions for different communities and purposes, recognizing that there isn't a one-size-fits-all approach.

      There's a growing push to change the rules of internet moderation, with the recent executive order being just one example. While the order itself may not bring about significant changes, the broader trend of legislation targeting content moderation is a cause for concern. This could potentially limit the types of moderation approaches available and impact freedom of speech online. It's crucial to allow for experimentation with various governance models in the crypto and decentralized systems space, as different approaches will work best for different communities and purposes. The ongoing debate and experimentation are essential to finding the best solutions for various services and recognizing that there isn't a one-size-fits-all approach.

    • Effective Communication: Active Listening, Empathy, and ClarityEffective communication involves active listening, empathy, clarity, and nonverbal cues to build stronger relationships, improve collaboration, and enhance productivity.

      Effective communication is key to building strong relationships, both in personal and professional settings. During our discussion, we explored various aspects of communication, including active listening, empathy, and clarity. Active listening involves fully concentrating on the speaker, understanding their perspective, and responding appropriately. Empathy, on the other hand, involves putting oneself in someone else's shoes and understanding their emotions. Clarity is essential to ensure that messages are conveyed effectively and that there is no room for misinterpretation. Furthermore, we discussed the importance of body language and tone in communication, as they can often convey more than words alone. In conclusion, effective communication requires a combination of active listening, empathy, clarity, and nonverbal cues. By mastering these skills, we can build stronger relationships, improve collaboration, and enhance overall productivity.

    Recent Episodes from a16z Podcast

    Cybersecurity's Past, Present, and AI-Driven Future

    Cybersecurity's Past, Present, and AI-Driven Future

    Is it time to hand over cybersecurity to machines amidst the exponential rise in cyber threats and breaches?

    We trace the evolution of cybersecurity from minimal measures in 1995 to today's overwhelmed DevSecOps. Travis McPeak, CEO and Co-founder of Resourcely, kicks off our discussion by discussing the historical shifts in the industry. Kevin Tian, CEO and Founder of Doppel, highlights the rise of AI-driven threats and deepfake campaigns. Feross Aboukhadijeh, CEO and Founder of Socket, provides insights into sophisticated attacks like the XZ Utils incident. Andrej Safundzic, CEO and Founder of Lumos, discusses the future of autonomous security systems and their impact on startups.

    Recorded at a16z's Campfire Sessions, these top security experts share the real challenges they face and emphasize the need for a new approach. 

    Resources: 

    Find Travis McPeak on Twitter: https://x.com/travismcpeak

    Find Kevin Tian on Twitter: https://twitter.com/kevintian00

    Find Feross Aboukhadijeh on Twitter: https://x.com/feross

    Find Andrej Safundzic on Twitter: https://x.com/andrejsafundzic

     

    Stay Updated: 

    Find a16z on Twitter: https://twitter.com/a16z

    Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

    Subscribe on your favorite podcast app: https://a16z.simplecast.com/

    Follow our host: https://twitter.com/stephsmithio

    Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.

     

    The Science and Supply of GLP-1s

    The Science and Supply of GLP-1s

    Brooke Boyarsky Pratt, founder and CEO of knownwell, joins Vineeta Agarwala, general partner at a16z Bio + Health.

    Together, they talk about the value of obesity medicine practitioners, patient-centric medical homes, and how Brooke believes the metabolic health space will evolve over time.

    This is the second episode in Raising Health’s series on the science and supply of GLP-1s. Listen to last week's episode to hear from Carolyn Jasik, Chief Medical Officer at Omada Health, on GLP-1s from a clinical perspective.

     

    Listen to more from Raising Health’s series on GLP-1s:

    The science of satiety: https://raisinghealth.simplecast.com/episodes/the-science-and-supply-of-glp-1s-with-carolyn-jasik

    Payers, providers and pricing: https://raisinghealth.simplecast.com/episodes/the-science-and-supply-of-glp-1s-with-chronis-manolis

     

    Stay Updated: 

    Let us know what you think: https://ratethispodcast.com/a16z

    Find a16z on Twitter: https://twitter.com/a16z

    Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

    Subscribe on your favorite podcast app: https://a16z.simplecast.com/

    Follow our host: https://twitter.com/stephsmithio

    Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.

    The State of AI with Marc & Ben

    The State of AI with Marc & Ben

    In this latest episode on the State of AI, Ben and Marc discuss how small AI startups can compete with Big Tech’s massive compute and data scale advantages, reveal why data is overrated as a sellable asset, and unpack all the ways the AI boom compares to the internet boom.

     

    Subscribe to the Ben & Marc podcast: https://link.chtbl.com/benandmarc

     

    Stay Updated: 

    Let us know what you think: https://ratethispodcast.com/a16z

    Find a16z on Twitter: https://twitter.com/a16z

    Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

    Subscribe on your favorite podcast app: https://a16z.simplecast.com/

    Follow our host: https://twitter.com/stephsmithio

    Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.

    Predicting Revenue in Usage-based Pricing

    Predicting Revenue in Usage-based Pricing

    Over the past decade, usage-based pricing has soared in popularity. Why? Because it aligns cost with value, letting customers pay only for what they use. But, that flexibility is not without issues - especially when it comes to predicting revenue. Fortunately, with the right process and infrastructure, your usage-based revenue can become more predictable than the traditional seat-based SaaS model. 

    In this episode from the a16z Growth team, Fivetran’s VP of Strategy and Operations Travis Ferber and Alchemy’s Head of Sales Dan Burrill join a16z Growth’s Revenue Operations Partner Mark Regan. Together, they discuss the art of generating reliable usage-based revenue. They share tips for avoiding common pitfalls when implementing this pricing model - including how to nail sales forecasting, adopting the best tools to track usage, and deal with the initial lack of customer data. 

    Resources: 

    Learn more about pricing, packaging, and monetization strategies: a16z.com/pricing-packaging

    Find Dan on Twitter: https://twitter.com/BurrillDaniel

    Find Travis on LinkedIn: https://www.linkedin.com/in/travisferber

    Find Mark on LinkedIn: https://www.linkedin.com/in/mregan178

    Stay Updated: 

    Let us know what you think: https://ratethispodcast.com/a16z

    Find a16z on Twitter: https://twitter.com/a16z

    Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

    Subscribe on your favorite podcast app: https://a16z.simplecast.com/

    Follow our host: https://twitter.com/stephsmithio

    Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.

    California's Senate Bill 1047: What You Need to Know

    California's Senate Bill 1047: What You Need to Know

    On May 21, the California Senate passed bill 1047.

    This bill – which sets out to regulate AI at the model level – wasn’t garnering much attention, until it slid through an overwhelming bipartisan vote of 32 to 1 and is now queued for an assembly vote in August that would cement it into law. In this episode, a16z General Partner Anjney Midha and Venture Editor Derrick Harris breakdown everything the tech community needs to know about SB-1047.

    This bill really is the tip of the iceberg, with over 600 new pieces of AI legislation swirling in the United States. So if you care about one of the most important technologies of our generation and America’s ability to continue leading the charge here, we encourage you to read the bill and spread the word.

    Read the bill: https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=202320240SB1047

    a16z Podcast
    enJune 06, 2024

    The GenAI 100: The Apps that Stick

    The GenAI 100: The Apps that Stick

    Consumer AI is moving fast, so who's leading the charge? 

    a16z Consumer Partners Olivia Moore and Bryan Kim discuss our GenAI 100 list and what it takes for an AI model to stand out and dominate the market.

    They discuss how these cutting-edge apps are connecting with their users and debate whether traditional strategies like paid acquisition and network effects are still effective. We're going beyond rankings to explore pivotal benchmarks like D7 retention and introduce metrics that define today's AI market.

    Note: This episode was recorded prior to OpenAI's Spring update. Catch our latest insights in the previous episode to stay ahead!

     

    Resources:

    Link to the Gen AI 100: https://a16z.com/100-gen-ai-apps

    Find Bryan on Twitter: https://twitter.com/kirbyman

    Find Olivia on Twitter: https://x.com/omooretweets

     

    Stay Updated: 

    Find a16z on Twitter: https://twitter.com/a16z

    Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

    Subscribe on your favorite podcast app: https://a16z.simplecast.com/

    Follow our host: https://twitter.com/stephsmithio

    Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.

    Finding a Single Source of AI Truth With Marty Chavez From Sixth Street

    Finding a Single Source of AI Truth With Marty Chavez From Sixth Street

    a16z General Partner David Haber talks with Marty Chavez, vice chairman and partner at Sixth Street Partners, about the foundational role he’s had in merging technology and finance throughout his career, and the magical promises and regulatory pitfalls of AI.

    This episode is taken from “In the Vault”, a new audio podcast series by the a16z Fintech team. Each episode features the most influential figures in financial services to explore key trends impacting the industry and the pressing innovations that will shape our future. 

     

    Resources: 
    Listen to more of In the Vault: https://a16z.com/podcasts/a16z-live

    Find Marty on X: https://twitter.com/rmartinchavez

    Find David on X: https://twitter.com/dhaber

     

    Stay Updated: 

    Find a16z on Twitter: https://twitter.com/a16z

    Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

    Subscribe on your favorite podcast app: https://a16z.simplecast.com/

    Follow our host: https://twitter.com/stephsmithio

    Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.

    A Big Week in AI: GPT-4o & Gemini Find Their Voice

    A Big Week in AI: GPT-4o & Gemini Find Their Voice

    This was a big week in the world of AI, with both OpenAI and Google dropping significant updates. So big that we decided to break things down in a new format with our Consumer partners Bryan Kim and Justine Moore. We discuss the multi-modal companions that have found their voice, but also why not all audio is the same, and why several nuances like speed and personality really matter.

     

    Resources:

    OpenAI’s Spring announcement: https://openai.com/index/hello-gpt-4o/

    Google I/O announcements: https://blog.google/technology/ai/google-io-2024-100-announcements/

     

    Stay Updated: 

    Let us know what you think: https://ratethispodcast.com/a16z

    Find a16z on Twitter: https://twitter.com/a16z

    Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

    Subscribe on your favorite podcast app: https://a16z.simplecast.com/

    Follow our host: https://twitter.com/stephsmithio

    Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.

     

     

    Remaking the UI for AI

    Remaking the UI for AI

    Make sure to check out our new AI + a16z feed: https://link.chtbl.com/aiplusa16z
     

    a16z General Partner Anjney Midha joins the podcast to discuss what's happening with hardware for artificial intelligence. Nvidia might have cornered the market on training workloads for now, but he believes there's a big opportunity at the inference layer — especially for wearable or similar devices that can become a natural part of our everyday interactions. 

    Here's one small passage that speaks to his larger thesis on where we're heading:

    "I think why we're seeing so many developers flock to Ollama is because there is a lot of demand from consumers to interact with language models in private ways. And that means that they're going to have to figure out how to get the models to run locally without ever leaving without ever the user's context, and data leaving the user's device. And that's going to result, I think, in a renaissance of new kinds of chips that are capable of handling massive workloads of inference on device.

    "We are yet to see those unlocked, but the good news is that open source models are phenomenal at unlocking efficiency.  The open source language model ecosystem is just so ravenous."

    More from Anjney:

    The Quest for AGI: Q*, Self-Play, and Synthetic Data

    Making the Most of Open Source AI

    Safety in Numbers: Keeping AI Open

    Investing in Luma AI

    Follow everyone on X:

    Anjney Midha

    Derrick Harris

    Check out everything a16z is doing with artificial intelligence here, including articles, projects, and more podcasts.

     

    Stay Updated: 

    Find a16z on Twitter: https://twitter.com/a16z

    Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

    Subscribe on your favorite podcast app: https://a16z.simplecast.com/

    Follow our host: https://twitter.com/stephsmithio

    Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.

    a16z Podcast
    enMay 16, 2024

    How Discord Became a Developer Platform

    How Discord Became a Developer Platform

    In 2009 Discord cofounder and CEO, Jason Citron, started building tools and infrastructure for games. Fast forward to today and the platform has over 200 million monthly active users. 

    In this episode, Jason, alongside a16z General Partner Anjney Midha—who merged his company Ubiquiti 6 with Discord in 2021—shares insights on the nuances of community-driven product development, the shift from gamer to developer, and Discord’s longstanding commitment to platform extensibility. 

    Now, with Discord's recent release of embeddable apps, what can we expect now that it's easier than ever for developers to build? 

    Resources: 

    Find Jason on Twitter: https://twitter.com/jasoncitron

    Find Anjney on Twitter: https://twitter.com/AnjneyMidha

     

    Stay Updated: 

    Find a16z on Twitter: https://twitter.com/a16z

    Find a16z on LinkedIn: https://www.linkedin.com/company/a16z

    Subscribe on your favorite podcast app: https://a16z.simplecast.com/

    Follow our host: https://twitter.com/stephsmithio

    Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.

     

    Related Episodes

    Section 230 CDA: Content Moderation, Free Speech, the Internet

    Section 230 CDA: Content Moderation, Free Speech, the Internet

    In this special "2x" episode (#32) of our news show 16 Minutes -- where we quickly cover the headlines and tech trends, offering analysis, frameworks, explainers, and more -- we cover the tricky but important topic of Section 230 of the Communications Decency Act. The 1996 law has been in the headlines a lot recently, in the context of Twitter, the president's tweets, and an executive order put out by the White House just this week on quote- "preventing online censorship". All of this is playing out against the broader, more profound cultural context and events around the death of George Floyd in Minnesota and beyond, and ongoing old-new debates around content moderation on social media.

    To make sense of only the technology and policy aspects of Section 230 specifically -- and where the First Amendment, content moderation, and more come in -- a16z host Sonal Chokshi brings on our first-ever outside guest for 16 Minutes, Mike Masnick, founder of the digital-native policy think tank Copia Institute and editor of the longtime news & analysis site Techdirt.com (which also features an online symposium for experts discussing difficult policy topics). Masnick has written extensively about these topics -- not just recently but for years -- along with others in media recently attempting to explain what's going on and dissect what the executive order purports to do (some are even tracking different versions as well).

    So what's hype/ what's real -- given this show's throughline! -- around what CDA 230 precisely does and doesn't do, the role of agencies like the FCC, and more? What are the nuances and exceptions, and how do we tease apart the most common (yet incorrect) rhetorical arguments such as "platform vs. publisher", "like a utility/ phone company", "public forum/square" and so on? Finally: how does and doesn't Section 230 connect to the First Amendment when it comes to companies vs. governments; what does "good faith" really mean and what are possible paths and ways forward among the divisive debates around content moderation? All this and more in this 2x+ long explainer episode of 16 Minutes.

    All about Section 230: What It Does and Doesn't Say

    All about Section 230: What It Does and Doesn't Say

     We cover the tricky but important topic of Section 230 of the Communications Decency Act. The 1996 law has been in the headlines a lot recently, in the context of Twitter, the president’s tweets, and an executive order put out by the White House on “preventing online censorship”. All of this is playing out against the broader, more profound cultural context and events around the death of George Floyd in Minnesota and beyond, and ongoing old-new debates around content moderation on social media. [Please note this episode was first published  May 31.] 

    To make sense of only the technology and policy aspects of Section 230 specifically — and where the First Amendment, content moderation, and more come in — a16z host Sonal Chokshi brings on our first-ever outside guest for 16 Minutes, Mike Masnick, founder of the digital-native policy think tank Copia Institute and editor of the longtime news & analysis site Techdirt.com (which also features an online symposium for experts discussing difficult policy topics). Masnick has written extensively about these topics — not just recently but for years — along with others in media recently attempting to explain what’s going on and dissect what the executive order purports to do (some are even tracking different versions as well).

    So what’s hype/ what’s real — given this show’s throughline! — around what CDA 230 precisely does and doesn’t do, the role of agencies like the FCC, and more? What are the nuances and exceptions, and how do we tease apart the most common (yet incorrect) rhetorical arguments such as “platform vs. publisher”, “like a utility/ phone company”, “public forum/square” and so on? Finally: how does and doesn’t Section 230 connect to the First Amendment when it comes to companies vs. governments; what does “good faith” really mean and what are possible paths and ways forward among the divisive debates around content moderation? All this and more in this extra-long explainer episode of 16 Minutes, shared here for longtime listeners of the a16z Podcast.

     

    image: presidential tweet activity/ Wikimedia Commons

    Capitalism, Free Speech, and Social Media Censorship with Professor Anders Walker of SLU Law

    Capitalism, Free Speech, and Social Media Censorship with Professor Anders Walker of SLU Law

    Sean has an insightful discussion with SLU Law Professor Anders Walker. Topics discussed include: What is Free Speech? What speech is protected? Can social media companies really ban politicians (such as President Trump) and other forms of political speech? Will the Supreme Court intervene? And more!

    Professor Anders Walker is the Lillie Myers Professor of Law at Saint Louis University School of Law (SLU Law). Professor Walker’s research and teaching focus on intersections between constitutional law, criminal law, and legal history.

    Professor Walker earned his PhD in African American Studies and History from Yale and his law degree from Duke. He has been voted SLU Law’s Teacher of the Year five different times, including for three consecutive years 2017-2019. He was Sean's professor for Constitutional Law II, and is currently his professor for a course on American Legal History.

    Follow Sean on Instagram and TikTok: @seansandifer

    Like Us on Facebook: facebook.com/theseansandifershow

    Join Our Email List: theseansandifershow.com

    10+ Years of #SocialMediaDay: The Good, The Bad & The Law

    10+ Years of #SocialMediaDay: The Good, The Bad & The Law

    On this #SocialMediaDay2021 episode, Chris McCarty of Knoxville-based law firm Lewis Thomason delves into the larger issues at hand for companies/organizations and individuals alike in managing the good, the bad and the law, when it comes to leveraging the power of social media in productive and legally compliant ways.

    Kelly and Mary Beth set the stage by providing a flashback to a November 2020 episode of #MsInterPReted (Social Media & Surveillance Capitalism: Dr. Candace White), and Kelly also shares a recent challenge experienced with a Facebook hacker ... indicative of the power -- much of it unchecked -- that the largest social media companies in particular wield, which can derail any company's ability to do business, at the drop of a hat.

    Chris discusses legal implications of these and other challenges, as well as broader issues of free speech and regulatory facts that every PR person and professional communicator should know, in today's dynamic social media environment.

    About Chris McCarty of Lewis Thomason, P.C.:

    Chris W. McCarty, a shareholder in Lewis Thomason's Knoxville office, practices in the areas of employment law, education law and civil litigation. Chris handles matters before state and federal courts throughout Tennessee, and has argued before the Tennessee Court of Appeals. Chris also presents on employment and education law topics. His articles on those topics have been seen in numerous publications, including HR Magazine, the Tennessee Bar Journal and the Knoxville Business Journal. Chris is approved as a member of the American Arbitration Association’s (AAA) Panel of Employment Arbitrators.

     AFFILIATIONS

    • Federation of Defense & Corporate Counsel
    • DRI – Employment & Labor Law Committee
    • Society for Human Resource Management
    • American Bar Association
    • Knoxville Bar Association
    • Tennessee Bar Association
    • National School Boards Association Council of School Attorneys


    PROFESSIONAL HONORS AND ACTIVITIES

    • Tennessee Council of School Board Attorneys, President, 2018-2020
    • Best Lawyers® 2020 Employment Law – Management “Lawyer of the Year” in Knoxville
    • Named to The Best Lawyers in America©, Employment Law Management and Education Law
    • Named a Cityview Magazine Top Attorney
    • Federation of Defense & Corporate Counsel, Elected Member, 2018
    • Knoxville Bar Foundation, Fellows Program, 2016
    • Member, American Arbitration Association (AAA) Panel of Employment Arbitrators
    • Mid-South Super Lawyers® Rising Star, 2015, 2016
    • Leadership Sevier, 2015
    • Knoxville Zoo, Circle of Friends Leadership Council, 2015
    • Nucleus Knoxville, President, 2014
    • “40 Under 40”, Knoxville Business Journal, 2013
    • Tennessee Bar Association Leadership Law, 2012
    • Leadership Tomorrow Sevier, 2010
    • Tennessee Bar Association Young Lawyers Division, President’s Award, 2009
    • Introduction Knoxville, 2007
    • Knoxville Bar Association, Publications Committee

    FOLLOW Chris McCarty and Lewis Thomason:
    Twitter: @LewisThomasonTN
    LinkedIn: Lewis Thomason / LinkedIn and Chris McCarty / LinkedIn
    Website: https://www.lewisthomason.com/

    FOLLOW FLETCHER MARKETING PR:

    All about Section 230, In the News

    All about Section 230, In the News

    All about section 230 of the Communications Decency Act -- what does and doesn't it say? How does this law play out against broader questions and debates around platforms, content moderation, and free speech? 

    This conversation between Mike Masnick (founder and editor in chief of Techdirt) and a16z editor in chief Sonal Chokshi was originally published May 2020, in the context of previous protests and presidential tweets (and an executive order then to prevent “online censorship”)-- but is exactly as relevant today... perhaps now more than ever.

    https://a16z.com/2020/05/31/16mins-section-230-communications-decency-act-content-moderation-free-speech-internet-past-present-future/

    image: presidential tweet activity/ Wikimedia Commons