Podcast Summary
How algorithms personalize ads and recommendations: Algorithms use our data to make ads and recommendations feel personalized, but this comes at the cost of privacy.
Technology, specifically algorithms used by apps and online services, can make advertising and recommendations feel eerily personalized, sometimes even telepathic. However, this comes at the cost of privacy, as these companies collect and use our data. This was discussed in an episode of The Indicator from Planet Money, where they explored how difficult it is for government regulators to make companies get rid of this data. Meanwhile, our entertainment choices can also lead to surprising recommendations. For instance, after watching the classic film "Point Break" on HBO Max, the service suggested other crime thrillers from the same era. This shows how algorithms can use our past behavior to influence our future choices, making our experiences feel tailor-made. Yet, it's important to remember that these recommendations are based on data collection and analysis, and there are implications for our privacy.
Algorithms shaping our digital experiences, especially in health apps like Kurbo: Algorithms are used in various digital services to provide personalized recommendations, but their use with sensitive data like health information requires careful consideration and prioritization of privacy and security.
Algorithms, which are computer programs making decisions based on user inputs, have become an integral part of our daily online experiences. From shopping recommendations to personalized fitness apps, algorithms shape our interactions with digital services. In the case of Kurbo, a weight loss app for children and families, algorithms were used to analyze user data and provide personalized health, fitness, and weight loss recommendations. However, the use of these algorithms, particularly when dealing with sensitive data like health information, has attracted scrutiny from government regulators. The Federal Trade Commission (FTC) has taken action against Kurbo's parent company, WW International, alleging that the company failed to provide adequate data security and misrepresented the effectiveness of its app. As our reliance on digital services and the data they collect grows, it's crucial to consider the potential implications and ensure that privacy and security are prioritized.
FTC fines Kurbo $1.5 million for collecting data from children without parental consent: Companies risk severe consequences, including fines, algorithmic destruction, and reputation damage, for collecting data from children under 13 without parental consent. FTC is closely monitoring and enforcing COPPA with algorithmic destruction as a deterrent.
Companies that collect and use data from children under the age of 13 without parental consent risk facing severe consequences, including fines, algorithmic destruction, and damage to their reputation. The Children's Online Privacy Protection Act (COPPA) requires such companies to obtain parental consent before collecting and using personal information from children. The FTC has recently settled with Kurbo, a health and wellness app company, for allegedly marketing to and collecting data from children under 13 without parental consent and encouraging them to lie about their age during sign-up. The settlement includes a $1.5 million penalty and a requirement for Kurbo to delete any illegally collected data and destroy any algorithms built with that data. This is the third time in under three years that the FTC has required a company to destroy their algorithms due to the use of ill-gotten data. Previous cases involved Cambridge Analytica and a photo sharing app. The FTC is closely monitoring tech companies and using algorithmic destruction as a deterrent to enforce COPPA and protect children's privacy.
Deleting Data: A Complex Process: Companies must be proactive about data practices, map out data usage, and carefully consider what data they collect and retain to avoid the destruction of valuable algorithms and potential upheaval in tech operations, while also complying with data deletion regulations.
Data deletion, as mandated by regulatory bodies like the FTC, is not as simple as it seems. Companies like Kurbo, which have faced FTC settlements for illegally collecting data, have vast amounts of data that need to be deleted. However, separating illegitimately sourced data from legitimately sourced data is a complex process. Data can be copied, sliced, and used in various parts of a company, making it challenging to track data lineage. Companies need to be proactive about good data practices, map out how data is being used, and be thoughtful about what data they collect and hold on to. Failure to do so could result in the destruction of valuable algorithms and a significant upheaval in how tech companies operate. For users of these apps, the experience may also change as a result of the deleted data and algorithms.
Impact on lawful users if controversial algorithms are eliminated: Elimination of controversial algorithms could lead to loss of personalized services for lawful users, including minors and those who have paid for the service, potentially resulting in a less innovative and less responsive digital landscape.
The elimination of controversial algorithms used in apps, as seen in the case of Kurbo, could negatively impact a large number of lawful users. John Verdi at the Future Privacy Forum explains that these users, who may include minors and those who have paid for the service, could face the loss of a service they enjoy and rely upon. The widespread destruction of such algorithms could result in a less personalized and less anticipatory internet. Tech companies would need to adapt quickly to new regulations or find new solutions to maintain the current level of personalization and convenience. This process could be lengthy and uncertain, potentially leading to a less innovative and less responsive digital landscape.