Logo
    Search

    Podcast Summary

    • Pairing LLMs with knowledge graphsPairing large language models with knowledge graphs and vector search can enhance retrieval methods, prompt engineering, and overall performance.

      While large language models (LLMs) like ChatGPT offer impressive capabilities, they require reliable data and context to function effectively. The team at Neo4j is exploring how to enhance LLMs by pairing them with knowledge graphs and vector search. This approach can help improve retrieval methods, prompt engineering, and overall performance. The adoption of AI in enterprises is ongoing, with a growing trend towards using multiple model providers and open models. The landscape is still evolving, and companies are navigating the complexities of implementing and integrating various AI solutions. Data science teams are shifting towards more specialized roles, reflecting the growing complexity of the field. Overall, the AI community is continuing to explore new ways to harness the power of LLMs and other advanced AI technologies.

    • AI industry maturationThe AI industry is shifting from hype and marketing to practical applications, recognizing the strengths and limitations of different models, integrating software and AI teams, and focusing on unique value propositions.

      The software industry is experiencing a maturing phase, where there's a shift from hype and marketing to a focus on practical applications and combinations of various models and technologies. Data science teams still exist and are expanding, but there's a recognition of the limitations and strengths of different AI models. Companies are also starting to integrate software and AI teams operationally, leading to more efficient and effective use of resources. The trend of data science as a function continues to grow, with a focus on unique value propositions for organizations. The idea of full stack data science or AI agent development is materializing, with more integration of software and data science roles in larger organizations. Overall, it's an exciting time as the industry moves towards a more holistic and efficient approach to AI and software development.

    • Apple's approach to AIApple positions AI as a feature enhancement, integrates it into devices and tasks, addresses privacy concerns, and provides tools for building advanced AI features.

      Apple's approach to AI differentiates them from other tech companies by positioning AI as a feature enhancement rather than the product itself. This was evident in their WWDC 2023 announcements, where they focused on integrating AI into devices and tasks for users. This shift in focus received a positive response, especially from those skeptical of the hype surrounding AI products. Furthermore, Apple addressed privacy concerns regarding their use of external AI models by giving users control over whether or not to use these models on a per-use basis. This choice empowers users and addresses the trade-offs between closed model providers and open models, allowing organizations to navigate the use of these models based on their specific needs and resources. Early adoption of Plumb, a tool for building advanced AI features, is enabling product teams to transform data and create reliable, high-quality structured output, leading to faster idea validation. Overall, Apple's approach to AI and the availability of tools like Plumb demonstrate the ongoing evolution of AI integration in technology and its potential impact on various industries and applications.

    • Internal hosting vs Third-party APIsOrganizations prioritize data control and privacy by increasingly turning to internally hosted models, despite the added effort and resources required. This allows them to maintain full control over their data and AI features, while third-party APIs offer convenience and advanced functionality but pose significant risks with data leaving the network and potential for biased processing.

      Organizations are increasingly turning to internally hosted models due to privacy and data control concerns when using third-party APIs. While third-party APIs offer convenience and advanced functionality, there are significant risks associated with data leaving an organization's network and the potential for biased or opinionated processing of user inputs. Internal hosting allows organizations to maintain full control over their data and the models they use, even if it requires more effort and resources. Additionally, there is a growing recognition that open models may not provide the same level of performance as closed, productized systems, but the trade-off is worth it for many organizations to maintain control over their data and AI features. It's important to note that both internal hosting and third-party APIs have their pros and cons, and the choice between the two depends on an organization's specific needs and priorities. The control element is a developing mindset among organizations, and it's essential to consider the implications of using closed, productized systems, which can offer impressive functionality but also make decisions about how to process user data without the user's full knowledge or control.

    • Retrieval Augmented GenerationRAG techniques have evolved beyond direct response generation, using hypothetical documents and query transformation for better query regeneration

      Retrieval Augmented Generation (RAG) techniques in the field of data science have evolved beyond the naive approach of directly generating responses based on input queries. Instead, more advanced methods involve the use of hypothetical documents and query transformation. These techniques allow for the regeneration of queries to better suit the retrieval task at hand. This wider picture of RAG was discussed, although not all aspects were covered in detail. It's important for practitioners to recognize that the typical approach may be sufficient in some cases, but advanced tools are available to move beyond this baseline. However, many individuals seem to be getting stuck at this stage, so it's crucial to be aware of these advancements. Overall, the conversation underscored the importance of staying informed about the latest developments in RAG and considering the potential benefits of more sophisticated methods.

    Recent Episodes from Practical AI: Machine Learning, Data Science

    Stanford's AI Index Report 2024

    Stanford's AI Index Report 2024
    We’ve had representatives from Stanford’s Institute for Human-Centered Artificial Intelligence (HAI) on the show in the past, but we were super excited to talk through their 2024 AI Index Report after such a crazy year in AI! Nestor from HAI joins us in this episode to talk about some of the main takeaways including how AI makes workers more productive, the US is increasing regulations sharply, and industry continues to dominate frontier AI research.

    Apple Intelligence & Advanced RAG

    Apple Intelligence & Advanced RAG
    Daniel & Chris engage in an impromptu discussion of the state of AI in the enterprise. Then they dive into the recent Apple Intelligence announcement to explore its implications. Finally, Daniel leads a deep dive into a new topic - Advanced RAG - covering everything you need to know to be practical & productive.

    The perplexities of information retrieval

    The perplexities of information retrieval
    Daniel & Chris sit down with Denis Yarats, Co-founder & CTO at Perplexity, to discuss Perplexity’s sophisticated AI-driven answer engine. Denis outlines some of the deficiencies in search engines, and how Perplexity’s approach to information retrieval improves on traditional search engine systems, with a focus on accuracy and validation of the information provided.

    Using edge models to find sensitive data

    Using edge models to find sensitive data
    We’ve all heard about breaches of privacy and leaks of private health information (PHI). For healthcare providers and those storing this data, knowing where all the sensitive data is stored is non-trivial. Ramin, from Tausight, joins us to discuss how they have deploy edge AI models to help company search through billions of records for PHI.

    Rise of the AI PC & local LLMs

    Rise of the AI PC & local LLMs
    We’ve seen a rise in interest recently and a number of major announcements related to local LLMs and AI PCs. NVIDIA, Apple, and Intel are getting into this along with models like the Phi family from Microsoft. In this episode, we dig into local AI tooling, frameworks, and optimizations to help you navigate this AI niche, and we talk about how this might impact AI adoption in the longer term.

    AI in the U.S. Congress

    AI in the U.S. Congress
    At the age of 72, U.S. Representative Don Beyer of Virginia enrolled at GMU to pursue a Master’s degree in C.S. with a concentration in Machine Learning. Rep. Beyer is Vice Chair of the bipartisan Artificial Intelligence Caucus & Vice Chair of the NDC’s AI Working Group. He is the author of the AI Foundation Model Transparency Act & a lead cosponsor of the CREATE AI Act, the Federal Artificial Intelligence Risk Management Act & the Artificial Intelligence Environmental Impacts Act. We hope you tune into this inspiring, nonpartisan conversation with Rep. Beyer about his decision to dive into the deep end of the AI pool & his leadership in bringing that expertise to Capitol Hill.

    Full-stack approach for effective AI agents

    Full-stack approach for effective AI agents
    There’s a lot of hype about AI agents right now, but developing robust agents isn’t yet a reality in general. Imbue is leading the way towards more robust agents by taking a full-stack approach; from hardware innovations through to user interface. In this episode, Josh, Imbue’s CTO, tell us more about their approach and some of what they have learned along the way.

    Private, open source chat UIs

    Private, open source chat UIs
    We recently gathered some Practical AI listeners for a live webinar with Danny from LibreChat to discuss the future of private, open source chat UIs. During the discussion we hear about the motivations behind LibreChat, why enterprise users are hosting their own chat UIs, and how Danny (and the LibreChat community) is creating amazing features (like RAG and plugins).