How to Think About Gen AI Use Cases
๐ Abstract
The article discusses the importance of taking a thoughtful and strategic approach to implementing Generative AI (Gen AI) in businesses, drawing lessons from the implementation of traditional AI over the past 30 years. It argues that businesses should start with employee-facing use cases for Gen AI, rather than rushing to implement customer-facing chatbots, in order to manage risk and extract measurable near-term value.
๐ Q&A
[01] Getting the Foundations Right
1. What are the key differences between traditional AI and Gen AI?
- Traditional AI was developed to reduce uncertainty, while Gen AI was developed to solve problems like reducing wasted time in information search, increasing speed and accuracy of information summarization, and reducing the cost and effort of creating non-unique content.
- The foundations of traditional AI are data marts, data warehouses, and data lakes, while the foundation for Gen AI is the large language model (LLM), which represents a "single source of truth" similar to the data warehouse.
2. What are the analogous operating principles for Gen AI to achieve sustainability, scalability, and reusability like traditional AI? The article states that the author does not yet have strong opinions on this, but observationally the operating principles may be centered around:
- Establishing a common set of LLMs as the foundational layer
- Developing standards and practices for prompt engineering and prompt libraries
- Ensuring robust data governance and model management practices
- Enabling seamless integration of Gen AI into existing workflows and systems
3. Why should businesses start with employee-facing use cases for Gen AI, rather than customer-facing chatbots?
- Employee-facing use cases provide a high-yield, low-risk opportunity to learn from and mature Gen AI solutions, as employees have sufficient "working sense" to evaluate the output.
- Customer-facing chatbots are a higher-risk application, and businesses should only pursue them after sufficient learnings and implementation experience with employee-facing use cases.
[02] Lessons from Traditional AI Implementation
1. How did the application of traditional AI evolve across industries?
- Traditional AI initially found early footings in marketing analytics and customer science, as these were high-yield, low-risk use cases.
- Over time, the application of traditional AI moved "inwards" towards organizational risk management, operations decision automation, and financial optimization, as businesses gained more experience and learnings.
2. What are the key lessons from the implementation of traditional AI that can be applied to Gen AI?
- The foundations of traditional AI, such as data marts, data warehouses, and data lakes, provide lessons on how to structure, index, and run efficient computations on data to achieve sustainability, scalability, and reusability.
- Similarly, the large language model (LLM) represents the foundational layer for Gen AI, and understanding how to manage and govern this layer will be crucial for successful Gen AI implementation.