Please join Philip Ninan, Alliance Development Manager, AI, Pure Storage | Rajiv Mandal, AI Solutions Architect, Intel in Getting Your Data AI-Ready: Planning Considerations for Enterprise AI Models.
Generative Artificial Intelligence (GenAI) applications have the potential to increase operational efficiency, improve business process responsiveness, and drive new product innovation. Realizing this potential requires enterprises to leverage massive amounts of their corporate data to train a Retrieval Augment Generation (RAG) model that combines with a Large Language Model (LLM) to fully enable a GenAI solution. Successfully aggregating significant amounts of corporate data and then processing that data into a RAG inference models requires addressing a number of data challenges including:
• GenAI System Design: Compute Performance and availability, Energy Efficiency & Power Budget, Storage Capacity, Performance & Scalability
• Data Preparation: Staging Structured and Unstructured Data, Data cleansing and labeling
• RAG/Inference Pipeline Building: Document and Data Tokenizing, Embedding, Checkpointing
• Governance: Access Policies, Policy Enforcement, Backup/Recovery
• GenAI System Growth: GPU/CPU Scaling, Data Storage Scaling
Intel and Pure Storage will discuss these topics and present a range of recommendations and best practices to address these challenges and to accelerate your organization’s GenAI project deployment.
- Vendor:
- Posted:
- Aug 27, 2024
- Published:
- Aug 27, 2024
- Format:
- Type:
- Replay