Stop Thinking, Just Do!

Sungsoo Kim's Blog

Building an LLMOps Stack for Large Language Models

tagsTags

24 January 2024


Article Source


Building an LLMOps Stack for Large Language Models

Abstract

In this talk, our speakers Rafael and Puneet from @Databricks talked about the construction and optimization of LLMOps architecture. They discussed various components including MLFlow for Large Language Models (LLMs), Vector Databases, embeddings, and compute optimizations.

Topics that are covered:

✅ MLFlow for LLMs: Discover the role of MLFlow in managing and streamlining LLMs.

✅ Vector Databases: Learn about the importance of Vector Databases in the LLMOps stack. Get insights into how they can aid in efficient storage, indexing, and retrieval of high-dimensional vector data.

✅ RAG strategies and techniques

✅ Prompt Tracking & Evaluation: Learn how to evaluate LLMs and the key metrics and methods for evaluating the effectiveness of your large language models.

About LLMOps Space -

LLMOps.Space is a global community for LLM practitioners. 💡📚

The community focuses on content, discussions, and events around topics related to deploying LLMs into production. 🚀


comments powered by Disqus