Stop Thinking, Just Do!

Sungsoo Kim's Blog

Emerging architectures for LLM applications

tagsTags

17 November 2023


Article Source


Emerging architectures for LLM applications

Abstract

Everything from training models from scratch and fine-tuning open-source models to using hosted APIs, with a particular emphasis on the design pattern of in-context learning. Key topics we’ll cover during the session include:

  • Data preprocessing and embedding, focusing on the role of contextual data, embeddings, and vector databases in creating effective LLM applications.
  • Strategies for prompt construction and retrieval, which are becoming increasingly complex and critical for product differentiation.
  • Prompt execution and inference, analyzing the leading language model providers, their models, and tools used for logging, tracking, and evaluation of LLM outputs.
  • Hosting solutions for LLMs, comparing the common solutions and emerging tools for easier and more efficient hosting of LLM applications.

Whether you’re a seasoned AI professional, a developer beginning your journey with LLMs, or simply an enthusiast interested in the applications of AI, this webinar offers valuable insights that can help you navigate the rapidly evolving landscape of LLMs.


comments powered by Disqus