The code behind the context: RAG compared to 1MM context LLMs

Priya Dwivedi
Generative AI
Published in
11 min readApr 19, 2024

--

From Unsplash — Link

Introduction

Augmenting language models with external knowledge is a critical challenge in natural language processing. The main approach for this so far is Retrieval Augmented Generation (RAG) but can Long Context LLMs replace RAG? RAG employs a two-step process — first retrieving relevant information from a knowledge base, then generating output…

--

--

CEO of Deep Learning Analytics, a AI/ML consultancy. We builds custom models for different use cases. Check us at: https://deeplearninganalytics.org