Let’s talk fast, accurate AI at AWS re:Invent.

Join us in Vegas on Dec. 1-5.
Building with LangChain

LLM session management with Redis

About

When you’re building with LLMs, memory matters. Here, Ricardo Ferreira shows you how to give your AI app a brain—by storing and reusing conversation history with LangChain and Redis. See how to connect chat memory to an OpenAI-powered LLM so your app can pick up right where it left off.

18 minutes
Key topics
  1. Build chat memory using LangChain and Redis
  2. Reuse past messages to make LLM responses smarter and more contextual
Speakers
Ricardo Ferreira

Ricardo Ferreira

Principal Developer Advocate

Latest content

See all
Image
Meet Redis LangCache: Semantic caching for AI
52 minutes
Image
Redis Released 2024 keynote: The future of fast starts here
1 hour 21 minutes
Image
What is hybrid search?
7 minutes

Get started with Redis today

Speak to a Redis expert and learn more about enterprise-grade Redis today.