~50 min read · updated 2026-05-10

Memory

Short-term, long-term, episodic, semantic — what each memory type does and where production agents actually use them.

The agent loop in module 01 has in-session memory via its message history. That covers a single task. The moment you want an agent that remembers what happened yesterday — or knows facts beyond its training data — you need something more.

This module is being expanded with code examples for short-term context management, vector-backed long-term memory using pgvector, episodic memory for conversation history, and the trade-offs between Mem0, Letta (formerly MemGPT), Zep, and rolling your own.

Coming in the next revision: short-term vs long-term primitives, vector store basics, retrieval ranking, summarization for context compression, and a worked example wiring an OpenAI-embedding-backed pgvector store into the agent from module 01.

Next: Module 05 — Planning patterns.