Memory Buffer Window

Conversation memory for AI agents. 1-hour TTL, KV-backed, blockchain-scoped.

Written By pvdyck

Last updated 21 minutes ago

Memory Buffer Window

The Memory Buffer Window stores recent conversation history for AI agents, enabling multi-turn conversations where the agent remembers previous messages.

How It Works

The node maintains a sliding window of the last N message pairs (human + AI). When the window is full, the oldest messages are dropped. Memory is persisted in Cloudflare KV and automatically expires after 1 hour of inactivity.

Parameters

ParameterDescription
Context Window LengthNumber of past message pairs to retain (default: 5). Higher values give more context but increase token usage.
Session IDCustom session identifier (V1.1+). Defaults to auto-generated. Use a fixed ID to share memory across executions.
Session KeyAdditional scoping key for memory isolation (V1.2+).

Technical Details

PropertyValue
StorageCloudflare KV (key-value store)
ScopePer service instance: chainId + contractAddress + tokenId + sessionId
TTL1 hour β€” Memory expires 1 hour after the last message. Not configurable.
IsolationDifferent service NFTs and sessions have completely separate memory.

Connects To

Parent NodeDescription
AgentProvides conversation memory for multi-turn agent interactions.
Chain LLMAdds conversation context to basic LLM chain calls.

Limitations

  • 1-hour TTL β€” Memory is lost after 1 hour of inactivity. No way to extend.
  • Cannot persist memory across different service contracts.
  • Higher context windows increase token costs with every request.
  • Not compatible with queue mode -- in n8n queue mode, multiple Simple Memory calls may not route to the same worker, causing memory loss. (This is an n8n-level limitation; indie.money uses KV-backed storage which avoids this issue.)

Tips

  • Start with a Context Window Length of 3-5 for most use cases. Only increase if the agent needs extensive conversation history.
  • Use a fixed Session ID (e.g., user wallet address) to maintain conversation continuity across separate workflow executions.
  • For longer-lived memory, use an external key-value store via HTTP Request node.

Related