generated from amazon-archives/__template_Apache-2.0
-
Notifications
You must be signed in to change notification settings - Fork 70
Open
Description
In AgentCoreMemorySessionManager.retrieve_customer_context, retrieved LTM context is appended as an assistant message AFTER the last user message:
Why change:
- The last message in the prompt becomes the memory dump, not the user request (can hurt instruction-following).
- Long-context positional bias: models over-weight info near the beginning/end; keep memory near the end but keep the USER message last. ("Lost in the Middle", Liu et al. 2023: https://arxiv.org/abs/2307.03172)
- The tag <user_context> is misleading (this content is retrieved memory, not user-authored).
Proposed fix:
- Insert the retrieved memory message immediately BEFORE the last user message (so user stays last).
- Rename wrapper tag to <retrieved_memory> (or similar).
OpenAI prompt structuring best practice (instructions + context separation): https://help.openai.com/en/articles/6654000-best-practices-for-prompt-engineering-with-the-openai-api
Anthropic context engineering (hybrid context + retrieval): https://www.anthropic.com/engineering/effective-context-engineering-for-ai-agents
Metadata
Metadata
Assignees
Labels
No labels