Chat-twin
From Computer Laboratory Group Design Projects
Jump to navigationJump to search
Proposed to: Matthew Postgate <matthew.postgate@infometa.com>
LLMs struggle to remember their history of interaction with you, and never update their training weights with real knowledge about your life. However recent research shows how it’s possible to get them to act like an intelligent agent, by maintaining your own description of a simple game world (including other human or LLM players). A highly compact version of this world state is fed back with the prompt for each new round of play. You will use the same strategy to turn an LLM into a digital twin, that keeps the most important records of your life, and helps you to prioritise and complete tasks through natural conversation.