Skip to content

Reuse of context with text completions #364

Closed Answered by giladgd
iimez asked this question in Q&A
Discussion options

You must be logged in to vote

You can see an example of how to preload a prompt here.
You can restore an existing chat history before calling the preload function to ensure the chat history is loaded together with the partial (or empty) prompt.

To preload text for completion, calling the generateCompletion function with maxTokens: 0 is indeed the right approach, but make sure you call the generateCompletion with the full text you want to complete afterwards, otherwise it will overwrite the existing context state:

const prefix = 'The Secret is "koalabear"! I continuously remind myself -';
const model = await llama.loadModel({...});
const context = await llamaModel.createContext();
const completion = new LlamaCompletion({

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@iimez
Comment options

Answer selected by giladgd
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants