You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I run the demo code chat.py using the following script: CUDA_VISIBLE_DEVICES=0 python chat.py --version='xinlai/LISA-13B-llama2-v1' --precision='fp16' --load_in_4bit on my GPU. It present the USER, ASSISTANT conversation as you demonstrate in your README.
I am curious about if the model keep the conversation history or not? I try to make a conversation with the model but I am not sure the response from it is depend on the previous dialogue or just the current input prompt?
Thanks!
The text was updated successfully, but these errors were encountered:
Hi, thank you for making this great work!
I run the demo code
chat.py
using the following script:CUDA_VISIBLE_DEVICES=0 python chat.py --version='xinlai/LISA-13B-llama2-v1' --precision='fp16' --load_in_4bit
on my GPU. It present the USER, ASSISTANT conversation as you demonstrate in your README.I am curious about if the model keep the conversation history or not? I try to make a conversation with the model but I am not sure the response from it is depend on the previous dialogue or just the current input prompt?
Thanks!
The text was updated successfully, but these errors were encountered: