Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About the contextual dialogue ability (or the memory ability) #301

Open
mcddddz opened this issue Jul 31, 2024 · 0 comments
Open

About the contextual dialogue ability (or the memory ability) #301

mcddddz opened this issue Jul 31, 2024 · 0 comments

Comments

@mcddddz
Copy link

mcddddz commented Jul 31, 2024

Thank you for the llama3 model. Now I'm trying some inference on Meta-Llama-3.1-8B, but it seems that it don't have the contextual dialogue ability. To be more specific, when I talked to it at second time, it have already forgotten the content when I first talked to it, althrough I just talked to it just a few seconds ago. Is there any wrong in my operation, or just because Meta-Llama-3.1-8B don't have a memory ability? Maybe adding the context I've talked to it to the new conversation can improve it, but is it the correct way to use this language model?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant