Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

在没有报错的情况下,LongAlpaca-7B只对文本的第一段文字进行了响应 #158

Open
waleyW opened this issue Dec 14, 2023 · 0 comments

Comments

@waleyW
Copy link

waleyW commented Dec 14, 2023

您好,我需要处理的文件为大约30k,包含多段内容。我的指令是,对该段文本进行polish,根据返回结果,其只处理了我的第一段内容,请问这是什么原因。
我的参数如下:

python3 inference.py  \
        --base_model /data/models/LongAlpaca-13B \
        --question "Please polish the article below" \
        --context_size 32768 \
        --max_gen_len 32768 \
        --flash_attn True \
        --material "materials_path"
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant