-
Notifications
You must be signed in to change notification settings - Fork 677
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Add support for 01 model platform #1093
Conversation
e5d8e88
to
eebbc57
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thanks @MuggleJinx 's contribution! Left some comments below
camel/configs/yi_config.py
Outdated
stop (Union[str, Sequence[str], NotGiven], optional): Up to `4` | ||
sequences where the API will stop generating further tokens. | ||
(default: :obj:`NOT_GIVEN`) | ||
presence_penalty (float, optional): Number between :obj:`-2.0` and | ||
:obj:`2.0`. Positive values penalize new tokens based on whether | ||
they appear in the text so far, increasing the model's likelihood | ||
to talk about new topics. (default: :obj:`0.0`) | ||
frequency_penalty (float, optional): Number between :obj:`-2.0` and | ||
:obj:`2.0`. Positive values penalize new tokens based on their | ||
existing frequency in the text so far, decreasing the model's | ||
likelihood to repeat the same line verbatim. (default: :obj:`0.0`) | ||
logit_bias (dict, optional): Modify the likelihood of specified tokens | ||
appearing in the completion. Accepts a json object that maps tokens | ||
(specified by their token ID in the tokenizer) to an associated | ||
bias value from :obj:`-100` to :obj:`100`. (default: :obj:`{}`) | ||
user (str, optional): A unique identifier representing your end-user, | ||
which can help monitor and detect abuse. (default: :obj:`""`) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
from yi's doc, I didn't see they mentioned they support these arguments and I think they are not supported yet
camel/utils/token_counting.py
Outdated
model_type (UnifiedModelType): Model type for which tokens will be | ||
counted. | ||
""" | ||
self._internal_tokenizer = OpenAITokenCounter(model_type) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
how could Yi's model_type
used by OpenAITokenCounter
?
eebbc57
to
cc067bc
Compare
Co-authored-by: Wendong-Fan <[email protected]>
elif model_platform.is_yi and model_type.is_yi: | ||
model_class = YiModel |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would be better to move this above the ModelType.STUB
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @MuggleJinx!! That's awesome!
Description
This PR adds support for the Yi-series LLM models, including yi-lightning, yi-large, yi-medium, and yi-large-turbo. The addition of Yi-series models enhances the platform’s capabilities, providing support for a broader range of language models and enabling users to leverage different performance tiers.
Motivation and Context
Closes issue #1066.
Types of changes
What types of changes does your code introduce? Put an
x
in all the boxes that apply:Implemented Tasks
Checklist
Go over all the following points, and put an
x
in all the boxes that apply.If you are unsure about any of these, don't hesitate to ask. We are here to help!