-
Notifications
You must be signed in to change notification settings - Fork 791
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Dynamic Service creation #4498
base: main
Are you sure you want to change the base?
Conversation
* fix: current directory * fix * fix * fix * ci: auto fixes from pre-commit.ci For more information, see https://pre-commit.ci --------- Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
from typing import Any | ||
|
||
import bentoml | ||
from bentoml import Runner |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi holzweber. We are deprecating Runner API from 1.2. But I see the values of the dynamic servive example. Would you like update this once we finished the 1.2 DOC?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thus may I mark this as merge-hold?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@bojiang i just checked the new documentation. but i do not know if my idea will work any longer, as i can not access the service object with the new service-annotation idea. i am scared that we really need to generate a pyhton file for dynamic services.. which is kind of an ugly solution...
any other idea how to get this working with 1.2?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Dynamic service in 1.2 will still work, as OpenLLM has to do something similar like this.
You can probably hijack directly into the service object, since bentoml.Service
will return the new service object, which treats all runnable as a normal python class.
I think one huge difference here is that the lifecycle is just a class, so probably a lot simpler.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Will send more example once I finish openllm revamp 😄
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@aarnphm ... Any more updates on this?
I was trying to do something like this, but it does not work as only the last services is in the service list afterwards (based on what i see in openapi)....
def wrap_service_methods(model: bentoml.Model,
targets: Any,
predict_route: str,
predict_name: str,
predict_proba_route: str,
predict_proba_name: str,
):
"""Wrap models in service methods and annotate as api."""
@bentoml.api(route=predict_route, name=predict_name)
async def predict(input_doc: str):
predictions = await model.predict.async_run([input_doc])
return {"result": targets[predictions[0]]}
@bentoml.api(route=predict_proba_route, name=predict_proba_name)
async def predict_proba(input_doc: str):
predictions = await model.predict_proba.async_run([input_doc])
return predictions[0]
return predict, predict_proba
@bentoml.service(
workers=1, resources={"cpu": "1"}
)
class DynamicService:
def __init__(self):
"""Nothing to do here."""
pass
for idx, available_model in enumerate(bentoml.models.list()):
if "twenty_news_group" in available_model.tag.name:
print(f"Creating Endpoint {idx}")
bento_model = bentoml.sklearn.get(f"{available_model.tag.name}:latest")
print(bento_model)
target_names = bento_model.custom_objects["target_names"]
path_predict = f"predict_model_{idx}"
path_predict_proba = f"predict_proba_model_{idx}"
predict, predict_proba = wrap_service_methods(bento_model,
target_names,
predict_route=path_predict,
predict_name=path_predict,
predict_proba_route=path_predict_proba,
predict_proba_name=path_predict_proba,
)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Edit: i added the methods via the locals() function. can you check and resolve conversation, if this is okey?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you can probably use types.new_class here, or even type() to construct subclass.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
true, i tried it with type() and it seems to work. updated my latest push.. so it is visible now :)
* Add the lcm lora use case doc Signed-off-by: Sherlock113 <[email protected]> * Fix space Signed-off-by: Sherlock113 <[email protected]> --------- Signed-off-by: Sherlock113 <[email protected]>
* fix(sdk): clean bentoml version * ci: auto fixes from pre-commit.ci For more information, see https://pre-commit.ci --------- Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
…l#4491) Fixes bentoml#4489 Signed-off-by: Frost Ming <[email protected]>
Update the get started docs Signed-off-by: Sherlock113 <[email protected]>
* Add client code examples without context manager Signed-off-by: Sherlock113 <[email protected]> * ci: auto fixes from pre-commit.ci For more information, see https://pre-commit.ci --------- Signed-off-by: Sherlock113 <[email protected]> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Update docs Signed-off-by: Sherlock113 <[email protected]>
Add authorization docs Signed-off-by: Sherlock113 <[email protected]>
Change sample input to oneline Signed-off-by: Sherlock113 <[email protected]>
* Update ControlNext use case docs Signed-off-by: Sherlock113 <[email protected]> * ci: auto fixes from pre-commit.ci For more information, see https://pre-commit.ci --------- Signed-off-by: Sherlock113 <[email protected]> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
) Update the distributed services and get started docs Signed-off-by: Sherlock113 <[email protected]>
* feat(cli): make CLI commands avaiable as modules Signed-off-by: Frost Ming <[email protected]> * fix: handle bentoml group Signed-off-by: Frost Ming <[email protected]> * fix: simply aliases Signed-off-by: Frost Ming <[email protected]> * fix: rename bento_command Signed-off-by: Frost Ming <[email protected]> --------- Signed-off-by: Frost Ming <[email protected]>
* Refactor BentoCloud docs Signed-off-by: Sherlock113 <[email protected]> * ci: auto fixes from pre-commit.ci For more information, see https://pre-commit.ci --------- Signed-off-by: Sherlock113 <[email protected]> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
…er/BentoML into dynamic-service-creation
For more information, see https://pre-commit.ci
For more information, see https://pre-commit.ci
What does this PR address?
Fixes #4460
Before submitting:
guide on how to create a pull request.
pre-commit run -a
script has passed (instructions)?those accordingly? Here are documentation guidelines and tips on writting docs.