v0.9.11
🚀 Added
YOLO World in the inference
Have you heard about YOLO World model? 🤔 If not - you would probably be interested to learn something about it! Our blog post 📰 may be a good starting point❗
Great news is that YOLO World is already integrated with inference
. Model is capable to perform zero-shot detections of classes specified in inference parameter. Thanks to that, you may start making videos like that just now 🚀
yellow-filling-output-1280x720.mp4
Simply install dependencies.
pip install inference-sdk inference-cli
Start the server
inference server start
And run inference against our HTTP server:
from inference_sdk import InferenceHTTPClient
client = InferenceHTTPClient(api_url="http://127.0.0.1:9001")
result = client.infer_from_yolo_world(
inference_input=YOUR_IMAGE,
class_names=["dog", "cat"],
)
Active Learning 🤝 workflows
Active Learning data collection made simple with workflows
🔥 Now, with just a little bit of configuration you can start data collection to improve your model over time. Just take look how easy it is:
active_learning_in_workflows.mp4
Key features:
- works for all models supported at Roboflow platform, including the ones from Roboflow Universe - making it trivial to use off-the-shelf model during project kick-off stage to collect dataset while serving meaningful predictions
- combines well with multiple
workflows
blocks - includingDetectionsConsensus
- making it possible to sample based on predictions of models ensemble 💥 - Active Learning block may use project-level config of Active Learning or define Active Learning strategy directly in the block definition (refer to Active Learning documentation 📖 for details on how to configure data collection)
See documentation 📖 of new ActiveLearningDataCollector
to find detailed info.
🌱 Changed
InferencePipeline
now works with all models supported at Roboflow platform 🎆
For a long time - InferencePipeline
worked only with object-detection models. This is no longer the case - from now on, other type of models supported at Roboflow platform (including stubs - like my-project/0
) work under InferencePipeline
. No changes are required in existing code. Just put model_id
of your model and the pipeline should work. Sinks suited for detection-only models were adjusted to ignore non-compliant formats of predictions and produce warnings notifying about incompatibility.
🔨 Fixed
- Bug in
yolact
model in #266
🏆 Contributors
@paulguerrie (Paul Guerrie), @probicheaux (Peter Robicheaux), @PawelPeczek-Roboflow (Paweł Pęczek)
Full Changelog: v0.9.10...v0.9.11