Predict on a model from huggingface #3609
Unanswered
RaulPPelaez
asked this question in
Q&A
Replies: 2 comments
-
I got some help on slack and was able to get it working by setting the trainer to none and calling model.train with a dummy dataset: config = yaml.safe_load(
"""
model_type: llm
base_model: meta-llama/Llama-2-7b-hf
input_features:
- name: question
type: text
output_features:
- name: answer
type: text
trainer:
type: none
"""
)
model = LudwigModel(config=config, logging_level=logging.INFO)
import pandas as pd
qa_entry = pd.DataFrame([{"question":["a"],"answer":["a"]}]*3)
model.train(qa_entry)
model.predict("The capital of France is ") |
Beta Was this translation helpful? Give feedback.
0 replies
-
Glad you found a workaround @RaulPPelaez. We should definitely clean this up to make it more intuitive. Added #3614 to track the issue. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
How can I call prediction on a model from HuggingFace?
I am trying to run this:
However, I get an error: ValueError: Model has not been trained or loaded
If I call train first on some dataset then it is ok, Ludwig automatically downloads the model and then loads it before training.
After training I can call predict.
How can I just load and predict on a model without training first?
Beta Was this translation helpful? Give feedback.
All reactions