-
Notifications
You must be signed in to change notification settings - Fork 598
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
worker_process_entrypoint FAILED #15
Comments
LLaMA says: Chat
|
hi @prashanthcheemala, could you provide reproducible steps for this? There's been major updates on llama-stack so it might have solved this issue |
llama inference start
/opt/LLama_Agentic_System/llama3_1venv/lib/python3.11/site-packages/llama_toolchain/utils.py:43: UserWarning:
The version_base parameter is not specified.
Please specify a compatability version level, or None.
Will assume defaults for version 1.1
initialize(config_path=relative_path)
Loading config from : /root/.llama/configs/inference.yaml
Yaml config:
inference_config:
impl_config:
impl_type: inline
checkpoint_config:
checkpoint:
checkpoint_type: pytorch
checkpoint_dir: /root/.llama/checkpoints/Meta-Llama-3.1-8B-Instruct/original/
tokenizer_path: /root/.llama/checkpoints/Meta-Llama-3.1-8B-Instruct/original/tokenizer.model
model_parallel_size: 1
quantization_format: bf16
quantization: null
torch_seed: null
max_seq_len: 16384
max_batch_size: 1
Listening on :::5000
INFO: Started server process [6765]
INFO: Waiting for application startup.
Failures:
<NO_OTHER_FAILURES>
Root Cause (first observed failure):
[0]:
time : 2024-07-25_12:48:53
host : ip-119-181-1-31.ec2.internal
rank : 0 (local_rank: 0)
exitcode : -9 (pid: 6774)
error_file: <N/A>
traceback : Signal 9 (SIGKILL) received by PID 6774
The text was updated successfully, but these errors were encountered: