Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bug: [0.5.11] Model failed to start with error Could not load engine llama-cpp: Default engine variant for cortex.llamacpp is not set yet! #4331

Open
1 of 3 tasks
idealsceneprod opened this issue Dec 24, 2024 · 5 comments
Assignees
Labels
category: cortex.cpp Related to cortex.cpp type: bug Something isn't working

Comments

@idealsceneprod
Copy link

Jan version

0.5.11

Describe the Bug

This is my first time running Jan. I downloaded the AppImage, ran it, downloaded the TinyLlama model and then typed hello in the chat, and got the following error message:

Model tinyllama-1.1b failed to start. Model failed to start: Could not load engine llama-cpp: Default engine variant for cortex.llamacpp is not set yet!

All settings are as default. No changes have been made.

I tried importing a local LM Studio model and got the same error. I can't find a setting anywhere to set the default engine variant. I'm new to LLMs so maybe I'm just doing something wrong. I followed the youtube videos and searched the online docs. Searched startpage for the error message and came out empty.

I'm on Ubuntu 22.04 LTS. I have 64GB RAM and an RTX2060 (6GB).

Steps to Reproduce

  1. Download a model, e.g. TinyLlama
  2. Go into chat and type "Hello" and hit enter

Screenshots / Logs

2024-12-24T14:24:29.644Z [SPECS]::0, 6144, NVIDIA GeForce RTX 2060
2024-12-24T14:24:29.646Z [APP]::{"notify":true,"run_mode":"gpu","nvidia_driver":{"exist":true,"version":"550.120"},"cuda":{"exist":true,"version":"12"},"gpus":[{"id":"0","vram":"6144","name":"NVIDIA GeForce RTX 2060","arch":"unknown"}],"gpu_highest_vram":"0","gpus_in_use":["0"],"is_initial":false,"vulkan":false}
2024-12-24T14:24:29.668Z [CORTEX]:: Spawning cortex subprocess...
2024-12-24T14:24:29.669Z [CORTEX]:: Cortex engine path: /tmp/.mount_jan-liGvWI8R/resources/app.asar.unpacked/shared
2024-12-24T14:24:29.669Z [CORTEX]:: Spawn cortex at path: jan-data-folder/extensions/@janhq/inference-cortex-extension/dist/bin/cortex-server
2024-12-24T14:24:29.670Z [APP]::Terminating watched process...
2024-12-24T14:24:29.670Z [APP]::Starting process: jan-data-folder/extensions/@janhq/inference-cortex-extension/dist/bin/cortex-server --start-server --port 39291 --config_file_path jan-data-folder/.janrc --data_folder_path /home/val/.config/Jan/data
2024-12-24T14:24:29.912Z [APP]::Process error: terminate called after throwing an instance of 'std::filesystem::__cxx11::filesystem_error'
what(): filesystem error: cannot create directories: Read-only file system [/tmp/.mount_jan-liGvWI8R/resources/app.asar.unpacked/shared/engines/cortex.tensorrt-llm/deps]

2024-12-24T14:24:29.913Z [APP]::Error: Error [ERR_UNHANDLED_ERROR]: Unhandled error. ("terminate called after throwing an instance of 'std::filesystem::__cxx11::filesystem_error'\n" +
' what(): filesystem error: cannot create directories: Read-only file system [/tmp/.mount_jan-liGvWI8R/resources/app.asar.unpacked/shared/engines/cortex.tensorrt-llm/deps]\n')
2024-12-24T14:24:34.914Z [APP]::Process output: 20241224 14:24:29.697219 UTC 2418171 INFO Config file not found. Creating one at jan-data-folder/.janrc - file_manager_utils.h:215
20241224 14:24:29.697291 UTC 2418171 INFO Default data folder path: /home/val/.config/Jan/data - file_manager_utils.h:217
Host: 127.0.0.1 Port: 39291

2024-12-24T14:24:34.914Z [APP]::Process error: Could not start server

2024-12-24T14:24:34.915Z [APP]::Error: Error [ERR_UNHANDLED_ERROR]: Unhandled error. ('Could not start server\n')
2024-12-24T14:24:34.916Z [APP]::Process exited with code 0
2024-12-24T14:24:34.916Z [APP]::Restarting process in 5000ms (Attempt 1/5)
2024-12-24T14:24:39.841Z [CORTEX]: CPU instruction: avx2
2024-12-24T14:24:39.842Z [CORTEX]: Engine variant: linux-amd64-avx2-cuda-12-0
2024-12-24T14:24:39.917Z [APP]::Starting process: jan-data-folder/extensions/@janhq/inference-cortex-extension/dist/bin/cortex-server --start-server --port 39291 --config_file_path jan-data-folder/.janrc --data_folder_path /home/val/.config/Jan/data

What is your OS?

  • MacOS
  • Windows
  • Linux
@idealsceneprod idealsceneprod added the type: bug Something isn't working label Dec 24, 2024
@github-project-automation github-project-automation bot moved this to Investigating in Jan & Cortex Dec 24, 2024
@SmokeShine
Copy link

Facing similar issue. Mine is an update and not a fresh install

@imtuyethan imtuyethan assigned namchuai and louis-jan and unassigned namchuai Dec 26, 2024
@imtuyethan imtuyethan changed the title bug: Model Failed to start bug: Model failed to start with error Could not load engine llama-cpp: Default engine variant for cortex.llamacpp is not set yet! Dec 26, 2024
@imtuyethan imtuyethan added the category: cortex.cpp Related to cortex.cpp label Dec 26, 2024
@imtuyethan imtuyethan changed the title bug: Model failed to start with error Could not load engine llama-cpp: Default engine variant for cortex.llamacpp is not set yet! bug: [0.5.11] Model failed to start with error Could not load engine llama-cpp: Default engine variant for cortex.llamacpp is not set yet! Dec 26, 2024
@imtuyethan
Copy link
Contributor

Thanks for reporting @SmokeShine @idealsceneprod We'll take a look!

@idealsceneprod
Copy link
Author

idealsceneprod commented Dec 26, 2024

I added instructions in the assistant tab, and it now was able to load the model. But, the assistant responded to "hello" with a blank message. Then I tried loading another model: dolphin-2.9.3-mistral-7B-32k.Q5_K_S, and it gave me the same error, with the Instructions in place.

I was able to go back to TinyLlama, and ask it a question and it did answer it. I typed Hello after that and it expounded on the initial answer.

The dolphin model runs fine and pretty fast in LM Studio. Not sure why it fails in Jan.

@louis-jan
Copy link
Contributor

Hi @idealsceneprod, could you kindly share the cortex.log file located in the app data folder? You can access the app logs by clicking System Monitoring in the app footer bar and then selecting App Logs.

@idealsceneprod
Copy link
Author

Here you go:

cortex.log

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
category: cortex.cpp Related to cortex.cpp type: bug Something isn't working
Projects
Status: Investigating
Development

No branches or pull requests

5 participants