search_llm_actions provides a simple template to collect action trajectories from local (vllm servers on multi-gpus are supported) or remote (togetherai api is supported) llms by search.🫠
git clone [email protected]:bebetterest/search_llm_actions.git
cd search_llm_actions
pip install -r requirements.txt
pip install .
# # edit code
# pip install -e .
# pip install search_llm_actions
- You could find an minimal example in
example.py
. - You need to override
init_root_node
,take_parallel_actions
,simulate
&detect_end
functions insearch_llm_actions/search.py
to adapt to your own task. - You need to override
deploy_vllm_multi.sh
&test_vllm_multi.sh
insearch_llm_actions/scripts
to adapt to your own llm server. - You need to implement a new subclass of
Caller
insearch_llm_actions/llm_caller.py
to adapt to your own llm server.
enjoy:)
🤯betterest🤯