LlamaIndex is a data framework for your LLM applications
-
Updated
Dec 21, 2024 - Python
LlamaIndex is a data framework for your LLM applications
Unified Efficient Fine-Tuning of 100+ LLMs (ACL 2024)
Low-code framework for building custom LLMs, neural networks, and other AI models
Run any open-source LLMs, such as Llama, Mistral, as OpenAI compatible API endpoint in the cloud.
Multi-lingual large voice generation model, providing inference, training and deployment full-stack ability.
Using Low-rank adaptation to quickly fine-tune diffusion models.
Scalable and flexible workflow orchestration platform that seamlessly unifies data, ML and analytics stacks.
H2O LLM Studio - a framework and no-code GUI for fine-tuning LLMs. Documentation: https://docs.h2o.ai/h2o-llmstudio/
Fine-tuning ChatGLM-6B with PEFT | 基于 PEFT 的高效 ChatGLM 微调
Open source data anonymization and synthetic data orchestration for developers. Create high fidelity synthetic data and sync it across your environments.
🔥🔥High-Performance Face Recognition Library on PaddlePaddle & PyTorch🔥🔥
RAG (Retrieval Augmented Generation) Framework for building modular, open source applications for production by TrueFoundry
🦖 𝗟𝗲𝗮𝗿𝗻 about 𝗟𝗟𝗠𝘀, 𝗟𝗟𝗠𝗢𝗽𝘀, and 𝘃𝗲𝗰𝘁𝗼𝗿 𝗗𝗕𝘀 for free by designing, training, and deploying a real-time financial advisor LLM system ~ 𝘴𝘰𝘶𝘳𝘤𝘦 𝘤𝘰𝘥𝘦 + 𝘷𝘪𝘥𝘦𝘰 & 𝘳𝘦𝘢𝘥𝘪𝘯𝘨 𝘮𝘢𝘵𝘦𝘳𝘪𝘢𝘭𝘴
Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
Your Automatic Prompt Engineering Assistant for GenAI Applications
Build, customize and control you own LLMs. From data pre-processing to fine-tuning, xTuring provides an easy way to personalize open-source LLMs. Join our discord community: https://discord.gg/TgHXuSJEk6
《大语言模型》作者:赵鑫,李军毅,周昆,唐天一,文继荣
Multi-LoRA inference server that scales to 1000s of fine-tuned LLMs
LLM Finetuning with peft
Add a description, image, and links to the fine-tuning topic page so that developers can more easily learn about it.
To associate your repository with the fine-tuning topic, visit your repo's landing page and select "manage topics."