Skip to content

Latest commit

 

History

History
12 lines (6 loc) · 657 Bytes

README.md

File metadata and controls

12 lines (6 loc) · 657 Bytes

A single runnable file for LLM training. Everything you need can be executed from the 1 file. Pick your method of training from the file name. Fill in all the places in the file with what you need to train, and run "Python File_Name.py" from a command prompt open in the same directory as the file.

As long as you have some sort of graphics card and train a model that fits in your VRAM, the training should work well.

It's as simple as that.

For better Lora training. Use my method bellow

Continuous Fine-tuning Without Loss Using Lora and Mergekit

https://docs.google.com/document/d/1OjbjU5AOz4Ftn9xHQrX3oFQGhQ6RDUuXQipnQ9gn6tU/edit?usp=sharing