- Introduction
- Prerequisites
- Installation
- Demo Notebook
- Manual Quick Start
- Usage
- Command Overview
- Azure OpenAI Services
- Configuration
- Examples
- Removing Files and Models
finetuna is designed to simplify the fine-tuning of OpenAI models. Train your models and interact with them—all through PowerShell.
Right now, the awesome functionality is that you can pipe a whole directory of jsonl files into a command that trains a whole new model/bot on those files.
- PowerShell (any version)
- OpenAI API Key or Azure OpenAI API Key
-
Set Execution Policy: Run the following command if needed:
Set-ExecutionPolicy RemoteSigned -Scope CurrentUser
-
Install from PowerShell Gallery: Run the following command:
Install-Module -Name finetuna
Dependencies will be installed automatically.
-
Get and set your OpenAI API Key: Get/Create your OpenAI API key from https://platform.openai.com/account/api-keys and then set as plain text with
$env:OPENAI_API_KEY
:$env:OPENAI_API_KEY = "sk-fake-T3BlbFJi7vpHiKhyYKy8aUT3Blbk"
You may also want to put it in your $profile.
$env:OpenAIKey
also works to provide compatibility with other PowerShell modules.
finetuna comes with a demo Jupyter notebook that showcases the main functionalities of the module. The notebook is located in the root directory of the module and is named demo.ipynb
.
To launch the demo notebook, use the Start-TuneDemo
command:
Start-TuneDemo
This command will attempt to launch the notebook in Visual Studio Code (regular or Insiders edition). If Visual Studio Code is not available, it will fall back to platform-specific launchers such as Jupyter or the default application associated with .ipynb files.
The demo notebook covers the following topics:
- Setting the provider configuration
- Grabbing a tuning file
- Checking the validity of the training file
- Uploading the file
- Checking the uploaded file
- Starting a tuning job
- Waiting for the job to complete
- Retrieving the job details
- Retrieving events for a specific fine-tuning job
- Retrieving and deleting custom models
- Retrieving the default model
- Measuring token count for a given text
- Getting the current provider configuration
By following along with the demo notebook, you can get a hands-on understanding of how to use finetuna to fine-tune OpenAI models and interact with them through PowerShell.
Use a bunch of json files to train or "fine tune" a model to make a bot. Append
allows you to fine-tune an initial model then save that as a new model and train the rest of the files against that.
Get-ChildItem .\sample\totbot*.jsonl | Create-CustomModel -Append
*** "train" used in a casual manner, as I am not an ML scientist and you probably aren't either because they don't use PowerShell for these tasks đź’” Anyway, now I can chat with my cat.
Drop that append parameter and train a bunch of models with a bunch of jsonl files
Get-ChildItem .\sample\totbot*.jsonl | Create-CustomModel
-
Send Files: Use
Send-TuneFile
to upload your training files.Get-ChildItem .\sample\totbot*.jsonl | Send-TuneFile
-
Verify File Upload: Use
Get-TuneFile
to make sure your file was uploaded correctly.Get-TuneFile | Select-Object -Last 1
-
Start Training:
Once your files are ready, start the training job.
Get-TuneFile | Out-GridView -Passthru | Start-TuneJob
Command Name | Description |
---|---|
Clear-TuneProvider | Clears the OpenAI provider configuration for finetuna |
Compare-Embedding | Calculates the similarity between two embedding vectors |
Create-CustomModel | An alias for New-TuneModel |
Get-Embedding | Creates an embedding vector for the given text |
Get-TuneFile | Retrieves a list or a specific tuning file |
Get-TuneFileContent | Reads the content of a list or specific tuning file |
Get-TuneJob | Retrieves a list or details of a specific tuning job |
Get-TuneJobEvent | Fetches events for a list or specific tuning job |
Get-TuneModel | Retrieves a list or a specific tuning model |
Get-TuneModelDefault | Gets the default model that Invoke-TuneChat uses |
Get-TuneProvider | Retrieves the current OpenAI provider configuration for finetuna |
Invoke-TuneChat | Initiates a chat session with a tuning model |
Invoke-TunedChat | Initiates a chat session with any model (alias for Invoke-TuneChat) |
Measure-TuneToken | Measures the token usage of the provided text |
New-TuneModel | Creates a new tuning model |
Remove-TuneFile | Deletes a specific tuning file |
Remove-TuneModel | Deletes a specific tuning model |
Request-TuneFileReview | Submits a file to Invoke-TuneChat for improvement suggestions |
Send-TuneFile | Sends a file for tuning |
Set-TuneModelDefault | Sets the default model that Invoke-TuneChat will use |
Set-TuneProvider | Configures the OpenAI or Azure OpenAI service context for finetuna |
Start-TuneDemo | Launches the finetuna demo notebook |
Start-TuneJob | Starts a new tuning job |
Stop-TuneJob | Stops a running tuning job |
Test-TuneFile | Validates tune files before sending to model for training |
Wait-TuneJob | Waits for a fine-tuning job to complete |
finetuna also supports Azure OpenAI services. To use Azure, you need to set up the provider using the Set-TuneProvider
command:
$splat = @{
Provider = "Azure"
ApiKey = "your-azure-api-key"
ApiBase = "https://your-azure-endpoint.openai.azure.com/"
Deployment = "your-deployment-name"
}
Set-TuneProvider @splat
Note that finetuned models can be surprisingly expensive to host on Azure ($3/hr + training costs). Be sure to check the pricing before you start training.
-
Default Quota: The default quota for Azure OpenAI services is relatively low. If you're experiencing limitations, consider requesting an increase in your quota through the Azure portal.
-
Model Compatibility: If your assistant isn't working as expected, try changing the model in your deployment to ensure your deployment is using a compatible model.
-
API Versions: Make sure you're using a compatible API version. Check the Azure documentation for the latest supported versions.
-
Deployment Name: When setting up the OpenAI provider for Azure, ensure you're using the correct deployment name. This is different from the model name and is specific to your Azure OpenAI resource.
-
Region Availability: Azure OpenAI services may not be available in all Azure regions. Ensure you're using a supported region for your deployment.
You can persist the OpenAI provider configuration to a JSON file using the Set-TuneProvider
command. This allows you to save the configuration for future sessions:
Set-TuneProvider -Provider OpenAI -ApiKey "your-openai-api-key"
To reset the OpenAI provider configuration, use the Clear-TuneProvider
command:
Clear-TuneProvider
This command will remove the persisted configuration file and reset the configuration to its default state.
# Upload training files
Get-ChildItem .\sample\totbot*.jsonl | Send-TuneFile
# Start a training job
Get-TuneFile | Out-GridView -Passthru | Start-TuneJob
# Chat with your fine-tuned model
Invoke-TunedChat -Message "Tell me about cats"
Use Remove-TuneFile
to delete a specific file that you have uploaded.
Get-TuneFile | Where-Object Name -eq "sample_file.jsonl" | Remove-TuneFile
Get-TuneFile | Select-Object -Last 1 | Remove-TuneFile
Use Remove-TuneModel
to delete a specific tuning model.
Get-TuneModel | Where-Object Name -eq "sample_model" | Remove-TuneModel
Get-TuneModel -Custom | Remove-TuneModel -Confirm:$false
Learn more about fine-tuning here: https://learn.microsoft.com/en-us/azure/ai-services/openai/concepts/fine-tuning-considerations