Skip to content

solargoe/synthetic-datasets

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Create Synthetic Data using Large Language Models

Augment datasets using Large Language Models by generating synthetic data. This repository has simple examples you can use to generate Synthetic Data and persist it to disk as a CSV file.

Setup your environment

This example can run in Codespaces but you can use the following if you are cloniing this repository:

Install the dependencies

Create the virtual environment and install the dependencies:

python3 -m venv .venv
source .venv/bin/activate
.venv/bin/pip install -r requirements.txt

Here is a summary of what this repository will use:

  1. Llamafile for the LLM (alternatively you can use an OpenAI API compatible key and endpoint)
  2. OpenAI's Python API to connect to the LLM
  3. A large language model (LLM) to generate synthetic data like Mixtral or using an OpenAI API based service.

Lessons

  1. Connect to an OpenAI based service
  2. Generate Synthetic Data
  3. Persist and verify the Synthetic Data

About

Augment datasets using Large Language Models

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published