Skip to content

apperside/dockerify.ai

Repository files navigation

Dockerify.AI: generate docker and docker-compose files with just one command!

license last-commit repo-top-language repo-language-count

Developed with the software and tools below.

JavaScript npm YAML Chai Mocha Axios ESLint OpenAI SemVer Lodash
Flat TypeScript tsnode Yarn Docker Ajv Buffer GitHub%20Actions JSON


Quick Links


Overview

Dockerify.ai leverages Docker, AI, and TypeScript to enhance the creation of docker related files. With OpenAI integration, it allows interactive user inquiries and streamlines the creation of a working docker configuration for your project.


How does it work

This is a super early stage version of the project, it will be developed over time with new features and behaviours.

At the moment what it will do is the following:

  • first it will read the files at the root of your project (not their content) and will send to openai to get an idea of the project type
  • many of the times some particular file or folder can be enough to let the ai understand what kind of project it is (for example, if your project contains a package.json file and a .next folder, most probably it is a NextJS project)
  • if it cannot manage to guess what kind of project it is, it will ask you the content of one or more file (you will be asked for permissions before it happens)
  • it eventually will ask you some kind of clarification
  • in the end, it will generate a docker file and a docker compose file

Repository Structure

└── dockerify.ai/
    ├── .github
    │   └── workflows
    │       ├── onPushToMain.yml
    │       ├── onRelease.yml
    │       └── test.yml
    ├── Dockerfile
    ├── README.md
    ├── bin
    │   ├── dev.cmd
    │   ├── dev.js
    │   ├── run.cmd
    │   └── run.js
    ├── docker-compose.yml
    ├── package.json
    ├── src
    │   ├── ai
    │   │   ├── ai.ts
    │   │   └── impl
    │   │       └── openai.impl.ts
    │   ├── api.ts
    │   ├── appConfig.ts
    │   ├── commands
    │   │   └── index.ts
    │   └── index.ts
    ├── test
    │   ├── commands
    │   │   └── hello
    │   │       ├── index.test.ts
    │   │       └── world.test.ts
    │   └── tsconfig.json
    ├── tsconfig.json
    └── yarn.lock

Modules

.
File Summary
docker-compose.yml This code utilizes Docker to create a production-ready environment for the AI application, making it consistent and easily distributable. The configuration file docker-compose.yml is central to running the application inside a Docker container. It sets up an app service, links the application codebase to the Docker environment, and excludes node_modules from synchronizing to improve performance.
Dockerfile The Dockerfile is key to creating a Docker container for the dockerify.ai application. Upon building, it installs the necessary Node dependencies and bundles the app source. After successful compilation, the application will start execution from run.js in the bin directory. This setup encapsulates the entire application for seamless deployment and ensures consistent runtime environment.
tsconfig.json The tsconfig.json file in the provided repository is a configuration file for TypeScript, a scripting language used for the development. It primarily informs TypeScript compiler to generate JavaScript files from TypeScript, the compilation version, root directory of the project, and where to output the compiled JavaScript files.
package.json This code segment is the package.json file, a crucial part of the Dockerify.ai repository. It's responsible for outlining the metadata about the project, including project dependencies, scripts commands, and main entry point. The project primarily facilitates making any project Docker-ready instantly, while housing the fundamental run-command for the application (./bin/run.js). Considerable dependencies, including axios, openAI, and inquirer, hint at network requests, interaction with AI toolkit, and terminal user-input, respectively.
yarn.lock This code snippet represents the Dockerify.ai repository structure, crucial for organizing project's workflows, Docker configurations, and documentation. It enables automated actions upon code release and pushes to main branch as well as testing, aiding in continuous integration and delivery (CI/CD).
bin
File Summary
run.cmd This code is part of the dockerify.ai repository's execution scripts in the bin directory. It represents a Windows-specific command script run.cmd, which aims at initiating the Node.js runtime environment to run the run script with provided arguments. It links to the overall project's operation by facilitating script execution, thus contributing to the application's functionality.
dev.js The bin/dev.js file is a crucial part of the dockerify.ai repository. It is essentially a development script executed via Node.js, facilitating the running of the application in a development environment. This script uses the oclif core's execute function for command processing and entry point of the application. Consequently, it plays a critical role in the workflow of the entire software structure.
dev.cmd This code snippet from bin/dev.cmd in the dockerify.ai repository serves to bootstrap a development environment. When triggered, it sets off a Node process with TypeScript Node as the module loader, suppressing experimental warnings. This is a foundational part of the repo's architectural setup, enabling smoother and more consistent development workflows.
run.js This code snippet resides in the run.js file within the bin directory of the dockerify.ai repository. Its primary function is to perform script execution in the Oclif CLI framework hence facilitating the command-line operations of this AI-based application. Subsequently, this contributes significantly to the overall runtime execution in the project's architecture.
.github.workflows
File Summary
onPushToMain.yml This code is responsible for the continuous integration of the dockerify.ai repository. It triggers automated tasks defined in the onPushToMain.yml script whenever changes are pushed to the main branch. These tasks include testing and building the software to ensure code quality and functionality before any release process.
onRelease.yml The onRelease.yml file, located in the.github/workflows directory, is central to the continuous integration/continuous delivery (CI/CD) pipeline of the dockerify.ai repository. Upon new releases, it triggers automated processes such as code testing, building Docker images, and pushing these to a Docker repository, thereby ensuring streamlined updates and deployment.
test.yml The code in test.yml is a GitHub Actions workflow. It supports the dockerify.ai repository by executing tests when specific events occur. The workflow is crucial for automated software quality assurance, helping validate changes before they're merged into the main codebase.
test
File Summary
tsconfig.json This tsconfig.json file in the test directory forms an essential part of the project's test suite configuration. It extends the base TypeScript config file with specific options for testing purposes, ensuring no new JavaScript files are generated during the testing process, and linking to the source code in the parent directory. This plays a pivotal role in enabling type safety checks and tooling support during automated testing.
test.commands.hello
File Summary
index.test.ts The highlighted code snippet is a component of the test suite within the dockerify.ai repository-specifically, it tests the hello command functionality. The code confirms the successful execution and correct output of the hello command, contributing to the repository's overall reliability and maintainability.
world.test.ts The code snippet, from the test/commands/hello/world.test.ts file, is a unit test for the hello world command in the repository. It checks whether running the command yields the expected hello world! output. This helps verify the correct functionality of the command within the broader microservice-oriented docker-based architecture of the repository.
src
File Summary
appConfig.ts The appConfig.ts file in the src directory of the dockerify.ai repository plays a key role in maintaining application-specific configuration using Configstore. It creates an instance of Configstore named dockerify.ai, providing a centralized store for configurations which aids in altering application behavior without code changes.
api.ts The api.ts file primarily functions as the communication bridge between the user and the AI assistant module. It facilitates user interactions by receiving user messages, sending them to the AI module, and delivering AI responses back to the user. Notably, it also manages a user-friendly spinner indicator to reveal when the AI is processing the request. This file streamlines user-AI discourse while maintaining user experience.
index.ts The src/index.ts file in the dockerify.ai repository serves as an entry point for the application. Its main function is to export the run command from the @oclif/core library, which is crucial in handling command-line interface operations. This supports the modular and command-based architecture of the project. The codebase's overall structure indicates a Docker-based ecosystem, facilitating AI applications development using TypeScript and automated workflows.
src.ai
File Summary
ai.ts The code in ai.ts provides an abstraction layer to interact with different AI engines, using an interface IAI. Currently, it implements OpenAI as the default AI engine. This approach gives the system flexibility to easily replace or add new AI services by changing the implementations.
src.ai.impl
File Summary
openai.impl.ts This code is part of an AI bot application and leverages the OpenAI API. The AppOpenAi class, housed in the openai.impl.ts file, provides the core chat interaction with a user. It enables creating, running, and managing asynchronous conversations with the AI assistant, obtaining user inputs, and returning AI-generated replies. The class utilizes an API key for authentication, stored in an application configuration file.
src.commands
File Summary
index.ts The code in src/commands/index.ts defines a MainCommand class responsible for generating Docker configurations for a project. This process uses user interaction to provide specific inputs needed, such as files' contents or project details required for Dockerization, acquired via interactive inquiries. The code also extends capabilities for project cleaning, specification of project path, and uses OpenAI's API for interactions, underpinning the core functionality of the dockerify.ai repository's architecture.

Getting Started

Istant usage without installation

npx dockerify.api@latest

Usage

USAGE
  $ dockerify  [--openAiApiKey <value>] [-c] [-P <value>]

FLAGS
  -P, --path=<value>          The root path of the project you want to generate the docker configuration for
  -c, --clean                 Clean the project
      --openAiApiKey=<value>  OpenAI API key

DESCRIPTION
  Generates a working docker configuration

Installation

  1. Clone the dockerify.ai repository:
git clone https://github.com/apperside/dockerify.ai
  1. Change to the project directory:
cd dockerify.ai
  1. Install the dependencies:
npm install

Running dockerify.ai

Use the following command to run dockerify.ai:

./bin/dev.js

Project Roadmap

  • ► Add tests
  • ► Improve project type detection
  • ► Handle more complicated scenarios (like monorepos)
  • ► Custom prompt to add any required customization (like requesting to add some database to the docker compose file or any other publicily known docker image)

Contributing

Contributions are welcome! Here are several ways you can contribute:

Contributing Guidelines
  1. Fork the Repository: Start by forking the project repository to your GitHub account.
  2. Clone Locally: Clone the forked repository to your local machine using a Git client.
    git clone https://github.com/apperside/dockerify.ai
  3. Create a New Branch: Always work on a new branch, giving it a descriptive name.
    git checkout -b new-feature-x
  4. Make Your Changes: Develop and test your changes locally.
  5. Commit Your Changes: Commit with a clear message describing your updates.
    git commit -m 'Implemented new feature x.'
  6. Push to GitHub: Push the changes to your forked repository.
    git push origin new-feature-x
  7. Submit a Pull Request: Create a PR against the original project repository. Clearly describe the changes and their motivations.

Once your PR is reviewed and approved, it will be merged into the main branch.


License

This project is protected under the MIT License. For more details, refer to the LICENSE file.


Acknowledgments

Return