If you have not already, please review CONTRIBUTING.md for more complete information on expectations for contributions.
Developers must use Python 3.11 and create a virtual environment using one of the requirements.txt files in the requirements/
directory in order to make contributions to this project.
Ensure that you have Python 3.11 installed and that it is available in your PATH, and then identify the requirements file that you want to use:
Filename | OS | Architecture | Tensorflow | PyTorch |
---|---|---|---|---|
linux-amd64-py3.11-requirements-dev.txt | Linux | x86-64 | ❌ | ❌ |
linux-amd64-py3.11-requirements-dev-tensorflow.txt | Linux | x86-64 | ✅ | ❌ |
linux-amd64-py3.11-requirements-dev-pytorch.txt | Linux | x86-64 | ❌ | ✅ |
linux-arm64-py3.11-requirements-dev.txt | Linux | arm64 | ❌ | ❌ |
linux-arm64-py3.11-requirements-dev-tensorflow.txt | Linux | arm64 | ✅ | ❌ |
linux-arm64-py3.11-requirements-dev-pytorch.txt | Linux | arm64 | ❌ | ✅ |
macos-amd64-py3.11-requirements-dev.txt | macOS | x86-64 | ❌ | ❌ |
macos-amd64-py3.11-requirements-dev-tensorflow.txt | macOS | x86-64 | ✅ | ❌ |
macos-amd64-py3.11-requirements-dev-pytorch.txt | macOS | x86-64 | ❌ | ✅ |
macos-arm64-py3.11-requirements-dev.txt | macOS | arm64 | ❌ | ❌ |
macos-arm64-py3.11-requirements-dev-tensorflow.txt | macOS | arm64 | ✅ | ❌ |
macos-arm64-py3.11-requirements-dev-pytorch.txt | macOS | arm64 | ❌ | ✅ |
win-amd64-py3.11-requirements-dev.txt | Windows | x86-64 | ❌ | ❌ |
win-amd64-py3.11-requirements-dev-tensorflow.txt | Windows | x86-64 | ✅ | ❌ |
win-amd64-py3.11-requirements-dev-pytorch.txt | Windows | x86-64 | ❌ | ✅ |
Next, use the venv
module to create a new virtual environment:
python -m venv .venv
Activate the virtual environment after creating it. To activate it on macOS/Linux:
source .venv/bin/activate
To activate it on Windows:
.venv\Scripts\activate
Next, upgrade pip
and install pip-tools
:
python -m pip install --upgrade pip pip-tools
Finally, use pip-sync
to install the dependencies in your chosen requirements file and install dioptra
in development mode.
On macOS/Linux:
# Replace "linux-amd64-py3.11-requirements-dev.txt" with your chosen file
pip-sync requirements/linux-amd64-py3.11-requirements-dev.txt
On Windows:
# Replace "win-amd64-py3.11-requirements-dev.txt" with your chosen file
pip-sync requirements\win-amd64-py3.11-requirements-dev.txt
If the requirements file you used is updated, or if you want to switch to another requirements file (you need access to the Tensorflow library, for example), just run pip-sync
again using the appropriate filename.
It will install, upgrade, and uninstall all packages accordingly and ensure that you have a consistent environment.
-
Clone the repository at https://github.com/usnistgov/dioptra:
git clone [email protected]:usnistgov/dioptra.git ~/dioptra/dev
or
git clone https://github.com/usnistgov/dioptra.git ~/dioptra/dev
-
cd ~dioptra/dev
-
git checkout dev
-
Create a work directory for files
mkdir -p ~/dioptra/deployments/dev
-
The following describes commands to execute in four different terminal windows:
-
Flask Terminal
-
Environment variables that must be set for flask:
DIOPTRA_RESTAPI_DEV_DATABASE_URI="sqlite:////home/<username>/dioptra/deployments/dev/dioptra-dev.db" DIOPTRA_RESTAPI_ENV=dev DIOPTRA_RESTAPI_VERSION=v1
N.B.: replace with your username. On some systems the home path may also be different. Verify the expansion of '~' with the
pwd
command while in the appropriate directory. -
Activate the python environment set-up in prior steps
-
dioptra-db autoupgrade
-
flask run
-
-
Frontend UI Terminal
- Commands to get a Frontend running:
cd src/fronted npm install npm run dev
- Commands to get a Frontend running:
-
Redis Terminal
redis-server
-
Dioptra Worker
- Starting a Dioptra Worker requires the following environment variables:
DIOPTRA_WORKER_USERNAME="dioptra-worker" # This must be a registered user in the Dioptra app DIOPTRA_WORKER_PASSWORD="password" # Must match the username's password DIOPTRA_API="http://localhost:5000" # This is the default API location when you run `flask run` RQ_REDIS_URI="redis://localhost:6379/0" # This is the default URI when you run `redis-server` MLFLOW_S3_ENDPOINT_URL="http://localhost:35000" # If you're running a MLflow Tracking server, update this to point at it. Otherwise, this is a placeholder. OBJC_DISABLE_INITIALIZE_FORK_SAFETY=YES # Macs only, needed to make the RQ worker (i.e. the Dioptra Worker) work
- Activate the python environment set-up in prior steps (e.g.
source .venv/bin/activate
) - With the prior environment variables set then execute the following commands:
mkdir -p ~/dioptra/deployments/dev/workdir/ cd ~/dioptra/deployments/dev/workdir/ dioptra-worker-v1 'Tensorflow CPU' # Assumes 'Tensorflow CPU' is a registered Queue name
- Starting a Dioptra Worker requires the following environment variables:
-
-
Frontend app is available by default at http://localhost:5173 (the frontend terminal windows should also indicate the URL to use)
-
Create Dioptra worker in the Frontend UI or through API. curl command for interacting with API (assuming you have the environment variables in Step iv set) is:
curl http://localhost:5000/api/v1/users/ -X POST --data-raw "{\"username\": \"$DIOPTRA_WORKER_USERNAME\", \"email\": \"dioptra-worker@localhost\", \"password\": \"$DIOPTRA_WORKER_PASSWORD\", \"confirmPassword\": \"$DIOPTRA_WORKER_PASSWORD\"}"
-
Create 'Tensorflow CPU' Queue -- this needs to agree with the queue name used in Step iv.
This project uses Sphinx to generate the documentation published at https://pages.nist.gov/dioptra. To build the documentation locally, activate your virtual environment if you haven't already and run:
python -m tox run -e web-compile,docs
Alternatively, you can also use make
to do this:
make docs
This project uses black
and isort
to automatically format Python code:
Developers are expected to run black
and isort
over their contributions before opening a Pull Request.
To do this, activate your virtual environment if you haven't already and run:
# Run black to reformat code
python -m tox run -e black -- src/dioptra task-plugins/dioptra_builtins tests
# Run isort to reformat import statements
python -m tox run -e isort -- src/dioptra task-plugins/dioptra_builtins tests
Alternatively, you can also use make
to do this:
make beautify
This project uses flake8
as a code linter and mypy
to perform static type checking.
Developers are expected to run flake8
and mypy
and resolve all issues before opening a Pull Request.
To do this, activate your virtual environment if you haven't already and run:
# Lint the code
python -m tox run -e flake8
# Perform static type checking
python -m tox run -e mypy
Alternatively, you can also use make
to do this:
make code-check
This project has a commit style guide that is enforced using the gitlint
tool.
Developers are expected to run gitlint
and validate their commit message before opening a Pull Request.
After committing your contribution, activate your virtual environment if you haven't already and run:
python -m tox run -e gitlint
Alternatively, you can also use make
to do this:
make commit-check
This project stores its unit tests in the tests/
folder and runs them using pytest.
Developers are expected to create new unit tests to validate any new features or behavior that they contribute and to verify that all unit tests pass before opening a Pull Request.
To run the unit tests, activate your virtual environment if you haven't already and run:
python -m tox run -e py311-pytest -- tests/unit
python -m tox run -e py311-cookiecutter
Alternatively, you can also use make
to do this:
make tests-unit
Run the following to clear away the project's temporary files, which includes the sentinel dotfiles that are created in build/
when using make
:
make clean