If you discover a bug, want to propose a new feature, or have other feedback about requests-cache, please create an issue!
If you want to discuss ideas about the project in general, or if you have an issue or PR that hasn't received a response in a timely manner, please reach out on the Code Shelter chat server, under projects/requests-cache.
Requests-cache is under active development! Contributions are very welcome, and will be attributed on the Contributors page.
If you are interested in helping out, here are a few ways to get started:
- Give feedback on open issues
- Make or suggest improvements for the documentation; see #355 for details.
- See the help-wanted issue label
- See the shelved issue label for features that have been previously proposed and are not currently planned, but not completely ruled out either
- If you find an issue you want to work on, please comment on it so others know it's in progress
To set up for local development (requires poetry):
git clone https://github.com/requests-cache/requests-cache.git
cd requests-cache
poetry install -v -E all
CI jobs will run code style checks, type checks, linting, etc. If you would like to run these same checks locally, you can use pre-commit. This is optional but recommended.
To install pre-commit hooks:
pre-commit install
To manually run checks on all files:
pre-commit run --all-files
# Alternative alias with nox:
nox -e lint
To disable pre-commit hooks:
pre-commit uninstall
- Tests are divided into unit and integration tests:
- Unit tests can be run without any additional setup, and don't depend on any external services.
- Integration tests depend on additional services, which are easiest to run using Docker (see Integration Tests section below).
- See conftest.py for pytest fixtures that apply the most common mocking steps and other test setup.
- Run
pytest
to run all tests - Run
pytest tests/unit
to run only unit tests - Run
pytest tests/integration
to run only integration tests
For CI jobs (including PRs), these tests will be run for each supported python version. You can use nox to do this locally, if needed:
nox -e test
Or to run tests for a specific python version:
nox -e test-3.10
To generate a coverage report:
nox -e cov
See nox --list
for a ful list of available commands.
A live web server and backend databases are required to run integration tests, and docker-compose config is included to make this easier. First, install docker and install docker-compose.
Then, run:
docker-compose up -d
pytest tests/integration
If you can't easily run Docker containers in your environment but still want to run some of the integration tests, you can use pytest-httpbin instead of the httpbin container. This just requires installing an extra package and setting an environment variable:
pip install pytest-httpbin
export USE_PYTEST_HTTPBIN=true
pytest tests/integration/test_cache.py
For backend databases, you can install and run them on the host instead of in a container, as long as they are running on the default port.
Sphinx is used to generate documentation.
To build the docs locally:
nox -e docs
To preview:
# MacOS:
open docs/_build/html/index.html
# Linux:
xdg-open docs/_build/html/index.html
You can also use sphinx-autobuild to rebuild the docs and live reload in the browser whenver doc contents change:
nox -e livedocs
Sometimes, there are differences in the Readthedocs build environment that can cause builds to
succeed locally but fail remotely. To help debug this, you can use the
readthedocs/build container to build
the docs. A configured build container is included in docs/docker-compose.yml
to simplify this.
Run with:
# Optionally add --build to rebuild with updated dependencies
docker-compose -f docs/docker-compose.yml up -d
docker exec readthedocs make all
Here are some general guidelines for submitting a pull request:
- If the changes are trivial, just briefly explain the changes in the PR description
- Otherwise, please submit an issue describing the proposed change prior to submitting a PR
- Add unit test coverage for your changes
- If your changes add or modify user-facing behavior, add documentation describing those changes
- Submit the PR to be merged into the
main
branch
- Releases are built and published to PyPI based on git tags.
- Milestones will be used to track progress on major and minor releases.
- GitHub Actions will build and deploy packages to PyPI on tagged commits
on the
main
branch.
Release steps:
- Update the version in
requests_cache/__init__.py
- Update the release notes in
HISTORY.md
- Generate a sample cache for the new version (used by unit tests) with
python tests/generate_test_db.py
- Merge changes into the
main
branch - Push a new tag, e.g.:
git tag v0.1 && git push origin --tags
- This will trigger a deployment. Verify that this completes successfully and that the new version
can be installed from pypi with
pip install
Pre-release builds are convenient for letting testers try out in-development changes. Versions with
the suffix .dev
(among others) can be deployed to PyPI and installed by users with pip install --pre
,
and are otherwise ignored by pip install
:
# Install latest pre-release build:
pip install -U --pre requests-cache
# Install latest stable build
pip install -U requests-cache
Notes:
- See python packaging docs on pre-release versioning for more info on how this works
- requests-cache pre-release docs can be found here: https://requests-cache.readthedocs.io/en/latest/
- Any collaborator can trigger a pre-release build for requests-cache by going to Actions > Deploy > Run workflow
- A complete list of builds can by found on PyPI under 'Release History'