Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update readme and create sample env file for local dev #5651

Draft
wants to merge 8 commits into
base: main
Choose a base branch
from
93 changes: 64 additions & 29 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,11 +22,10 @@ This project uses Docker compose to setup and run all the necessary components.
cd data-hub-api
```

2. Create `.env` files from `sample.env`
2. Create a `.env` file

```shell
cp sample.env .env
cp config/settings/sample.env config/settings/.env
cp sample-docker-dev.env .env
```

If you're working with data-hub-frontend and mock-sso, `DJANGO_SUPERUSER_SSO_EMAIL_USER_ID` should be the same as
Expand Down Expand Up @@ -87,11 +86,10 @@ There is now a `make` command to bring up the three environments on a single doc
cd data-hub-api
```

2. Create `.env` files from `sample.env`
2. Create a `.env` file

```shell
cp sample.env .env
cp config/settings/sample.env config/settings/.env
cp sample-docker-dev.env .env
```

Ensure `DJANGO_SUPERUSER_SSO_EMAIL_USER_ID` is the same as `MOCK_SSO_EMAIL_USER_ID`
Expand Down Expand Up @@ -142,7 +140,7 @@ There is now a `make` command to bring up the three environments on a single doc
Dependencies:

- Python 3.10.x
- PostgreSQL 12
- PostgreSQL 16.x
- redis 6.x
- OpenSearch 1.x

Expand Down Expand Up @@ -171,7 +169,7 @@ Dependencies:
brew install libpq
```

4. Install _postgres_, if not done already, as this is required by **psycopg2** in the requirements below
4. Install Postgres

On Ubuntu:

Expand All @@ -183,52 +181,85 @@ Dependencies:

```shell
brew install postgresql
brew services start postgresql
```

5. Create and activate the virtualenv:
5. Setup the postgres user and database

On macOS:
```shell
createdb
```

```shell
psql
CREATE DATABASE datahub;
CREATE DATABASE test_datahub;
CREATE USER datahubuser WITH SUPERUSER PASSWORD 'datahubpassword';
\q
```

nb. SUPERUSER is needed to create 'citext' extension

On macOS:

The default table collation ('C') is case and accent sensitive whereas the default ('en_US.utf8') on Debian (which the Python Docker image is based on) is not. See [this](https://postgresql.verite.pro/blog/2019/10/14/nondeterministic-collations.html) for more information on collation types. Without this a couple of tests will fail.

```shell
psql
update pg_database set datcollate='en_US.UTF-8', datctype='en_US.UTF-8' where datname = 'test_datahub';
update pg_database set datcollate='en_US.UTF-8', datctype='en_US.UTF-8' where datname = 'datahub';
\q
brew services restart postgresql
```

6. Create and activate the virtualenv:

```shell
python3.10 -m venv env
source env/bin/activate
pip install -U pip
or
formally to make sure you have the same version as what is used for cloudfoundry, use buildpack to install the same version e.g. https://github.com/cloudfoundry/python-buildpack/releases e.g. 22.1.2
python -m pip install pip==22.1.2
```

6. Install the dependencies:
7. Install the dependencies:

```shell
pip install -r requirements-dev.txt
```

7. Create an `.env` settings file (it’s gitignored by default):
8. Create a `.env` file (it’s gitignored by default):

```shell
cp config/settings/sample.env config/settings/.env
cp sample-local-dev.env .env
```

8. Set `DOCKER_DEV=False` and `LOCAL_DEV=True` in `.env`

9. Make sure you have OpenSearch running locally. If you don't, you can run one in Docker:
11. Make sure you have OpenSearch running locally.

```shell
docker run -p 9200:9200 -e "http.host=0.0.0.0" -e "transport.host=127.0.0.1" -e "plugins.security.disabled=true" opensearchproject/opensearch:1.2.4
brew install opensearch
brew services start opensearch
```

12. Install and run redis

10. Make sure you have redis running locally and that the REDIS_BASE_URL in your `.env` is up-to-date.
```shell
brew install redis
brew services start redis
```

11. Populate the databases and initialise OpenSearch:
13. Populate the databases and initialise OpenSearch:

```shell
./manage.py migrate
./manage.py migrate_search

./manage.py loadinitialmetadata
# Force is required as `migrate` already loads some metadata
# but loading fixtures will fail without this
./manage.py loadinitialmetadata --force
./manage.py createinitialrevisions
```

12. Optionally, you can load some test data:
14. Optionally, you can load some test data:

```shell
./manage.py loaddata fixtures/test_data.yaml
Expand All @@ -238,27 +269,31 @@ Dependencies:
and hence the loaded records won‘t be returned by search endpoints until RQ is
started and the queued tasks have run.

13. Create a superuser:
15. Create a superuser:

```shell
./manage.py createsuperuser
```

(You can enter any valid email address as the username and SSO email user ID.)

14. Start the server:
16. Start the server:

```shell
./manage.py runserver
```

15. Start RQ (Redis Queue):
17. Start RQ (Redis Queues):

Start short running queue working
```shell
python rq/rq-worker.py
./rq-run.sh short-running-worker.py
```

Note that in production the cron-scheduler:1/1, short-running-working:4/4, long-running-worker:4/4 are run in separate instances .
Start long running queue working
```shell
./rq-run.sh long-running-worker.py
```

## API documentation

Expand Down Expand Up @@ -600,4 +635,4 @@ Example of environment variables for updating the interaction notification email
```
INTERACTION_NOTIFICATION_API_KEY=<noftify-api-key>
EXPORT_NOTIFICATION_NEW_INTERACTION_TEMPLATE_ID=<template-id>
```
```
36 changes: 0 additions & 36 deletions config/settings/sample.env

This file was deleted.

127 changes: 64 additions & 63 deletions sample.env → sample-docker-dev.env
Original file line number Diff line number Diff line change
@@ -1,87 +1,88 @@
# Environment variables specific to usage with docker-compose
DATABASE_CREDENTIALS={"username": "postgres", "password": "datahub", "engine": "postgres", "port": 5432, "dbname": "datahub", "host": "postgres", "dbInstanceIdentifier": "db-instance"}
POSTGRES_URL=tcp://postgres:5432
### Local dev ###
PAAS_IP_WHITELIST=1.2.3.4
DISABLE_PAAS_IP_CHECK=true
DEBUG=True
DJANGO_SECRET_KEY=changeme
DJANGO_SETTINGS_MODULE=config.settings.local
COV_TOKEN=${COV_TOKEN}

### Postgres ###
DATABASE_URL=postgresql://postgres:datahub@postgres/datahub
DATABASE_CREDENTIALS={"username": "postgres", "password": "datahub", "engine": "postgres", "port": 5432, "dbname": "datahub", "host": "postgres", "dbInstanceIdentifier": "db-instance"}

### Redis ###
REDIS_BASE_URL=redis://redis:6379

### OpenSearch ###
OPENSEARCH_URL=http://opensearch:9200
OPENSEARCH_INDEX_PREFIX=test_index
REDIS_BASE_URL=redis://redis:6379
DEFAULT_BUCKET_AWS_DEFAULT_REGION=eu-west-2
DEFAULT_BUCKET_AWS_ACCESS_KEY_ID=foo
DEFAULT_BUCKET_AWS_SECRET_ACCESS_KEY=bar
DEFAULT_BUCKET=baz
ES_APM_ENABLED=False

### Django ###
DJANGO_SECRET_KEY=changeme
DJANGO_SETTINGS_MODULE=config.settings.local
# OAuth2 settings for Django Admin access
ADMIN_OAUTH2_ENABLED=False
ADMIN_OAUTH2_TOKEN_FETCH_URL=http://localhost:8100/o/token
ADMIN_OAUTH2_USER_PROFILE_URL=
ADMIN_OAUTH2_AUTH_URL=http://localhost:8100/o/authorize
ADMIN_OAUTH2_CLIENT_ID=oauth2-client-id
ADMIN_OAUTH2_CLIENT_SECRET=oauth2-secret-id
ADMIN_OAUTH2_REDIRECT_URL=http://localhost:8000/oauth/callback
# If you’re working with data-hub-frontend and mock-sso, DJANGO_SUPERUSER_EMAIL should
# be the same as MOCK_SSO_USERNAME in mock-sso’s .env file, and
# DJANGO_SUPERUSER_SSO_EMAIL_USER_ID the same as DJANGO_SUPERUSER_EMAIL in data-hub-api .env file otherwise the user may not exist
[email protected]
DJANGO_SUPERUSER_PASSWORD=foobarbaz
[email protected]
# If SUPERUSER_ACCESS_TOKEN is given a value, an access token for the
# superuser with that value will be created when the server comes up.
# The superuser should have an SSO email user ID set for this to work.
SUPERUSER_ACCESS_TOKEN=ditStaffToken

### Email Domain Allowlist ###
DIT_EMAIL_DOMAINS=trade.gov.uk,digital.trade.gov.uk

### PyTest ###
ALLOW_TEST_FIXTURE_SETUP=True

### Disable output buffering in Python ###
PYTHONUNBUFFERED=1

### AWS ###
AWS_DEFAULT_REGION=eu-west-2
AWS_ACCESS_KEY_ID=foo
AWS_SECRET_ACCESS_KEY=bar
DEFAULT_BUCKET=upload.datahub.dev.uktrade.io
SSO_ENABLED=True
STAFF_SSO_BASE_URL=http://mock-sso:8080/
STAFF_SSO_AUTH_TOKEN=sso-token

### Activity Stream config ###
ACTIVITY_STREAM_ACCESS_KEY_ID=some-id
ACTIVITY_STREAM_SECRET_ACCESS_KEY=some-secret
ACTIVITY_STREAM_OUTGOING_URL=http://activity.stream/
ACTIVITY_STREAM_OUTGOING_ACCESS_KEY_ID=some-outgoing-id
ACTIVITY_STREAM_OUTGOING_SECRET_ACCESS_KEY=some-outgoing-secret

### Market Access service config ###
MARKET_ACCESS_ACCESS_KEY_ID=market-access-id
MARKET_ACCESS_SECRET_ACCESS_KEY=market-access-key
PAAS_IP_ALLOWLIST=1.2.3.4
# Set this when using local environment
# DISABLE_PAAS_IP_CHECK=true

DIT_EMAIL_DOMAINS=trade.gov.uk,digital.trade.gov.uk
### Data Flow API Config ###
DATA_FLOW_API_ACCESS_KEY_ID=data-flow-api-id
DATA_FLOW_API_SECRET_ACCESS_KEY=data-flow-api-access-key
DATA_HUB_ENQUIRY_MGMT_HAWK_ID=data-hub-enquiry-mgmt-hawk-id
DATA_HUB_ENQUIRY_MGMT_HAWK_SECRET_KEY=data-hub-enquiry-mgmt-hawk-secret-key

### Datahub Frontend config ###
DATA_HUB_FRONTEND_ACCESS_KEY_ID=frontend-key-id
DATA_HUB_FRONTEND_SECRET_ACCESS_KEY=frontend-key

# Determines the docker-compose project - by default, containers with the same
# project name share a network and are able to communicate with eachother
COMPOSE_PROJECT_NAME=data-hub
# Some extra ENV variables to make superuser creation easier on docker copies
# If you're working with data-hub-frontend and mock-sso, DJANGO_SUPERUSER_EMAIL should
# be the same as MOCK_SSO_USERNAME in mock-sso's .env file, and
# DJANGO_SUPERUSER_SSO_EMAIL_USER_ID the same as DJANGO_SUPERUSER_EMAIL in data-hub-api .env file otherwise the user may not exist
[email protected]
DJANGO_SUPERUSER_PASSWORD=foobarbaz
[email protected]

# If SUPERUSER_ACCESS_TOKEN is given a value, an access token for the
# superuser with that value will be created when the container comes up.
# The superuser should have an SSO email user ID set for this to work.
SUPERUSER_ACCESS_TOKEN=ditStaffToken

# Settings for Elasticsearch APM.
ES_APM_ENABLED=False
# ES_APM_SERVICE_NAME=datahub
# ES_APM_SECRET_TOKEN=
# ES_APM_SERVER_URL=http://localhost:8200
# ES_APM_ENVIRONMENT='develop'

# OAuth2 settings for Django Admin access
ADMIN_OAUTH2_ENABLED=False
ADMIN_OAUTH2_TOKEN_FETCH_URL=http://localhost:8100/o/token
ADMIN_OAUTH2_USER_PROFILE_URL=
ADMIN_OAUTH2_AUTH_URL=http://localhost:8100/o/authorize
ADMIN_OAUTH2_CLIENT_ID=oauth2-client-id
ADMIN_OAUTH2_CLIENT_SECRET=oauth2-secret-id
ADMIN_OAUTH2_REDIRECT_URL=http://localhost:8000/oauth/callback

# dnb-service settings
DNB_SERVICE_BASE_URL=http://api-dnb:8000/api/
# Generated through django-rest-framework on dnb-service
DNB_SERVICE_TOKEN=cc373a2a49ce7143817a9d036fa2a0be92da0d6a
DNB_MAX_COMPANIES_IN_TREE_COUNT=1000

# Python specific env vars
PYTHONUNBUFFERED=1
### Dun & Bradstreet service Config ###
DNB_SERVICE_BASE_URL=https://dnb-service.dev.datahub.uktrade.digital/api/
DNB_SERVICE_TOKEN=some-token

# Consent Service settings
### Consent service config ###
CONSENT_SERVICE_BASE_URL=http://mock-third-party-services:8555
CONSENT_SERVICE_HAWK_ID=dummyId
CONSENT_SERVICE_HAWK_KEY=dummyKey

DATAHUB_NOTIFICATION_API_KEY=
OMIS_NOTIFICATION_API_KEY=
INVESTMENT_NOTIFICATION_API_KEY=
INTERACTION_NOTIFICATION_API_KEY=
EXPORT_WIN_NOTIFICATION_API_KEY=
### Overseas Market Introduction Service config ###
OMIS_PUBLIC_ACCESS_KEY_ID=access-key-id
OMIS_PUBLIC_SECRET_ACCESS_KEY=secret-access-key
Loading