Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dev #1

Merged
merged 40 commits into from
Dec 3, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
40 commits
Select commit Hold shift + click to select a range
f7dbaa1
Init commit, created the backbone structure for the test
alex-au-922 Dec 1, 2023
b92505f
Small Commit
alex-au-922 Dec 1, 2023
259b4f4
Adding tests for producer and consumer's code
alex-au-922 Dec 2, 2023
8a702ee
Added test for the producer
alex-au-922 Dec 2, 2023
d075762
Debugging the outputs for the step load-dotenv
alex-au-922 Dec 2, 2023
6a1fc3b
Debugging the ports for docker network in github actions
alex-au-922 Dec 2, 2023
4b10b63
Debugging the error of queue not found in rabbitmq
alex-au-922 Dec 2, 2023
1cba6c4
Updated the code coverage testing
alex-au-922 Dec 2, 2023
f8cf512
Updated the test
alex-au-922 Dec 2, 2023
69c5d73
Updated the id-token permission
alex-au-922 Dec 2, 2023
6b02030
Updated the permissions
alex-au-922 Dec 2, 2023
2ec6ac2
Updated coverage.svg
github-actions[bot] Dec 2, 2023
0460810
Updated the CICD test pipeline
alex-au-922 Dec 2, 2023
be39f26
Updated the coverage step
alex-au-922 Dec 2, 2023
f4bca34
Solving the cache hit problem
alex-au-922 Dec 2, 2023
770eda2
Debugging the coverage file
alex-au-922 Dec 2, 2023
35fd2dd
Debugging the permissions
alex-au-922 Dec 2, 2023
dec5d53
Updated coverage.svg
github-actions[bot] Dec 2, 2023
550a3a1
Ignore the coverage on test files
alex-au-922 Dec 2, 2023
e7d667a
Merge branch 'dev' of https://github.com/alex-au-922/producer_consume…
alex-au-922 Dec 2, 2023
1290ec4
Updated the coverage report scope to remove test files
alex-au-922 Dec 2, 2023
76778c0
Updated the postgresql test case
alex-au-922 Dec 2, 2023
c2ccbf4
Updated the cicd
alex-au-922 Dec 2, 2023
c446ac2
Init database
alex-au-922 Dec 2, 2023
e31dee1
Added the init db test script
alex-au-922 Dec 2, 2023
a3f20fa
Updated the env parameter
alex-au-922 Dec 2, 2023
cead93e
Updated coverage.svg
github-actions[bot] Dec 2, 2023
6155219
updated the report command
alex-au-922 Dec 2, 2023
4ec9480
Merge branch 'dev' of https://github.com/alex-au-922/producer_consume…
alex-au-922 Dec 2, 2023
53e4a08
Added tests for rabbitmq
alex-au-922 Dec 2, 2023
b4d0c70
Updated the tests
alex-au-922 Dec 2, 2023
bcb7b18
Updated coverage.svg
github-actions[bot] Dec 2, 2023
08d3cd5
changing the names for better consistency
alex-au-922 Dec 3, 2023
b4ec3d8
Merge branch 'dev' of https://github.com/alex-au-922/producer_consume…
alex-au-922 Dec 3, 2023
7964eeb
Updated the main test
alex-au-922 Dec 3, 2023
ea6b124
Updated coverage.svg
github-actions[bot] Dec 3, 2023
fa557c5
Updated the deps
alex-au-922 Dec 3, 2023
d3dcfdf
Merge branch 'dev' of https://github.com/alex-au-922/producer_consume…
alex-au-922 Dec 3, 2023
60a4a6f
udpated the test
alex-au-922 Dec 3, 2023
dbe4a10
Updated the test and readme
alex-au-922 Dec 3, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
49 changes: 49 additions & 0 deletions .env
Original file line number Diff line number Diff line change
@@ -0,0 +1,49 @@
TZ=Asia/Hong_Kong

POSTGRES_VERSION_TAG=13
POSTGRES_PORT=5432
POSTGRES_USERNAME=postgres
POSTGRES_PASSWORD=postgres
POSTGRES_DATABASE=records
POSTGRES_BATCH_UPSERT_SIZE=5000
POSTGRES_VOLUME_DIR=./postgres-data

RABBITMQ_VERSION_TAG=3.12.10-management
RABBITMQ_USERNAME=rabbitmq
RABBITMQ_PASSWORD=rabbitmq
RABBITMQ_PORT=5672
RABBITMQ_WEBAPP_PORT=15672
RABBITMQ_POLLING_TIMEOUT=60
RABBITMQ_SOCKET_TIMEOUT=86400
RABBITMQ_VOLUME_DIR=./rabbitmq-data

RABBITMQ_QUEUE_NAME=filenames

AMAZON_LINUX_VERSION_TAG=2023.2.20231113.0

TARGET_FILE_DIR=./records_test
TARGET_FILE_EXTENSION=.csv

PRODUCER_LOG_LEVEL=INFO
PRODUCER_LOG_FORMAT="[%(asctime)s | %(levelname)s] {%(filename)s:%(lineno)d} >> %(message)s"
PRODUCER_LOG_DATE_FORMAT="%Y-%m-%d %H:%M:%S"
PRODUCER_LOG_DIR=./logs/producer
PRODUCER_LOG_RETENTION=7
PRODUCER_LOG_ROTATION=midnight

CONSUMER_LOG_LEVEL=INFO
CONSUMER_LOG_FORMAT="[%(asctime)s | %(levelname)s] {%(filename)s:%(lineno)d} >> %(message)s"
CONSUMER_LOG_DATE_FORMAT="%Y-%m-%d %H:%M:%S"
CONSUMER_LOG_DIR=./logs/consumer
CONSUMER_LOG_RETENTION=7
CONSUMER_LOG_ROTATION=midnight
CONSUMER_REPLICAS=16

CSV_PARSER_DELIMITER=","
CSV_PARSER_FILE_EXTENSION=.csv

GEN_NUM_SENSORS=1000
GEN_NUM_RECORDS=100000
GEN_START_DATE=2021-01-01
GEN_RECORD_INTERVAL=5
GEN_TIMEZONE=Asia/Hong_Kong
257 changes: 257 additions & 0 deletions .github/workflows/test.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,257 @@
name: Producer Consumer CI Test
on:
push:
branches: ["dev"]
workflow_dispatch:
jobs:
load-dotenv:
runs-on: ubuntu-latest
outputs:
target-file-dir: ${{ steps.load-dotenv.outputs.TARGET_FILE_DIR }}
target-file-extension: ${{ steps.load-dotenv.outputs.TARGET_FILE_EXTENSION }}
postgres-version-tag: ${{ steps.load-dotenv.outputs.POSTGRES_VERSION_TAG }}
postgres-port: ${{ steps.load-dotenv.outputs.POSTGRES_PORT }}
postgres-username: ${{ steps.load-dotenv.outputs.POSTGRES_USERNAME }}
postgres-password: ${{ steps.load-dotenv.outputs.POSTGRES_PASSWORD }}
postgres-database: ${{ steps.load-dotenv.outputs.POSTGRES_DATABASE }}
rabbitmq-version-tag: ${{ steps.load-dotenv.outputs.RABBITMQ_VERSION_TAG }}
rabbitmq-port: ${{ steps.load-dotenv.outputs.RABBITMQ_PORT }}
rabbitmq-username: ${{ steps.load-dotenv.outputs.RABBITMQ_USERNAME }}
rabbitmq-password: ${{ steps.load-dotenv.outputs.RABBITMQ_PASSWORD }}
rabbitmq-queue-name: ${{ steps.load-dotenv.outputs.RABBITMQ_QUEUE_NAME }}
rabbitmq-socket-timeout: ${{ steps.load-dotenv.outputs.RABBITMQ_SOCKET_TIMEOUT }}
csv-parser-delimiter: ${{ steps.load-dotenv.outputs.CSV_PARSER_DELIMITER }}
csv-parser-file-extension: ${{ steps.load-dotenv.outputs.CSV_PARSER_FILE_EXTENSION }}
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Load dotenv
id: load-dotenv
run: |
set -o allexport
source .env
set +o allexport
echo "TARGET_FILE_DIR=$TARGET_FILE_DIR" >> $GITHUB_OUTPUT
echo "TARGET_FILE_EXTENSION=$TARGET_FILE_EXTENSION" >> $GITHUB_OUTPUT
echo "POSTGRES_VERSION_TAG=$POSTGRES_VERSION_TAG" >> $GITHUB_OUTPUT
echo "POSTGRES_PORT=$POSTGRES_PORT" >> $GITHUB_OUTPUT
echo "POSTGRES_USERNAME=$POSTGRES_USERNAME" >> $GITHUB_OUTPUT
echo "POSTGRES_PASSWORD=$POSTGRES_PASSWORD" >> $GITHUB_OUTPUT
echo "POSTGRES_DATABASE=$POSTGRES_DATABASE" >> $GITHUB_OUTPUT
echo "RABBITMQ_VERSION_TAG=$RABBITMQ_VERSION_TAG" >> $GITHUB_OUTPUT
echo "RABBITMQ_PORT=$RABBITMQ_PORT" >> $GITHUB_OUTPUT
echo "RABBITMQ_USERNAME=$RABBITMQ_USERNAME" >> $GITHUB_OUTPUT
echo "RABBITMQ_PASSWORD=$RABBITMQ_PASSWORD" >> $GITHUB_OUTPUT
echo "RABBITMQ_QUEUE_NAME=$RABBITMQ_QUEUE_NAME" >> $GITHUB_OUTPUT
echo "RABBITMQ_SOCKET_TIMEOUT=$RABBITMQ_SOCKET_TIMEOUT" >> $GITHUB_OUTPUT
echo "CSV_PARSER_DELIMITER=$CSV_PARSER_DELIMITER" >> $GITHUB_OUTPUT
echo "CSV_PARSER_FILE_EXTENSION=$CSV_PARSER_FILE_EXTENSION" >> $GITHUB_OUTPUT
test-producer:
needs: load-dotenv
runs-on: ubuntu-latest
env:
WATCH_FILE_PATTERNS: |
producer/**/*.py
producer/requirements-dev.txt
COVERAGE_FILE: .coverage_producer
WORKDIR: producer
outputs:
coverage-file-cache-path: ${{ steps.output-coverage-file.outputs.COVERAGE_FILE_CACHE_PATH }}
coverage-file-cache-key: ${{ steps.output-coverage-file.outputs.COVERAGE_FILE_CACHE_KEY }}
services:
rabbitmq:
image: rabbitmq:${{ needs.load-dotenv.outputs.rabbitmq-version-tag }}
env:
RABBITMQ_DEFAULT_USER: ${{ needs.load-dotenv.outputs.rabbitmq-username }}
RABBITMQ_DEFAULT_PASS: ${{ needs.load-dotenv.outputs.rabbitmq-password }}
options: >-
--health-cmd "rabbitmq-diagnostics -q check_running"
--health-interval 5s
--health-timeout 30s
--health-retries 3
ports:
- ${{ needs.load-dotenv.outputs.rabbitmq-port }}:5672
steps:
- name: Checkout
uses: actions/checkout@v4
- uses: actions/cache@v2
id: cache
with:
path: ${{env.COVERAGE_FILE}}
key: ${{ runner.os }}-coverage-producer-${{ hashFiles(env.WATCH_FILE_PATTERNS) }}
restore-keys: |
${{ runner.os }}-coverage-producer-
- uses: actions/setup-python@v4
if: steps.cache.outputs.cache-hit != 'true'
with:
python-version: '3.11'
cache: 'pip'
cache-dependency-path: ${{env.WORKDIR}}/requirements-dev.txt
- name: Install dependencies
if: steps.cache.outputs.cache-hit != 'true'
working-directory: ${{env.WORKDIR}}
run: pip install -r requirements-dev.txt
- name: Run tests
if: steps.cache.outputs.cache-hit != 'true'
run: |
coverage run -m pytest -v producer/tests
env:
RABBITMQ_HOST: localhost
RABBITMQ_PORT: ${{ needs.load-dotenv.outputs.rabbitmq-port }}
RABBITMQ_USERNAME: ${{ needs.load-dotenv.outputs.rabbitmq-username }}
RABBITMQ_PASSWORD: ${{ needs.load-dotenv.outputs.rabbitmq-password }}
RABBITMQ_QUEUE_NAME: ${{ needs.load-dotenv.outputs.rabbitmq-queue-name }}
RABBITMQ_SOCKET_TIMEOUT: ${{ needs.load-dotenv.outputs.rabbitmq-socket-timeout }}
TARGET_FILE_DIR: ${{ needs.load-dotenv.outputs.target-file-dir }}
TARGET_FILE_EXTENSION: ${{ needs.load-dotenv.outputs.target-file-extension }}
- name: Output coverage file
id: output-coverage-file
run: |
echo "COVERAGE_FILE_CACHE_PATH=${{env.COVERAGE_FILE}}" >> $GITHUB_OUTPUT
echo "COVERAGE_FILE_CACHE_KEY=${{ runner.os }}-coverage-producer-${{ hashFiles(env.WATCH_FILE_PATTERNS) }}" >> $GITHUB_OUTPUT
test-consumer:
needs: load-dotenv
runs-on: ubuntu-latest
env:
WATCH_FILE_PATTERNS: |
consumer/**/*.py
consumer/requirements-dev.txt
COVERAGE_FILE: .coverage_consumer
WORKDIR: consumer
outputs:
coverage-file-cache-path: ${{ steps.output-coverage-file.outputs.COVERAGE_FILE_CACHE_PATH }}
coverage-file-cache-key: ${{ steps.output-coverage-file.outputs.COVERAGE_FILE_CACHE_KEY }}
services:
rabbitmq:
image: rabbitmq:${{ needs.load-dotenv.outputs.rabbitmq-version-tag }}
env:
RABBITMQ_DEFAULT_USER: ${{ needs.load-dotenv.outputs.rabbitmq-username }}
RABBITMQ_DEFAULT_PASS: ${{ needs.load-dotenv.outputs.rabbitmq-password }}
options: >-
--health-cmd "rabbitmq-diagnostics -q check_running"
--health-interval 5s
--health-timeout 30s
--health-retries 3
ports:
- ${{ needs.load-dotenv.outputs.rabbitmq-port }}:5672
postgres:
image: postgres:${{ needs.load-dotenv.outputs.postgres-version-tag }}
env:
POSTGRES_USER: ${{ needs.load-dotenv.outputs.postgres-username }}
POSTGRES_PASSWORD: ${{ needs.load-dotenv.outputs.postgres-password }}
POSTGRES_DB: ${{ needs.load-dotenv.outputs.postgres-database }}
options: >-
--health-cmd pg_isready
--health-interval 5s
--health-timeout 30s
--health-retries 3
ports:
- ${{ needs.load-dotenv.outputs.postgres-port }}:5432
steps:
- name: Checkout
uses: actions/checkout@v4
- uses: actions/cache@v2
id: cache
with:
path: ${{env.COVERAGE_FILE}}
key: ${{ runner.os }}-coverage-consumer-${{ hashFiles(env.WATCH_FILE_PATTERNS) }}
restore-keys: |
${{ runner.os }}-coverage-consumer-
- uses: actions/setup-python@v4
if: steps.cache.outputs.cache-hit != 'true'
with:
python-version: '3.11'
cache: 'pip'
cache-dependency-path: ${{env.WORKDIR}}/requirements-dev.txt
- name: Install dependencies
if: steps.cache.outputs.cache-hit != 'true'
working-directory: ${{env.WORKDIR}}
run: pip install -r requirements-dev.txt
- name: Run tests
if: steps.cache.outputs.cache-hit != 'true'
run: |
coverage run -m pytest -v consumer/tests
env:
POSTGRES_HOST: localhost
POSTGRES_PORT: ${{ needs.load-dotenv.outputs.postgres-port }}
POSTGRES_USERNAME: ${{ needs.load-dotenv.outputs.postgres-username }}
POSTGRES_PASSWORD: ${{ needs.load-dotenv.outputs.postgres-password }}
POSTGRES_DATABASE: ${{ needs.load-dotenv.outputs.postgres-database }}
RABBITMQ_HOST: localhost
RABBITMQ_PORT: ${{ needs.load-dotenv.outputs.rabbitmq-port }}
RABBITMQ_USERNAME: ${{ needs.load-dotenv.outputs.rabbitmq-username }}
RABBITMQ_PASSWORD: ${{ needs.load-dotenv.outputs.rabbitmq-password }}
RABBITMQ_QUEUE_NAME: ${{ needs.load-dotenv.outputs.rabbitmq-queue-name }}
RABBITMQ_SOCKET_TIMEOUT: ${{ needs.load-dotenv.outputs.rabbitmq-socket-timeout }}
CSV_PARSER_DELIMITER: ${{ needs.load-dotenv.outputs.csv-parser-delimiter }}
CSV_PARSER_FILE_EXTENSION: ${{ needs.load-dotenv.outputs.csv-parser-file-extension }}
- name: Output coverage file
id: output-coverage-file
run: |
echo "COVERAGE_FILE_CACHE_PATH=${{env.COVERAGE_FILE}}" >> $GITHUB_OUTPUT
echo "COVERAGE_FILE_CACHE_KEY=${{ runner.os }}-coverage-consumer-${{ hashFiles(env.WATCH_FILE_PATTERNS) }}" >> $GITHUB_OUTPUT
coverage:
needs: [test-producer, test-consumer]
runs-on: ubuntu-latest
permissions:
contents: write
id-token: write
pages: write
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Retrieve producer coverage file
uses: actions/cache@v2
id: producer-cache
with:
path: ${{ needs.test-producer.outputs.coverage-file-cache-path }}
key: ${{ needs.test-producer.outputs.coverage-file-cache-key }}
restore-keys: |
${{ runner.os }}-coverage-producer-
- name: Retrieve consumer coverage file
uses: actions/cache@v2
id: consumer-cache
with:
path: ${{ needs.test-consumer.outputs.coverage-file-cache-path }}
key: ${{ needs.test-consumer.outputs.coverage-file-cache-key }}
restore-keys: |
${{ runner.os }}-coverage-consumer-
- uses: actions/setup-python@v4
with:
python-version: '3.11'
- name: Install dependencies
run: pip install coverage
- name: Combine coverage files
run: |
coverage combine ${{ needs.test-producer.outputs.coverage-file-cache-path }} ${{ needs.test-consumer.outputs.coverage-file-cache-path }}
- name: Generate coverage report
run: |
coverage report --omit="*/tests/*" -m
coverage html --omit="*/tests/*"
- name: upload artifact
uses: actions/upload-pages-artifact@v1
with:
path: ./htmlcov/
- name: deploy to Github Pages
uses: actions/deploy-pages@v2
id: deployment
- name: Coverage Badge
uses: tj-actions/coverage-badge-py@v2
- name: Verify Changed files
uses: tj-actions/verify-changed-files@v16
id: verify-changed-files
with:
files: coverage.svg
- name: Commit files
if: steps.verify-changed-files.outputs.files_changed == 'true'
run: |
git config --local user.email "github-actions[bot]@users.noreply.github.com"
git config --local user.name "github-actions[bot]"
git add coverage.svg
git commit -m "Updated coverage.svg"
- name: Push changes
if: steps.verify-changed-files.outputs.files_changed == 'true'
uses: ad-m/github-push-action@master
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
branch: ${{ github.ref }}
9 changes: 9 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
.pytest_cache
__pycache__
.mypy_cache
records
records_test
logs
postgres-data
postgres-logs
rabbitmq-data
80 changes: 80 additions & 0 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,80 @@
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.5.0
hooks:
- id: check-yaml
- id: end-of-file-fixer
- id: trailing-whitespace
- id: check-added-large-files
- id: check-json
- id: check-toml
- id: detect-aws-credentials
args: [--allow-missing-credentials]
- id: detect-private-key
- repo: https://github.com/psf/black
rev: 23.11.0
hooks:
- id: black
- repo: https://github.com/python-poetry/poetry
rev: 1.7.0
hooks:
- id: poetry-check
name: poetry check producer
args: ["-C", "./producer"]
- id: poetry-lock
name: poetry lock producer
args: ["-C", "./producer"]
- id: poetry-export
name: poetry export producer dev dependencies
args: [
"-C",
"./producer",
"-f", "requirements.txt",
"-o",
"./producer/requirements-dev.txt",
"--without-hashes",
"--with",
"dev"
]
always_run: true
- id: poetry-export
name: poetry export producer dependencies
args: [
"-C",
"./producer",
"-f", "requirements.txt",
"-o",
"./producer/requirements.txt",
"--without-hashes"
]
always_run: true
- id: poetry-check
name: poetry check consumer
args: ["-C", "./consumer"]
- id: poetry-lock
name: poetry lock consumer
args: ["-C", "./consumer"]
- id: poetry-export
name: poetry export consumer dev dependencies
args: [
"-C",
"./consumer",
"-f", "requirements.txt",
"-o",
"./consumer/requirements-dev.txt",
"--without-hashes",
"--with",
"dev"
]
always_run: true
- id: poetry-export
name: poetry export consumer dependencies
args: [
"-C",
"./consumer",
"-f", "requirements.txt",
"-o",
"./consumer/requirements.txt",
"--without-hashes"
]
always_run: true
Loading
Loading