- sync the logs u want locally
First download the logs you want.. EG:
export YEAR=2020
export MONTH=03
export DAY=06
export HOUR=16
export S3BUCKET=my-alb-log-bucket
export AWS_ACCOUNT_ID=123321321321
export REGION=us-east-1
export ALB_NAME=ingress
export SOMETHING=alb-foo
mkdir -p ./logs/${YEAR}/${MONTH}/${DAY}/
aws s3 sync \
s3://${S3BUCKET}/${SOMETHING}/AWSLogs/${AWS_ACCOUNT_ID}/elasticloadbalancing/${REGION}/${YEAR}/${MONTH}/${DAY}/ \
./logs/${YEAR}/${MONTH}/${DAY}/ \
--exclude="*" \
--include "*${ALB_NAME}*${YEAR}${MONTH}${DAY}T${HOUR}*"
-
docker-compose up
-
parse logs into elastic
find ./logs/${YEAR}/${MONTH}/${DAY} -type f -name "*.gz" -exec ./main.py -f {} \;
- get es + kibana data dir ready.. we need it owned by 1000:0
mkdir data-elasticsearch
sudo chmod g+rwx data-elasticsearch
sudo chown -R 1000:0 data-elasticsearch
-
docker-compose up
-
disable kibana telemetry
once its up.. u can disable telemetry like so:
curl -v http://localhost:5601/api/telemetry/v2/optIn \
-H "kbn-version: 7.6.0" \
-H "Accept: application/json, text/plain, */*" \
-H "Content-Type: application/json;charset=utf-8" \
--data '{"enabled":false}'
- create a python venv in
localenv
dir
python3 -m venv localvenv
- activate the venv
(or use direnv allow
)
source localvenv/bin/activate
- install the deps
pip install --upgrade pip
pip install -r requirements.txt
-
write code
-
don't forget to freeze deps
pip freeze --path ./localvenv/lib/python3.8/site-packages > requirements.txt
The fastest way I've found to parse the lines is with the regex by @jweyrich here