Find the Exponential Software extensions you want
| UNIX name | Owner | Status |
|---|---|---|
| platformsh-store-logs-at-s3 | 7x | stable |
| Version | Compatible with |
|---|---|
| N/A | N/A |
On Platform.sh all log files are trimmed to 10 MB automatically, but in some cases, it is vital to have complete access logs: to analyze traffic/analytics/etc.
This solution provides a cronjob which uploads logs to AWS S3 (separate file for each day). As result, you will have complete logs. It handles only access logs, but it is relatively easy to adjust it to any other logs.
platform variable:create --level=project --name=AWS_ACCESS_KEY_ID --value= --json=false --sensitive=true --prefix=env --visible-build=true --visible-runtime=true
platform variable:create --level=project --name=AWS_SECRET_ACCESS_KEY --value= --json=false --sensitive=true --prefix=env --visible-build=true --visible-runtime=true
platform variable:create --level=project --name=LOGS_S3_BUCKET --value=platform.sh-logs --json=false --sensitive=false --prefix=env --visible-build=false --visible-runtime=true
platform variable:create --level=project --name=LOGS_S3_FOLDER --value=my-project --json=false --sensitive=false --prefix=env --visible-build=false --visible-runtime=true
platform variable:create --level=project --name=LOGS_TMP_PATH --value=/app/var --json=false --sensitive=false --prefix=env --visible-build=false --visible-runtime=true
if [ ! -z "$AWS_ACCESS_KEY_ID" ] && [ ! -z "$AWS_SECRET_ACCESS_KEY" ]; then
pip install futures
pip install awscli --upgrade --user 2>/dev/null
fi
upload_logs_to_s3:
spec: '0 * * * *'
cmd: |
if [ "$PLATFORM_BRANCH" = master ]; then
bash cron/upload_logs_to_s3.sh
fi