aemo | ||
archive | ||
common | ||
docker-dev | ||
settings | ||
templates | ||
.gitignore | ||
docker-compose.override.yml | ||
docker-compose.yml | ||
INSTALL | ||
LICENSE | ||
manage.py | ||
README.md | ||
requirements.txt | ||
requirements_dev.txt | ||
ruff.toml |
AEMO - Fribourg
Description
TODO
Dev environment with docker-compose
Prerequisites:
- Docker & docker-compose installed on your system
- "docker" group exists on your system (
sudo groupadd docker
will fail anyway if it exists) - Your user in "docker" group (
sudo usermod -aG docker $USER
+ restart your system)
Setup
You first have to create the untracked file "settings/init.py" to be able to run the docker environment.
Use command line: echo "from settings.dev_docker import *" > settings/__init__.py
Then, you have to add your POST API credentials in it. settings/init.py should looks like:
from settings.dev_docker import *
POST_API_USER = 'your user'
POST_API_PASSWORD = 'your password'
Quick start
To run a dev environment: docker-compose up -d
Then you'll be able to reach your local environment on localhost:8000
To collect static: docker-compose exec web /src/manage.py collectstatic
To restart & rebuild your environment: docker-compose down --remove-orphans && docker-compose up -d --build
To use Python debugger (breakpoint): docker attach $(docker-compose ps -q web)
Clone production data in your dev environment
There are three fabric tasks you can use to clone & prepare production data in your dev environment:
fab -r docker-dev -H <HOST> download-remote-data
dumps remote (production) DB, downloads it and synchronizes mediafab -r docker-dev import-db-in-dev
overwrites local DB with the production DBfab -r docker-dev create-admin-in-dev
create a superuser with credentials admin/admin
You can also proceed all tasks in the same command with:
fab -r docker-dev -H <HOST> download-remote-data import-db-in-dev create-admin-in-dev