Gestion AEMO Fribourg
Find a file
2024-06-03 16:49:01 +02:00
aemo Initial commit 2024-06-03 16:49:01 +02:00
archive Initial commit 2024-06-03 16:49:01 +02:00
common Initial commit 2024-06-03 16:49:01 +02:00
docker-dev Initial commit 2024-06-03 16:49:01 +02:00
settings Initial commit 2024-06-03 16:49:01 +02:00
templates Initial commit 2024-06-03 16:49:01 +02:00
.gitignore Initial commit 2024-06-03 16:49:01 +02:00
docker-compose.override.yml Initial commit 2024-06-03 16:49:01 +02:00
docker-compose.yml Initial commit 2024-06-03 16:49:01 +02:00
INSTALL Initial commit 2024-06-03 16:49:01 +02:00
LICENSE Initial commit 2024-06-03 16:49:01 +02:00
manage.py Initial commit 2024-06-03 16:49:01 +02:00
README.md Initial commit 2024-06-03 16:49:01 +02:00
requirements.txt Initial commit 2024-06-03 16:49:01 +02:00
requirements_dev.txt Initial commit 2024-06-03 16:49:01 +02:00
ruff.toml Initial commit 2024-06-03 16:49:01 +02:00

AEMO - Fribourg

Description

TODO

Dev environment with docker-compose

Prerequisites:

  • Docker & docker-compose installed on your system
  • "docker" group exists on your system (sudo groupadd docker will fail anyway if it exists)
  • Your user in "docker" group (sudo usermod -aG docker $USER + restart your system)

Setup

You first have to create the untracked file "settings/init.py" to be able to run the docker environment. Use command line: echo "from settings.dev_docker import *" > settings/__init__.py

Then, you have to add your POST API credentials in it. settings/init.py should looks like:

from settings.dev_docker import *


POST_API_USER = 'your user'
POST_API_PASSWORD = 'your password'

Quick start

To run a dev environment: docker-compose up -d
Then you'll be able to reach your local environment on localhost:8000

To collect static: docker-compose exec web /src/manage.py collectstatic

To restart & rebuild your environment: docker-compose down --remove-orphans && docker-compose up -d --build

To use Python debugger (breakpoint): docker attach $(docker-compose ps -q web)

Clone production data in your dev environment

There are three fabric tasks you can use to clone & prepare production data in your dev environment:

  • fab -r docker-dev -H <HOST> download-remote-data dumps remote (production) DB, downloads it and synchronizes media
  • fab -r docker-dev import-db-in-dev overwrites local DB with the production DB
  • fab -r docker-dev create-admin-in-dev create a superuser with credentials admin/admin

You can also proceed all tasks in the same command with: fab -r docker-dev -H <HOST> download-remote-data import-db-in-dev create-admin-in-dev