To protect your data, the CISO officer has suggested users to enable 2FA as soon as possible.
Currently 2.7% of users enabled 2FA.

Commit 314cccfd authored by Liam Hayes's avatar Liam Hayes
Browse files

updated docs

parent 7b542a09
......@@ -4,6 +4,7 @@ __pycache__
# results files
\ No newline at end of file
# Local Deployment Guide
How to get this django project running on your local machine.
# pyenv and poetry
pyenv and poetry are requirements of this project, but also just useful tools to have when developing with python.
Install `pyenv`:
- `curl | bash`
- add the following to your `~/.basrc`:
# pyenv
export PATH="~/.pyenv/bin:$PATH"` to `~/.bashrc`
eval "$(pyenv init -)"
eval "$(pyenv virtualenv-init -)"
- `source ~/.bashrc`
Install `python-3.8.2` with `pyenv`:
sudo apt update && sudo apt install -y make build-essential \
libssl-dev zlib1g-dev libbz2-dev libreadline-dev libsqlite3-dev \
wget curl llvm libncurses5-dev libncursesw5-dev xz-utils tk-dev \
libffi-dev liblzma-dev python-openssl git
pyenv install 3.8.2
Install `poetry`:
- `curl -sSL | python`
- add `source $HOME/.poetry/env` to your `~/.bashrc`
- `source ~/.bashrc`
# install
Add the following to your `~/.bashrc`:
# django settings
export DJANGO_LOCATION='home'
export DJANGO_SECRET_KEY='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx'
Create your own secret key of random characters.
Then do the following:
source ~/.bashrc
git clone
poetry install # install all python packages
./manage migrate
./manage runserver
Browse to `localhost:8000` to see the website.
# import the relevant data
The script `scripts/manage_data/` fills the django database with useful data
(validation wind farms, plots and results, as well as wind turbine power curves).
It gets this data from `drive1`, a harddrive that I (Liam Hayes) filled with
wind related data from "", "ENTSO-E transparency platform", ""
and "Global Wind Atlas".
\ No newline at end of file
# How to install onto remote machine
# Remote Deployment Guide
## AWS EC2 setup
This explains how to deploy this website to run on an Amazon Web Services EC2 instance.
It assumes that you've got this django project working on your local machine first.
- Launch t2.small Ubuntu-18.04 instance.
......@@ -13,17 +17,18 @@ On local machine:
- SSH: "ssh windatlas"
On remote machine:
- Install pyenv, poetry and postgres.
- `sudo apt install libpq-dev`
- Install pyenv, poetry, postgres, nginx.
- `sudo apt install libpq-dev`, a package required for postgres
- git clone the repo
- `cd windatlas`
- `poetry install`
- `cd`
- `poetry install` to install required python packages
## Database setup
# Database
The remote django project uses PostGres database.
check that postgres and libpq-dev are be installed, and psycopg2 is in the virtual environment
postgres and libpq-dev should installed system packages.
psycopg2 should be an installed python package.
On remote machine `sudo -u postgres psql` (create real password for postgres database):
......@@ -39,11 +44,15 @@ GRANT ALL PRIVILEGES ON DATABASE windatlas_db TO windatlas_user;
On remove machine:
- `./manage migrate`
If it complains that "django.db.utils.ProgrammingError: relation "farms_powercurve" does not exist"
then try commenting out "initial=PowerCurve.objects.get(name='Vestas V126-3450')," from farms/
then try commenting out "initial=PowerCurve.objects.get(name='Vestas V126-3450')," from `farms/`.
Or maybe there is a similar offending line somewhere. I don't understand why this happens.
# S3
## S3 setup
The remote machine needs the credentials to access the S3 bucket which contains the ERA5 wind speed data.
The remote machine needs the credentials to access the S3 bucket.
The Amazon resource name (ARN) of the S3 bucket is `arn:aws:s3:::windsite-files`.
It is owned by my own AWS account (Liam Hayes), so only I should have the credentials for it.
create `~/.aws/config` (something like this):
......@@ -58,9 +67,9 @@ aws_access_key_id = LSKEJFLKJESLKJFLKEJJ
aws_secret_access_key = LKj34lknol2krlkjlij+lkjkjkeLLkwnLkngleks
## Gunicorn
# Gunicorn
Follow DigitalOcean tutorial from "Testing Gunicorn’s Ability to Serve the Project":
Follow DigitalOcean tutorial starting from "Testing Gunicorn’s Ability to Serve the Project":
- ``
This is what it should end up with (only major differnce is the inclusion of the environment variables in `gunicorn.service`):
......@@ -111,7 +120,7 @@ curl --unix-socket /run/gunicorn.sock localhost # test socket is working
sudo systemctl restart gunicorn # restart service, do this when .service file is updated, or django project is updated
## Nginx
# Nginx
`sudo apt install nginx`
......@@ -138,10 +147,45 @@ server {
The django application will only be displayed by nginx if browsing from the "server_name", so "",
otherwise the "welcome to nginx" site is shown.
Then `sudo nginx -t` (to test the config file) and `sudo systemctl restart nginx`.
## Domain name
The django application will only be displayed by nginx if browsing from ``.
If you see a "welcome to nginx" where the webapp should be, check the `server_name` parameter matches the domain in your URL.
# Domain name
In AWS EC2, allocate an elastic IP to the instance.
In the domain name provider settings (godaddy), point the "" domain at the elastic IP address (find online tutorial).
\ No newline at end of file
In the domain name provider settings (godaddy), point the "" domain at the elastic IP address (find online tutorial).
# Migrate data
Move database and media files from local to remote.
This is a **dangerous** step!! Make sure both the local and remote machine have the same models and migrations, and all migrations
have been applied. Consider making a snapshot of the EC2 instance!
Move the database first (the bit that can go wrong):
On local machine:
mkdir datadumps/
./ dumpdata --natural-primary --natural-foreign > datadumps/db_4Dec20.json
scp datadumps/db_4Dec20.json windatlas:~/datadumps/db_4Dec20.json
On remote machine:
./manage flush
git pull
./ migrate
./ loaddata ~/datadumps/database.json
The remote server should now have data on it, but the database entries may refer to media files that
don't yet exist.
Now move the media files (on local machine):
- `rsync -rvv --delete media/ windatlas:/home/ubuntu/` (~330MB).
- `r` for recursive, `vv` for very verbose, `delete` for delete files on remote that aren't on local
- For big transfers it'd be faster to make `` file, and copy that across instead.
\ No newline at end of file
......@@ -21,14 +21,13 @@ from scripts.manage_data import config
### globals ###
# globals
ninja_dir = '/media/liam/drive1/ninja/data/'
entsoe_dir = '/media/liam/drive1/entsoe/processed/ActualGenerationOutputPerUnit/'
thewindpower_dir = '/media/liam/drive1/thewindpower/'
coffshore_fname = '/media/liam/drive1/4coffshore/windfarms-data.csv'
gwastat_dump = '/media/liam/drive1/globalwindatlas/website-backup/'
drive1 = '/media/liam/drive1/'
coffshore_data = pd.read_csv(coffshore_fname, encoding='utf-16').set_index('name')
# globals
ninja_dir = drive1 + 'ninja/data/'
entsoe_dir = drive1 + 'entsoe/processed/ActualGenerationOutputPerUnit/'
thewindpower_dir = drive1 + 'thewindpower/'
gwastat_dump = drive1 + 'globalwindatlas/website-backup/'
s3 = boto3.resource("s3")
bucket = s3.Bucket('windsite-files')
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment