Skip to content

Commit

Permalink
Update reqs and readme (#445)
Browse files Browse the repository at this point in the history
* Update python to 3.13; update all requirements

* Make readme more up to date

* Add instructions to set the PGTZ environment variable

* Add instructions for opening QGIS project

---------

Co-authored-by: Lauri Kajan <[email protected]>
  • Loading branch information
Rikuoja and LKajan authored Feb 6, 2025
1 parent 03f63f2 commit b1b7032
Show file tree
Hide file tree
Showing 14 changed files with 165 additions and 134 deletions.
66 changes: 43 additions & 23 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,12 +1,17 @@
# hame-ryhti
# ARHO-Ryhti

[![CI/CD](https://github.com/GispoCoding/hame-ryhti/actions/workflows/ci.yml/badge.svg)](https://github.com/GispoCoding/hame-ryhti/actions/workflows/ci.yml)
[![Tests](https://github.com/GispoCoding/hame-ryhti/actions/workflows/tests.yml/badge.svg)](https://github.com/GispoCoding/hame-ryhti/actions/workflows/tests.yml)
[![Code-style](https://github.com/GispoCoding/hame-ryhti/actions/workflows/code-style.yml/badge.svg)](https://github.com/GispoCoding/hame-ryhti/actions/workflows/code-style.yml)
[![Deploy](https://github.com/GispoCoding/hame-ryhti/actions/workflows/deploy.yml/badge.svg)](https://github.com/GispoCoding/hame-ryhti/actions/workflows/deploy.yml)
[![pre-commit](https://img.shields.io/badge/pre--commit-enabled-brightgreen?logo=pre-commit&logoColor=white)](https://github.com/pre-commit/pre-commit)

HAME regional land use planning database and QGIS project compatible with [national Ryhti data model](https://ryhti.syke.fi/alueidenkaytto/tietomallimuotoinen-kaavoitus/) -
[Ryhti-yhteensopiva](https://ryhti.syke.fi/alueidenkaytto/tietomallimuotoinen-kaavoitus/) tietokanta ja QGIS-projekti maakuntakaavoitukseen.
---

The database and functions can be run on AWS (Amazon Web Services) cloud platform.
[Ryhti-yhteensopiva](https://ryhti.syke.fi/alueidenkaytto/tietomallimuotoinen-kaavoitus/) ARHO-kaavoitustietokanta, joka käyttää [kansallisia Ryhti-rajapintoja](https://ryhti.syke.fi/tietoa-jarjestelmasta/miten-tieto-liikkuu/). Tietokantaa voi käyttää QGISillä, johon on asennettu [ARHO-kaavoituslisäosa](https://github.com/GispoCoding/arho-feature-template). Tietokanta ja sen tarvitsemat funktiot toimivat AWS (Amazon Web Services) -pilvialustalla.

---

ARHO land use planning database compatible with [national Ryhti data model](https://ryhti.syke.fi/alueidenkaytto/tietomallimuotoinen-kaavoitus/), connecting to [national Ryhti APIs](https://ryhti.syke.fi/tietoa-jarjestelmasta/miten-tieto-liikkuu/). The database may be used with QGIS with [ARHO land use planning plugin](https://github.com/GispoCoding/arho-feature-template). The database and required functions work on AWS (Amazon Web Services) cloud platform.

- [Architecture](#architecture)
- [Data model](#data-model)
Expand All @@ -22,15 +27,15 @@ The database and functions can be run on AWS (Amazon Web Services) cloud platfor

## Architecture

HAME-Ryhti consists of
ARHO-Ryhti consists of
1. a PostGIS database,
2. various AWS Lambda functions to manage the database and import or export planning data,
3. [X-Road security server sidecar container](https://gofore.com/en/benefits-of-the-x-road-security-server-sidecar/) to connect to Ryhti through Finnish X-Road, and
4. QGIS project to connect to the database and create regional land use plans.
4. [QGIS plugin](https://github.com/GispoCoding/arho-feature-template) to connect to the database and create regional land use plans.

![diagram of AWS resources and their connections to software and APIs](infra/architecture.svg)

To manage Hame-Ryhti AWS resources, check the [infra README](https://github.com/GispoCoding/hame-ryhti/blob/main/infra/README.md#hame-infra) in the infra directory.
To manage ARHO-Ryhti AWS resources, check the [infra README](https://github.com/GispoCoding/hame-ryhti/blob/main/infra/README.md#hame-infra) in the infra directory.

## Data model

Expand All @@ -40,7 +45,7 @@ To look closer at our data model, check the autogenerated [data model documentat

## Development requirements

- Python 3.12
- Python 3.13
- Docker (Install Docker based on [your platform's instructions](https://docs.docker.com/get-started/#download-and-install-docker).)

## Development
Expand All @@ -56,13 +61,24 @@ If you also want to test Ryhti API client, you have to
6. Register to [SYKE API portal](https://api-developer.ymparisto.fi) and subscribe to their Ryhti product. Your subscription details will contain your Ryhti API key.
7. Insert your Ryhti API key at SYKE_APIKEY line in `.env` file. Do *not* modify `.env.dev`, it is committed in github and should only contain public example data, not your actual api key.

If you also want to import municipality and region boundaries from [MML](https://www.maanmittauslaitos.fi/), you have to

8. Register to [MML](https://www.maanmittauslaitos.fi/rajapinnat/api-avaimen-ohje) and create a new API key for MML APIs.
9. Insert your MML API key at MML_APIKEY line in `.env` file. Do *not* modify `.env.dev`, it is committed in github and should only contain public example data, not your actual api key.

### Database and functions

1. Run tests with `make pytest`. (If you have not specified a Ryhti API key, some `test_services` will fail, because some Ryhti client tests try out calling the SYKE Ryhti API.)
2. Build and start the development containers with `docker-compose -f docker-compose.dev.yml up -d` (or `make rebuild`).
3. Fill the database with current data model by `make test-create-db`.
4. Populate national code tables from [koodistot.suomi.fi](https://koodistot.suomi.fi) by `make test-koodistot`.
5. Edit the lambda functions under [database](./database), run tests and rebuild again.
1. Run tests with `make pytest`. (If you have not specified a Ryhti API key, some `test_services` will fail, because some Ryhti client tests try out calling the SYKE Ryhti open validation API.)
2. Edit the lambda functions under [database](./database), run tests and rebuild again.

If you want to use the local development database with a PostGIS client or QGIS:

3. Build and start the development containers with `make rebuild` (or `docker-compose -f docker-compose.dev.yml up -d`).
4. Fill the database with current data model by `make test-create-db`.
5. Populate national code tables from [koodistot.suomi.fi](https://koodistot.suomi.fi) by `make test-koodistot`. (If you have not specified an MML API key, code tables will be populated, but municipality and regional geometries will be left empty, and you will get an error telling you that
MML API key is missing.)
6. To create plans in the database, you must add at least one `organization` to the organization table (i.e. a test region or test municipality), with foreign key to the national code table which contains the geometry of your region or municipality. All plans that you create must have a foreign key to a valid region or municipality.
7. Once you have created plan data in the database, you may test calling the SYKE Ryhti open validation API with your database contents with `make test-ryhti-validate`.

If test using pytest-docker get stuck, you can remove the dangling containers with:

Expand All @@ -78,7 +94,7 @@ docker network ls --format {{.Name}} |grep pytest | awk '{print $1}' | xargs -I
2. Database is divided into two schemas: `codes` contains all the Ryhti specific [national code lists](https://ryhti.syke.fi/ohjeet-ja-tuki/tietomallit/), while `hame` contains all the data tables (plans, plan objects, plan regulations etc.).
3. If you want to change *all* tables in a schema (i.e. edit *all* the code tables, or add a field to *all* the data tables), the abstract base classes are in [base.py](./database/base.py).
4. If you only want to change/add *one* code table or one data table, please edit/add the right table in [codes.py](./database/codes.py) or [models.py](./database/models.py).
5. To get the changes tested and usable in your functions, create a new database revision with `make revision name="describe_your_changes"`, e.g. `make revision name="add_plan_object_table"`. This creates a new random id (`uuid`) for your migration, and a revision file `YYYY-MM-DD-HHMM-uuid-add_plan_object_table` in the [alembic versions dir](./database/migrations/versions). Please check that the autogenerated revision file seems to do approximately sensible things.
5. To get the changes tested and usable in your functions, you must have up-to-date test database running with `make rebuild` and `make-test-create-db`. When the database is up, you may create a new database revision with `make revision name="describe_your_changes"`, e.g. `make revision name="add_plan_object_table"`. This creates a new random id (`uuid`) for your migration, and a revision file `YYYY-MM-DD-HHMM-uuid-add_plan_object_table` in the [alembic versions dir](./database/migrations/versions). Please check that the autogenerated revision file seems to do approximately sensible things.
- Specifically, when adding geometry fields, please note [GeoAlchemy2 bug with Alembic](https://geoalchemy-2.readthedocs.io/en/latest/alembic.html#interactions-between-alembic-and-geoalchemy-2), which means you will have to *manually remove* `op.create_index` and `op.drop_index` in the revision file. This is because GeoAlchemy2 already automatically creates geometry index whenever adding a geometry column.
6. Run tests with `make pytest` to check that the revision file runs correctly. At minimum, you may have to change the tested table counts (codes_count and hame_count) in [database test setup](./database/test/conftest.py) to reflect the correct number of tables in the database.
7. Run `make rebuild` and `make test-create-db` to start development instance with the new model.
Expand Down Expand Up @@ -120,7 +136,7 @@ To update requirements to latest versions:
[Database documentation](./database/dbdoc/README.md) -->

## Connecting to the test database
## Connecting to the AWS database

Connecting to the database is done with the secure shell protocol (SSH). To be able to connect to the database, you will have to
1. Create a SSH key pair on your computer (this has to be done only once)
Expand All @@ -143,16 +159,17 @@ you have to provide to the database administrator, and the private key in file `
![screenshot of ssh key pair creation dialog](docs/img/ssh-keygen.png)


### Opening an SSH tunnel to the test server
### Opening an SSH tunnel to AWS

Once the administrator has added your public key to the server, you can connect to the database using ssh:
- On *Windows*, the easiest way to open the SSH tunnel to the server is by using a batch script named `create_tunnel.bat` found [here](docs/create_tunnel.bat) in this repository. Save the file to your computer in a convenient location. After this you can open the tunnel by executing this script by double clicking the file. On *Linux/Mac OS* (or if you want to use a command prompt), just copy-paste the command
- On *Windows*, the easiest way to open the SSH tunnel to the server is by using a batch script named `create_tunnel.vsl.bat` found [here](docs/create_tunnel.vsl.bat) in this repository. Save the file to your computer in a convenient location. After this you can open the tunnel by executing this script by double clicking the file. On *Linux/Mac OS* (or if you want to use a command prompt), just copy-paste the command
```
ssh -N -L 5433:hame-devdb.ctcspesmxrh1.eu-central-1.rds.amazonaws.com:5432 -L 5443:kfhh24yii6.execute-api.eu-central-1.amazonaws.com:443 -i "~/.ssh/id_ed25519" [email protected]
ssh -N -L 5433:hame-devdb.ctcspesmxrh1.eu-central-1.rds.amazonaws.com:5432 -D localhost:5443 -i "~/.ssh/id_ed25519" [email protected]
```
In addition to SSH tunnel to the database, the command creates a socks5 proxy that allows the Arho plugin to connect to the lambda functions in AWS.

- Enter the passphrase for the key (if set) and hit enter. If no error messages appear, the tunnel is connected. Do not close the command prompt window, otherwise the SSH tunnel is disconnected.
- Now you can connect to the database using `localhost` as the host and `5433` as the port. The details how to do this with
different software are given in the following sections.
- Now you can connect to the database using `localhost` as the host and `5433` as the port. The details how to do this with different software are given in the following sections.
- Additional tips: the connection can automatically terminate, for example, due to server rebooting or network issues (this is usually accompanied by a message, such as `client_loop: send disconnect: Connection reset`). If this happens, simply double click the file again to reopen the tunnel. In case you want to close an open SSH tunnel, press `Ctrl+C` and answer the confirmation by pressing `Y`.

### Connecting to the database from QGIS
Expand All @@ -176,11 +193,11 @@ NOTE: the Postgres service file for the dev environment is also included in in t

![screenshot of new profile menu](docs/img/qgis-new-profile.png)

3. In QGIS settings add a `PGSERVICEFILE` environment variable and fill the file path of corresponding service file as a value.
3. In QGIS settings add two environment variables. Add a `PGSERVICEFILE` variable and fill the file path of corresponding service file as a value. Add a second variable `PGTZ` and set the value to your local timezone using the time zone name from [tz database](https://en.wikipedia.org/wiki/List_of_tz_database_time_zones), probably `Europe/Helsinki`. Choose `Overwrite` for both variables in the first column of the variable list.

![screenshot of menu location](docs/img/qgis-settings.png)

![screenshot of the setting dialog](docs/img/qgis-pgservicefile-environment-variable.png)
![screenshot of the setting dialog](docs/img/qgis-environment-variables.png)

4. Restart QGIS to make the environment variable to take effect.

Expand All @@ -201,3 +218,6 @@ Now you can proceed with the database authentication details. As in step 3, open
Add the necessary parameters as follows. You can also test the connection at this point and when done, press OK.

![screenshot of the new connection dialog](docs/img/qgis-create-connection.png)

7. Open the [ARHO QGIS project](https://github.com/GispoCoding/arho-feature-template/blob/main/qgisprojekti.qgz) in QGIS. It opens up the plan map layers contained in the ARHO database. If you wish, you may save the latest QGIS project version in your ARHO database, so you may edit it if needed and everybody may find the project in the database: Project > Save as... > PostgreSQL...
8. Install the [ARHO QGIS plugin](https://github.com/GispoCoding/arho-feature-template).
2 changes: 1 addition & 1 deletion database/db_manager.Dockerfile
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
FROM public.ecr.aws/lambda/python:3.12
FROM public.ecr.aws/lambda/python:3.13

# Install Python dependencies
COPY requirements.txt ${LAMBDA_TASK_ROOT}/requirements.txt
Expand Down
2 changes: 1 addition & 1 deletion database/koodistot_loader.Dockerfile
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
FROM public.ecr.aws/lambda/python:3.12
FROM public.ecr.aws/lambda/python:3.13

# Install Python dependencies
COPY requirements.txt ${LAMBDA_TASK_ROOT}/requirements.txt
Expand Down
2 changes: 1 addition & 1 deletion database/mml_loader.Dockerfile
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
FROM public.ecr.aws/lambda/python:3.12
FROM public.ecr.aws/lambda/python:3.13

# Install Python dependencies
COPY requirements.txt ${LAMBDA_TASK_ROOT}/requirements.txt
Expand Down
2 changes: 1 addition & 1 deletion database/ryhti_client.Dockerfile
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
FROM public.ecr.aws/lambda/python:3.12
FROM public.ecr.aws/lambda/python:3.13

# Install Python dependencies
COPY requirements.txt ${LAMBDA_TASK_ROOT}/requirements.txt
Expand Down
19 changes: 0 additions & 19 deletions docs/create_tunnel.bat

This file was deleted.

18 changes: 18 additions & 0 deletions docs/create_tunnel.paimio.bat
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
@echo off
set SERVER_ADDRESS=arhodb.ctcspesmxrh1.eu-central-1.rds.amazonaws.com
set LOCAL_PORT=5433
set REMOTE_PORT=5432
set LAMBDA_PROXY_HOST=localhost
set LAMBDA_PROXY_PORT=5443
set TUNNEL_USER=ec2-tunnel
set TUNNEL_ADDRESS=arho-test.bastion.gispocoding.fi
set SSH_KEY_PATH=~/.ssh/id_ed25519

echo Creating SSH tunnel to %SERVER_ADDRESS%:%REMOTE_PORT%...
echo Tunneling local port %LOCAL_PORT% to remote port %REMOTE_PORT%...
echo Setting up SOCKS proxy at %LAMBDA_PROXY_HOST%:%LAMBDA_PROXY_PORT%...

ssh -N -L %LOCAL_PORT%:%SERVER_ADDRESS%:%REMOTE_PORT% -D %LAMBDA_PROXY_HOST%:%LAMBDA_PROXY_PORT% -i "%SSH_KEY_PATH%" %TUNNEL_USER%@%TUNNEL_ADDRESS%

echo SSH tunnel closed.
pause
18 changes: 18 additions & 0 deletions docs/create_tunnel.vsl.bat
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
@echo off
set SERVER_ADDRESS=hame-devdb.ctcspesmxrh1.eu-central-1.rds.amazonaws.com
set LOCAL_PORT=5433
set REMOTE_PORT=5432
set LAMBDA_PROXY_HOST=localhost
set LAMBDA_PROXY_PORT=5443
set TUNNEL_USER=ec2-tunnel
set TUNNEL_ADDRESS=hame-dev.bastion.gispocoding.fi
set SSH_KEY_PATH=~/.ssh/id_ed25519

echo Creating SSH tunnel to %SERVER_ADDRESS%:%REMOTE_PORT%...
echo Tunneling local port %LOCAL_PORT% to remote port %REMOTE_PORT%...
echo Setting up SOCKS proxy at %LAMBDA_PROXY_HOST%:%LAMBDA_PROXY_PORT%...

ssh -N -L %LOCAL_PORT%:%SERVER_ADDRESS%:%REMOTE_PORT% -D %LAMBDA_PROXY_HOST%:%LAMBDA_PROXY_PORT% -i "%SSH_KEY_PATH%" %TUNNEL_USER%@%TUNNEL_ADDRESS%

echo SSH tunnel closed.
pause
Binary file added docs/img/qgis-environment-variables.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file removed docs/img/qgis-pgservicefile-environment-variable.png
Binary file not shown.
Loading

0 comments on commit b1b7032

Please sign in to comment.