sakisv d6b78e6373 Add internal routes for api and api-sms-receipts
These routes will be used by prometheus to scrape the `/metrics` endpoint.

Currently:

The shared prometheus scrapes the `/metrics` endpoint using
the public routes.

The `/metrics` endpoint is provided by the [gds_metrics_python][] which
comes with [bearer-token authentication][] where the token is expected
to be equal to the paas app id.

Each app is configured as a separate target in the shared prometheus
with its app id configured as a GET parameter (e.g.
http://notify-api-production.cloudapps.digital/metrics?cf_app_guid=69c87503-6b53-4c35-XXXX-XXXXXXXXXXXX&cf_app_instance=69c87503-6b53-4c35-XXXX-XXXXXXXXXXXX%3A1&cf_app_instance_index=1)

Each scrape request goes through an nginx proxy which retrieves this
argument from the query string and sets it as a header [[source][]]. This way it
passes the authentication and also is able to instruct the gorouter to
target a specific instance of the app.

In the future:

Since we're moving away from the shared prometheus and towards an
approach where we [run our own prometheus on PaaS][] we can skip the
need for having an nginx proxy and use the internal routes instead, and
have a [preshared-token][] for authentication if we need to.

[gds_metrics_python]: https://github.com/Crown-Commercial-Service/gds_metrics_python
[bearer-token authentication]: https://github.com/Crown-Commercial-Service/gds_metrics_python/blob/master/gds_metrics/__init__.py#L47-L52
[source]: https://github.com/alphagov/prometheus-aws-configuration-beta/blob/master/terraform/modules/prom-ec2/prometheus/cloud.conf#L111-L123
[run our own prometheus on PaaS]: https://github.com/alphagov/notifications-cf-monitoring/pull/1
[preshared-token]: https://github.com/Crown-Commercial-Service/gds_metrics_python/pull/18
2022-04-13 11:01:24 +03:00
2020-05-12 16:04:18 +01:00
2019-08-02 12:41:03 +01:00
2021-11-11 13:54:21 +00:00
2019-05-16 17:06:34 +01:00
2022-02-24 17:15:41 +00:00
2021-03-08 17:44:48 +00:00
2022-04-05 17:06:08 +01:00
2022-04-05 17:06:08 +01:00
2021-11-11 13:54:14 +00:00

GOV.UK Notify API

Contains:

  • the public-facing REST API for GOV.UK Notify, which teams can integrate with using our clients
  • an internal-only REST API built using Flask to manage services, users, templates, etc (this is what the admin app talks to)
  • asynchronous workers built using Celery to put things on queues and read them off to be processed, sent to providers, updated, etc

Setting Up

Python version

We run python 3.9 both locally and in production.

psycopg2

Follow these instructions on Mac M1 machines.

AWS credentials

To run the API you will need appropriate AWS credentials. See the Wiki for more details.

environment.sh

Creating and edit an environment.sh file.

echo "
export NOTIFY_ENVIRONMENT='development'

export MMG_API_KEY='MMG_API_KEY'
export FIRETEXT_API_KEY='FIRETEXT_ACTUAL_KEY'
export REACH_API_KEY='REACH_API_KEY'
export NOTIFICATION_QUEUE_PREFIX='YOUR_OWN_PREFIX'

export FLASK_APP=application.py
export FLASK_ENV=development
export WERKZEUG_DEBUG_PIN=off
"> environment.sh

Things to change:

  • Replace YOUR_OWN_PREFIX with local_dev_<first name>.
  • Run the following in the credentials repo to get the API keys.
notify-pass credentials/firetext
notify-pass credentials/mmg
notify-pass credentials/reach

Postgres

Install Postgres.app.

Currently the API works with PostgreSQL 11. After installation, open the Postgres app, open the sidebar, and update or replace the default server with a compatible version.

Note: you may need to add the following directory to your PATH in order to bootstrap the app.

export PATH=${PATH}:/Applications/Postgres.app/Contents/Versions/11/bin/

Redis

To switch redis on you'll need to install it locally. On a Mac you can do:

# assuming you use Homebrew
brew install redis
brew services start redis

To use redis caching you need to switch it on with an environment variable:

export REDIS_ENABLED=1

To run the application

# install dependencies, etc.
make bootstrap

# run the web app
make run-flask

# run the background tasks
make run-celery

# run scheduled tasks (optional)
make run-celery-beat

We've had problems running Celery locally due to one of its dependencies: pycurl. Due to the complexity of the issue, we also support running Celery via Docker:

# install dependencies, etc.
make bootstrap-with-docker

# run the background tasks
make run-celery-with-docker

# run scheduled tasks
make run-celery-beat-with-docker

To test the application

# install dependencies, etc.
make bootstrap

make test

To run one off tasks

Tasks are run through the flask command - run flask --help for more information. There are two sections we need to care about: flask db contains alembic migration commands, and flask command contains all of our custom commands. For example, to purge all dynamically generated functional test data, do the following:

Locally

flask command purge_functional_test_data -u <functional tests user name prefix>

On the server

cf run-task notify-api "flask command purge_functional_test_data -u <functional tests user name prefix>"

All commands and command options have a --help command if you need more information.

Further documentation

Description
The API powering Notify.gov
Readme 92 MiB
Languages
Python 98.5%
HCL 0.6%
Jinja 0.5%
Shell 0.3%
Makefile 0.1%