restructure readme & docs

This commit is contained in:
stvnrlly
2022-10-20 14:05:23 -04:00
parent 2ce19fd502
commit a45e02d6e5
6 changed files with 193 additions and 124 deletions

182
README.md
View File

@@ -1,153 +1,87 @@
# US Notify API # US Notify API
Cloned from the brilliant work of the team at [GOV.UK Notify](https://github.com/alphagov/notifications-api), cheers! This project is the core of [Notify](https://notifications-admin.app.cloud.gov/). It's cloned from the brilliant work of the team at [GOV.UK Notify](https://github.com/alphagov/notifications-api), cheers!
Contains: This repo contains:
- the public-facing REST API for US Notify, which teams can integrate with using [our clients](https://www.notifications.service.gov.uk/documentation) [DOCS ARE STILL UK] - A public-facing REST API for Notify, which teams can integrate with using [API clients built by UK](https://www.notifications.service.gov.uk/documentation)
- an internal-only REST API built using Flask to manage services, users, templates, etc (this is what the [admin app](http://github.com/18F/notifications-admin) talks to) - An internal-only REST API built using Flask to manage services, users, templates, etc., which the [admin UI](http://github.com/18F/notifications-admin) talks to)
- asynchronous workers built using Celery to put things on queues and read them off to be processed, sent to providers, updated, etc - Asynchronous workers built using Celery to put things on queues and read them off to be processed, sent to providers, updated, etc
## QUICKSTART ## Local setup
---
If you are the first on your team to deploy, set up AWS SES/SNS as instructed in the AWS setup section below.
Create .env file as described in the .env section below. ### Direct installation
Install VS Code 1. Set up Postgres && Redis
Open VS Code and install the Remote-Containers plug-in from Microsoft.
Make sure your docker daemon is running (on OS X, this is typically accomplished by opening the Docker Desktop app) 1. Install dependencies into a virtual environment
Also make sure there is NOT a Postgres daemon running on port 5432.
Create the external docker network: ```
pipenv install --with dev
createdb notification_api
flask db upgrade
```
`docker network create notify-network` 1. Create the .env file
Using the command palette (shift+cmd+p), search and select “Remote Containers: Open Folder in Container...” ```
When prompted, choose **devcontainer-api** folder (note: this is a *subfolder* of notification-api). This will startup the container in a new window (replacing the current one). cp sample.env .env
# follow the instructions in .env
```
After this page loads, hit "show logs” in bottom-right. The first time this runs it will need to build the Docker image, which will likely take several minutes. 1. Run Flask
Select View->Open View..., then search/select “ports”. Await a green dot on the port view, then open a new terminal and run the web server: ```
`make run-flask` pipenv run make run-flask
```
Open another terminal and run the background tasks: 1. Run Celery
`make run-celery`
Confirm that everything is working by hitting localhost:6011 and it responds with a 200 OK. ```
pipenv run make run-celery
```
---
## Setting Up
### `.env` file ### VS Code && Docker installation
Create and edit a .env file, based on sample.env. If you're working in VS Code, you can also leverage Docker for a containerized dev environment
1. Create .env file as described in the .env section below.
1. Install the Remote-Containers plug-in in VS Code
1. With Docker running, create the network:
`docker network create notify-network`
1. Using the command palette (shift+cmd+p) or green button thingy in the bottom left, search and select “Remote Containers: Open Folder in Container...” When prompted, choose **devcontainer-api** folder (note: this is a *subfolder* of notification-api). This will startup the container in a new window, replacing the current one.
1. Wait a few minutes while things happen
1. Open a VS Code terminal and run the Flask application:
`make run-flask`
1. Open another VS Code terminal and run Celery:
`make run-celery`
NOTE: when you change .env in the future, you'll need to rebuild the devcontainer for the change to take effect. Vscode _should_ detect the change and prompt you with a toast notification during a cached build. If not, you can find a manual rebuild in command pallette or just `docker rm` the notifications-api container. NOTE: when you change .env in the future, you'll need to rebuild the devcontainer for the change to take effect. Vscode _should_ detect the change and prompt you with a toast notification during a cached build. If not, you can find a manual rebuild in command pallette or just `docker rm` the notifications-api container.
Things to change: ## Deeper documentation
- If you're not the first to deploy, only replace the aws creds, get these from team lead ### Infrastructure
- Replace `NOTIFY_EMAIL_DOMAIN` with the domain your emails will come from (i.e. the "origination email" in your SES project)
- Replace `SECRET_KEY` and `DANGEROUS_SALT` with high-entropy secret values
- Set up AWS SES and SNS as indicated in next section (AWS Setup), fill in missing AWS env vars
### AWS Setup - [Checklist for onboarding to all of the things](./docs/infra-onboarding.md)
- [Setting up the initial infrastructure using AWS](./docs/infra-setup.md)
- [Database management](./docs/database-management.md)
**Steps to prepare SES** ### Common dev work
1. Go to SES console for \$AWS_REGION and create new origin and destination emails. AWS will send a verification via email which you'll need to complete. - [Testing](./docs/testing.md)
2. Find and replace instances in the repo of "testsender", "testreceiver" and "dispostable.com", with your origin and destination email addresses, which you verified in step 1 above. - [Running one-off tasks](./docs/one-off-tasks.md)
TODO: create env vars for these origin and destination email addresses for the root service, and create new migrations to update postgres seed fixtures ## UK docs that may still be helpful
**Steps to prepare SNS**
1. Go to Pinpoints console for \$AWS_PINPOINT_REGION and choose "create new project", then "configure for sms"
2. Tick the box at the top to enable SMS, choose "transactional" as the default type and save
3. In the lefthand sidebar, go the "SMS and Voice" (bottom) and choose "Phone Numbers"
4. Under "Number Settings" choose "Request Phone Number"
5. Choose Toll-free number, tick SMS, untick Voice, choose "transactional", hit next and then "request"
6. Go to SNS console for \$AWS_PINPOINT_REGION, look at lefthand sidebar under "Mobile" and go to "Text Messaging (SMS)"
7. Scroll down to "Sandbox destination phone numbers" and tap "Add phone number" then follow the steps to verify (you'll need to be able to retrieve a code sent to each number)
At this point, you _should_ be able to complete both the email and phone verification steps of the Notify user sign up process! 🎉
### Secrets Detection
```
brew install detect-secrets # or pip install detect-secrets
detect-secrets scan
#review output of above, make sure none of the baseline entries are sensitive
detect-secrets scan > .secrets.baseline
#creates the baseline file
```
Ideally, you'll install `detect-secrets` so that it's accessible from any environment from which you _might_ commit. You can use `brew install` to make it available globally. You could also install via `pip install` inside a virtual environment, if you're sure you'll _only_ commit from that environment.
If you open .git/hooks/pre-commit you should see a simple bash script that runs the command below, reads the output and aborts before committing if detect-secrets finds a secret. You should be able to test it by staging a file with any high-entropy string like `"bblfwk3u4bt484+afw4avev5ae+afr4?/fa"` (it also has other ways to detect secrets, this is just the most straightforward to test).
You can permit exceptions by adding an inline comment containing `pragma: allowlist secret`
The command that is actually run by the pre-commit hook is: `git diff --staged --name-only -z | xargs -0 detect-secrets-hook --baseline .secrets.baseline`
You can also run against all tracked files staged or not: `git ls-files -z | xargs -0 detect-secrets-hook --baseline .secrets.baseline`
### Postgres
Local postgres implementation is handled by [docker compose](https://github.com/18F/notifications-api/blob/main/docker-compose.devcontainer.yml)
### Redis
Local redis implementation is handled by [docker compose](https://github.com/18F/notifications-api/blob/main/docker-compose.devcontainer.yml)
## To test the application
```
# install dependencies, etc.
make bootstrap
make test
```
## To run a local OWASP scan
1. Run `make run-flask` from within the dev container.
2. On your host machine run:
```
docker run -v $(pwd):/zap/wrk/:rw --network="notify-network" -t owasp/zap2docker-weekly zap-api-scan.py -t http://dev:6011/_status -f openapi -c zap.conf
```
## To run scheduled tasks
```
# After scheduling some tasks, open a third terminal in your running devcontainer and run celery beat
make run-celery-beat
```
## To run one off tasks (Ignore for Quick Start)
Tasks are run through the `flask` command - run `flask --help` for more information. There are two sections we need to
care about: `flask db` contains alembic migration commands, and `flask command` contains all of our custom commands. For
example, to purge all dynamically generated functional test data, do the following:
Local (from inside the devcontainer)
```
flask command purge_functional_test_data -u <functional tests user name prefix>
```
Remote
```
cf run-task notify-api "flask command purge_functional_test_data -u <functional tests user name prefix>"
```
All commands and command options have a --help command if you need more information.
## Further documentation [DEPRECATED]
- [Writing public APIs](docs/writing-public-apis.md) - [Writing public APIs](docs/writing-public-apis.md)
- [Updating dependencies](https://github.com/alphagov/notifications-manuals/wiki/Dependencies) - [Updating dependencies](https://github.com/alphagov/notifications-manuals/wiki/Dependencies)

View File

@@ -0,0 +1,55 @@
# Database management
## Initial state
In Notify, several aspects of the system are loaded into the database via migration. This means that
application setup requires loading and overwriting historical data in order to arrive at the current
configuration.
[Here are notes](https://docs.google.com/document/d/1ZgiUtJFvRBKBxB1ehiry2Dup0Q5iIwbdCU5spuqUFTo/edit#)
about what is loaded into which tables, and some plans for how we might manage that in the future.
Flask does not seem to have a great way to squash migrations, but rather wants you to recreate them
from the DB structure. This means it's easy to recreate the tables, but hard to recreate the initial data.
## Migrations
Create a migration:
```
flask db migrate
```
Trim any auto-generated stuff down to what you want, and manually rename it to be in numerical order.
We should only have one migration branch.
Running migrations locally:
```
flask db upgrade
```
This should happen automatically on cloud.gov, but if you need to run a one-off migration for some reason:
```
cf run-task notifications-api-staging --commmand "flask db upgrade" --name db-upgrade
```
## Purging user data
There is a Flask command to wipe user-created data (users, services, etc.).
The command should stop itself if it's run in a production environment, but, you know, please don't run it
in a production environment.
Running locally:
```
flask command purge_functional_test_data -u <functional tests user name prefix>
```
Running on cloud.gov:
```
cf run-task notify-api "flask command purge_functional_test_data -u <functional tests user name prefix>"
```

7
docs/infra-onboarding.md Normal file
View File

@@ -0,0 +1,7 @@
# Infrastructure onboarding
- [ ] Join [the GSA GitHub org](https://github.com/GSA/GitHub-Administration#join-the-gsa-organization)
- [ ] Get permissions for the repos
- [ ] Get access to the cloud.gov org && space
- [ ] Get access to AWS, if necessary
- [ ] Pull down creds from cloud.gov and create the local .env file

20
docs/infra-setup.md Normal file
View File

@@ -0,0 +1,20 @@
# Setting up the infrastructure
## Steps to prepare SES
1. Go to SES console for \$AWS_REGION and create new origin and destination emails. AWS will send a verification via email which you'll need to complete.
2. Find and replace instances in the repo of "testsender", "testreceiver" and "dispostable.com", with your origin and destination email addresses, which you verified in step 1 above.
TODO: create env vars for these origin and destination email addresses for the root service, and create new migrations to update postgres seed fixtures
## Steps to prepare SNS
1. Go to Pinpoints console for \$AWS_PINPOINT_REGION and choose "create new project", then "configure for sms"
2. Tick the box at the top to enable SMS, choose "transactional" as the default type and save
3. In the lefthand sidebar, go the "SMS and Voice" (bottom) and choose "Phone Numbers"
4. Under "Number Settings" choose "Request Phone Number"
5. Choose Toll-free number, tick SMS, untick Voice, choose "transactional", hit next and then "request"
6. Go to SNS console for \$AWS_PINPOINT_REGION, look at lefthand sidebar under "Mobile" and go to "Text Messaging (SMS)"
7. Scroll down to "Sandbox destination phone numbers" and tap "Add phone number" then follow the steps to verify (you'll need to be able to retrieve a code sent to each number)
At this point, you _should_ be able to complete both the email and phone verification steps of the Notify user sign up process! 🎉

22
docs/one-off-tasks.md Normal file
View File

@@ -0,0 +1,22 @@
# One-off tasks
For these, we're using Flask commands, which live in [`/app/commands.py`](../app/commands.py).
This includes things that might be one-time operations! Using a command allows the operation to be tested,
both with `pytest` and with trial runs.
To run a command on cloud.gov, use this format:
```
cf run-task CLOUD-GOV-SPACE --commmand "YOUR COMMAND HERE" --name YOUR-COMMAND
```
[Here's more documentation](https://docs.cloudfoundry.org/devguide/using-tasks.html) about Cloud Foundry tasks.
## Celery scheduled tasks
After scheduling some tasks, run celery beat to get them moving:
```
make run-celery-beat
```

31
docs/testing.md Normal file
View File

@@ -0,0 +1,31 @@
# Testing
```
# install dependencies, etc.
make bootstrap
make test
```
This will run:
- flake8 for code styling
- isort for import styling
- pytest for the test suite
On GitHub, in addition to these tests, we run:
- bandit for code security
- pip-audit for dependency vulnerabilities
- OWASP for dynamic scanning
## CI testing
We're using GitHub Actions. See [/.github](../.github/) for the configuration.
## To run a local OWASP scan
1. Run `make run-flask` from within the dev container.
2. On your host machine run:
```
docker run -v $(pwd):/zap/wrk/:rw --network="notify-network" -t owasp/zap2docker-weekly zap-api-scan.py -t http://dev:6011/_status -f openapi -c zap.conf
```