1. Create a separate billing blueprint to house these endpoints
2. Return monthly breakdown in same format as we did before
3. Return yearly breakdown but only return {billing units, rate,
notification_type}. Admin only makes use of these.
two endpoints:
* get all inbound sms for a service (you can limit to the X most
recent, or filter by user's phone number [which will be normalised])
* get a summary of inbound sms for a service - returns the count of
inbound sms in the database, and the date that the most recent was
sent
- uses the reference field on the notifications table to store a 16char random string used to cross reference DVLA letters back to the notification
- used as letter barcode does not have space for a UUID notification id
Depends on https://github.com/alphagov/notifications-utils/pull/149
Renamed the numeric_id to notification_reference in utils and changed validation rules to match this
Note also the persist_notification method set "reference" to be "client_reference" which is confusing and they are different things, so fixed this too.
There are three authentication methods:
- requires_no_auth - public endpoint that does not require an Authorisation header
- requires_auth - public endpoints that need an API key in the Authorisation header
- requires_admin_auth - private endpoint that requires an Authorisation header which contains the API key for the defined as the client admin user
We want to log the usage of the various API clients we have so that we understand when they can be cycled.
To this end we are going to count usage in statsd.
All notify clients have a suer agent, of the format: NOTIFY-API-{LANGUAGE}-CLIENT/version.number
For example, NOTIFY-API-PYTHON-CLIENT/3.0.0
We convert that into a statsd/graphite friendly string of the format: notify-api-python-client.3-0-0
So we can subdivide on client and client version on our dashboards.
Present but unknown User agents are records as "non-notify-user-agent"
Missing are presented as "unknown"
We're formally using the ISO 8601 UTC datetime format, and so the
correct way to output the data is by appending the timezone.
("Z" in the case of UTC*).
Unfortunately, Python's `datetime` formatting will just ignore the
timezone part of the string on output, which means we just have to
append the string "Z" to the end of all datetime strings we output.
Should be fine, as we will only ever output UTC timestamps anyway.
* https://en.wikipedia.org/wiki/ISO_8601#UTC
The new 'v2' API wants to return less data than the previous one,
which was sending back tons of fields the clients never used.
This new route returns only useful information, with the JSON
response dict being built up in the model's `.serialize()` method.
Note that writing the test for this was a bit painful because of
having to treat loads of keys differently. Hopefully we think this
is a good way to write this test, because if we don't, we should
start thinking of a better way to check the values are what we
expect.
- Use these validation methods in post_sms_notification and the version 1 of post_notification.
- Create a v2 error handlers.
- InvalidRequest has a to_dict method for private and v1 error responses and a to_dict_v2 method to create the v2 of the error responses.
- Each validation method has extensive unit tests, so the unit test for the endpoint do not need to check every error case, but check that the error handle formats the message correctly.
- The format of the error messages is still a work on progress.
- This version of the api could be deployed without causing a problem to the application.
- The new endpoing is still a work in progress and is not being used yet.
- this allows us to send a notification to a provider by means of an API call
- This is in addition to the celery code.
- idea is that we can use this method to help speed up throughput by generating API traffic by node/lambda etc to supplement the celery code in times of high load.
it was causing a bug where a local variable service was not being
instantiated and we were trying to operate on the blueprint instead
it's being used in so few places it makes sense to rename it