Years ago we started to implement a way to schedule a notification. We hit a problem but we never came up with a good solution and the feature never made it back to the top of the priority list.
This PR removes the code for scheduled_for. There will be another PR to drop the scheduled_notifications table and remove the schedule_notifications service permission
Unfortunately, I don't think we can remove the `scheduled_for` attribute from the notification.serialized method because out clients might fail if something is missing. For now I have left it in but defaulted the value to None.
By serialising these straight away we can:
- not go back to the database later, potentially closing the connection
sooner
- potentially cache the serialised data, meaning we don’t touch the
database at all
This commit changes the code in post notification endpoint to handle a
serialised template (ie a `dict`) rather than a database object.
This is the first step towards being able to cache the template and not
hit the database on every request.
There should be no functional changes here, it’s just refactoring.
There are some changes to the tests where the signature of functions
has changed.
Importing of the template schema has to be done at a function level,
otherwise Marshmallow gets weird.
This commit also copies the `JSONModel` class from the admin app, which
turns serialised data (a dict made from JSON) into an object on which
certain predefined properties are allowed.
This means we can still do the caching of serialised data, without
having to change too much of the code in the app, or make it ugly by
sprinkling dict lookups everywhere.
We’re not copying all of JSONModel from the admin app, just the bits we
need. We don’t need to compare or hash these objects, they’re just used
for lookups. And redefining `__getattribute__` scares Leo.
In the previous PR I removed the `update_notification` method to reduce the need for another update query. However, that meant the notification was marked as delivered without an updated_at timestamp.
It is weird to set the updated_at when we create the notification. So is this a better fix? Or do I put the update back now?
I recommend we push this fix now.
After the commit we issue two calls to the db to get service and get notification. This is because after the commit the ORM wants to ensure that the data model objects are the latest.
So far this is just a proof of concept, but the letter flow needs to be updated and we should be able to get rid of research mode. And it needs some tidy up.
Two new metrics:
auth_db_connection_duration_seconds (histogram)
wraps the first DB call of post notifications. This includes waiting
to get a connection from the pool, and also making the actual request
to the db to retrieve the service and api keys. (i'm not sure there's
an easy way to separate these two things)
post_notification_json_parse_duration_seconds
wraps parsing the v2 post notifications json parsing and schema
validation. Shouldn't include any async code
For services that have permission to send international letters we
should not reject letters that are addressed to another country. We
should still reject letters that are badly-addressed.
We’ve added some new properties to the templates in utils that we can
use instead of doing weird things like
`WithSubjectTemplate.__str__(another_instance)`
This is part of moving away from `postcode` and towards `address line 7`
We think it will be easier for people to map their existing data to our
API if we let them fill in any 3 lines, instead of requiring at least
1, 2, and postcode specifically.
We don’t need to reformat the postcode here once template preview takes
care of it when rendering the PDF.
It’s better (and less code) to store what people give us, so we give
them back the same thing.
This has been replaced by a new task, `sanitise-letter`, to this deletes
all the code in the old task and ensures that when antivirus is not
enabled locally we are calling the new task.
SQS fails if the message body is over 256kb. Normally our messages are
quite small, but if we're using the new save-api-email task with an
email that has a large body, we can get over that limit. If so, handle
the exception and fall back to the existing code path (saving to the
database and sending a deliver-email task, which only has a notification
id.
Instead of saving the email notification to the db add it to a queue to save later.
This is an attempt to alleviate pressure on the db from the api requests.
This initial PR is to trial it see if we see improvement in the api performance an a reduction in queue pool errors. If we are happy with this we could remove the hard coding of the service id.
In a nutshell:
- If POST /v2/notification/email is from our high volume service (hard coded for now) then create a notification to send to a queue to persist the notification to the db.
- create a save_api_email task to persist the notification
- return the notification
- New worker app to process the save_api_email tasks.
When a precompiled letter is sent via the admin app, we now pass in the address which can be set in the Notifications.to field.
Once a precompiled letters sent by the API has passed validation we can set the address in Notifications.to field.
The celery tasks to validate precompiled letters sent by the API will be done in another PR.
By adding `force=True` to request.get_json() the mime type is ignore. If the data is not valid json the method will return a `BadRequestError` we catch that and throw our own error with a clear error message "Invalid JSON supplied in POST data".
If the json is valid return the json data or an empty dict if None is passed in.
This PR improves the error messages if the json is invalid, previously, the error message was "None object type" message which is not very helpful.
We want a way of getting the hidden precompiled template from admin,
so this adds an endpoint which gets the template or creates it if it doesn't
exist. The function to get or create the hidden template already existed
but has been moved to the template DAO now that it is used in more
places.
previously, we didn't create templated letters, and just marked them as
delivered straight away. However, we may need to return PDFs for these
letters, so we should create them the same as live letters. Then update
the functions so that they know where to look for these letters.
antivirus is sometimes tough to get running locally - now in dev
antivirus is skipped unless `ANTIVIRUS_ENABLED=1` is set on the command
line. on all other environments it is always enabled.
We need to set template_with_content placeholder values with an
updated personalisation dictionary after processing the document
uploads in order to avoid returning base64 file data instead of
the document URLs in the response contents.
Catches the requests exception for document-download-api calls, logs
a warning and returns a matching response code and message.
Connection errors to document download result in 503 response to the
user.
Adds support for a new personalisation value type: file upload.
File uploads are represented as a dictionary with a "file" key and
a base64-encoded file data as the key's value:
```
personalisation={
'field1': {'file': '<base64-encoded file contents>'}
}
```
Post notification endpoint checks the request personalisation data
looking for the file uploads in personalisation data. If any are
found and the service has permissions to upload documents the files
are sent to document download API and personalisation values are
replaced with the URLs returned in the document download response.
A fake document URL is returned for simulated notifications, no
documents are stored in Document Download.
Multiple files can be uploaded for one notification by providing
a file upload in more than one personalisation field.
- Separated the logic of precompiled and template letters.
- Remove the check for research mode, research mode is not relevant to api calls. The test key is used for testing.
Refactor upload_pdf_letter to accept a precompile boolean to save a query to template.