Grouping the letters into a maximum number of files is necessary because
the SQS task needs to be under a certain size. We also compress the task
when sending.
add collate-letter-pdfs task (name pending). This retrieves a list of
letter pdf files (just the metadata, not the actual data) from s3, and
loops through them, calling the ftp task zip-and-send-letter-pdfs. It
groups them up by adding them to lists while counting the total
filesize, if it gets over a certain filesize (currently set to 500mb)
it breaks at that chunk, sends off that list of files to the ftp app,
and then starts building up a new list.
DVLA have a hard 2gb limit on how big the zip files we can send is -
however we're going to be limited by the amount of memory on the ftp
app well before we get around to handling 2gb of pdf data - so the
limit is 500mb for now. We'll adjust it after we see how ftp performs.
- Introduce a `_raise` flag for `get_notification_by_id` so that sql alchemy will raise the NoResults error rather than the app
- Refactor `dao_set_created_live_letter_api_notifications_to_pending` to use a join for getting services that don't have `letters_as_pdf` as marginally faster.
- Added has_permission helper in models.py to check permission in service
- Moved letters pdf tasks to separate file
- Moved letters pdf tests to own file