When letters are sent to DVLA, we will now put them in a separate
ZIP file for each service, so that if there are printing issues
due to bad files from one service, other services will hopefully
not be affected by that.
When we create a broadcast message, we should invoke the cbc proxy to
send a cap message
Either a function will be invoked within AWS, or a noop function call
is made, depending on the environment
We have only implemented CB message creation in the CBC Proxy, without
polygons, therefore we:
* only invoke the CBC Proxy during message creation
* only send description, identifier, and hard-coded headline
Signed-off-by: Toby Lorne <toby.lornewelch-richards@digital.cabinet-office.gov.uk>
Co-authored-by: Pea <pea.tyczynska@digital.cabinet-office.gov.uk>
Co-authored-by: Katie <katie.smith@digital.cabinet-office.gov.uk>
We have had a few instances where letters have caused problems. Particularly for precompiled letters, often the issue comes from the same service.
The hope is that by adding a sort order this will help the print provider narrow down the problem.
There is a small degradation of the performance of the query, but it's not enough to concern me.
Broadcast services only have broadcast templates. But on the frontend
we show the template type under the name of the template. This is
redundant. It would be better to preview the content of the template
instead.
At the moment we can’t do this because we optimised the template schema
response to not include the content of the templates in
https://github.com/alphagov/notifications-api/pull/2880
This commit reverses that optimisation, for broadcast templates only,
and expands the tests to cover all template types.
While the API still has a column and field for personalisation I think
it makes sense for it to consider the personalisation when serialising
the broadcast, so we should have a test for this.
This way the API still works as a coherent whole, and the admin app just
happens to be a client of the API which doesn’t implement the
personalisation feature.
If we want to remove personalisation from the API at another time we
should do it wholesale.
This is so when users reset their password they are still
redirected to pages they were meant to visit.
This change was done specifically so everyone who is meant to see
broadcast tour sees it, but it will improve lives of all users
who wanted to visit a page on Notify but then had to reset
their password in the process
At the moment we display the count of scheduled jobs on the dashboard
by sending all the scheduled jobs to the admin app and letting it work
out the stats.
This is inefficient and, because the get jobs response has a page size
of 50, becomes incorrect if a service schedules more than 50 jobs.
This commit adds a separate endpoint which gives the admin app the stats
it needs directly and correctly.
Since we’ve doubled the number of rows in a job, jobs can take twice as
long to insert all the notifications. We don’t check for missing rows
until we’re pretty confident that the original tasks have finished
processing. This means we need to double the time we wait to still be
as sure.
we don't name letters based on the day we send them on, rather, the day
we create them on. If we process a letter for a second time for whatever
reason, even if it's a couple of days later, it'll still go in a folder
based on the created_at timestamp. There's still a slight confusion,
however - if the timestamp is after 5:30pm, the folder will be for the
day after. However, still the day after creation, so I think created_at
still makes the most sense.
Remove the term `sending_date` to try and make this relationship more
apparent.
`_now`? why would we ever use a different _now? instead say created_at,
because that's what it'll always be set to, even if we're replaying old
letters. We always set the folder name to when the letter was
created_at, or we might not know where to look to find it.
`dont_use_sending_date` doesn't really tell us what might happen if we
don't use it - the answer is we return an empty string. we ignore the
folder entirely. so lets call it that.
Also, remove use of freeze_gun in the tests, to prove that we don't use
the current time in any calculations. Also add an assert to a mock in
the get_pdf_for_templated_letter test, because we were mocking but not
asserting before, so the tests didn't fail when the function signature
changed.
We were determing the filename for precompiled letters before we had
checked if the letters were international. This meant that a letter
could have a filename indicating it was 2nd class, but once we had
sanitised the letter and checked the address we were setting the
notification to international.
This stopped these letters from being picked up to be sent to the DVLA,
since the filename and postage of the letter did not match.
We now regenerate the filename after the letter has been sanitised (and when
we know the postage) and use the updated filename when moving the letter
into the live PDF letters bucket.
It is not of the form
[[lat, long][lat, long]] as this would only hold a single polygon. It
instead needs to handle multiple polygons so instead is of the form
[[[lat, long][lat, long]]].
Our code was assuming that any notifications with `international` set to
`True` were text messages. It was then trying to look up delivery
information for a notification which wasn’t sent to a phone number,
causing an exception.
`international` for letters in `ft_billing` was always False. Now that
letters can be international, this changes the column value to the value
of `international` for the notification.
We want to display flash messages in admin when invites have been
cancelled. This message needs to display the user's email address, so
this commit adds endpoints to GET a single invited service user and org
user so that we can look up the email address of a cancelled user.