`jsonschema[format]` includes all the formatting dependencies of
jsonschema, meaning that we don't have to specify `rfc3339-validator`
and `rfc3987` ourselves in the requirements.in file. This also has the
benefit of meaning that if the underlying formatting packages of
jsonschema change, we will be covered and won't accidentally miss the
fact that we need to change a package.
`charset-normalizer` is now used by default if installed instead of
`chardet` (https://pyup.io/changelogs/beautifulsoup4/#4.11.0). We do
have `charset-normalizer` installed because it's a subdependency of the
requests library, so it is being used.
This caused the `test_content_too_long_returns_400` to fail since it
now thought that the encoding of `ŵ` is `{'encoding': 'Big5',
'language': 'Chinese', 'confidence': 1.0}`.
There are two options for fixing this
- change the test content so that it doesn't just contain a single
letter - the docs state that you shouldn't run character detection on
very tiny content
- add `chardet` as a requirement, so that the code functions exactly the
same as before
I've chose the first option, since this avoids adding a dependency and
we should never have messages consisting of a single character.
We have three different ways of checking the formats of datetimes.
1. The built-in way that comes with the jsonschema package ("date-time")
2. A new way we added for broadcasts ("datetime") 61a5730596
3. An old way we defined in
"/tests/app/public_contracts/schemas/v0/definitions.json"
In order to simplify things and make it clearer how datetimes are being
validated, this replaces the few places where we were using option 3 with option 1
instead. Option 3 was only being used to validate code that is no longer
used, the initial version of the API.
The big breaking change for our code (not mentioned in the changelog) is
that the built-in validator for the `date-time` format now requires the
`rfc3339-validator` package instead of the `strict-rfc3339` package.
This updates the requirements file to use `rfc3339-validator`. Without
this change, wrong `date-time` formats would always silently pass validation.
We were using the Draft4Validator in one place, so this updates it to
the Draft7Validator instead.
The schemas were mostly using draft 4 of the JSON schema, though there
were a couple of schemas that were already of version 7. This updates
them all to version 7, which is the latest version fully supported by
the jsonschema Python package. There are some breaking changes in the
newer version of the schema, but I could not see anywhere would these
affect us. Some of these schemas were not valid in version 4, but are
now valid in version 7 because `"required": []` was not valid in earlier
versions.
There's no changelog for this, but I've looked through all the commits
and can't see any reason why this needed a major version bump or
anything that should cause us issues.
This is done to make self-service branding easier to implement,
and also because NHS branding makes much more sense for services
in those orgs than GOV.UK branding.
eventlet works by monkey-patching core IO libraries (such as ssl) to be
non-blocking. However, there's currently a bug: In the normal socket
library it may throw a timeout error as a `socket.timeout` exception.
However eventlet.green.ssl's patch raises an ssl.SSLError('timed out',)
instead. redispy handles socket.timeout but not ssl.SSLError, so we
solve this by monkey patching the monkey patching code to raise the
correct exception type 😱
Note: This code should _only_ be called when we're using eventlets, or
we'll run into issues with regular code failing with max recursion
errors. With that in mind we put this code in gunicorn_config, as that
isn't imported when we run celery or run flask locally.
This is to support a migration from Redislabs to PaaS native Redis,
allowing us to toggle between old and new using the env vars for
the instance - without needing to change the code.
This was inconsistent with the source data for the fixture being
overidden in some of the tests. We only need to set it in the env
once, so it makes sense to just put the code there.
Fixes this error (in Admin):
File "/home/vcap/app/app/notify_client/notification_api_client.py", line 74, in send_precompiled_letter
return self.post(url='/service/{}/send-pdf-letter'.format(service_id), data=data)
File "/home/vcap/app/app/notify_client/__init__.py", line 59, in post
return super().post(*args, **kwargs)
File "/home/vcap/deps/0/python/lib/python3.9/site-packages/notifications_python_client/base.py", line 48, in post
return self.request("POST", url, data=data)
File "/home/vcap/deps/0/python/lib/python3.9/site-packages/notifications_python_client/base.py", line 64, in request
response = self._perform_request(method, url, kwargs)
File "/home/vcap/deps/0/python/lib/python3.9/site-packages/notifications_python_client/base.py", line 118, in _perform_request
raise api_error
notifications_python_client.errors.HTTPError: 500 - Internal server error
Due to this error (in API):
File "/home/vcap/app/app/service/send_notification.py", line 178, in send_pdf_letter_notification
raise e
File "/home/vcap/app/app/service/send_notification.py", line 173, in send_pdf_letter_notification
letter = utils_s3download(current_app.config['TRANSIENT_UPLOADED_LETTERS'], file_location)
File "/home/vcap/deps/0/python/lib/python3.9/site-packages/notifications_utils/s3.py", line 53, in s3download
raise S3ObjectNotFound(error.response, error.operation_name)
notifications_utils.s3.S3ObjectNotFound: An error occurred (NoSuchKey) when calling the GetObject operation: The specified key does not exist.
I checked the DB to verify the letter does actually exist i.e. it
is an instance of the problem we're fixing here.
We previously decided against this [^1] but the lambda alerts are
not a replacement for this one.
Even if both sets of alerts go off, we expect PagerDuty to aggregate
them into a single notification.
[^1]: 65c21f694c
It's worth noting this will break the Admin page to edit provider
priorities, which currently works in units of 10. We already have
work planned to fix this [^1] and it's not an immediate problem.
The automatic adjustment algorithm will continue to work properly
as it can cope with increments smaller than 10 [^2].
[^1]: https://www.pivotaltracker.com/story/show/181681739
[^2]: b145a29935/app/dao/provider_details_dao.py (L122)