stream notifications when collating zip files

we had issues where we had 150k 2nd class notifications, and the collate
task never ran properly, presumably because the volume of data being
returned was too big.

to try and help with this, we can switch to streaming rather than using
`.all` and building up lists of data. This should help, though the
initial query may be a problem still.
This commit is contained in:
Leo Hemsted
2020-10-23 10:21:52 +01:00
parent 6fbf12afeb
commit 4b61060d32
4 changed files with 10 additions and 9 deletions

View File

@@ -223,7 +223,9 @@ def test_get_key_and_size_of_letters_to_be_sent_to_print(notify_api, mocker, sam
{'ContentLength': 3},
])
results = get_key_and_size_of_letters_to_be_sent_to_print(datetime.now() - timedelta(minutes=30), postage='second')
results = list(
get_key_and_size_of_letters_to_be_sent_to_print(datetime.now() - timedelta(minutes=30), postage='second')
)
assert mock_s3.call_count == 3
mock_s3.assert_has_calls(
@@ -284,7 +286,9 @@ def test_get_key_and_size_of_letters_to_be_sent_to_print_catches_exception(
ClientError(error_response, "File not found")
])
results = get_key_and_size_of_letters_to_be_sent_to_print(datetime.now() - timedelta(minutes=30), postage='second')
results = list(
get_key_and_size_of_letters_to_be_sent_to_print(datetime.now() - timedelta(minutes=30), postage='second')
)
assert mock_head_s3_object.call_count == 2
mock_head_s3_object.assert_has_calls(