Files
notifications-api/app/dao/fact_processing_time_dao.py
Rebecca Law 21edf7bfdd Persist the processing time statistics to the database.
The performance platform is going away soon. The only stat that we do not have in our database is the processing time. Let me clarify the only statistic we don't have in our database that we can query efficiently is the processing time. Any queries on notification_history are too inefficient to use on a web page.
Processing time = the total number of normal/team emails and text messages plus the number of messages that have gone from created to sending within 10 seconds per whole day. We can then easily calculate the percentage of messages that were marked as sending under 10 seconds.
2021-02-26 07:49:49 +00:00

30 lines
1.0 KiB
Python

from sqlalchemy.dialects.postgresql import insert
from app import db
from app.dao.dao_utils import transactional
from app.models import FactProcessingTime
@transactional
def insert_update_processing_time(processing_time):
'''
This uses the Postgres upsert to avoid race conditions when two threads try and insert
at the same row. The excluded object refers to values that we tried to insert but were
rejected.
http://docs.sqlalchemy.org/en/latest/dialects/postgresql.html#insert-on-conflict-upsert
'''
table = FactProcessingTime.__table__
stmt = insert(table).values(
bst_date=processing_time.bst_date,
messages_total=processing_time.messages_total,
messages_within_10_secs=processing_time.messages_within_10_secs
)
stmt = stmt.on_conflict_do_update(
index_elements=[table.c.bst_date],
set_={
'messages_total': stmt.excluded.messages_total,
'messages_within_10_secs': stmt.excluded.messages_within_10_secs
}
)
db.session.connection().execute(stmt)