In previous iterations of the classPersister, we
found issues with the implementation meant classes
it should have added back to elements were also
added to other elements. This adds tests for this
scenario to ensure it doesn't happen again.
Also includes changes to fix a linting error with
the JS which complained about a function being
defined in a loop while referencing variables in
the outer scope.
The assumption that the classes you want to
persist will always have parity with the elements
that have those classes, at that point, won't
always be true.
Because of that, this changes the way elements
with those classes are stored, to be in a map
between classes and the elements with them (at
that point).
Also includes an extra test for a scenario where
more than one updating component is in the page
with classes that need to persist through updates.
If the :has pseudo-class becomes available in the
browsers we support in future, the code this
branch/pull request adds will be redundant as its
behaviour will be possible only using CSS.
This adds comments to note this to the parts of
the code you'd need to remove.
Includes the following:
Guard for adding duplicate classNames
Stops the code that allows classNames to persist
across updates to the component HTML from adding a
className multiple times to the list of those to
persist. From this comment on the associated pull
request:
https://github.com/alphagov/notifications-admin/pull/4155#discussion_r804639058
Add comment explaining guard for operations on
elements no longer in the DOM
The value of this guard can be unclear and why it
is needed so we add a comment to explain this.
From this comment on the associated pull request:
https://github.com/alphagov/notifications-admin/pull/4155#discussion_r804697189
We can't guarantee that elements we stored a
reference to with `classesToPersist.remove` will
still exist so we need to guard against this.
Note: it checks for whether the node is still
attached to the DOM rather than whether it exists
because the standard way to delete a node just
detaches it from the DOM and relies on garbage
collection to delete it from memory.
The current updateContent JS replaces the in-page
HTML with the HTML from the server the first time
an AJAX request is fired, even if the HTML from
the server has no changes. This is because the
code that compares the two operates on two
different things:
The HTML in the page is the component HTML, with
all the data attributes and the partial HTML
(marked with the 'ajax-block-container' class) as
its first child:
```
<div data-module="update-content" data-url="...">
<div class="ajax-block-container">
...
</div>
</div>
```
The HTML from the server only contains the
partial:
```
<div class="ajax-block-container">
...
</div>
```
The diffing code just sees them as different at
the top level so replaces the page HTML with the
partial from the server. This means all subsequent
diffs are between partial HTML and partial HTML so
only update on actual changes.
These replace the component with the partial, as
part of the component initialising. This means all
code that runs on an AJAX response will only
compare like-for-like so will result in actual
changes (or none at all), not just swapping one
element out for another.
Note: this commit also removes the
aria-live="polite" from the ajax_block component.
It has always been overwritten by the first
response so never announces anything to assistive
technologies. Removing it makes this more clear.
Wrap the code that updates the HTML with changes
from the server with code that stores and
re-applies specified classes.
This is to allow other JS to add classes which
change the visual state of the HTML without them
being considered by the code that diffs our
in-page HTML against that from the server.
They are called classesToPersist because this
should make the visual state they create persist
between updates.
Includes the addition of tests for updateContent
that cover the addition/deletion of elements so we
can write a test for classNames persisting through
updates. The existing tests only cover updates
that change the content of elements. Just adding
the test for these changes to those would simulate
a scenario that doesn't exist in the app. Writing
extra tests for the kind of updates these changes
act on keeps them in line with the app code.
We added domdiff to replace the DiffDOM library
here:
87f54d1e88
DiffDOM had updated its code to be written to the
ECMAScript 6 (ES6) standard and so needed extra
work to work with the older browsers in our
support matrix. This was recorded as an issue
here:
https://www.pivotaltracker.com/n/projects/1443052/stories/165380360
Domdiff didn't work (see below for more details)
so this replaces it with the morphdom library.
Morphdom supports the same browsers as us and is
relied on by a range of large open source
projects:
https://github.com/patrick-steele-idem/morphdom#what-projects-are-using-morphdom
It was tricky to find alternatives to DiffDOM so
if we have to source alternatives in future, other
options could be:
- https://github.com/choojs/nanomorph
- https://diffhtml.org/index.html (using its
outerHTML method)
Why domdiff didn't work
Turns out that domdiff was replacing the page HTML
with the HTML from the AJAX response every time,
not just when they differed. This isn't a bug.
Domdiff is bare bones enough that it compares old
DOM nodes to new DOM nodes with ===. With our
code, this always results to false because our new
nodes are made from HTML strings from AJAX
response so are never the same node as the old
one.
A while ago diffDOM moved its code to use ES6
modules and started using various language
features specific to ES6. These two things
happened independently btw.
The result of this is that the version of diffDOM
suitable for our build pipeline, structured as an
immediately invoked function evocation (IIFE),
now requires polyfills of some ES6 features to
work in the older browsers we support, like IE11.
It's also worth noting that in the move to ES6
the maintainers of diffDOM have adopted a process
whereby users who need to support older browsers
now have to add polyfill code for any ES6 features
they choose to use.
This commmit proposes a move to the domdiff
library instead because:
- it runs on all javascript runtimes with no
polyfills
- it is 2KB instead of diffDOM's 25KB
Domdiff takes a different approach to diffDOM, in
that it compares existing nodes and new nodes and
replaces the existing ones with the new ones if
there are differences. By contrast, diffDOM will
make in-place changes to nodes if there are enough
similarities. In other words, in most situations,
diffDOM won't change the node in $component
whereas domdiff will.
Because of this, I've had to change the
updateContent.js code to cache the data-key
attribute's value so we don't lose access to it by
overwrite the $component variable with a different
jQuery selection.
At the moment the first AJAX call is triggered as soon as the page
loads. We then look at its response time to work out how long to wait
until making the next call.
This isn’t great because:
- stuff is unlikely to have changed straight away, so it’s a waste of a
call
- while the DOM is being updated it seems to introduce a delay in
clicks on links, which is either more pronounced or more noticeable
when it’s happening straight away, making the UI feel less snappy
I chose a value of 2 seconds as a rough proxy for the minimum time we’d
expect to see a notification go from created to delivered. Median
time-to-delivered was 2.9 seconds when we analysed it for
https://github.com/alphagov/notifications-admin/pull/2974#discussion_r286101286
By default our AJAX calls were 2 seconds. Then they were 5 seconds
because someone reckoned 2 seconds was putting too much load on the
system. Then we made them 10 seconds while we were having an incident.
Then we made them 20 seconds for the heaviest pages, but back to 5
seconds or 2 seconds for the rest of the pages.
This is not a good situation because:
- it slows all services down equally, no matter how much traffic they
have, or which features they have switched on
- it slows everything down by the same amount, no matter how much load
the platform is under
- the values are set based on our worst performance, until we manually
remember to switch them back
- we spend time during incidents deploying changes to slow down the
dashboard refresh time because it’s a nothing-to-lose change that
might relieve some symptoms, when we could be spending time digging
into the underlying cause
This pull request makes the Javascript smarter about how long it waits
until it makes another AJAX call. It bases the delay on how long the
server takes to respond (as a proxy for how much load the server is
under).
It’s based on the square root of the response time, so is more sensitive
to slow downs early on, and less sensitive to slow downs later on. This
helps us give a more pronounced difference in delay between an AJAX call
that is fast (for example the page for a single notification) and one
that is slow (for example a dashboard for a service with lots of
traffic).
*Some examples of what this would mean for various pages*
Page | Response time | Wait until next AJAX call
---|---|---
Check a reply to address | 130ms | 1,850ms
Brand new service dashboard | 229ms | 2,783ms
HM Passport Office dashboard | 634ms | 5,294ms
NHS Coronavirus Service dashboard | 779ms | 5,977ms
_Example of the kind of slowness we’ve seen during an incident_ | 6,000ms | 18,364ms
GOV.UK email dashboard | `HTTP 504` | 😬
This is mainly because tests don't inferr that
global variables are just properties of the window
object, as browsers do, but it also makes this
more explicit.
AJAX requests stop on success or failure, as the waiting page
does not have to referesh any longer.
Also on failure a form that allows user to try again
is shown.
Since it moved to ES Modules in version 2.3.1,
diff-dom stopped including the `diffDOM.js` file
in its NPM package.
We don't do any kind of bundling in our build yet,
just concatenation of our scripts and some
minification of the results so we can't take
advantage of this yet.
The `diffDOM.js` file is still available in the
Github release so this moves to referencing that
in the `package.json` instead, until we start
using a bundler.
I opened an issue to check this is what the author
intended:
https://github.com/fiduswriter/diffDOM/issues/84
The latest version also adds Rollup as a peer
dependency.
See parent commit for the reason we’re doing this.
Currently our AJAX requests only work as `GET` requests. So this commit
does a bit of work to make them work as `POST` requests. This is
optional behaviour, and will only happen when the element in the page
that should be updated with AJAX has the `data-form` attribute set. It
will take the form that has the corresponding `id`, serialise it, and
use that data as the body of the post request. If not form is specified
it will not do the serialisation, and submit as a `GET` request as
before.
The pages with AJAX on were feeling quite sluggish, and it felt like
they were making the whole browser slow down.
Looking at the performance stuff in Chrome, the number of DOM nodes was
going up and up, which is weird because the number of things on the page
wasn’t changing. This was causing the page to consume more and more
memory in order to store all these nodes.
This is kinda beyond my understanding, but I tried a few things and it
looks like the browser was having a hard time garbage collecting the
temporary bits of DOM used to update the page.
By assinging these bits of DOM to variables before using them it seems
that the browser has an easier time cleaning them up.
With the change in implementation in the previous commit, any error
(eg server responds with `500`) would cause the page to not be updated
again.
This is better than the previous implementation, whereby the browser
would re-request as fast as it could until it got a successful response.
This commit handles errors by clearing the render queue if the server
returns an error. So:
- Any updates that would have been performed based on this request are
dropped
- Subsequent updates will be attempted as if it was the first load of
the page, ie after a delay of `x` seconds
The `updateContent` module updates a section of the page based on an
AJAX request.
The current implementation will poll every `x` seconds. If the server
takes a long time to respond, this results in a the browser queuing up
requests. If a particular endpoint is slow, this can result in one
client making enough requests to slow down the whole server, to the
point where it gets removed from the loadbalancer, eventually bringing
the whole site down.
This commit rewrites the module so that it queues up the render
operations, not the requests.
There is one queue per endpoint, so for
`http://example.com/endpoint.json`:
1. Queue is empty
```javascript
{
'http://example.com/endpoint.json': []
}
```
2. Inital re-render is put on the queue…
```javascript
{
'http://example.com/endpoint.json': [
function render(){…}
]
}
```
…AJAX request fires
```
GET http://example.com/endpoint.json
```
3. Every `x` seconds, another render operation is put on the queue
```javascript
{
'http://example.com/endpoint.json': [
function render(){…},
function render(){…},
function render(){…}
]
}
```
4. AJAX request returns queue is flushed by executing each queued
render function in sequence
```javascript
render(response); render(response); render(response);
```
```javascript
{
'http://example.com/endpoint.json': []
}
```
5. Repeat
This means that, at most, the AJAX requests will never fire more than
once every `x` seconds, where `x` defaults to `1.5`.
Currently, when we update a section of the page with AJAX we replace the
entire HTML of the section with the new HTML. This causes problems:
- if you’re trying to interact with that section of the page, eg by
inpecting it, clicking or hovering an element
- (probably) for screenreaders trying to navigate a page which is
changing more than is necessary
This commit replaces the call to `.html()` with a pretty clever library
called diffDOM[1]. DiffDOM works by taking a diff of the old element and
the new element, then doing a patch update, ie only modifying the parts
that have changed.
This is similar in concept to React’s virtual DOM, while still allowing
us to render all markup from one set of templates on the server-side.
1. https://github.com/fiduswriter/diffDOM
Some pages with AJAX should update quickly, because the data is likely
to be changing quickly, and be finished changing sooner. Other pages we
want to have tick over a bit slower.
This commit adds an optional ‘interval’ parameter to the updateContent
modules, which sets how often the page should ping the server for an
update.
It then sets the interval for the dashboard page to be 10 seconds,
rather than the default 1.5 seconds.
This is a first go at having the job page update without refreshing.
The approach I’ve taken is to do all the rendering of HTML on the server side,
rather than use a Javascipt templating engine like mustache. This ensures that
we don’t have to maintain two sets of templates.
So the approach is to split the job page into partials. These partials can then:
- be included in the job page to render the whole page
- be rendered indivudually and then returned as a blob of HTML inside a JSON
response
Then I’ve added a Javascript module which looks for areas of the page that should
be reloaded. For each area of the page it will poll a URL and re-render that
section of the page when it gets new HTML. It implements some throttling so that
API calls will never happen more frequently than 0.67 times/second.