GDS introduces algorithmic ‘related content’ links to GOV.UK
Tests show users click on automatically chosen links and make less use of search
The Government Digital Service (GDS) is introducing ‘related content’ boxes generated by an algorithm across GOV.UK, following random tests on whether users clicked on them.
Around 2,000 GOV.UK pages already had such boxes, shown on the right-hand side of pages such as that for UK bank holidays. But the links are chosen by human editors, meaning a lot of effort would be required to extend these across the site.
GDS used an A/B test, where users were randomly served a GOV.UK page with or without a related content box of automatically-generated links. Versions of algorithms were tested for about a week, and staff wrote software to analyse the results immediately on completion of each test, allowing them to make adjustments rapidly.
In a blogpost, GDS data scientists Suganya Sivaskantharajah and Mat Gregory said that users clicked on related links in more than 2.4% of cases for the best-scoring algorithms and used the internal search service 20%-40% less, the latter suggesting they were finding what they wanted through the links. They added that having considered other criteria, GDS has chosen the Node2Vec algorithm, created by researchers at Stanford University, to generate links as it has more potential for improvement.
“We decided that we would not replace the hand-curated links in those 2,000 pages that had them, as despite how good our algorithms are at the moment, they still do not have the same context as a subject matter expert,” Sivaskantharajah and Gregory added.
The work to introduce this appears to be ongoing, as the GOV.UK page used as an example in the blogpost on paternity pay and leave is currently not showing a related content box.
Department looks for trio of experts to join specialist team
Value of trio of technical and support deals is raised
Reports identify importance of technology skills to help meet service demand in spite of financial pressures
Regulator claims new systems come with inherent risk of ‘systemic bias, inaccuracy and even discrimination’