Government uses natural-language processing to support post-Brexit trade consultation

ONS data scientists are supporting the DIT in applying AI techniques to public responses

Credit: Pxhere

The government is using natural language processing (NLP) techniques to support the policymaking and negotiating teams focused on the UK’s post-Brexit trade agreements.

Following the country’s departure from the European Union on 31 January, the UK has a year to develop and implement a set of independent trade tariffs – the first time it has needed to do so in almost half a century.

A public consultation on the UK Global Tariff was launched on 6 February by the Department for International Trade. 

The department is seeking input from businesses and experts across all industries, who are asked to provide “specific feedback on specific products or commodity codes of importance to you… [and] the importance of tariffs to your sector”. 

Responses to the consultation, which is open until midnight on 5 March, will inform upcoming trade negotiations.

Given the vastness of the consultation’s remit and the huge level of public interest, the number of respondents could reach well into the hundreds of thousands.

To help analysts pick out key themes and sentiments as quickly as possible from such an enormous data set, the DIT is working with the Data Science Campus of the Office for National Statistics to use NLP technologies to process responses.

Related content

Natural language processing is a form of artificial intelligence that enables a machine learning-powered computer program to read and identify words and derive meaning from linguistic patterns.

Tom Smith, managing director of the campus, told PublicTechnology that his team have developed NLP techniques based on a mix of off-the-shelf products and tools developed in house. 

“We’ve been using natural language processing techniques that can pull out themes, trends, and patterns in responses, and then look for the sentiment on each of those trends and themes, and use that to give intelligence and information and insight to both the analysis teams to dig further, but also the trade-negotiating teams to inform what they’re looking at as well,” he said. “Now the consultation has started, we’re working with the DIT to help their teams essentially take over these tools.”

The tools can be reused and applied to similar challenges across government. Smith describes the creation of these tools as “a really, really important piece of work”, and one of the crowning achievements of the campus since it was created three years ago.

Other existing uses of the NLP system developed by ONS data scientists include its application to patent data to support the work of policy teams focused on the delivery of the Industrial Strategy.

The campus was asked to explore the use of big data to help “identify where the UK has strengths or, perhaps, lags behind”.

Smith says: “One way is to look at patent applications and grants; we’ve been working with the Intellectual Property Office and using the global patent database of 19 million patents and, essentially, using the same [NLP] tools and techniques to pull out trends, emerging themes, areas of interest, and then cross-reference them against which organisations are registering those patents, and in which countries. And that gives you an insight for policymaking around how the industrial strategy works to support particular areas of UK industry.”

He adds: “You could, in principle, also look down to regions and look at how what’s coming out of high-tech areas like Oxford, Cambridge and ‘Silicon Fen’… and see how that compares with some of the other industrial cities, for example.”

Look out on PublicTechnology – and in the next issue of our sister title Civil Service World – for a full interview with Smith

Sam Trendall

Learn More →

Leave a Reply

Your email address will not be published. Required fields are marked *

Thank you! Your subscription has been confirmed. You'll hear from us soon.
Subscribe to our newsletter