David Currie explains that there is an ‘arms race’ between web platforms and criminals that are equally sophisticated
The chair of the UK’s independent advertising regulator has described how the body and the online platforms it oversees are in an “arms race” against “sophisticated” scammers.
Lord David Currie, who heads the Advertising Standard Agency (ASA), said the organisation was deploying tech-assisted monitoring and enforcement to protect children and vulnerable people from harmful adverts.
This includes the use of avatars — artificial systems which mimic the profiles of children and teens — to see whether these groups are being served targeted ads in a range of online spaces for restricted products such as alcohol and gambling. Other areas being explored include the use of data science to create algorithms which can detect scam adverts with a high degree of accuracy.
The move has led to a 346% increase in ads being amended or withdrawn as a result of ASA intervention, to a total of 36,491 in 2020.
“We need to use tech to keep up in this world. It’s growing so fast. We have to rely on tech to advance what we’re doing,” Currie told PublicTechnology sister publication PoliticsHome. “By doing this work, and by doing it regularly, people know we’re doing it and therefore there’s quite a deterrent effect. It’s a very powerful tool that is being emulated by other regulators around the world because we were the pioneers in this.”
But he warned that online scammers were using continually advanced technologies to evade the agency’s most cutting-edge monitoring programmes.
“There is a technical issue there which we are working to overcome, which is that these scammers and these scam ads are themselves very sophisticated,” he said. “The people working for the scammers are every bit as sophisticated as the people working for the tech companies themselves. And they probably paid at least as much if not more, because it is a very lucrative business.”
Currie said scammers were able to engage in “cloaking”, where they identified if the profile accessing their platform is a person or a bot, with the aim of serving a less harmful advert to the latter.
“Getting around that sort of thing requires more tech on our side to fight it. It’s a very good example of the way we need to fight tech with tech,” he said. “It is an arms race, effectively. And we need to make sure we have enough resources to keep up in the arms race.”
The ASA has a “developing relationship” with social media giants like Google and Facebook to tackle scammers and those producing harmful content, he added.
“We are in the early stages of this journey. I think there’s an increasing awareness that this is a necessary journey to be engaged,” Currie said: “Having said that, if you go back two years, the level of cooperation and effective working together was much less. We’ve developed significantly. Given that their systems are all so different, getting these things to be put in place is not a question of us saying, ‘this should happen’, and it happens. You’ve got to work at it, you’ve got to work on the details of it. So it is, for that reason, a journey.
“But, am I confident that in two years’ time will be more advanced than we are now? Absolutely, because we’re more advanced than we were two years ago.”
The ASA launched a Scam Ad Alert system in June 2020 after a successful three-month trial, which allowed complaints to be quickly shared with online ad networks to facilitate their swift removal, as well as preventing similar ads from appearing.
It has received over 1,100 reports and sent over 100 Scam Ad Alerts which primarily concern cryptocurrency investment scams promoted using ‘fake news’ stories and doctored images.