Who: The Advertising Standards Authority (ASA)
Where: United Kingdom
When: 23 February 2023
Law stated as at: 13 March 2023
The ASA is making use of machine learning and artificial intelligence-based (AI) systems to support its digital advertising regulation. The sheer scale of the digital advertising ecosystem means that it can be difficult to search through large volumes of ads and deal with complaints that arise. The use of technology in this way can help human experts deliver more efficient regulation.
As an example, the ASA has used AI to help ensure that claims in ads about climate impacts are transparent and do not mislead consumers. The ASA’s data science team has used the tools in order to search large volumes of ads from “carbon-intensive” industries, identifying those that make climate claims, and putting them in front of the relevant human experts for further assessment. This then allows those experts to act quickly where appropriate and follow up on any claims.
The ASA recognises that the use of these tools can improve its regulation, but this will not come without risks. AI tools can make decisions which human experts would not agree with, or make decisions which could be unfair or discriminatory towards individuals or groups of people. There is scope for unfair outcomes, but the ASA has declared it is committed to ensuring that these potential risks are considered carefully when using the technologies.
The risks themselves can vary depending on the tool used and its application and the ASA will need to review the technical measures which can be implemented at the design phase of these tools. The ASA will also need to consider all of the ways it can ensure its work is aligned with its principles of delivering regulation that is “transparent, proportionate, targeted, consistent and accountable”. This will include undertaking risk assessments and making sure actions that can mitigate risks are appropriately followed up.
Why this matters:
The ASA recognises that although AI and machine-learning tools can be used to improve efficiency in its regulation, the use of these tools will not always be appropriate. Some things should not be automated and this should not be seen as an automated overhaul of all the roles that the ASA currently plays. The ASA does have many experts with years of experience in reviewing and regulating ads, so it is important to stress that these tools will be used to supplement the these experts’ powers rather than completely take over.
AI systems are unlikely to be able to judge how the ASA should respond to a complaint or claim. It will be difficult for a machine-learning or AI tool to assess the wider context of an ad or the way that audiences are likely to interpret it. A well-designed system, which combines automated tools with human oversight, could be used to deal with large volumes of content more efficiently, while making the most of the experience that comes from human experts.