Who: The Data, Technology and Analytics (DaTA) unit of the Competition and Markets Authority (CMA) and any market participant deploying algorithms.
Where: United Kingdom
When: 19 January 2021
Law stated as at: 29 March 2021
On 19 January 2021, the DaTA unit of the UK’s CMA published a paper, and accompanying call for information, on “Algorithms: How they reduce competition and harm consumers” which focusses on the potential harm caused by the use of algorithms by market participants.
The paper acknowledges that machine learning, algorithmic systems and artificial intelligence (AI) can provide notable benefits to consumers in the digital world, by personalising recommendations and creating efficiencies, whilst also bringing a potential cost benefit.
However, it also raises concerns about the use of algorithms in causing direct and indirect harm to consumers, either intentionally or unintentionally. For example, the paper specifically discusses the role of algorithmic data collection and analysis in delivering personalised pricing, and the use of personalised search rankings, where algorithmic systems facilitate preferences for particular services, products or suppliers. It contemplates that these practices may potentially lead to negative outcomes for consumers by manipulating consumer decision-making. Drawing from other areas of law and policy, the paper also raises concerns where the personalisation is based on protected characteristics (such as age, sex, and race), which could amount to unlawful discrimination and an infringement of equality legislation. Such practices, in addition to the use of “dark patterns”, which are designed to deceive users into making commercial decisions that they may not have otherwise (such as buying or signing up to something) are considered particularly unfair given the lack of transparency from a consumer perspective.
Furthermore, the report states that algorithms are also being used in ways that could facilitate pricing collusion amongst competitors. Potential areas of concern cited in the paper include: the greater volume and availability of pricing data; the use of the algorithmic systems and software provided by the same third party, which facilitates information exchange and a “hub and spoke” structure (in which all prices move together); and autonomous tacit collusion (where pricing algorithms learn to collude without requiring explicit information sharing).
While markets and market participants have implemented algorithms for many years, the advancement of technology and AI has resulted in the potential increased risk of misuse. The paper is highly indicative of the CMA’s likely approach to the use of algorithms and the priorities of its new DaTA unit when it launches in April 2021, and it is thought that the CMA intends to harness all of its regulatory powers to take a tough stance going forward.
Consequently, there will be a greater regulatory compliance burden on market participants, irrespective of size, who will be required to conduct robust and transparent audit practices to justify their use of algorithms, and keep accurate records and design documents which explain their reasoning prior to implementation. The paper also contemplates the use of regulatory “sandboxes”, to enable companies to test their algorithms in a live environment without the risk of the usual regulatory consequences.
Why this matters:
The CMA’s focus on the use of algorithms will have obvious implications for companies operating online, and it is clear that the regulatory environment surrounding algorithms is likely to tighten in the near future, especially with the anticipated introduction of the DaTA unit in April 2021. This aligns with its ongoing wider strategy of tackling misleading online commercial practices which have a potentially detrimental effect on consumer choice and decision-making. Companies should review their use of algorithms at the pre and post deployment stages and assess the potential impact on consumers and the broader market, with an aim of building greater transparency into the process.