Who: Office of Fair Trading
Where: UK
When: May 2013
Law stated as at: 5 June 2013
What happened:
In May 2013 the UK’s Office of Fair Trading (OFT) published a report following its November 2012 call for information on personalised pricing practices – when a trader charges different prices to different people for the same product based on “observed, volunteered, inferred or collected” personal data.
Unlike some other forms of so-called “price discrimination” such as demand-based dynamic pricing models for airline tickets, “personalised pricing” as analysed in the OFT report uses information about the individual customer – for example their browsing or purchasing history or the device they use – to help assess what they may be willing to pay and adjust pricing and presentation accordingly.
The OFT had become interested in this concept, and wanted to understand how prevalent it is, whether it is harming consumers or markets and how it may develop.
The OFT’s findings
Its report considers that personalised pricing has the potential to deliver a net benefit to both consumers and businesses, and it highlights the role technology is able to play in delivering personalisation. However, it finds no evidence that retailers use information collected about individuals to offer higher prices to specific customers.
As might be expected, the OFT reports that retailers told it they did not want to risk alienating customers and attracting adverse publicity by being seen to use customer data in this way. Rather, what customer data is actually being used for, says the report, is to offer personalised discounts, to determine what inventory to promote to particular individuals and/or to refine pricing strategies.
While the worst fears identified by the OFT may have proved to be unfounded for now, the regulator has nevertheless highlighted a number of concerns – both specifically around personalised pricing and more generally.
In particular, it says it is “disappointed with the level of transparency” around data collection and exploitation and indicates it would be likely to be concerned if consumers cannot easily avoid personalisation, do not know personalisation is occurring, cannot easily see prices paid by other customers or are misled by statements (such as a “best price” claim when the consumer is in fact paying more than other consumers) or omissions.
Interestingly, the report states that the OFT’s most significant concern currently is not direct harm to consumers but rather indirect harm caused by “a reduction in trust in online markets”.
Why this matters:
The report flags various ways in which personalised pricing can potentially fall foul of UK legal and regulatory requirements (with a review that strongly reflects an Osborne Clarke analysis in our 2012 Data Gold Rush report).
However a strong sense emerges from the report that the OFT has found little to get truly worried about – for now – in the area of personalised pricing, but that its research has led it to look more closely at transparency issues generally.
Letters to 60 businesses
The OFT claims that many of the websites it examined during its research did not make clear “what information they collected about consumers, how it would be used or how users of their websites could opt out of data collection”. Other websites are criticised for “forcing users to accept their privacy policy” and providing no mechanism to opt out of non-essential data collection.
Alongside publication of its report, it has written to 60 online businesses to raise awareness of its concerns around these kinds of practice. While its letter is headed “OFT’s Call for Information on Personalised Pricing”, the content of the letter reflects a broader range of transparency concerns around “online practices” generally.
Similarly, the report’s analysis under the CPRs suggests a new enforcement focus in relation to data collection and usage in general – not just in relation to pricing practices.
“Aggressive practices”
The report and the OFT’s letter put forward the view that where non-essential data is collected automatically or used without a genuine opportunity for the consumer to opt-out, this may be unlawful as an “aggressive practice” under the CPRs.
This seems a clear statement of intent that data collection and usage should be policed under consumer law as well as by ICO under data protection law – putting more pressure on online operators to give users greater control over the collection and use of their information, whether it falls within the definition of “personal data” or otherwise.
However, the OFT’s use of the word “may” is telling. A closer examination of the CPRs shows that the OFT’s “aggressive practice” arguments rely on some interpretations that a court may be reluctant to follow. For instance, the OFT takes a very broad interpretation of “transactional decision”, and sees this as including a decision to visit a website or to click through to another page. Its analysis also seems to assume that information should be seen as a “product” supplied by the consumer to the trader. (But where data is collected automatically can it really be right to say the consumer has supplied it?) Further, where is the “harassment, coercion or undue influence” required for aggressive practice? To argue that failure to provide an opt-out amounts to the application of “pressure” on the consumer feels like a pretty aggressive analysis itself…
The OFT’s report and other associated information can be found here.