Nesta cite CAMRI research in their User illusions policy work
John Davies from Nesta in his User illusions: Data and algorithms will address long-standing consumer issues, but create new ones too elaborates on some of the themes discussed at our Regulating the Digital Economy workshop held by the Policy Observatory in June.
.
We don’t shop around for the best deals in electricity and banking. We don’t read online terms and conditions. Algorithms making decisions about these on our behalf might do a better job. At the same time, the algorithmic personalisation of pricing and products by firms may make our shopping choices less clear and it harder to switch supplier. The role of transparency in data processing and data portability in all this shows the importance of the European data protection regulation (The GDPR) set to be enforced from May 2018.
Delegating decisions that we don’t engage with to algorithms to make on our behalf
Automated purchasing of more homogenous products where price is the main factor such as in energy and banking
A characteristic of many regulated sectors with homogenous products is that consumers often do not switch supplier despite relatively clear financial incentives to do so.[1] This implies a role for digital intermediaries/algorithms to make decisions on behalf of consumers. Automating decisions people are not interested in and saving them time and money. Automated purchasing has been occurring in business to business (B2B) transactions for a while, for example in financial services (where stocks and shares are frequently traded programmatically to get the cheapest price and maximise return) and also targeted advertising where ad agencies automatically place bids to show adverts to people with certain characteristics.[2] It seems plausible this will become more widespread for consumers and that there will be a growth of intermediaries that try and buy consumers the best deal.[3]
Delegating our consent for data processing to the scrutiny of autonomous agents
We routinely click terms and conditions without reading – let alone thinking – but perhaps it would be better if this function was automated involving an artificial agent that scrutinised the conditions on our behalf. With the growth of the Internet of Things (IoT) the number of devices processing data about us is likely to increase.[4] This will mean more data processing consent requests for us to consider. An implication may be that widespread use of IoT devices requires some form of automated consent to practically manage the processing of our data to certain agreed standards that we have pre-selected.
This is not to imply that the implementation of either of these is straightforward. Competition in utilities is not just about the demand side, terms and conditions are far from standard, and how do you know you can trust an autonomous agent making decisions on your behalf? It seems plausible though that progress could be made. Other aspects of consumer choice may become more challenging.
Algorithms personalising the prices and products we are offered, making things less transparent and switching harder
Personalisation of online advertising and websites based on our individual browsing history and sometimes location is a routine part of our lives. However, personalisation is likely to become more widespread as the data that is generated by us starts to offer greater scope to personalise:
The price offered to us where different people are charged different prices for the same product based on their price sensitivity, so called price discrimination. This is different from when the price varies according to the cost of supplying the product to the individual as in insurance or lending where we are charged different premia based on our estimated risk. Using data to inform either of these is not new, but the increase in available data on us and the flexibility of pricing offered by online platforms makes it easier for companies to do this. It’s also not illegal unless it violates existing anti-discrimination legislation. There is, at least in the UK, limited evidence that online personalised price discrimination occurs, and perhaps it will be too unpopular to implement, but it is technically possible and could become more widespread.[5]
The products themselves The Amazon and Netflix business models are partly based on tailoring their sales offer to us using the data collected on people’s preferences through the sites. More radically, there will probably be much greater personalisation of physical products and services based on the data we generate. Extensive personalisation is already possible through automated production techniques such as in cars (the Mini, for example, is billed as offering buyers the choice from over 10 million possible design combinations of finished car).The data that companies collect from us and our devices is likely to encourage greater personalisation (devices that recognise us for example) in conjunction with more flexible means of production such as 3D printing or through co-creation where online tools allow us to create our own designs. The move of some of the large tech companies into areas such as personal home assistant devices like Amazon’s Echo or Google’s Home, and self-driving cars may also accelerate this.
Why this shows the importance of the forthcoming data protection legislation (the GDPR)
With homogenous products, consumers should want to buy the product for the cheapest price possible.[6] With terms and conditions that no-one reads, an assessment of standardised conditions by an automated assistant should be an improvement. However, price discrimination is an area which economics has traditionally acknowledged has ambiguous welfare effects (very crudely, some people get higher prices and others lower prices) and will make things more complicated.[7] Greater personalisation of products based on personal data should hopefully give us better products. It’s also not new, website optimisation aside, although no direct data is exchanged, implicitly it is one of the reasons why we often stick with the same professional service providers as they know us (going back to the same hairdresser for example). What is new is that as our data becomes more embedded in the products and services we buy it may become harder for us to switch suppliers – it’s not just about us switching, but about us being able to effectively transfer our data. Personalisation of products combined with greater personalisation of pricing may also make it harder for consumers to understand what the deal they are being offered actually is.
The impending European General Data Protection Regulation (GDPR) – set to be introduced in UK legislation regardless of Brexit is therefore very important. Containing as it does a right for greater transparency in data processing, a right for people not to be subject to automated decision making and profiling and a right to transfer data between suppliers.[8] Whether the legislation has got the balance on these right or not, these are not abstract points of principle, but issues that will shape the future of consumer choice.
Acknowledgements: Thanks go to Theo Bass, Chris Gorst, Juan Mateos-Garcia, and Tom Symons for their comments on this post and its follow up.
Thanks also go to the European Law Institute, the Meaningful consent in the Digital Economy group at Southampton University, the Communication and Media Research Institute at Westminster University and the Digital Catapult for allowing me to attend their workshops on the digital economy/data protection.
[1] The UK Regulators (2014) report on Consumer Engagement and Switching lists 12 key barriers that prevent people switching.
[2] For a discussion of the role of data in online advertising see the interview with Claudia Perlich in Gutierrez (2016), ‘Data Scientists at work’. For a review of the role of algorithmic trading in financial markets see The Government Office for Science (2012), ‘The future of computer trading in financial markets’.
[3] At the moment, there are a number of price comparison websites. These are currently being investigated by the CMA as part of their work on digital comparison tools. Price comparison websites have challenges, but so far the CMA has assessed their effects as broadly positive.
[4] Windsor, G. and Fernando, T. (2016), ‘What does the growing internet of things mean for privacy and regulation’.
[5] When this was examined in 2013 by the Competition Market Authority’s predecessor, the Office of Fair Trading (OFT) there was found there was limited evidence of this. OFT (2013), ‘Personalised pricing: Increasing transparency to improve trust in the market’ and ‘The economics of online personalised pricing’.
[6] An exception would be where the conditions of supply are considered important for example although electricity is homogenous some forms of production have environmental effects that may matter to consumers.
[7] For much more sophisticated discussions of price discrimination see Tirole, J.(1988), ‘The theory of industrial; organisation’, Price discrimination and Belleflamme, P. and Peitz, M. (2010), ‘Industrial Organisation Markets and Strategies’, Group pricing and personalised pricing
[8] Information Commissioner’s Office (ICO), (2017), ‘Overview of the General Data Protection Regulation’
Photo by and machines on Unsplash