In the age of the mobile phone, online shopping and artificial intelligence, the most valuable asset in the world has become data.
In March, The Economist pointed out that a century ago, oil was the world’s most sought-after commodity, while data has become the asset that dominates business interest today. And that’s why regulators have now turned their focus to the way data is generated, managed and shared.
It’s no surprise. A recent report from technology analyst IDC (International Data Corporation) says that by 2025 the world’s “datasphere” will be composed of around 163 zettabytes, or 163 trillion gigabytes. A gigabyte of internet browsing would allow 3,000 web pages to be viewed. The numbers imply that the datasphere will become unfathomably large and, according to IDC, almost 60% of that data will be stored in businesses. That makes data a significant regulatory issue.
Nigel Parr, head of the competition and antitrust practice at law firm Ashurst, says the critical nature of data is becoming clear to watchdogs. “Data is becoming ever-more important; it’s being generated quickly and effectively, at relatively low cost and confers a competitive advantage. And that raises questions in regulators’ minds about whether the ownership and control of data may restrict or distort competition.”
Impact of ownership
Data has already attracted attention. Over recent years regulators have looked closely at both M&A deals and the functioning of some market sectors, asking themselves what impact the ownership of data has. The European Commission has looked at the role of data in the takeover of messaging app WhatsApp by Facebook, and the acquisition of LinkedIn, the professional networking site, by Microsoft.
In the UK, competition watchdogs have considered the data implications in both the banking and energy markets. A key question is whether a lack of access to data can undermine competition.
The issue has also received public attention. Indeed, so sensitive has the issue become that ownership of vast quantities of data has prompted calls in some quarters for companies like Amazon, Facebook, Microsoft and Google to be broken up. That may seem unlikely, but it illustrates how data access has become such a provocative question.
Parr says data concerns arise across a range of areas. First, there’s a basic competition question. Does exclusive access to vast data pools represent a reward for risk-taking and innovation, or a loss of actual, or potential, competition?
Second, there is huge debate about the role of data in mergers and acquisitions. Does the coming together of two large data-owning companies substantially reduce actual or potential competition in a market?
This is a particularly thorny issue for tech companies because it’s clear that many own highly valuable databases, but have yet to monetise them. Low revenue means such companies could fall below the relevant thresholds for merger control. This, in turn, raises questions for regulators on how to effectively assess the competition implications of two data-rich enterprises joining forces.
Then there is the cartel question. In traditional markets the exchange of data, particularly where it is of strategic importance, is generally discouraged because of the risk of creating cartels. However, in the digital age it is possible that the sharing of data could, in fact, facilitate market entry and better-functioning markets.
The question has also been raised as to whether data might constitute an essential input for certain business activities, and a refusal to make it available to actual, or potential, competitors might constitute the abuse of a dominant market position.
Lastly, regulators have taken a keen interest in the way data sets can affect market sectors.
Regulators are grappling with all these questions, but it’s in the area of discrete markets, Parr says, that we have so far seen the most radical action.
Last year saw the UK’s Competition and Markets Authority (CMA) publish its market investigation report on retail banking. The inquiry addressed why the leading banks had retained their market shares despite a range of new players entering the market, and why relatively few retail customers switch banks.
Parr says the solution was innovative. Rather than force disproportionate and costly structural change on the banks through forced divestments, the CMA created what was called a “cross-cutting foundation” remedy through the creation of open application programme interfaces (open APIs). These allow new entrants to the banking market to have access to customer transaction data controlled by the incumbent banks, which enables them to compete on a level playing field.
“That’s frankly revolutionary,” says Parr. “Through open APIs a new ecosystem will be created whereby market entrants will gain access to data to help them develop new business models and offer innovative services to customers. Other jurisdictions are now examining whether they can use the CMA’s solution as a model for their own banking markets.”
The report demonstrates how access to data can be a solution to structural competition concerns; unsurprisingly, database solutions have been used elsewhere. The energy market, for instance, has seen its own CMA investigation, which required the energy regulator to create a database listing details of customers who have been on their energy supplier’s standard tariff for three or more years, with access being given to rival suppliers.
Again, the regulator’s action is focused on offering access to data in an attempt to enable market participants to target services at disengaged customers.
While particular market sectors have received attention, mergers and acquisitions will also occupy the attention of regulators, says Parr. Last year the merger of IMS and Quintiles—both providers of information services in the pharmaceutical sector—was examined by the European Commission. The merger was cleared, but the case demonstrated the regulator’s concern for data concentration.
Like most issues affected by technology and data, this is a fast-moving area. The practice of regulators in assessing most mergers has been to use the PQRS model, by looking at the impact of mergers on the price, quality, range and service of goods or services that are supplied.
A key question, Parr says, is whether this will be an effective way of addressing the issue of big data as products and markets develop with the advance of new technology. For regulators this all adds up to a fast-evolving landscape with which they must keep pace.
“In technological terms we are in a period of rapid change,” says Parr. “We will see new players and services emerge, and much of that will be based on or require access to data, which may encourage M&A activity. Competition regulators are conscious of this.”
He adds: “Mergers is an area where they are concerned that they might have to intervene early, but certain transactions may not satisfy the turnover thresholds for assessment.”
Here the question is whether regulators will need new criteria to investigate a deal—a switch possibly, from turnover, to deal value. That point is emphasised by the sheer scale of some of the acquisitions taking place (WhatsApp was generating comparatively little annual revenue, around $10m, when it was acquired by Facebook for $22bn).
So far, says Parr, traditional competition law tools have proved flexible enough to apply to data questions. The market investigation process (used in the banking and energy sectors in the UK) has been around for many years. However, that doesn’t mean the watchdogs do not have concerns.
Parr says regulators are treading carefully. “They are watching the development of data closely. They need to make sure they don’t disincentivise investment and innovation. But by the same token, they are aware that if they don’t intervene early it may be too late.”
This article has been prepared in collaboration with Ashurst, a supporter of Board Agenda.