UK report warns DeepMind Health could gain ‘excessive monopoly power’

INSUBCONTINENT EXCLUSIVE:
DeepMind foray into digital health services continues to raise concerns.The latest worries are voiced by a panel of external reviewers
appointed by the Google-owned AI company to report on its operations after its initial data-sharing arrangements with the U.K
National Health Service (NHS) ran into a major public controversy in 2016. The DeepMind Health Independent Reviewers&2018 reportflags a
series of risks and concerns, as they see it, including the potential for DeepMind Health to be able to &exert excessive monopoly power& as
a result of the data access and streaming infrastructure that bundled with provision of the Streams app — andwhich, contractually,
positions DeepMind as the access-controlling intermediary between the structured health data and any other third parties that might, in the
future, want to offer their own digital assistance solutions to the Trust
While the underlying FHIR (aka, fast healthcare interoperability resource) deployed by DeepMind for Streams uses an open API, the contract
between the company and the Royal Free Trust funnels connections via DeepMind own servers, and prohibits connections to other FHIR servers
A commercial structure that seemingly works against the openness and interoperability DeepMind co-founder Mustafa Suleyman has claimed to
support. &There are many examples in the IT arena where companies lock their customers into systems that are difficult to change or replace
Such arrangements are not in the interests ofthe public
And we do not want to see DeepMind Health putting itself in a position where clients, such as hospitals, find themselves forced to stay with
DeepMind Health even if it is no longer financially or clinically sensible to do so;we want DeepMind Health to compete on quality and price,
not by entrenching legacy position,& the reviewers write. Though they point toDeepMind &stated commitmentto interoperability of systems,&
and &their adoption of the FHIR open API& as positive indications, writing: &This means that there is potential for many other SMEs to
become involved, creating a diverse and innovative marketplace which works to the benefit of consumers, innovation and the economy.& &We
also note DeepMind Health intention to implement many of the features of Streams as modules which could be easily swapped, meaning that they
will have to rely on being the best to stay in business,& they add. However, stated intentions and future potentials are clearly not the
same as on-the-ground reality
And, as it stands, a technically interoperable app-delivery infrastructure is being encumbered by prohibitive clauses in a commercial
contract — and by a lack of regulatory pushback against such behavior. The reviewers also raise concerns about an ongoing lack of clarity
around DeepMind Health business model — writing: &Given the current environment, and with no clarity about DeepMind Health business model,
people arelikely to suspect that there must be an undisclosed profit motive or a hidden agenda
We do not believe this to be the case, but would urge DeepMind Health to be transparent about their business model, and their ability to
stick to that without being overridden by Alphabet
For once an ideaof hidden agendas is fixed in people mind, it is hard to shift, no matter how much a company is motivated by the public
good.& &We have had detailed conversations about DeepMind Health evolving thoughts in this area, and are aware that some of these questions
have not yet been finalised
However, we would urge DeepMind Health to set out publicly what they are proposing,& they add. DeepMind has suggested it wants to build
healthcare AIs that are capable of charging by results
But Streams does not involve any AI
The serviceis also being provided to NHS Trusts for free, at least for the first five years — raising the question of how exactly the
Google-owned company intends to recoup its investment. Google of course monetizes a large suite of free-at-the-point-of-use consumer
products — such as the Android mobile operating system; its cloud email service Gmail; and the YouTube video sharing platform, to name
three — by harvesting people personal data and using that information to inform its ad targeting platforms. Hence the reviewers&
recommendation for DeepMind to set out its thinking on its business model to avoid its intentions vis-a-vis people medical data being viewed
with suspicion. The company historical modus operandi also underlines the potential monopoly risks if DeepMind is allowed to carve out a
dominant platform position in digital healthcare provision — given how effectively its parent has been able to turn a free-for-OEMs mobile
OS (Android) into global smartphone market OS dominance, for example. So, while DeepMind only has a handful of contracts with NHS Trusts
for the Streams app and delivery infrastructure at this stage, the reviewers& concerns over the risk ofthe company gaining &excessive
monopoly power&do not seem overblown. They are also worried about DeepMind ongoing vagueness about how exactly it works with its parent
Alphabet, and what data could ever be transferred to the ad giant — an inevitably queasy combination when stacked against DeepMind
handling of people medical records. &To what extent can DeepMind Health insulate itself against Alphabet instructing them in the future
to do something which it has promised not to do today Or, if DeepMind Health current management were to leave DeepMind Health, how much
could a new CEO alter what has been agreed today& they write. &We appreciate that DeepMind Health would continue to be bound by the legal
and regulatory framework, but much of our attention is on the steps that DeepMind Health have taken to take a more ethical stance than the
law requires; could this all be ended We encourage DeepMind Health to look at ways of entrenching its separation from Alphabet and DeepMind
more robustly, so that it can have enduring force to the commitments it makes.& Responding to the report publication on its website,
DeepMind writes that it &developing our longer-term business model and roadmap.& &Rather than charging for the early stages of our work, our
first priority has been to prove that our technologies can help improve patient care and reduce costs
We believe that our business model should flow from the positive impact we create, and will continue to explore outcomes-based elements so
that costs are at least in part related to the benefits we deliver,& it continues. So it has nothing to say to defuse the reviewers&
concerns about making its intentions for monetizing health data plain — beyond deploying a few choice PR soundbites. On its links with
Alphabet, DeepMind also has little to say, writing only that: &We will explore further ways to ensure there is clarity about the binding
legal frameworks that govern all our NHS partnerships.& &Trusts remain in full control of the data at all times,& it adds
&We are legally and contractually bound to only using patient data under the instructions of our partners
We will continue to make our legal agreements with Trusts publicly available to allow scrutiny of this important point.& &There is nothing
in our legal agreements with our partners that prevents them from working with any other data processor, should they wish to seek the
services of another provider,& it also claims in response toadditional questions we put to it. &We hope that Streams can help unlock the
next wave of innovation in the NHS
The infrastructure that powers Streams is built on state-of-the-art open and interoperable standards, known as FHIR
The FHIR standard is supported in the UK by NHS Digital, NHS England and the INTEROPen group
This should allow our partner trusts to work more easily with other developers, helping them bring many more new innovations to the clinical
frontlines,& it adds in additional comments to us. &Under our contractual agreements with relevant partner trusts, we have committed to
building FHIR API infrastructure within the five year terms of the agreements.& Asked about the progress it made on a technical audit
infrastructure for verifying access to health data, which it announced last year, it reiterated the wording on its blog, saying: &We will
remain vigilant about setting the highest possible standards of information governance
At the beginning of this year, we appointed a full time Information Governance Manager to oversee our use of data in all areas of our work
We are also continuing to build ourVerifiable Data Audit and other tools to clearly show how we&re using data.& So developments on that
front look as slow as we expected. The Google-owned U.K
AI company began its push into digital healthcare services in 2015, quietly signing an information-sharing arrangement with a London-based
NHS Trust that gave it access to around 1.6 million people medical records for developing an alerts app for a condition called Acute Kidney
Injury. It also inked an MoU with the Trust where the pair set out their ambition to apply AI to NHS data sets
(They even went so far as to get ethical signs-off for an AI project — but have consistently claimed the Royal Free data was not fed to
any AIs.) However, the data-sharing collaboration ran into trouble in May 2016 when the scope of patient data being shared by the Royal Free
with DeepMind was revealed (via investigative journalism, rather than by disclosures from the Trust or DeepMind). None of the ~1.6 million
people whose non-anonymized medical records had been passed to the Google-owned company had been informed or asked for their consent
And questions were raised about the legal basis for the data-sharing arrangement. Last summerthe U.K
privacy regulator concluded an investigation of the project — finding that the Royal Free NHS Trust had broken data protection rules
during the app development. Yet despite ethical questions and regulatory disquiet about the legality of the data sharing, the Streams
project steamrollered on
And the Royal Free Trust went on to implement the app for use by clinicians in its hospitals, while DeepMind has also signed several
additional contracts to deploy Streams to other NHS Trusts. More recently, the law firm Linklaters completed an audit of the Royal Free
Streams project, after being commissioned by the Trust as part of its settlement with the ICO
Though this audit only examined the current functioning of Streams
(There has been no historical audit of the lawfulness of people medical records being shared during the build and test phase of the
project.) Linklaters did recommend the Royal Free terminates its wider MoU with DeepMind — and the Trust has confirmed to us that it will
be following the firm advice. &The audit recommends we terminate the historic memorandum of understanding with DeepMind which was signed in
January 2016
The MOU is no longer relevant to the partnership and we are in the process of terminating it,& a Royal Free spokesperson told us. So
DeepMind, probably the world most famous AI company, is in the curious position of being involved in providing digital healthcare services
to U.K
hospitals that don&t actually involve any AI at all
(Though it does have some ongoing AI research projects with NHS Trusts too.) In mid 2016, at the height of the Royal Free DeepMind data
scandal — and in a bid to foster greater public trust — the company appointed the panel of external reviewers who have now produced
their second report looking at how the division is operating. And it fair to say that much has happened in the tech industry since the panel
was appointed to further undermine public trust in tech platforms and algorithmic promises— including the ICO finding that the initial
data-sharing arrangement between the Royal Free and DeepMind broke U.K
privacy laws. The eight members of the panel for the 2018 report are:Martin Bromiley OBE; Elisabeth Buggins CBE; Eileen Burbidge MBE;
Richard Horton;Dr
Julian Huppert;Professor Donal O&Donoghue; Matthew Taylor; andProfessor Sir John Tooke. In their latest report the external reviewers warn
that the public view of tech giants has &shifted substantially& versus where it was even a year ago — asserting that &issues of privacy in
a digital age are if anything, of greater concern.& At the same time politicians are also gazing rather more critically on the works and
social impacts of tech giants. Although the U.K
government has also been keen to position itself as a supporter of AI, providing public funds for the sector and, in itsIndustrial Strategy
white paper, identifying AI and data as one of four so-called &Grand Challenges& where it believes the U.K
can &lead the world for years to come& — including specificallyname-checking DeepMind as one of a handful of leading-edge homegrown AI
businesses for the country to be proud of. Still, questions over how to manage and regulate public sector data and AI deployments —
especially in highly sensitive areas such as healthcare — remain to be clearly addressed by the government. Meanwhile, the encroaching
ingress of digital technologies into the healthcare space — even when the techs don&t even involve any AI — are already presenting major
challenges by putting pressure on existing information governance rules and structures, and raising the specter of monopolistic risk. Asked
whether it offers any guidance to NHS Trusts around digital assistance for clinicians, including specifically whether it requires multiple
options be offered by different providers, the NHS& digital services provider, NHS Digital, referred our question on to the Department of
Health (DoH), saying it a matter of health policy. The DoH in turn referred the question to NHS England, the executive non-departmental body
which commissions contracts and sets priorities and directions for the health service in England. And at the time of writing, we&re still
waiting for a response from the steering body. Ultimately it looks like it will be up to the health service to put in place a clear and
robust structure for AI and digital decision services that fosters competition by design by baking in a requirement for Trusts to support
multiple independent options when procuring apps and services. Without that important check and balance, the risk is that platform dynamics
will quickly dominate and control the emergent digital health assistance space — just as big tech has dominated consumer tech. But
publicly funded healthcare decisionsand data setsshould not simply be handed to the single market-dominating entity that willing and able to
burn the most resource to own the space. Nor should government stand by and do nothing when there a clear risk that a vital area of digital
innovation is at risk of being closed down by a tech giant muscling in and positioning itself as a gatekeeper before others have had a
chance to show what their ideas are made of, and before even a market has had the chance to form.