Last week, Google announced that it would be buying Fitbit, valuing the 12-year-old company at $2.1 billion. Many have seen this as an attempt to boost Google’s position in the wearables sector. So far, the company’s Wear OS platform has made relatively little impact. The acquisition certainly improves Google’s position, but it is only part of a much larger strategy to scoop up the huge amounts of data that are now being generated in the health sector: “Google aspires to create tools that help people enhance their knowledge, success, health and happiness.” But as the company acknowledges:
to get this right, privacy and security are paramount. When you use our products, you’re trusting Google with your information. We understand this is a big responsibility and we work hard to protect your information, put you in control and give you transparency about your data. Similar to our other products, with wearables, we will be transparent about the data we collect and why. We will never sell personal information to anyone. Fitbit health and wellness data will not be used for Google ads. And we will give Fitbit users the choice to review, move, or delete their data.
That boilerplate privacy pledge still leaves plenty of room for Google to gather and analyze extremely personal information, even if it doesn’t sell access to it directly. Google is acutely aware that privacy is even more of a concern for health data than it is for the many other kinds people hand over to Internet companies, because the company has already had problems in this area. Back in 2017, the UK’s Information Commissioner ruled on a collaboration between Google’s DeepMind AI division and a UK National Health Service (NHS) hospital. The Royal Free NHS Foundation Trust provided personal data of around 1.6 million patients as part of a trial to test an alert, diagnosis and detection system for acute kidney injury. According to the Commissioner, the Trust “failed to comply with the [UK] Data Protection Act when it provided patient details to Google DeepMind.”
DeepMind was rolled into Google Health recently, evidence that the company is taking a closer interest in the health data sector. Google continues to sign deals with the UK’s NHS to gain access to patient data. Another major project is a specialized medical record search tool. More generally, the Google Health division is ramping up:
Today we’re studying the use of artificial intelligence to assist in diagnosing cancer, predicting patient outcomes, preventing blindness, and much more. We’re exploring ways to improve patient care, including tools that are already being used by clinicians. And we’re partnering with doctors, nurses, and other healthcare professionals to help improve the care patients receive.
It’s not just Google that wants to muscle into the huge healthcare market. Facebook has announced a major initiative here: “Facebook is developing products and partnerships that can help people connect with resources to support their health. Today we’re sharing an update on some of this work, including a new Preventive Health tool in the US.” Like Google, the company is aware that protecting privacy is crucial:
Personal information about your activity in Preventive Health is not shared with third parties, such as health organizations or insurance companies, so it can’t be used for purposes like insurance eligibility.
We don’t show ads based on the information you provide in Preventive Health — that includes things like setting a reminder for a test, marking it as done or searching for a healthcare location.
However, despite that promise, Facebook will doubtless use health data in order to refine its analysis of who you are, and what kind of goods and services you might buy. The end result will be Mark Zuckerberg’s manipulation machine knowing even more about you, and using it to make money.
Other digital giants starting to explore the healthcare sector include Amazon, Apple and Uber. They’re even getting together to set common standards. What all these moves have in common is that they will be gathering large quantities of digital information concerning our health. Although the companies involved are aware that data protection is crucial, the mere existence of increasingly complete stores of data about our bodies and health represents a real risk.
As Privacy News Online discussed last year, even apparently innocuous information from Strava, a Website and mobile app used to track athletic activity via GPS coordinates, revealed all kinds of highly-personal details. Once data relating to health has been captured for one reason, it can be used for quite different purposes. For example, there is a US proposal to create the Health Advanced Research Projects Agency, or HARPA, to develop ways to identify early signs of changes in people with mental illness that might lead to violent behavior. According to the Washington Post:
HARPA would develop “breakthrough technologies with high specificity and sensitivity for early diagnosis of neuropsychiatric violence,” says a copy of the proposal. “A multi-modality solution, along with real-time data analytics, is needed to achieve such an accurate diagnosis.”
The document goes on to list a number of widely used technologies it suggests could be employed to help collect data, including Apple Watches, Fitbits, Amazon Echo and Google Home.
This blog has already written a number of times about the privacy risks of using “smart” speakers; to that can be added the problems of wearables. Today’s promises by Google and Facebook that they would strongly protect sensitive health data could easily be nullified by new laws obliging them to provide access to the authorities for these kind of monitoring purposes. As ever, the best way to avoid these possible problems in the future is not to create huge centralized stores of personal data in the first place. Applying powerful analytics to healthcare information is an extremely promising field, and should be welcomed for its potential to improve people’s quality of life. At the same time, thought must be given to the huge amounts of data that it draws on. That information needs to be stored locally, under the control of individuals, with limited and calibrated access provided to healthcare services. If that isn’t done – and there are few signs at the moment it will be – it is only a matter of time before intimate data about our bodies is used for quite different purposes, either by criminals, or by governments.
Featured image by Paulbourgine.