By Ameesh Divatia, Baffle
Google Healthcare has been the subject of widespread media attention recently after it was reported that Google’s “Project Nightingale” is gathering personal health data on millions of Americans. This raises many questions about not simply privacy issues, but also legal and ethical consequences.
In my opinion, consumers should be concerned about Google having access to patient information. Today, healthcare information and healthcare records are more valuable than financial records. Our biggest threat with security, especially in cloud environments, is insider threats. We do not believe that Google itself will be careless about sensitive data but a rogue employee or an external threat is always a possibility as evident from the recent Capital One debacle. The fact remains that someone very knowledgeable, for completely nefarious reasons, could compromise the data.
Who Is Responsible For The Data?
The reality is cloud providers do not take responsibility for their customers’ data. In this case, Google is both the data collector and cloud provider, and Google will take responsibility for the cloud infrastructure. But if someone were to compromise the data while analyzing it, s/he will not take responsibility because there could always be user error. There is no way to protect against some sort of configuration error or how someone uses the cloud infrastructure.
When a medical provider makes a deal like this with Google to collect data, the provider should limit patient data access. The reason goes beyond just healthcare restrictions like HIPAA and HITECH. Now there are regulations around privacy protection. GDPR started in the EU, where consumers not only have a right to know what their data will be used for, so the data collectors have to provide purpose, but they also have the right to be forgotten.
Where Privacy Is Being Enforced
The most important part about privacy regulations is the extent of the penalties. We’ve had these breach regulations around for a long time, but they always specify that any damage caused has to be compensated. With these privacy regulations, damage is not necessarily related to the loss that a consumer suffers. There is a dollar amount associated with it, just like GDPR. GDPR penalizes people by the percentage of revenue that the collector had. The penalties for CCPA (California Consumer Privacy Act), which is already law and will go into effect January 1, 2020, are pure dollar amounts - $150-$750 related to a singular record. And there are multiple states following California’s lead, including New York and Massachusetts.
What Ascension is doing with Google is legal. But what will happen is that when people turn over their records to Ascension, the print says Ascension can do whatever they want with that data. So it is probably legal but absolutely not ethical because if medical records are compromised, that can have all sorts of repercussions. If a cancer diagnosis is revealed, for example, that patient might not be able to get insured. Given that though, every service provider, before they insure a group of people, always asks for medical records because they have to assess their risk.
The data is always going to be shared. There is no way around it. That’s the world we live in right now. The problem is well understood, which is that data needs to be managed carefully.
The Solution: Privacy-Preserving Analytics
There have been significant advances in an area that the industry is calling privacy preserving analytics, which is defined as a capability that enables analyzing data without compromising its underlying privacy. Existing analytics applications will process data without actually seeing it. Several technology companies including Google are very much involved in efforts such as this. Ethically what consumers should demand is that Google process that data in a way that nobody, including themselves, will see the data.
Data is the new oil with so much value, but it is also the new asbestos, which means that if you don’t use the data responsibly, it can have some bad effects. As an industry, we are up for the challenge. We want to create solutions where this unethical behavior is not only prevented, it’s actually mathematically impossible because of the safeguards we can put in place where no data is provided in the clear, even while it’s being processed. Because of existing and upcoming privacy regulations, the mechanisms, the way data is processed, must evolve to make sure privacy is not compromised.
About The Author
Ameesh Divatia is cofounder and CEO of Baffle.