Apple Expands their Work on 'Differential Privacy' to Safeguard Next-Gen Health Record Solutions & Beyond
Last year, Apple Inc. kicked off a massive experiment with new privacy technology aimed at solving an increasingly thorny problem: how to build products that understand users without snooping on their activities. Its answer is differential privacy, a term virtually unknown outside of academic circles until a year ago. Today, other companies such as Microsoft Corp. and Uber Technologies Inc. are experimenting with this technology.
Differential privacy is key to Apple's artificial intelligence efforts, said Abhradeep Guha Thakurta, an assistant professor at University of California, Santa Cruz, who worked on Apple's differential-privacy systems until January of this year. Thakurta added that "Apple has tried to stay away from collecting data from users until now, but to succeed in the AI era they have to collect information about the user. Apple began rolling out their differential-privacy software in September."
Originally used to understand how Apple customers were using emojis or new slang expressions on the phone, Apple is now expanding its use of differential privacy to cover its collection and analysis of web browsing and health-related data, Katie Skinner, an Apple software engineer, said at the company's annual developer's conference in June.
The company is now receiving millions of pieces of information daily -- all protected via this technique -- from Macs, iPhones and iPads running the latest operating systems, she said.
An Apple spokesman said via email to the Wall Street Journal that "Apple believes that great features and privacy go hand in hand."
Researchers are coming up with "surprisingly powerful" uses of differential privacy, but the technology is only about a decade old, said Benjamin Pierce, a computer science professor at the University of Pennsylvania. "We're really far from understanding what the limits are," he said.
Differential privacy has seen wider adoption since Apple first embraced it. Uber employees, for example, use it to improve services without being overexposed to user data, a spokeswoman said via email.
Microsoft is working with San Diego Gas & Electric Co. on a pilot project to make smart-meter data available to researchers and government agencies for analysis, while making sure "any data set cannot be tied back to our customers," said Chris Vera, head of customer privacy at the utility.
Google, one of differential privacy's earliest adopters, has used it to keep Chrome browser data anonymous. But while the technology is good for some types of analysis, it suffers where precision is required. For example, experts at Google say it doesn't work in so-called A/B tests, in which two versions of a webpage are tested on a small number of users to see which generates the best response.
"In some cases you simply can't answer the questions that developers want answers to," said Yonatan Zunger, a privacy engineer at Google. "We basically see differential privacy as a useful tool in the toolbox, but not a silver bullet."
Why is Differential Privacy Important?
The Wall Street Journal reports that "The problem differential privacy tries to tackle is that modern data-analysis tools are capable of finding links between large databases. Privacy experts worry these tools could be used to identify people in otherwise anonymous data sets.
Two years ago, researchers at the Massachusetts Institute of Technology discovered shoppers could be identified by linking social-media accounts to anonymous credit-card records and bits of secondary information, such as the location or timing of purchases.
"I don't think people are aware of how easy it is getting to de-anonymize data," said Ishaan Nerurkar, whose startup LeapYear Technologies Inc. sells software for leveraging machine learning while using differential privacy to keep user data anonymous.
Differentially private algorithms blur the data being analyzed by adding a measurable amount of statistical noise. This could be done, for example, by swapping out the answer to one question (have you ever committed a violent crime?) with a question that has a statistically known response rate (were you born in February?).
Someone trying to find links in the data would never be sure which question a particular person was asked. That lets researchers analyze sensitive data such as medical records without being able to tie the data back to specific people.
This is important to one of Apple's latest health related projects that Patently Apple covered back on June 15 in a report titled "Apple is working on a new Major Endeavor Related to Creating a Centralized Health Management System." We followed through with a secondary report on this on June 19 titled ""Apple is Reportedly Working with Health Gorilla on their Secret Health Project Revealed last Week."
Handling a user's Health Records will require rock-solid Privacy technology and Apple will need to articulate their privacy policies with clarity for it to be a cornerstone for their future health applications.
For more on this, read the full Wall Street Journal report here.
About Making Comments on our Site: Patently Apple reserves the right to post, dismiss or edit any comments. Those using abusive language or negative behavior will result in being blacklisted on Disqus.
Comments