AMD Introduced 7nm CPUs and GPUs at Computex 2019 with the GPU Likely to launch on Next-Gen iMacs
Intel will announce 10nm Mobile Processor 'Ice Lake' is ready to ship & Introduce a new Gaming CPU at Computex Tomorrow

Secret Isolated facilities test Apple Chips under tough conditions to ensure that their 'Secret Enclave' Safeguards Users' Data

1 x processor testing Apple iPhone AX

 

Apple's iPhone processors are the great line of defense in a battle that Apple never stops fighting as it tries to keep users' data private. To ensure that their processors are able to withstand attacks, Apple has a secret testing facility near the mothership that houses highly advanced machines where iPhones undergo a grueling and intense work out that will heat, cool, push, shock and otherwise abuse their processors.

 

It's an ongoing battle that is being fought on many fronts: against governments that want to read users' personal data; against hackers who try and break into devices on their behalf; against other companies who have attacked Apple's stringent privacy policies.

 

The battle extends to the Chinese government despite laws that force it to store private data on systems that give the country's government nearly unlimited access. The unfortunate drawback, as seen by many, is that it ensures that terrorists, criminals, pedophiles, human traffickers and drug dealers are protected against the government trying to keep citizens safe.

 

Further, critics have argued that the approach has meant Apple is overly concerned with privacy in a way that limits features, and is only made possible by its colossal wealth - cash generated by the premium the firm charges for its products, in effect depriving those unable to pay of its benefits.

 

Craig Federighi, Apple's senior vice president of software engineering spoke to The  Independent about their commitment to privacy: "I can tell you that privacy considerations are at the beginning of the process, not the end. When we talk about building the product, among the first questions that come out is: how are we going to manage this customer data?"

 

Federighi points out that not all of the sensitive data on phones is personal. Some of it can be very much public. For instance, "If I'm a worker at a power plant, I might have access to a system that has a very high consequence. The protection and security of those devices, is actually really critical to public safety. We know that there are plenty of highly motivated attackers who want to for profit, or do want to break into these valuable stores of information on our devices."

 

Apple has repeatedly argued that the creation of a master key or back door to only allow government's into secure devices is simply not possible – any entry that allows law enforcement in will inevitably be exploited by the criminals they are fighting, too. So protecting the owner of the phone as much as possible keeps data private, and ensures that devices stay safe, he argues.

 

The report further noted that "Apple's commitment to privacy, justifying its place at the core of the company's values even when many customers regard it with indifference or even downright disdain."

 

A part of Apple's processor known as the "enclave" that began with the iPhone-5 acts as something of an inner sanctum, the part of the phone that stores its most sensitive information and is fitted out with all the security required to do so.

 

Inside the enclave are key pieces of information, like keys that lock up the biometric data it uses to check your fingerprint against when you hold it up to the sensor, and the ones that lock messages so they can only be read by the people sending or receiving them.

 

Those keys must be kept secure if the phone is to stay safe: the keys protect the biometric data, and the biometric data is the thing that ensures what's inside the phone can only be seen by its owner. While Apple has had some scares about both parts of that process being compromised – such as a disproven suggestion that its Face ID facial recognition technology could be fooled by mannequins – security experts say its approach has worked out.

 

So the aim as the chips are being stress tested is to see if they misbehave in these kind of extreme scenarios – and, if they do, to ensure that happens in this lab rather than once they are inside the phones of users. Any kind of misbehavior could be fatal to a device.

 

2 XF - Secret apple facility for testing AX processorss

 

It might seem unlikely that any normal phone would be subjected to this kind of beating, given the chance of their owners going through an environment that chills them to -40C or heats them to 110C. But the fear here is not normal at all. If the chips were found to be insecure under this kind of pressure, then bad actors would immediately start putting phones through it, and all the data they store could be boiled out of them.

 

If such a fault were found after the phones make their way to customers, there would be nothing Apple could do. Chips can't be changed after they are in people's hands, unlike software updates. So it looks instead to find any possible dangers in this room, tweaking and fixing to ensure the chips can cope with anything thrown at them.

 

The chips arrive here years before they make it into this room; the silicon sitting inside the boxes could be years from making it into users' hands.

 

Eventually, they will find their way into Apple's shining new iPhones, Macs, Apple Watches and whatever other luxury computing devices the company brings to market in the future. The cost of those products has led to some criticism from Apple's rivals, who have said that it is the price of privacy; that Apple is fine talking about how little data it collects, but it is only able to do so because of the substantial premiums they command. That was the argument recently made by Google boss Sundar Pichai, in just one of a range of recent broadsides between tech companies about privacy.

 

"Privacy cannot be a luxury good offered only to people who can afford to buy premium products and services," Pichai wrote in an op-ed in the New York Times. He didn't name Apple, but he didn't need to.

 

Privacy is in danger of becoming a marketing term. Like artificial intelligence and machine learning before it, there is the possibility that it becomes just another word that tech firms need to promise users they're thinking about – even if how it is actually being used remains largely unknown.

 

3 jpeg Microsoft pushes privacy and trust into Build 2019 keynote

 

Patently Apple covered this in a report this month titled " With Apple leading the way, Google & Microsoft attempt to Redefine their Companies as True Advocates of Privacy."

 

You could read a lot more on this thorough report about Apple's privacy and chip testing facility here in a report titled "Inside Apple's Top Secret Testing Facilities where the iPhone defenses are forged in temperatures of -40C."

 

Apple's stance on privacy is about no compromises. That's why a new class action filed against Apple last week alleging that the company has sold private customer data and song lists to data aggregators and the like was so surprising to read about. Apple will have to crush this lawsuit by proving it doesn't participate in such activities for profit or have their reputation for privacy  blow up in their face.  

 

10.0F - Apple News Bar

About Making Comments on our Site: Patently Apple reserves the right to post, dismiss or edit any comments. Those using abusive language or negative behavior will result in being blacklisted on Disqus.

Comments

The comments to this entry are closed.