Numerous ‘unused’ iOS encryption measures, cryptographers say



[ad_1]

iOS doesn’t use the built-in encryption measures as much as possible, allowing potentially unnecessary security vulnerabilities, according to cryptographers at Johns Hopkins University (via Wired).

IPhone 12 security feature

Using public documentation from Apple and Google, law enforcement reports on bypassing mobile security features, and their own analysis, cryptographers assessed the strength of iOS and Android encryption. Research has found that while the encryption infrastructure on iOS “sounds really good,” it is largely unused:

“On iOS in particular, the infrastructure is in place for this hierarchical encryption which sounds really good,” said Maximilian Zinkus, senior iOS researcher. “But I was really surprised at how unused it is.”

When an iPhone boots, all stored data is in a state of “Full Protection” and the user must unlock the device before anything can be decrypted. While this is extremely secure, the researchers pointed out that once the device has been unlocked for the first time after a restart, a large amount of data goes into a state that Apple calls “Protected Until.” first user authentication ”.

Because devices are rarely restarted, most data is in the “Protected until first user authentication” state rather than “Full protection” most of the time. The advantage of this less secure state is that the decryption keys are stored in fast access memory, where they can be quickly accessed by applications.

In theory, an attacker could find and exploit certain types of security vulnerabilities in iOS to obtain encryption keys into fast-access memory, allowing them to decrypt large amounts of device data. This is believed to be how many smartphone access tools work, such as those from forensic access company Grayshift.

While it is true that attackers need a specific operating system vulnerability to access keys, and Apple and Google fix many of these flaws as they are noticed, it can be avoided. by hiding the encryption keys more deeply.

“It really shocked me, because I went into this project thinking that these phones really protect user data,” says Matthew Green, cryptographer at Johns Hopkins. “Now I walked out of the project thinking that almost nothing is protected as much as it could be. So why do we need a backdoor for law enforcement when the protections that these phones actually offer are that bad? “

The researchers also shared their findings and a number of technical recommendations directly with Apple. An Apple spokesperson offered a public statement in response:

“Apple devices are designed with multiple layers of security to protect against a wide range of potential threats, and we’re constantly working to add new protections for our users’ data. As customers continue to increase the amount of sensitive information they store on their devices, we will continue to develop additional protections, both hardware and software, to protect their data. “

The spokesperson also said Wired that Apple’s security work is primarily focused on protecting users from hackers, thieves, and criminals who seek to steal personal information. They also noted that the types of attacks identified by researchers are very expensive to develop, require physical access to the target device, and only work until Apple releases a fix. Apple also stressed that its goal with iOS is to balance security and convenience.

[ad_2]

Source link