********
http://www.slate.com/articles/technology/future_tense/2014/09/ios_8_encryption_why_apple_won_t_unlock_your_iphone_for_the_police.html

Last week Apple released its new iOS 8 operating system for iPhones, iPads,
and iPod Touch devices. Most of the coverage of iOS 8 focuses on visible
features that users can interact with. But there’s one major change in iOS
8 that most users probably won’t notice unless they find themselves in a
great deal of trouble. Specifically, Apple has radically improved the way
that data on those devices is encrypted. Once users set a passcode, Apple
will no longer be able to unlock your device—even if ordered to do so by a
court.

While privacy advocates have praised Apple’s move, it has drawn fire from
some notable legal scholars. Writing in the Washington Post on Sept. 19,
Orin Kerr referred to Apple’s new policy as a “dangerous game,” one that
“doesn’t stop hackers, trespassers, or rogue agents” but “only stops lawful
investigations with lawful warrants.” While Kerr has moderated his views
since his initial post, his overarching concern remains the same: By
placing customer interests before that of law enforcement, Apple is working
against the public interest. If you interpret Apple’s motivations as Kerr
does, then Apple’s recent move is pretty surprising. Not only has the
company picked a pointless fight with the United States government, it’s
potentially putting the public at risk.

The only problem is that Kerr is wrong about this. Apple is not designing
systems to prevent law enforcement from executing legitimate warrants. It’s
building systems that prevent everyone who might want your data—including
hackers, malicious insiders, and even hostile foreign governments—from
accessing your phone. This is absolutely in the public interest. Moreover,
in the process of doing so, Apple is setting a precedent that users, and
not companies, should hold the keys to their own devices.

To see why this is the case, you need to know a bit about what Apple is
doing with its new technology. The first time you power up a new iPhone or
iPad, you’ll be asked to set a passcode for unlocking your phone. This can
be a full password or just a 4-digit PIN (though the former is certainly
stronger). On devices with a Touch ID sensor, you’ll also be allowed to use
your fingerprint as a more convenient alternative.

A passcode may look like flimsy security, but it’s not. The minute you set
one, Apple’s operating system immediately begins encrypting your phone’s
sensitive data—including mail, texts, photos, and call records—using a form
of encryption that the U.S. government uses to protect classified military
secrets. The key for this encryption is mathematically derived by combining
your passcode with a unique set of secret numbers that are baked into your
phone.

If all goes well, you’ll never notice this is happening. But the impact on
data raiders is enormous. Even if someone cracks your phone open and
attempts to read data directly off of the memory chips, all she’ll see is
useless, scrambled junk. Guessing your passcode won’t help her—unless she
can also recover the secret numbers that are stored within your phone’s
processor. And Apple’s latest generation of phones makes that very
difficult. Of course, your would-be data thief could try to get in by
exhaustively trying all possible combinations, but according to an iOS
security document, Apple also includes protections to slow this attack
down. (In the same document, Apple estimates that a 6-digit alphanumeric
password could take upward of five years to guess.)

The encryption on Apple devices is not entirely new with iOS 8. What is new
is the amount of data your phone will now encrypt. Apple has extended
encryption protections to nearly all the data you produce on a daily basis
and will also require you to enter the passcode (or fingerprint) each time
you reboot your phone. In addition, if you purchase a recent iPhone (5S, 6,
or 6 Plus), Apple will store your keys within a dedicated hardware
encryption “co-processor” called the Secure Enclave.

Taking Apple’s recent privacy announcements at face value, even Apple
itself can’t break into the Secure Enclave in your phone. While it may seem
“natural” that the designer of a system—in this case Apple—can break its
own encryption, the truth is that such a capability is hardly an inevitable
design outcome. For Apple to maintain such a capability with its newer
security processors, it can’t just be more knowledgeable than its
customers. It would have to literally design in a form of “skeleton key.”
In computer security circles this mechanism is generally known as a
“backdoor.”


--
| ig/twit:@h4nafi | japri: ya[at]terserah.de | sent from tab°S° [4.4.2] |

-- 
==========
Berubah menjadi lebih terkendali bersama @kartuHalo Halo Fit Hybrid
Info Lengkap >> tsel.me/halohybrid   #KendalikanHidup
-------------------
Gunakan layanan Hosting Indonesia yang stabil, terjangkau dan aman
Kunjungi  >> http://www.Qwords.com
--------------------
ID-Android on YouTube
https://www.youtube.com/watch?v=0u81L8Qpy5A 
--------------------
Kontak Admin, Twitter  @agushamonangan

Aturan Umum  ID-ANDROID >> http://goo.gl/NfzSGB

Join Forum   ID-ANDROID >> http://forum.android.or.id
==========
--- 
Anda menerima pesan ini karena Anda berlangganan grup "[id-android] Indonesian 
Android Community " dari Google Grup.
Untuk berhenti berlangganan dan berhenti menerima email dari grup ini, kirim 
email ke id-android+unsubscr...@googlegroups.com.
Kunjungi grup ini di http://groups.google.com/group/id-android.

Kirim email ke