The FBI was able to access Washington Post reporter Hannah Natanson's Signal messages because she used Signal on her work laptop. The laptop accepted Touch ID for authentication, meaning the agents were allowed to require her to unlock it.
I actually think it is fitting to read about a government agency weaponized by an unscrupulous billionaire going after journalists working for an unscrupulous billionaire on an unscrupulous trillionaire owned platform.
Being held in contempt at least means you got a day in court first. A judge telling me to give up my password is different than a dozen armed, masked secret police telling me to.
> A judge telling me to give up my password is different than a dozen armed, masked secret police telling me to.
Yes, a judge is unlikely to order your execution if you refuse. Based on recent pattern of their behavior, masked secret police who are living their wildest authoritarian dreams are likely to execute you if you anger them (for example by refusing to comply with their desires).
> Authorities, citing a âforegone conclusion exceptionâ to the Fifth Amendment, argued that Rawls could not invoke his right to self-incrimination because police already had evidence of a crime. The 3rd Circuit panel agreed, upholding a lower court decision.
I do not follow the logic here, what does that even mean? It seems very dubious. And what happens if one legitimately forgets? They just get to keep you there forever?
This command will make your MacBook hibernate when lid is closed or the laptop sleeps, so RAM is written to disk and the system powers down. The downside is that it does increase the amount of time it takes to resume.
A nice side benefit though, is that fingerprint is not accepted on first unlock, I believe secrets are still encrypted at this stage similar to cold boot. A fingerprint still unlocks from screensaver normally, as long as the system does not sleep (and therefore hibernate)
Is the knowledge of which finger to use protected as much as a passcode? Law enforcement might have the authority to physically hold the owner's finger to the device, but it seems that the owner has the right to refuse to disclose which finger is the right one. If law enforcement doesn't guess correctly in a few tries, the device could lock itself and require the passcode.
Another reason to use my dog's nose instead of a fingerprint.
I really wish Apple would offer a pin option on macos. For this reason, precisely. Either that, or an option to automatically disable touchid after a short amount of time (eg an hour or if my phone doesn't connect to the laptop)
You can setup a separated account with a long password on MacOS and remove your user account from accounts that can unlock FileVault. Then you can change your account to use a short password. You can also change various settings regarding how long Mac has to sleep before requiring to unlock FileVault.
With that setup on boot or after a long sleep one first must log in into an account with longer password. Then one logs out of that and switches to the primary account with a short password.
As another alternative, rather than using Touch ID you can setup a Yubikey or similar hardware key for login to macOS. Then your login does indeed become a PIN with 3 tries before lockout. That plus a complex password is pretty convenient but not biometric. It's what I've done for a long time on my desktop devices.
There's only ten possible guesses, and most people use their thumb and/or index finger, leaving four much likelier guesses.
Also, IANAL, but I'm pretty sure that if law enforcement has a warrant to seize property from you, they're not obligated to do so immediately the instant they see you - they could have someone follow you and watch to see how you unlock your phone before seizing it.
0.1 in itself is a very good odd, and 0.1 * n tries is even more laughable. Also most people have two fingers touchID, which makes this number close to half in reality.
Also, using biometrics on a device, and your biometrics unlock said device, do wonders for proving to a jury that you owned and operated that device. So you're double screwed in that regard.
When they arrest you, they have physical control of your body. You're in handcuffs. They can put your fingers against the unlock button. You can make a fist, but they can have more strength and leverage to unfist your fist.
There's no known technique to force you to input a password.
It's something you know vs. something you have. That's how the legal system sees it. You might not tell someone the pin to your safe, but if police find the key to it, or hire a locksmith to drill out your safe, it's theirs with a warrant.
It's interesting in the case of social media companies. Technically the data held is the companies data (Google, Meta, etc.) however courts have ruled that a person still has an expectation of privacy and therefore police need a warrant.
Did you know that on most models of iPhone, saying "Hey Siri, who's iPhone is this?" will disable biometric authentication until the passcode is entered?
Serious question. If I am re-entering the US after traveling abroad, can customs legally ask me to turn the phone back on and/or seize my phone? I am a US citizen.
Out of habit, I keep my phone off during the flight and turn it on after clearing customs.
my understanding is that they can hold you for a couple days without charges for your insubordination but as a citizen they have to let you back into the country or officially arrest you, try to get an actual warrant, etc.
If you are a US citizen, you legally cannot be denied re-entry into the country for any reason, including not unlocking your phone. They can make it really annoying and detain you for a while, though.
Or squeeze the power and volume buttons for a couple of seconds. Itâs good to practice both these gestures so that they become reflex, rather than trying to remember them when theyâre needed.
Sad, neither of those works on Android. Pressing the power button activates the emergency call screen with a countdown to call emergency services, and power + volume either just takes a screenshot or enables vibrations/haptics depending on which volume button you press.
Did you check your phone settings? Mine has an option to add it to the power menu, so you get to it by whichever method you use to do that (which itself is sad that phones are starting to differ in what the power key does).
On my 9 you get a setting to choose if holding Power gets you the power menu or activates the assistant (I think it defaulted to assistant? I have it set to the power menu because I don't really ever use the assistant.)
It's close enough, because (most of) the encryption keys are wiped from memory every time the device is locked, and this action makes the secure enclave require PIN authentication to release them again.
Not really, because tools like Cellbrite are more limited with BFU, hence the manual informing LEO to keep (locked) devices charged, amd the countermeasures being iOS forcefully rebooting devices that have been locked for too long.
Eh? BFU ("before first unlock") is, by definition, the state that a phone is in when it is turned on. There's no need to "force" it.
If you mean forcing an iOS device out of BFU, that's impossible. The device's storage is encrypted using a key derived from the user's passcode. That key is only available once the user has unlocked the device once, using their passcode.
This is the third person advocating button squeezing, as a reminder: IF a gun is on you the jig is up, you can be shot for resisting or reaching for a potential weapon. Wireless detonators do exist, don't f around please.
In case anyone is wondering: In newer versions of MacOS, the user must log out to require a password. Locking screen no longer requires password if Touch ID is enabled.
I just searched the case. I'm appalled. It looks like USA doesn't have legal protection for reporter sources. Or better, Biden created some, but it was revoked by the current administration.
The real news here isn't privacy control in a consumer OS ir the right to privacy, but USA, the leader of the free world, becoming an autocracy.
As if the government is not above breaking the law and using rubber hose decryption. The current administrationâs justice department has been caught lying left and right
I find it so frustrating that Lockdown Mode is so all-or-nothing.
I want some of the lockdown stuff (No facetime and message attachments from strangers, no link previews, no device connections), but like half of the other ones I don't want.
Why can't I just toggle an iMessage setting for "no link preview, no attachments", or a general setting for "no automatic device connection to untrusted computers while locked"? Why can't I turn off "random dickpicks from strangers on iMessage" without also turning off my browser's javascript JIT and a bunch of other random crap?
Sure, leave the "Lockdown mode" toggle so people who just want "give me all the security" can get it, but split out individual options too.
Just to go through the features I don't want:
* Lockdown Mode disables javascript JIT in the browser - I want fast javascript, I use some websites and apps that cannot function without it, and non-JIT js drains battery more
* Shared photo albums - I'm okay viewing shared photo albums from friends, but lockdown mode prevents you from even viewing them
* Configuration profiles - I need this to install custom fonts
Apple's refusal to split out more granular options here hurts my security.
Iâm with you on the shared photo albums. Iâd been using lockdown mode for quite a while before I discovered this limitation, though. For me, this is one Iâd like to be able to selectively enable (like the per-website/app settings). In my case, it was a one-off need, so I disabled lockdown mode, shared photos, then enabled it again.
The other feature I miss is screen time requests. This one is kinda weird - Iâm sure thereâs a reason theyâre blocked, but itâs a message from Apple (or, directly from a trusted family member? Iâm not 100% sure how they work). I still _recieve_ the notification, but itâs not actionable.
While I share with your frustration, though, I do understand why Apple might want to have it as âall-or-nothingâ. If they allow users to enable even one âdangerousâ setting, that ultimately compromises the entire security model. An attacker doesnât care which way they can compromise your device. If thereâs _one_ way in, thatâs all they need.
Ultimately, for me the biggest PiTA with lockdown mode is not knowing if itâs to blame for a problem Iâm having. I couldnât tell you how many times Iâve disabled and re-enabled it just to test something that should work, or if itâs the reason a feature/setting is not showing up. To be fair, most of the time itâs not the issue, but sometimes I just need to rule it out.
>* Lockdown Mode disables javascript JIT in the browser - I want fast javascript, I use some websites and apps that cannot function without it, and non-JIT js drains battery more
This feature has the benefit of teaching users (correctly) that browsing the internet on a phone has always been a terrible idea.
Phone networks by design track you more precisely than possible over a conventional internet connection to facilitate the automatic connection to the nearest available network. Also, for similar reasons it requires the phone network to know that it is your phone
Is there an implication here that they could get into an iPhone with lower security settings enabled? There's Advanced Data Protection, which E2EEs more of your data in iCloud. There's the FaceID unlock state, which US law enforcement can compel you to unlock; but penta-click the power button and you go into PIN unlock state, which they cannot compel you to unlock.
My understanding of Lockdown Mode was that it babyifies the device to reduce the attack surface against unknown zero-days. Does the government saying that Lockdown Mode barred them from entering imply that they've got an unknown zero-day that would work in the PIN-unlock state, but not Lockdown Mode?
This was known, in the past, but if its relying on zero-days Apple & Google are, adversarially, attempting to keep up with and patch, my assumption would not be that pegasus is, at any time, always able to breach a fully-updated iPhone. Rather, its a situation where maybe there are periods of a few months at a time where they have a working exploit, until Apple discovers it and patches it, repeat indefinitely.
Sadly, they still got to her Signal on her Desktop â her sources might still be compromised. It's sadly inherent to desktop applications, but I'm sad that a lot more people don't know that Signal for Desktop is much, much less secure against adversaries with your laptop.
In addition to what the other person who replied said, ignoring that iOS/Android/iPadOS is far more secure than macOS, laptops have significantly less hardware-based protections than Pixel/Samsung/Apple mobile devices do. So really the only way a laptop in this situation would be truly secure from LEO is if its fully powered off when itâs seized.
The key in the desktop version is not always stored in the secure enclave, is my assumption (it definitely supports plaintext storage). Theoretically this makes it possible to extract the key for the message database. Also a different malicious program can read it. But this is moot anyway if the FBI can browse through the chats. This isn't what failed here.
Also last time I looked (less than 1 year ago) files sent over Signal are stored in plain, just with obfuscated filenames. So even without access to Signal it's easy to see what message attachments a person has received, and copy any interesting ones.
That's a strong statement. Also imho it's important that we use Signal for normal stuff like discussing where to get coffee tomorrow - no need for disappearing messages there.
Strong and accurate. Considering non-disappearing messages the same as texts is not the same thing as saying all Signal messages ought to be disappearing or else the app is useless.
Telegram allows you to have distinct disappearing settings for each chat/group. Not sure how it works on Signal, but a solution like this could be possible.
I would have thought reporters with confidential sources at that level would already exercise basic security hygiene. Hopefully, this incident is a wake up call for the rest.
Yea, I also would want to question the conclusions in the article. Was the issue that they couldn't unlock the iPhone, or that they had no reason to pursue the thread? To my understanding, the Apple ecosystem means that everything is synced together. If they already got into her laptop, wouldn't all of the iMessages, call history, and iCloud material already be synced there? What would be the gain of going after the phone, other than to make the case slightly more watertight?
Depending on your jurisdiction faceid is safer than fingerprint, because faceid wonât unlock while your eyes are closed.
In many European countries forcing your finger on a scanner would be permissible under certain circumstances, forcing your eyes open so far has been deemed unacceptable.
> Natanson said she does not use biometrics for her devices, but after investigators told her to try, âwhen she applied her index finger to the fingerprint reader, the laptop unlocked.â
I want to say that is generous of her, but one thing that is weird is if I didnât want someone to go into my laptop and they tried to force me to use my fingerprint to unlock it, I definitely wouldnât use the finger I use to unlock it on the first try. Hopefully, Apple locks it out and forces a password if you use the wrong finger âaccidentallyâ a couple of times.
There appear to be a relatively few possibilities.
* The reporter lied.
* The reporter forgot.
* Apple devices share fingerprint matching details and another device had her details (this is supposed to be impossible, and I have no reason to believe it isn't).
* The government hacked the computer such that it would unlock this way (probably impossible as well).
* The fingerprint security is much worse than years of evidence suggests.
Mainly it was buried at the very end of the article, and I thought it worth mentioning here in case people missed it.
My opinion is that she set it up, it didn't work at first, she didn't use it, forgot that it existed, and here we are.
> Apple devices share fingerprint matching details and another device had her details
I looked into it quite seriously for windows thinkpads, unless Apple do it differently, you cannot share fingerprint, they're in a local chip and never move.
The reporter lying or forgetting seems to be the clear answer, there's really no reason to believe it's not one of those. And the distinction between the two isn't really important from a technical perspective.
Fingerprint security being poor is also unlikely, because that would only apply if a different finger had been registered.
She has to have set it up before. There is no way to divine a fingerprint any other way. I guess the only other way would be a faulty fingerprint sensor but that should default to a non-entry.
The fingerprint sensor does not make access control decisions, so the fault would have to be somewhere else (e.g. the software code branch structure that decides what to do with the response from the secure enclave).
Could be a parallel construction type thing. They already have access but they need to document a legal action by which they could have acquired it so it doesn't get thrown out of court.
I think this is pretty unlikely here but it's within the realm of possibility.
I think they mean if they already have her fingerprint from somewhere else, and a secret backdoor into the laptop. Then they could login, setup biometrics and pretend they had first access when she unlocked it. All without revealing their backdoor.
"Lockdown Mode is a sometimes overlooked feature of Apple devices that broadly make[sic] them harder to hack."
Funny to see disabling "features" itself described as "feature"
Why not call it a "setting"
Most iPhone users do not change default settings. That's why Google pays Apple billions of dollars for a default setting that sends data about users to Google
"Lockdown Mode" is not a default setting
The phrase "sometimes overlooked" is an understatement. It's not a default setting and almost no one uses it
If it is true Lockdown Mode makes iPhones "harder to hack", as the journalist contends, then it is also true that Apple's default settings make iPhones "easier to hack"
The intention behind lockdown mode is protection for a select few groups of people such as journalists, that are at risk of having software like Pegasus used against them. Itâs to reduce the attack surface. The average user wouldnât want most of it as a default setting, for example: almost no message attachments allowed, no FaceTime calls from people you havenât called and safari is kneecapped. Making this a default setting for most people is unrealistic and also probably wonât help their cybersecurity as they wouldnât be targeted anyway.
A "reduced attack surface" can also be a reduced surface for telemetry, data collection, surveillance and advertising services, thereby directly or indirectly causing a reduction in Apple revenues
Perhaps this could be a factor in why it's not a default setting
It seems unfortunate that enhanced protection against physically attached devices requires enabling a mode that is much broader, and sounds like it has a noticeable impact on device functionality.
I never attach my iPhone to anything that's not a power source. I would totally enable an "enhanced protection for external accessories" mode. But I'm not going to enable a general "Lockdown mode" that Apple tells me means my "device wonât function like it typically does"
There is a setting as of iOS 26 under "Privacy & Security > Wired Accessories" in which you can make data connections always prompt for access. Not that there haven't been bypasses for this before, but perhaps still of interest to you.
GrapheneOS does this by default - only power delivery when locked. Also it's a hardware block, not software. Seems to be completely immune to these USB exploit tools.
It also has various options to adjust the behaviour, from no blocks at all, to not even being able to charge the phone (or use the phone to charge something else) -- even when unlocked. Changing the mode of operation requires the device PIN, just as changing the device PIN does.
Note that it behaves subtly differently to how you described in case it was connected to something before being locked. In that case data access will remain -- even though the phone is now locked -- until the device is disconnected.
Can a hacked phone (such as one that was not in Lockdown Mode at one point in time) persist in a hacked state?
Obviously, the theoretical answer is yes, given an advanced-enough exploit. But let's say Apple is unaware of a specific rootkit. If each OS update is a wave, is the installed exploit more like a rowboat or a frigate? Will it likely be defeated accidentally by minor OS changes, or is it likely to endure?
This answer is actionable. If exploits are rowboats, installing developer OS betas might be security-enhancing: the exploit might break before the exploiters have a chance to update it.
Forget OS updates. The biggest obstacle to exploit persistence: a good old hard system reboot.
Modern iOS has an incredibly tight secure chain-of-trust bootloader. If you shut your device to a known-off state (using the hardware key sequence), on power on, you can be 99.999% certain only Apple-signed code will run all the way from secureROM to iOS userland. The exception is if the secureROM is somehow compromised and exploited remotely (this requires hardware access at boot-time so I don't buy it).
So, on a fresh boot, you are almost definitely running authentic Apple code. The easiest path to a form of persistence is reusing whatever vector initially pwned you (malicious attachment, website, etc) and being clever in placing it somewhere iOS will attempt to read it again on boot (and so automatically get pwned again).
But honestly, exploiting modern iOS is already difficult enough (exploits go for tens millions $USD), persistence is an order of magnitude more difficult.
That's how you get off such charges. I'll work for you, if you drop charges. There was a reddit post I can't find when EMPRESS had one of their episodes where she was asked if she wanted to work for. It's happened in the cracking scene before.
> The jailbreaking community is fractured, with many of its former members having joined private security firms or Apple itself. The few people still doing it privately are able to hold out for big payouts for finding iPhone vulnerabilities. And users themselves have stopped demanding jailbreaks, because Apple simply took jailbreakersâ best ideas and implemented them into iOS.
Re: reboots â TFA states that recent iPhones reboot every 3 days when inactive for the same reasons. Of course, now that we know that it's linked to inactivity, black hatters will know how to avoid it...
You should read into IOS internals before commenting stuff like this. Your answer is wrong, and rootkits have been dead on most OS's for years, but ESPECIALLY IOS. Not every OS is like Linux where security is second.
Even a cursory glance would show it's literally impossible on IOS with even a basic understanding.
It's unlikely that Pegasus would work since Apple patched the exploit it used.
I think it's unclear whether Cellebrite can or cannot get around Lockdown Mode as it would depend very heavily on whether the technique(s)/exploit(s) Cellebrite uses are suitable for whatever bugs/vulnerabilities remain exposed in Lockdown Mode.
If they're not investigating her she doesn't have any 5th-amendment protection and can be compelled to testify on anything relevant, including how to unlock her devices.
Don't be idiots. The FBI may say that whether or not they can get in:
1. If they can get in, now people - including high-value targets like journalists - will use bad security.
2. If the FBI (or another agency) has an unknown capability, the FBI must say they can't get in or reveal their capabilities to all adversaries, including to even higher-profile targets such as counter-intelligence targets. Saying nothing also risks revealing the capability.
3. Similarly if Apple helped them, Apple might insist that is not revealed. The same applies to any third party with the capability. (Also, less significantly, saying they can't get in puts more pressure on Apple and on creating backdoors, even if HN readers will see it the other way.)
Also, the target might think they are safe, which could be a tactical advantage. It also may exclude recovered data from rules of handling evidence, even if it's unusable in court. And at best they haven't got in yet - there may be an exploit to this OS version someday, and the FBI can try again then.
I would not recommend that one trust a secure enclave with full disk encryption (FDE). This is what you are doing when your password/PIN/fingerprint can't contain sufficient entropy to derive a secure encryption key.
The problem with low entropy security measures arises due to the fact that this low entropy is used to instruct the secure enclave (TEE) to release/use the actual high entropy key. So the key must be stored physically (eg. as voltage levels) somewhere in the device.
It's a similar story when the device is locked, on most computers the RAM isn't even encrypted so a locked computer is no major obstacle to an adversary. On devices where RAM is encrypted the encryption key is also stored somewhere - if only while the device is powered on.
RAM encryption doesnât prevent DMA attacks and perofming a DMA attack is quite trivial as long as the machine is running. Secure enclaves do prevent those and they're a good solution. If implemented correctly, they have no downsides. I'm not referring to TPMs due to their inherent flaws; Iâm talking about SoC crypto engines like those found in Appleâs M series or Intel's latest Panther Lake lineup. They prevent DMA attacks and side-channel vulnerabilities. True, I wouldnât trust any secure enclave never to be breached â thatâs an impossible promise to make even though it would require a nation-state level attack â but even this concern can be easily addressed by making the final encryption key depend on both software key derivation and the secret stored within the enclave.
It sounds like almost all of our devices have security by annoyance as default. Where are the promises of E2E encryption and all the privacy measures? When I turned on lockdown mode on my iPhone, there were a few notifications where the random spam calls I get were attempting a FaceTime exploit. How come we have to wait until someone can prove ICE can't get into our devices?
I trust 404 media more than most sources, but I canât help but reflexively read every story prominently showcasing the FBIâs supposed surveillance gaps as attempted watering hole attacks. The NSA almost certainly has hardware backdoors in Apple silicon, as disclosed a couple of years ago by the excellent researchers at Kaspersky. That being the case, Lockdown Mode is not even in play.
Even a parallel construction has limited uses, since you can't use the same excuse every time. The NSA probably doesn't trust the FBI to come up with something plausible.
I use the Cryptomator app for this, it works as advertised. I keep ~60 GiB of personal files in there that would be an easy button to steal my identity and savings. I'm just hoping it doesn't include an NSA back door.
Even if I had the skills to confirm the code is secure, how could I know that this is the code running on my phone, without also having the skills to build and deploy it from source?
Every time something like this happens I assume it is a covert marketing campaign.
If the government wants to get in theyâre going to get in. They can also hold you in contempt until you do.
Donât get me wrong, itâs a good thing that law enforcement cant easily access this on their own. Just feels like the government is working with Apple here to help move some phones.
Better to be held in contempt than to give up constitutional rights under pressure - most functioning democracies have and defend the right to free press, protecting said press sources, and can't make you incriminate yourself.
Anyway, it's a good thing to be skeptical about claims that iphones can't be hacked by government agencies, as long as it doesn't mean you're driven to dodgier parties (as those are guaranteed honeypots).
"Government propaganda to help one of the richest companies in the history of the world sell 0.000000001% more phones this quarter" is quite frankly just idiotic.
You only said half the sentence anyway. The full sentence is: "If the government wants to get in they're going to get in, unless they want to utilize the courts in any way, in which case they have to do things the right way."
If this reporter was a terrorist in Yemen they would have just hacked her phone and/or blown up her apartment. Or even if they simply wanted to knock off her source they probably could have hacked it or gotten the information in some other illicit fashion. But that's not what is happening here.
https://archive.is/1ILVS
Remember...they can make you use touch id...they can't make you give them your password.
https://x.com/runasand/status/2017659019251343763?s=20
The FBI was able to access Washington Post reporter Hannah Natanson's Signal messages because she used Signal on her work laptop. The laptop accepted Touch ID for authentication, meaning the agents were allowed to require her to unlock it.
Link which doesn't directly support website owned by unscrupulous trillionaire: https://xcancel.com/runasand/status/2017659019251343763?s=20
Good reminder to also set up something that does this automatically for you:
https://news.ycombinator.com/item?id=46526010
I generally avoid extensions that can read all sites (even if technically necessary), so use the suggestion found here [1] instead.
A few bookmarklets:
javascript:(function(){if (location.host.endsWith('x.com')) location.host='xcancel.com';})()
javascript:(function(){if (location.host.endsWith('youtube.com')) location.host='inv.nadeko.net';})()
javascript:(function(){if (location.hostname.endsWith('instagram.com')) {location.replace('https://imginn.com' + location.pathname);}})()
[1] https://www.reddit.com/r/uBlockOrigin/comments/1cc0uon/addin...
I actually think it is fitting to read about a government agency weaponized by an unscrupulous billionaire going after journalists working for an unscrupulous billionaire on an unscrupulous trillionaire owned platform.
They can hold you in contempt for 18 months for not giving your password, https://arstechnica.com/tech-policy/2020/02/man-who-refused-....
Being held in contempt at least means you got a day in court first. A judge telling me to give up my password is different than a dozen armed, masked secret police telling me to.
> A judge telling me to give up my password is different than a dozen armed, masked secret police telling me to.
Yes, a judge is unlikely to order your execution if you refuse. Based on recent pattern of their behavior, masked secret police who are living their wildest authoritarian dreams are likely to execute you if you anger them (for example by refusing to comply with their desires).
That's a very unusual and narrow exception involving "foregone conclusion doctrine", an important fact missed by Ars Technica but elaborated on by AP: https://apnews.com/general-news-49da3a1e71f74e1c98012611aedc...
> Authorities, citing a âforegone conclusion exceptionâ to the Fifth Amendment, argued that Rawls could not invoke his right to self-incrimination because police already had evidence of a crime. The 3rd Circuit panel agreed, upholding a lower court decision.
I do not follow the logic here, what does that even mean? It seems very dubious. And what happens if one legitimately forgets? They just get to keep you there forever?
And why do they need to unlock your phone if they already proved you did the crime?
You're delusional. When ICE starts executing people on the spot for not giving up iPhone passwords, I'll eat my words.
???
I previously commented a solution to another problem, but it assists here too:
https://news.ycombinator.com/item?id=44746992
This command will make your MacBook hibernate when lid is closed or the laptop sleeps, so RAM is written to disk and the system powers down. The downside is that it does increase the amount of time it takes to resume.
A nice side benefit though, is that fingerprint is not accepted on first unlock, I believe secrets are still encrypted at this stage similar to cold boot. A fingerprint still unlocks from screensaver normally, as long as the system does not sleep (and therefore hibernate)
As far as I know lockdown mode and BFU prevent touch ID unlocking.
At least a password and pin you choose to give over.
Remember that our rights aren't laws of nature. They have to be fought for to be respected by the government.
Is the knowledge of which finger to use protected as much as a passcode? Law enforcement might have the authority to physically hold the owner's finger to the device, but it seems that the owner has the right to refuse to disclose which finger is the right one. If law enforcement doesn't guess correctly in a few tries, the device could lock itself and require the passcode.
Another reason to use my dog's nose instead of a fingerprint.
I really wish Apple would offer a pin option on macos. For this reason, precisely. Either that, or an option to automatically disable touchid after a short amount of time (eg an hour or if my phone doesn't connect to the laptop)
You can setup a separated account with a long password on MacOS and remove your user account from accounts that can unlock FileVault. Then you can change your account to use a short password. You can also change various settings regarding how long Mac has to sleep before requiring to unlock FileVault.
I didnât understand how a user that cannot unlock FileVault helps. Can you please elaborate on this setup? Thanks.
With that setup on boot or after a long sleep one first must log in into an account with longer password. Then one logs out of that and switches to the primary account with a short password.
As another alternative, rather than using Touch ID you can setup a Yubikey or similar hardware key for login to macOS. Then your login does indeed become a PIN with 3 tries before lockout. That plus a complex password is pretty convenient but not biometric. It's what I've done for a long time on my desktop devices.
On my Macbook Pro, I usually need to use both touch and a password but that might be only when some hours have passed between log ins.
You can script a time out if desired.
uhm, are you saying its not possible to require an actual password to unlock osx?
There's only ten possible guesses, and most people use their thumb and/or index finger, leaving four much likelier guesses.
Also, IANAL, but I'm pretty sure that if law enforcement has a warrant to seize property from you, they're not obligated to do so immediately the instant they see you - they could have someone follow you and watch to see how you unlock your phone before seizing it.
0.1 in itself is a very good odd, and 0.1 * n tries is even more laughable. Also most people have two fingers touchID, which makes this number close to half in reality.
Remember, this isn't how it works in every country.
Also, using biometrics on a device, and your biometrics unlock said device, do wonders for proving to a jury that you owned and operated that device. So you're double screwed in that regard.
I don't get why I can be forced to use my biometrics to unlock but I cannot be forced to give a pin. Doesn't jive in my brain.
When they arrest you, they have physical control of your body. You're in handcuffs. They can put your fingers against the unlock button. You can make a fist, but they can have more strength and leverage to unfist your fist.
There's no known technique to force you to input a password.
It's something you know vs. something you have. That's how the legal system sees it. You might not tell someone the pin to your safe, but if police find the key to it, or hire a locksmith to drill out your safe, it's theirs with a warrant.
It's interesting in the case of social media companies. Technically the data held is the companies data (Google, Meta, etc.) however courts have ruled that a person still has an expectation of privacy and therefore police need a warrant.
The fifth amendment gives you the right to be silent, but they didn't write in anything about biometrics.
> they can't make you give them your password.
Except when they can: https://harvardlawreview.org/print/vol-134/state-v-andrews/
Allowed to require - very mildly constructed sentence, which could include torture or force abuse...
https://xkcd.com/538/
Reminder that you can press the iPhone power button five times to require passcode for the next unlock.
Did you know that on most models of iPhone, saying "Hey Siri, who's iPhone is this?" will disable biometric authentication until the passcode is entered?
hm. didn't work on my 17 pro :( might be due to a setting i have.
They disabled that in like iOS 18.
Serious question. If I am re-entering the US after traveling abroad, can customs legally ask me to turn the phone back on and/or seize my phone? I am a US citizen.
Out of habit, I keep my phone off during the flight and turn it on after clearing customs.
my understanding is that they can hold you for a couple days without charges for your insubordination but as a citizen they have to let you back into the country or officially arrest you, try to get an actual warrant, etc.
they can just break the law
If you are a US citizen, you legally cannot be denied re-entry into the country for any reason, including not unlocking your phone. They can make it really annoying and detain you for a while, though.
Or squeeze the power and volume buttons for a couple of seconds. Itâs good to practice both these gestures so that they become reflex, rather than trying to remember them when theyâre needed.
Sad, neither of those works on Android. Pressing the power button activates the emergency call screen with a countdown to call emergency services, and power + volume either just takes a screenshot or enables vibrations/haptics depending on which volume button you press.
Did you check your phone settings? Mine has an option to add it to the power menu, so you get to it by whichever method you use to do that (which itself is sad that phones are starting to differ in what the power key does).
On Pixel phones, Power + Volume Up retrieves a menu where you can select "Lockdown".
Not on my Pixel phone, that just sets it to vibrate instead of ring. Holding down the power button retrieves a menu where you can select "Lockdown".
On my 9 you get a setting to choose if holding Power gets you the power menu or activates the assistant (I think it defaulted to assistant? I have it set to the power menu because I don't really ever use the assistant.)
Yes, that was the default for me, but I changed it in settings.
Oh wow, just going into the "should I shutdown" menu also goes into pre-boot lock state? I didn't know that.
It doesn't reenter a BFU state, but it requires a passcode for the next unlock.
It's close enough, because (most of) the encryption keys are wiped from memory every time the device is locked, and this action makes the secure enclave require PIN authentication to release them again.
> It's close enough
Not really, because tools like Cellbrite are more limited with BFU, hence the manual informing LEO to keep (locked) devices charged, amd the countermeasures being iOS forcefully rebooting devices that have been locked for too long.
There is a way now to force BFU from a phone that is turned on, I can't remember the sequence
Eh? BFU ("before first unlock") is, by definition, the state that a phone is in when it is turned on. There's no need to "force" it.
If you mean forcing an iOS device out of BFU, that's impossible. The device's storage is encrypted using a key derived from the user's passcode. That key is only available once the user has unlocked the device once, using their passcode.
Alternately, hold the power button and either volume button together for a few seconds.
This is the third person advocating button squeezing, as a reminder: IF a gun is on you the jig is up, you can be shot for resisting or reaching for a potential weapon. Wireless detonators do exist, don't f around please.
In case anyone is wondering: In newer versions of MacOS, the user must log out to require a password. Locking screen no longer requires password if Touch ID is enabled.
Is that actually true? I'm fairly confident my work Mac requires a password if it's idle more than a few days (typically over the weekend).
Shift+Option+Command+Q is your fastest route there, but unsaved work will block.
Settings -> lock screen -> âRequire password after screen saver begins or display is turned offâ
I just searched the case. I'm appalled. It looks like USA doesn't have legal protection for reporter sources. Or better, Biden created some, but it was revoked by the current administration.
The real news here isn't privacy control in a consumer OS ir the right to privacy, but USA, the leader of the free world, becoming an autocracy.
As if the government is not above breaking the law and using rubber hose decryption. The current administrationâs justice department has been caught lying left and right
Plausible deniability still works
I find it so frustrating that Lockdown Mode is so all-or-nothing.
I want some of the lockdown stuff (No facetime and message attachments from strangers, no link previews, no device connections), but like half of the other ones I don't want.
Why can't I just toggle an iMessage setting for "no link preview, no attachments", or a general setting for "no automatic device connection to untrusted computers while locked"? Why can't I turn off "random dickpicks from strangers on iMessage" without also turning off my browser's javascript JIT and a bunch of other random crap?
Sure, leave the "Lockdown mode" toggle so people who just want "give me all the security" can get it, but split out individual options too.
Just to go through the features I don't want:
* Lockdown Mode disables javascript JIT in the browser - I want fast javascript, I use some websites and apps that cannot function without it, and non-JIT js drains battery more
* Shared photo albums - I'm okay viewing shared photo albums from friends, but lockdown mode prevents you from even viewing them
* Configuration profiles - I need this to install custom fonts
Apple's refusal to split out more granular options here hurts my security.
Iâm with you on the shared photo albums. Iâd been using lockdown mode for quite a while before I discovered this limitation, though. For me, this is one Iâd like to be able to selectively enable (like the per-website/app settings). In my case, it was a one-off need, so I disabled lockdown mode, shared photos, then enabled it again.
The other feature I miss is screen time requests. This one is kinda weird - Iâm sure thereâs a reason theyâre blocked, but itâs a message from Apple (or, directly from a trusted family member? Iâm not 100% sure how they work). I still _recieve_ the notification, but itâs not actionable.
While I share with your frustration, though, I do understand why Apple might want to have it as âall-or-nothingâ. If they allow users to enable even one âdangerousâ setting, that ultimately compromises the entire security model. An attacker doesnât care which way they can compromise your device. If thereâs _one_ way in, thatâs all they need.
Ultimately, for me the biggest PiTA with lockdown mode is not knowing if itâs to blame for a problem Iâm having. I couldnât tell you how many times Iâve disabled and re-enabled it just to test something that should work, or if itâs the reason a feature/setting is not showing up. To be fair, most of the time itâs not the issue, but sometimes I just need to rule it out.
The profiles language may be confusing -- what you can't do is change them while in Lockdown mode.
Family albums work with lockdown mode. You can also disable web restrictions per app and website.
>* Lockdown Mode disables javascript JIT in the browser - I want fast javascript, I use some websites and apps that cannot function without it, and non-JIT js drains battery more
This feature has the benefit of teaching users (correctly) that browsing the internet on a phone has always been a terrible idea.
I'll bite. Why is it so terrible? I'm browsing this site right now on my phone and don't see the horror.
Phone networks by design track you more precisely than possible over a conventional internet connection to facilitate the automatic connection to the nearest available network. Also, for similar reasons it requires the phone network to know that it is your phone
You don't need to connect to the internet for that. It has nothing to do with web browsing at all.
I think that ship has sailed.
Is there an implication here that they could get into an iPhone with lower security settings enabled? There's Advanced Data Protection, which E2EEs more of your data in iCloud. There's the FaceID unlock state, which US law enforcement can compel you to unlock; but penta-click the power button and you go into PIN unlock state, which they cannot compel you to unlock.
My understanding of Lockdown Mode was that it babyifies the device to reduce the attack surface against unknown zero-days. Does the government saying that Lockdown Mode barred them from entering imply that they've got an unknown zero-day that would work in the PIN-unlock state, but not Lockdown Mode?
It's relatively well know that the NSO Group / Pegasus is what governments use to access locked phones.
This was known, in the past, but if its relying on zero-days Apple & Google are, adversarially, attempting to keep up with and patch, my assumption would not be that pegasus is, at any time, always able to breach a fully-updated iPhone. Rather, its a situation where maybe there are periods of a few months at a time where they have a working exploit, until Apple discovers it and patches it, repeat indefinitely.
How does Apple discover their exploits? I'm sure they keep some around for extremely high value targets.
Yes
Sadly, they still got to her Signal on her Desktop â her sources might still be compromised. It's sadly inherent to desktop applications, but I'm sad that a lot more people don't know that Signal for Desktop is much, much less secure against adversaries with your laptop.
> I'm sad that a lot more people don't know that Signal for Desktop is much, much less secure against adversaries with your laptop
Educate us. What makes it less secure?
In addition to what the other person who replied said, ignoring that iOS/Android/iPadOS is far more secure than macOS, laptops have significantly less hardware-based protections than Pixel/Samsung/Apple mobile devices do. So really the only way a laptop in this situation would be truly secure from LEO is if its fully powered off when itâs seized.
The key in the desktop version is not always stored in the secure enclave, is my assumption (it definitely supports plaintext storage). Theoretically this makes it possible to extract the key for the message database. Also a different malicious program can read it. But this is moot anyway if the FBI can browse through the chats. This isn't what failed here.
Also last time I looked (less than 1 year ago) files sent over Signal are stored in plain, just with obfuscated filenames. So even without access to Signal it's easy to see what message attachments a person has received, and copy any interesting ones.
If people don't have Signal set to delete sensitive messages quickly, then they may as well just be texting.
That's a strong statement. Also imho it's important that we use Signal for normal stuff like discussing where to get coffee tomorrow - no need for disappearing messages there.
Strong and accurate. Considering non-disappearing messages the same as texts is not the same thing as saying all Signal messages ought to be disappearing or else the app is useless.
Telegram allows you to have distinct disappearing settings for each chat/group. Not sure how it works on Signal, but a solution like this could be possible.
I'm weird, i even have disappearing messages for my coffee chats. It's kind of refreshing not having any history.
I'm an inbox zero person... I keep even my personal notes to disappear after 2 days. For conversations 1 day.
Not if you're using Signal for life-and-death secure messaging; in that scenario it's table stakes.
I would have thought reporters with confidential sources at that level would already exercise basic security hygiene. Hopefully, this incident is a wake up call for the rest.
Yea, I also would want to question the conclusions in the article. Was the issue that they couldn't unlock the iPhone, or that they had no reason to pursue the thread? To my understanding, the Apple ecosystem means that everything is synced together. If they already got into her laptop, wouldn't all of the iMessages, call history, and iCloud material already be synced there? What would be the gain of going after the phone, other than to make the case slightly more watertight?
Did she have Bitlocker or FileVault or other disk encryption that was breeched? (Or they took the system booted as TLAs seek to do?)
There was a story here the other day, bitlocker keys stored in your Microsoft account will be handed over.
Which windows does by default and makes it hard to turn off
breached
Depending on your jurisdiction faceid is safer than fingerprint, because faceid wonât unlock while your eyes are closed.
In many European countries forcing your finger on a scanner would be permissible under certain circumstances, forcing your eyes open so far has been deemed unacceptable.
> Natanson said she does not use biometrics for her devices, but after investigators told her to try, âwhen she applied her index finger to the fingerprint reader, the laptop unlocked.â
Curious.
Probably enabled it at some point and forgot. Perhaps even during setup when the computer was new.
My recollection is the computers do by default ask the user to set up biometrics
I want to say that is generous of her, but one thing that is weird is if I didnât want someone to go into my laptop and they tried to force me to use my fingerprint to unlock it, I definitely wouldnât use the finger I use to unlock it on the first try. Hopefully, Apple locks it out and forces a password if you use the wrong finger âaccidentallyâ a couple of times.
Correct. Thatâs why my Touch ID isnât configured to use the obvious finger.
Very much so, because the question is... did she set it up in the past?
How did it know the print even?
Why is this curious?
There appear to be a relatively few possibilities.
* The reporter lied.
* The reporter forgot.
* Apple devices share fingerprint matching details and another device had her details (this is supposed to be impossible, and I have no reason to believe it isn't).
* The government hacked the computer such that it would unlock this way (probably impossible as well).
* The fingerprint security is much worse than years of evidence suggests.
Mainly it was buried at the very end of the article, and I thought it worth mentioning here in case people missed it.
My opinion is that she set it up, it didn't work at first, she didn't use it, forgot that it existed, and here we are.
> Apple devices share fingerprint matching details and another device had her details
I looked into it quite seriously for windows thinkpads, unless Apple do it differently, you cannot share fingerprint, they're in a local chip and never move.
So how does TouchID on an external keyboard work without having to re-set up fingerprints?
Presumably the fingerprint data is stored in the Mac's Secure Enclave, and the external keyboard is just a reader
The reporter lying or forgetting seems to be the clear answer, there's really no reason to believe it's not one of those. And the distinction between the two isn't really important from a technical perspective.
Fingerprint security being poor is also unlikely, because that would only apply if a different finger had been registered.
She has to have set it up before. There is no way to divine a fingerprint any other way. I guess the only other way would be a faulty fingerprint sensor but that should default to a non-entry.
> faulty fingerprint sensor
The fingerprint sensor does not make access control decisions, so the fault would have to be somewhere else (e.g. the software code branch structure that decides what to do with the response from the secure enclave).
If you're interested in this in more detail, check this out:
https://blackwinghq.com/blog/posts/a-touch-of-pwn-part-i/
Could be a parallel construction type thing. They already have access but they need to document a legal action by which they could have acquired it so it doesn't get thrown out of court.
I think this is pretty unlikely here but it's within the realm of possibility.
Seems like it would be hard to fake. The was she tells it she put her finger on the pad and the OS unlocked the account. Sounds very difficult to do
I think they mean if they already have her fingerprint from somewhere else, and a secret backdoor into the laptop. Then they could login, setup biometrics and pretend they had first access when she unlocked it. All without revealing their backdoor.
"Lockdown Mode is a sometimes overlooked feature of Apple devices that broadly make[sic] them harder to hack."
Funny to see disabling "features" itself described as "feature"
Why not call it a "setting"
Most iPhone users do not change default settings. That's why Google pays Apple billions of dollars for a default setting that sends data about users to Google
"Lockdown Mode" is not a default setting
The phrase "sometimes overlooked" is an understatement. It's not a default setting and almost no one uses it
If it is true Lockdown Mode makes iPhones "harder to hack", as the journalist contends, then it is also true that Apple's default settings make iPhones "easier to hack"
The intention behind lockdown mode is protection for a select few groups of people such as journalists, that are at risk of having software like Pegasus used against them. Itâs to reduce the attack surface. The average user wouldnât want most of it as a default setting, for example: almost no message attachments allowed, no FaceTime calls from people you havenât called and safari is kneecapped. Making this a default setting for most people is unrealistic and also probably wonât help their cybersecurity as they wouldnât be targeted anyway.
A "reduced attack surface" can also be a reduced surface for telemetry, data collection, surveillance and advertising services, thereby directly or indirectly causing a reduction in Apple revenues
Perhaps this could be a factor in why it's not a default setting
It seems unfortunate that enhanced protection against physically attached devices requires enabling a mode that is much broader, and sounds like it has a noticeable impact on device functionality.
I never attach my iPhone to anything that's not a power source. I would totally enable an "enhanced protection for external accessories" mode. But I'm not going to enable a general "Lockdown mode" that Apple tells me means my "device wonât function like it typically does"
There is a setting as of iOS 26 under "Privacy & Security > Wired Accessories" in which you can make data connections always prompt for access. Not that there haven't been bypasses for this before, but perhaps still of interest to you.
GrapheneOS does this by default - only power delivery when locked. Also it's a hardware block, not software. Seems to be completely immune to these USB exploit tools.
It also has various options to adjust the behaviour, from no blocks at all, to not even being able to charge the phone (or use the phone to charge something else) -- even when unlocked. Changing the mode of operation requires the device PIN, just as changing the device PIN does.
Note that it behaves subtly differently to how you described in case it was connected to something before being locked. In that case data access will remain -- even though the phone is now locked -- until the device is disconnected.
> I would totally enable an "enhanced protection for external accessories" mode.
Anyone can do this for over a decade now, and it's fairly straightforward:
- 2014: https://www.zdziarski.com/blog/?p=2589
- recent: https://reincubate.com/support/how-to/pair-lock-supervise-ip...
This goes beyond the "wired accessories" toggle.
It isnât. Settings > Privacy & Security > Wired Accessories
Set to ask for new accessories or always ask.
I have to warn you, it does get annoying when you plug in your power-only cable and it still nags you with the question. But it does work as intended!
You might want to check that charger. I have the same option set to ask every time and it never appears for chargers.
> it has a noticeable impact on device functionality.
The lack of optional granularity on security settings is super frustrating because it leads to many users just opting out of any heightened security.
Computer security is generally inversely proportional to convenience. Best opsec is generally to have multiple devices.
> I never attach my iPhone to anything that's not a power source.
It's "attached" to the wifi and to the cell network. Pretty much the same thing.
Can a hacked phone (such as one that was not in Lockdown Mode at one point in time) persist in a hacked state?
Obviously, the theoretical answer is yes, given an advanced-enough exploit. But let's say Apple is unaware of a specific rootkit. If each OS update is a wave, is the installed exploit more like a rowboat or a frigate? Will it likely be defeated accidentally by minor OS changes, or is it likely to endure?
This answer is actionable. If exploits are rowboats, installing developer OS betas might be security-enhancing: the exploit might break before the exploiters have a chance to update it.
Forget OS updates. The biggest obstacle to exploit persistence: a good old hard system reboot.
Modern iOS has an incredibly tight secure chain-of-trust bootloader. If you shut your device to a known-off state (using the hardware key sequence), on power on, you can be 99.999% certain only Apple-signed code will run all the way from secureROM to iOS userland. The exception is if the secureROM is somehow compromised and exploited remotely (this requires hardware access at boot-time so I don't buy it).
So, on a fresh boot, you are almost definitely running authentic Apple code. The easiest path to a form of persistence is reusing whatever vector initially pwned you (malicious attachment, website, etc) and being clever in placing it somewhere iOS will attempt to read it again on boot (and so automatically get pwned again).
But honestly, exploiting modern iOS is already difficult enough (exploits go for tens millions $USD), persistence is an order of magnitude more difficult.
It's why I keep my old iPhone XR on 15.x for jail breaking reasons. I purchased an a new phone specially for the later versions and online banking.
Apple bought out all the jail breakers as Denuvo did for the game crackers.
> Apple bought out all the jail breakers > Denuvo did for the game crackers
Do you have sources for these statements?
Like anything in that field its more NDA, antidotal.
> in 2018, the prominent Denuvo cracker known as "Voksi" (of REVOLT) was arrested in Bulgaria following a criminal complaint from Denuvo.
https://www.dsogaming.com/news/denuvo-has-sued-revolts-found...
That's how you get off such charges. I'll work for you, if you drop charges. There was a reddit post I can't find when EMPRESS had one of their episodes where she was asked if she wanted to work for. It's happened in the cracking scene before.
> The jailbreaking community is fractured, with many of its former members having joined private security firms or Apple itself. The few people still doing it privately are able to hold out for big payouts for finding iPhone vulnerabilities. And users themselves have stopped demanding jailbreaks, because Apple simply took jailbreakersâ best ideas and implemented them into iOS.
https://www.vice.com/en/article/iphone-jailbreak-life-death-...
And from the jail break community discord.
Secure boot and verified system partition is supposed to help with that. It's for the same reason jailbreaks don't persist across reboots these days.
Re: reboots â TFA states that recent iPhones reboot every 3 days when inactive for the same reasons. Of course, now that we know that it's linked to inactivity, black hatters will know how to avoid it...
You should read into IOS internals before commenting stuff like this. Your answer is wrong, and rootkits have been dead on most OS's for years, but ESPECIALLY IOS. Not every OS is like Linux where security is second.
Even a cursory glance would show it's literally impossible on IOS with even a basic understanding.
Can't they just use Pegasus or Cellebrite???
It's unlikely that Pegasus would work since Apple patched the exploit it used.
I think it's unclear whether Cellebrite can or cannot get around Lockdown Mode as it would depend very heavily on whether the technique(s)/exploit(s) Cellebrite uses are suitable for whatever bugs/vulnerabilities remain exposed in Lockdown Mode.
Given Cook's willing displays of fealty to Trump this time around I wouldn't be shocked if they were to remove lockdown mode in a future release.
We need a Lockdown mode for MacBooks as well!
Looks like itâs a feature: https://support.apple.com/en-us/105120
To save a click:
* Lockdown Mode needs to be turned on separately for your iPhone, iPad, and Mac.
* When you turn on Lockdown Mode for your iPhone, it's automatically turned on for your paired Apple Watch.
* When you turn on Lockdown Mode for one of your devices, you get prompts to turn it on for your other supported Apple devices.
What is she investigated for?
They're not actually investigating her, they're investigating a source that leaked her classified materials.
If they're not investigating her she doesn't have any 5th-amendment protection and can be compelled to testify on anything relevant, including how to unlock her devices.
Did the individual store the classified material in the bathroom at his beach-side resort?
Don't be idiots. The FBI may say that whether or not they can get in:
1. If they can get in, now people - including high-value targets like journalists - will use bad security.
2. If the FBI (or another agency) has an unknown capability, the FBI must say they can't get in or reveal their capabilities to all adversaries, including to even higher-profile targets such as counter-intelligence targets. Saying nothing also risks revealing the capability.
3. Similarly if Apple helped them, Apple might insist that is not revealed. The same applies to any third party with the capability. (Also, less significantly, saying they can't get in puts more pressure on Apple and on creating backdoors, even if HN readers will see it the other way.)
Also, the target might think they are safe, which could be a tactical advantage. It also may exclude recovered data from rules of handling evidence, even if it's unusable in court. And at best they haven't got in yet - there may be an exploit to this OS version someday, and the FBI can try again then.
I would not recommend that one trust a secure enclave with full disk encryption (FDE). This is what you are doing when your password/PIN/fingerprint can't contain sufficient entropy to derive a secure encryption key.
The problem with low entropy security measures arises due to the fact that this low entropy is used to instruct the secure enclave (TEE) to release/use the actual high entropy key. So the key must be stored physically (eg. as voltage levels) somewhere in the device.
It's a similar story when the device is locked, on most computers the RAM isn't even encrypted so a locked computer is no major obstacle to an adversary. On devices where RAM is encrypted the encryption key is also stored somewhere - if only while the device is powered on.
RAM encryption doesnât prevent DMA attacks and perofming a DMA attack is quite trivial as long as the machine is running. Secure enclaves do prevent those and they're a good solution. If implemented correctly, they have no downsides. I'm not referring to TPMs due to their inherent flaws; Iâm talking about SoC crypto engines like those found in Appleâs M series or Intel's latest Panther Lake lineup. They prevent DMA attacks and side-channel vulnerabilities. True, I wouldnât trust any secure enclave never to be breached â thatâs an impossible promise to make even though it would require a nation-state level attack â but even this concern can be easily addressed by making the final encryption key depend on both software key derivation and the secret stored within the enclave.
Little too late for 1000 people hacked by pegasus.
Previously, direct link to the court doc:
FBI unable to extract data from iPhone 13 in Lockdown Mode in high profile case [pdf]
https://storage.courtlistener.com/recap/gov.uscourts.vaed.58...
(https://news.ycombinator.com/item?id=46843967)
It sounds like almost all of our devices have security by annoyance as default. Where are the promises of E2E encryption and all the privacy measures? When I turned on lockdown mode on my iPhone, there were a few notifications where the random spam calls I get were attempting a FaceTime exploit. How come we have to wait until someone can prove ICE can't get into our devices?
I guess they got a 404
I trust 404 media more than most sources, but I canât help but reflexively read every story prominently showcasing the FBIâs supposed surveillance gaps as attempted watering hole attacks. The NSA almost certainly has hardware backdoors in Apple silicon, as disclosed a couple of years ago by the excellent researchers at Kaspersky. That being the case, Lockdown Mode is not even in play.
The NSA is not going to tip its hand about any backdoors it had built into the hardware for something as small as this.
It depends on if parallel reconstruction can be used to provide deniability.
Even a parallel construction has limited uses, since you can't use the same excuse every time. The NSA probably doesn't trust the FBI to come up with something plausible.
Samsung phones have the Secure Folder which can have a different, more secure password and be encrypted when the phone is on.
Secure folder uses or is in the process of starting to use Android native feature private space, which is available on all Android 15 phones.
I use the Cryptomator app for this, it works as advertised. I keep ~60 GiB of personal files in there that would be an easy button to steal my identity and savings. I'm just hoping it doesn't include an NSA back door.
The NSA definitely has easier ways to steal your identity and savings if they wanted to anyways
you can check the github https://github.com/cryptomator/ios
Even if I had the skills to confirm the code is secure, how could I know that this is the code running on my phone, without also having the skills to build and deploy it from source?
Also, you need to make sure that the installation process does not insert a backdoor into the code you built from source.
Or the compilation process!
https://www.cs.cmu.edu/~rdriley/487/papers/Thompson_1984_Ref...
For now! Theyâll get something from open market like the last time when Apple refused to decrypt (or unlock?) a phone for them.
Yeah this is low stakes stuff, Pegasus historically breaks Apple phones easy. Bezos's nudes and Khashoggi knows. (not really Khashoggi is dead)
They just need to ask apple to unlock it. And they can't really refuse under US law
They can refuse, and they have refused. See San Bernardino and the concept of "compelled work".
That was the old US law, not the one where Tim Cook delivered gold bars to Trump
Every time something like this happens I assume it is a covert marketing campaign.
If the government wants to get in theyâre going to get in. They can also hold you in contempt until you do.
Donât get me wrong, itâs a good thing that law enforcement cant easily access this on their own. Just feels like the government is working with Apple here to help move some phones.
Better to be held in contempt than to give up constitutional rights under pressure - most functioning democracies have and defend the right to free press, protecting said press sources, and can't make you incriminate yourself.
Anyway, it's a good thing to be skeptical about claims that iphones can't be hacked by government agencies, as long as it doesn't mean you're driven to dodgier parties (as those are guaranteed honeypots).
"Government propaganda to help one of the richest companies in the history of the world sell 0.000000001% more phones this quarter" is quite frankly just idiotic.
You only said half the sentence anyway. The full sentence is: "If the government wants to get in they're going to get in, unless they want to utilize the courts in any way, in which case they have to do things the right way."
If this reporter was a terrorist in Yemen they would have just hacked her phone and/or blown up her apartment. Or even if they simply wanted to knock off her source they probably could have hacked it or gotten the information in some other illicit fashion. But that's not what is happening here.