Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's the neighborly thing to do, but people are under no obligation to report vulns privately. The blame lies squarely on Apple, not on the messenger.

The fact that we know about it means we can take steps to mitigate the damage.



The blame lies squarely on Apple, not on the messenger.

There is blame on both.

If you leave your key in your front door lock and I blast out on twitter your address and tell people about it, I think I have some responsibility.


If you leave keys in other people's doors all over the neighbourhood, I damn well have a rigtht, and possibly an obligation, to make it publicly known that such a thing is taking place. So that everyone may take their own precautions.


Let's say keys were hidden around the neighborhood. Would you rather everyone in the whole town know about it or quietly and quickly go pick up all the keys before someone notices and breaks into one of the houses?

Personally I think if you report through the proper channels and nothing is changed THEN broadcast, but not as an opener.


These keys aren't exactly hidden. This is like trying the doorknob a second time.


In determining whether keys-in-locks or keys-under-doormats is the closer analogy, I have to go with the doormats. Various people go door-to-door... delivery people, campaign volunteers, Jehovah's Witnesses, etc. and a key in the lock would be hard to miss. A key under a doormat is easy to miss, being obscure. Sure -- the doormat is one of the first places you look if you're actually trying to break in, but people whose nefarious side doesn't manifest until the opportunity is obvious are indeed thwarted by that obscurity.


I am going to ask, do you want to try this scenario on your own in real life? Because often we make general statements while we don’t actually practice what we say to others when the issue is going to hurt ourselves.


We all live with this scenario everyday, most consumer locks are ridiculously easy to open. https://en.wikipedia.org/wiki/Lock_bumping


But do YOU want to have your good neighbor report to tou that your key has been left on the poach before someone takes it away and then get into the house when you are away from home on Thanksgiving night celebrating a nice holiday vacation? I bet you not. The fault is on you for dropping the key, isn’t it? Or you are so bluntly careless to have a backup key left under the mat on your poach which slided away and the key is now revealed.

The fact locks are easy to pick doesn’t mean people don’t take more measures to defend it. More and more people install steel door as an extra layer and home security surveillance system, all of which can be compromised with the right tools.

Responsible disclosure is something I respect. While he has the right to not disclose this privately first, has he tried? How hard is it to ask someone to get in contact with the Apple security team? There are a bunch of top sec researchers on Twitter constantly tweeting) to help escalate this. I think someone on Google zero project team did this to escalate.


A better analogy would be "if the lending bank left the door to your new house open..."

Other than buy an Apple product, the users did nothing intentional to undermine security.

Since this is a subjective argument, based more on historical instances of "responsible disclosure" and not law, I'm gonna lean in this case of it being Apple that failed

They built the entire "walled garden" without getting outside help. They want the control, they have billions of dollars, can hire whatever talent...

Failed to spot a password-less root login issue.

People need to know today to be even more cautious about using Apple gear in public places or around plain ol' tech jerks that like to fuck with people for a gag.

Society has no legal or moral obligation to make sure Apple stays in business.


Exactly.

Responsible disclosure is an interesting concept. How does this kind of disclosure make sure that the public knows about a company's track record of vulnerabilities, if everyone is under NDA and the company has no obligation to ever publicize it?

Now, if the reseacher could give a grace period, that's cool, but there MUST be a deadline by which stuff goes public. Hopefully the company fixes it and issues a postmortem first. If not - too bad!


The problem with that analogy is that the probability that the "bad guys" already know about this vulnerability is vastly higher than the probability that thieves know about how well some random house in the neighborhood is secured.


But do they? And what portion of them do? And are they using it? There's a lot of speculation here. But surely the average person doesn't know and with this being public knowledge, AND easy to execute there is a bigger chance for crime of opportunity.


It’s always reasonable to assume that black-hats (and… what do you call government hackers — black-suits, helicopter-hats, ???) know everything that white-hats know, and that they either have or are already in the process of selling that exploit to less skilled criminals.

It’s not like being good morally correlates with being good at security.


But that's not what I'm saying. I'm saying that since this is so easy, a person that is computer illiterate can now gain root access. You definitely don't post those kinds of things on Twitter.


Computer illiterate people might now have a new way to shoot themselves in the foot, they won’t be able to exploit it because they won’t know what root is or why it does stuff.


How many more people now know about this vulnerability cause of this knuckle-head tweeting it? At least 100k impressions? Now think of how many more "bad guys" have access to this hack that are going to abuse it.


And how many people and companies are now empowered to fix this issue for themselves, immediately.


Not immediately.


You may say so, but really the level of incompetence of not setting a password for a root account is pretty high. The fact that someone reported it in a way you don't agree with shouldn't distract you from the fact that this highlights a serious oversight.

The main question that should be asked is, how did this get overlooked? How is it that your average website has better password security than the OS of one of the richest tech companies in the world?

To be fair to Apple, Microsoft had similar issues back in the 1990s. Perhaps it takes a string of security blunders for some tech companies to take security seriously.


If you sell locks and those locks can be opened by pulling on them twice, the reasonnable course of action is to make that fact known to every buyer ASAP, not tell you privately and wait for you to maybe issue a recall.


Locks don't nag you to decommission them quite as aggressively as OS X asks you to patch it. And an OS update was going to happen anyway, so including this patch doesn't really burden the user with an extra task they wouldn't already be subject to. Therefore, coordinated disclosure has a lot of value in the OS update ecosystem and very little in the physical lock ecosystem.


Wrong. This is Apple -- not the homeowners -- leaving everyone's key in everyone's door without them knowing.


Responsible Disclosure is widely regarded as a good practice in these situations. Blame isn't the key issue - fixing the problem quickly and safely is. Widespread disclosure before Apple have even a chance to respond in a timely fashion is inherently unsafe.

You would hope the self-described twitter bio "Agile Software Craftsman" might have thought about this a little before tweeting.

> https://en.wikipedia.org/wiki/Responsible_disclosure


The poster practiced Full Disclosure, which is also a valid disclosure policy.

Since we're just making up statements, I guarantee that Apple would never voluntarily disclose this issue if it was reported privately. So Full Disclosure is the only way to put Apple's feet to the fire, as it's the only way in which this issue would have had any visibility whatsoever.

https://en.wikipedia.org/wiki/Full_disclosure_(computer_secu...


Private Disclosure does not prevent you from publicly disclosing it after a grace period.


What grace period is appropriate for a 0 day root login vulnerability?

This guy is, with all probability, not the first one to have found it.


If there was a vulnerability that allowed anyone to open your car and drive off with it, you wouldn't care if he was the first to find it or not. You'd only care about it getting fixed before anyone else knows about it.

I'm not sure what length grace period is appropriate, though.


If there was a vulnerability that allowed anyone to open my car, I'd want to known ASAP, because I wouldn't trust the manufacturer to provide a remedy quickly enough that eliminates my risk.

Same applies for Apple. No reason to believe this guy was the first one to find this exploit, we only know he was the first one to publicize it.


> I wouldn't trust the manufacturer to provide a remedy quickly enough that eliminates my risk.

True, and this is where the analogy breaks down, since they would not be able to remotely send over a fix. But Apple would, and apparently now has.


To be pedantic, this isn't a 0day root login vuln, but a priv escalation vuln.


On what basis do you "guarantee" this?


> On what basis do you "guarantee" this?

I think on the (probably intentionally snarky) basis of "just making up statements":

> Since we're just making up statements, I guarantee that Apple would never voluntarily disclose this issue if it was reported privately.


It's used as an idiom, meaning roughly "I have total confidence".


"Responsible Disclosure" is a term rejected by the industry as a loaded phrase that favors vendors, instead preferring the term Coordinated Disclosure. Even so, reasonable professionals still disagree that this is the best option, in a debate that has existed for decades, so it's by no means settled as the "proper" way.


Do you have any good history on this you can link to? Sounds very interesting.


This situation seems more like a lock manufacturer selling millions of locks that can be opened with a toothpick.

I'd lay responsibility at the lockmaker's door, not the guy who told everybody they were at the mercy of anyone with a toothpick.


>If you leave your key in your front door lock and I blast out on twitter your address and tell people about it, I think I have some responsibility.

That's not a faithful analogy. Apple isn't your neighbour. They are the landlord. The scenario is more like that the landlord uses bogus locks in your complex, and you post it on twitter. You could complain to them privately too, but given your past experiences perhaps, you thought that twitter would be a more effective medium.


This is different though: the bug is so bad that random, inexpert users can discover it by accident. People that are not going to even be familiar with the term "responsible disclosure" at all. This may have been the case for the guy who tweeted this.

There is no realistic way to keep a lid on something like that and so in this case the blame is entirely on Apple.


Your analogy makes no sense. He's just as vulnerable as you are.


This very excellent comment lies "dead", so I'll repost it:

asejfwe8823 24 minutes ago [dead] [-]

A better analogy would be "if the lending bank left the door to your new house open..." Other than buy an Apple product, the users did nothing intentional to undermine security. Since this is a subjective argument, based more on historical instances of "responsible disclosure" and not law, I'm gonna lean in this case of it being Apple that failed They built the entire "walled garden" without getting outside help. They want the control, they have billions of dollars, can hire whatever talent... Failed to spot a password-less root login issue. People need to know today to be even more cautious about using Apple gear in public places or around plain ol' tech jerks that like to fuck with people for a gag. Society has no legal or moral obligation to make sure Apple stays in business.


Mostly, they're just missing out on an up to $200,000 bug bounty.

https://www.theregister.co.uk/2016/08/05/apple_joins_the_bug...


"Invite only", "provides a full report and a proof of concept that is accepted by Apple engineers"


what the f is the point of this being invite only... i will never understand apple.


... And means that others can utilize this to cause damage.

The idea of responsible disclosure is to minimize harm for you, the user. Not to minimize bad publicity.


In a case like this, I think it would be best to maximize the bad publicity. Bad publicity is the minimum Apple deserves for a bug like this. In my idea world they'd get a lot of bad publicity, and a significant financial penalty.


I get it, I really do, but it's not like he was complaining about a bad Uber driver. Disclosure in this way has real-world impacts up to and including harming people and we shouldn't ever consider it as something which is remotely acceptable. Is it acceptable to publicly disclose that an airport has a self-destruct switch which can be accessed near the NW mens bathroom? No. You contact someone who can fix the problem, then publicly disclose.


It's as remotely acceptable as "root" with no password, apparently.

The question is large and complicated, and people can agree to disagree. There's nothing wrong with tweeting vulns: The company is at fault, we can defend ourselves now that we know about the vuln, and it's a big PR disaster for Apple.

A past conversation: https://news.ycombinator.com/item?id=14009937

No, no it's not strictly more ethical. It's not even strictly safer, which should be an even easier question to answer. The baked-in assumption in your logic is that users have no options other than waiting to patch. But, obviously, they do, and keeping vulnerabilities secret deprives them of those options.


> The question is large and complicated, and people can agree to disagree

> There's nothing wrong with tweeting vulns

Those two statements seem to be contradictory. It seems to me there is still quite a lot of debate about which disclosure policy is best,


But everyone can fix this problem by setting a root password. So telling everyone is the right call. Otherwise people would be sitting vulnerable while Apple comes up with a patch.


But a tweet isn't really the most effective way to tell everyone. Technical people, including those who would use this vulnerability for malice, will find out far far sooner than my grandmother.

It seems to me the right thing to do is to tell Apple privately, tell them to either push a fix or put out some kind of release letting all their customers know how to mitigate this in the next, say, 3 days, or I'll just tweet about it. What's the downside? At the worst case, you just prolonged the status quo for another 3 days.


It's not the most effective, but that doesn't make it bad, or malicious.


I agree this person isn't malicious, certainly. But I do think his decision was bad. Not "bad" in the moral sense, but "bad" in the sense of being sub-optimal.


Was his tweeting strategy more or less "sub-optimal" than Apple's security QA? I think we're focusing on the wrong part of this story.


    > people are under no obligation to
    > report vulns privately
Legal obligation, no, you're right. Moral obligation? Why not?




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: