Name:

Favorite quotation: Before you criticize someone, you should walk a mile in their shoes, that way when you criticize them, you're a mile away and you have their shoes. - Jack Handey

Thursday, February 25, 2016

Why Apple vs. FBI is No Contest, and Why Tim Cook is Right about the Danger of Leaving Vulnerabilities Open

On the air today, Lars Larson asked me whether, if I was the CEO of Apple, whether I would order my engineers to unlock a would-be terrorist's phone if I was told that stopping something like 9/11 was on the line.

I said no, but I have been thinking about it. It's a very tough question. Of course I would want to stop any such horrific attack.

Also, I am pro-law-enforcement, and I support the police. I have many relatives who served in the military. I work with ex-Marines every day at the software startup I work for. We talk about all these issues. We know the FBI trying to just do their jobs, and it's for our protection.

However it's impossible to explain the cybersecurity landscape in the context of a radio show phone call, as I had wanted to try to do.

So I am writing this blog.

Imagine if Apple complies and creates the malware suitable for hacking the dead terrorist's phone (and the 12 others that the FBI wants them to hack, and however many others they will continue being asked to hack).

To comply with the court order by creating this malware, Apple's engineers must first identify a vulnerability in their hardware and software—both of which are specifically designed not to have such vulnerabilities. Then Apple must write malware that depends entirely on the existence of that vulnerability, and keep that malware around indefinitely into the future for the FBI.

Even if Apple never releases this malware, or tells anyone about the vulnerability, or how they exploited it, Apple must still leave open the vulnerability that they found, so that their malware hacking tool they made for the FBI will continue to work.

Now, let me put that into the context of today's cybersecurity landscape.


Apple, Google, Microsoft, and other major tech companies are constantly searching for security holes in their own and each others products, then quietly and quickly patching those exploits. As soon as a loophole is found by the "white-hat" security researchers (i.e. good-guy hackers), then the security flaw gets patched as quickly as possible. Quite often, these exploits get published online in the CVE database. For example here is the current CVE list of known iOS vulnerabilities and the current CVE list of known Android vulnerabilities.

More often than not, Apple is not the party who discovers the newest iOS vulnerability. (If they knew about it, then it would not have made it into iOS in the first place!) So as soon as these vulnerabilities appear, then Apple (and Google and Microsoft) quickly cat to patch the vulnerabilities because they are now in a race with cybercriminals to identify ways to exploit those vulnerabilities.

All too often, before anyone patches them, these sorts of loopholes get exploited by cybercriminals and hackers. If the cybercriminals win the race, then hundreds of millions of peoples' payment information, medical records, and extremely private personal information get put at risk. In other words, peoples' identities and lives are put at risk.

Everyone believes that Apple is technically capable of making the kind of malware weapon that the FBI and court are demanding that they make. However, Tim Cook has called it "dangerous" and not just for marketing reasons—the FBI are proposing that Apple create a tool that relies on the same kind of vulnerability that, eventually, cybercriminals (or security researchers at another company) will discover and publish or exploit. Then, hackers can exploit this by creating their own version this malware, probably without Apple's knowledge.

Because Apple would have no way to know when or if cybercriminals had managed to find the same loophole that Apple used to create malware to hack phone for the FBI, then by the time they found out, it would probably be far too late. Not only would people's devices get hacked, but also, Apple would be responsible, because they would have known about the exploit well in advance of it being taken advantage of. Ironically in that inevitable situation, the FBI themselves would be the ones enabling cybercriminals to hack everyone's phones.

The Nature of the Risk


You should also realize that such a malware, if it came into the wrong hands—whether by being leaked from within Apple, or because Apple kept the exploit open for the sake of law enforcement—could easily be used against the United States by its enemies—and used more effectively than it probably could be used by the FBI to stop them.

To use such an exploit the FBI needs to have the device and its owner in its custody... otherwise the owner can wipe the device from remote. The FBI then must take the time to get a warrant and send the device to Apple, etc.

However if cybercriminals have the exploit (unbeknownst to Apple or the FBI), they can exploit it as soon as they get a device, before the owner knows it's missing, and possibly without the device even going missing. Its owner will not be a criminal that (if they were smart) didn't keep anything sensitive on it; they'll be an American citizen, who, thinking the device to be secure, kept all sorts of things on it that would be damaging if they fell into the wrong hands.

Suppose that a nuclear plant worker plugged his phone into a compromised charging terminal at an airport. Cybercriminals use this backdoor to plant secret spyware on his phone and use it to get inside the secure systems at the nuclear power station when he connects to its WiFi using his phone. Such a technique could also be used to get into the WiFi network of an airplane, which is one theory behind what happened to the Malaysia Air flight that went missing.

Terrorists could also use cybercrime to steal identities of people traveling abroad, then infiltrate the country using the medical information and payment methods on their phones to completely impersonate them.

These are, sadly, very real types of threats. If there's a way to create this hack, you can be sure that the cybercriminals and hackers will (sooner, not later) figure create the hack themselves—*whether or not* Apple creates this tool.

The only line of defense we have against cybercriminals and hackers in this world is the sort of encryption which the iPhone has. On the other hand, as you point out, it could also be used by terrorists to keep certain secrets.

The Surveillance State Is Not The Real Problem


On the air with Lars, I said this issue is about freedom and privacy vs. living in a surveillance state. However, while I was upset to learn of the NSA spying on American citizens without warrants, I also feel that the real threat is not from our government spying on us, but from cybercriminals and all the enemies of the United States who are, every day, already using the type of malware against us that the FBI wants Apple to create and leave the vulnerability open for.

Lars makes a good point, which is that a threat like 9/11 is also very real. However, it seems to me that 9/11 happened only because the FBI and CIA failed to do their jobs properly. It had nothing at all to do with a lack of access to information, but everything to do with the FBI and CIA being dysfunctional organizations much in the same way that the Challenger disaster was a product of NASA's dysfunction. Not listening to subordinates, and lacking interagency communication, led to the hijackers falling through the cracks of the agencies who were supposed to be protecting us.

Of course, CIA and FBI have made many changes in response to that dysfunction since that time, and the Dept. of Homeland security was also created in response to it. These interagency communication improvements are good, but we should not have needed to lose two space shuttles and two WTC towers to get people to wake up and do their damn jobs right.

So... as someone who watched the towers fall and the Challenger blow up, I have a very hard time siding with the FBI when I know if they just did their jobs right, then it would not have been a problem. However I also think we should support the FBI and NASA more now than ever, because the missions of both these agencies transcends their past mistakes. Nobody is perfect, least of all me, and if we can't persevere then how can we take pride in the USA?

So I commend the FBI for being worried about stopping terror attacks, but they need to let Apple be worried about stopping legions of cybercriminals and hackers who are working day and night towards stealing hundreds of millions of people's entire lives that they keep on their phones, including medical records, payment information, and private information related to their jobs, etc.

If Apple can't be allowed do its job in the war against cybercrime,  then there will be a lot more dead terrorists with phones that the FBI needs Apple to yet again hack, because the criminals will just find other forms of encryption to lock up their sensitive data. However regular American citizens will be put at risk.

Cybercrime Is A Greater Threat Than Terrorism


I believe the threat of terrorists being able to use encryption to keep secrets is a much lesser threat over-all than that of cybercrime, due to the huge and immense scale of cybercrime, and the fact that it can be done to anyone at any time anywhere in the world. You have not seen the server logs that I've seen... you can't have a server online for more than 15 minutes before the hack attempts start. It's crazy out there.

You might think that Apple could create this malware, and keep it safe, keep it secret. However, by doing this, they would be taking on an incredibly huge liability. Since that could potentially do irreversible harm to their business, Tim Cook is, technically speaking, legally required not to take that action. However, the reason it would harm their business is because it would harm their customers.  If they create this tool, history tells us that someday it will escape into the wild, or evolve in the wild on its own. No one will know that it exists in the wild until it's much too late.

And guess what? We've already been down that road, several times. There were a number of high-profile encryption backdoors that seemed awfully conveniently left in place, like Heartbleed, which the NSA knew about for years before it was eventually discovered by hackers, then used against lots of businesses and innocent people, with losses well into the billions at the high profile big banks and government agencies being hit particularly hard.

These are also the exact same kinds of exploits that have lead to the theft of military aircraft schematics from the defense department—do you understand what I'm saying?

The Catch-22 Of Data Security: The FBI needs to actually be intelligent.


It's a horrible catch-22, but if we're going to have security, the simple fact is that government agencies like the FBI and NSA need to do their jobs without relying on companies like Apple to leave security exploits open.

They need proactively and work on its interrogation techniques and ways of detecting threats before they happen, and not just relying on companies like Apple to enable them to hack the phones of people in their custody who are uncooperative due to death or other reasons.

The FBI needs to be watching potential terrorists who have immigrated recently from countries that are known to spawn terrorists. They need to surveil those people using the myriad of techniques that exist today.

Frankly with all the crazy tech that's available to the FBI for surveillance now, it's almost a joke that they would need Apple to hack someone's phone. They can follow them 24/7 with drones and use special lasers pointed at windows to pick up every single thing they say in their home. They can also tap their conversations. It's kind of crazy that we are even talking about this.

However the FBI and Homeland Security allowed a terrorist to immigrate from a known terrorist threat nation. They let her set up shop, stockpile weapons, and go on a shooting spree. Their demands to Apple frankly seem to me like a way for them to deflect questions off of themselves for failing to properly identify the threat posed by the San Bernadino terrorists prior to the attacks.

So it is from that perspective that I responded to Lars Larson's question of whether, if I was the CEO of Apple, and the FBI came to me with a phone and said, "we need you to crack this to stop a major terrorist attack," I would say "no." Because there is a lot more at stake than I think you are giving Tim Cook credit for realizing.

What's more, Apple cannot hack a phone if the owner is still at large, because they could just wipe it from remote. That means, the only time the FBI will need Apple to open a phone is when they already have the suspect in custody or they're dead. The likelihood that the FBI would already have a  suspect in custody and yet their phone is only key to stopping a major attack seems very low to me.

What seems more likely is the FBI will be asking Apple to hack phones potentially full of HIPAA-protected medical records for suspects who already pleaded guilty or who pleaded the fifth. At that point, is the FBI likely to have no other leads that it is intelligent enough to use to stop whatever pending attack the phone might hold the key to?

To ask Apple to betray any of its customers' trust, especially a customer who (more than likely) will not even have been proven guilty yet and may be pleading the fifth, is tricky.

Then there's the fact if a loophole exists to hack an iPhone, there's no way to know that Apple's the only one who has that loophole. That, in turn, casts a reasonable doubt upon the authenticity of any information that could be found on that phone, since it could have been planted there by a hacker.

In Conclusion


I appreciate your willingness to read this. It's just my perspective on it as someone who has worked in computer programming and network administration (and thus, cyber security) in the past.

I'm sorry to Lars that I wasn't able to make this point more clearly on the air. It is a complex issue.

But I don't feel this should be a political issue; I think it is clear cut and we should all agree that a free and secure society based on the Constitution comes with certain risks.

The legal issue of whether or not the court should legally be able to tell Apple what to do, is a sideshow to the real issue of whether Apple, if it discovers a security loophole, should leave it in place and hope no hackers find it. Because, hackers WILL find it.

0 Comments:

Post a Comment

<< Home