I have to say I’m rather intrigued by the TalkTalk hack. First of all, they’ve found the 15-year-old who allegedly did it, arrested him and bailed him pending investigations. Hopefully, if said person did it, he’ll be quite interested in helping the police with their inquiries and with a bit of luck, the customers aren’t going to have their personal or financial details released. TalkTalk have waived cancellation fees for customers who want to leave, but only if a customer has sustained a financial loss.
Meanwhile, Brian Krebs reports that the TalkTalk hackers demanded £80k worth of Bitcoin from the ISP. We’ve now had the media tell us it is “cyberjihadis”, a fifteen year old boy, and people holding it ransom for Bitcoin.
What’s curious though is how the mainstream media have not really talked very much to security experts. Yesterday, I listened to the BBC Today programme—this clip in particular. It featured an interview with Labour MP Hazel Blears (who was formerly a minister in the Home Office) and Oliver Parry, a senior corporate governance adviser at the Institute of Directors.
Here’s Mr Parry’s response to the issue:
The threat is changing hour by hour, second by second—and that’s one of the real problems, but as I said, I don’t think there’s one way to deal with this. We just need to reassure consumers, shareholders and other wider stakeholder groups that they have something in place.
Just a few things. This attack was a simple SQL injection attack. That threat isn’t “changing hour by hour, second by second”. It’s basic, common sense security that every software developer should know to mitigate, that every supervisor should be sure to ask about during code reviews, and that every penetration tester worth their salt will check for (and sadly, usually find).
As Paul Moore has pointed out in this excellent blog post, there are countless security issues with TalkTalk’s website. Craptastic SSL, no PCI compliance. The talking heads are going on about whether or not the data was “encrypted” or not. The SSL transport was encrypted but you could request that they encrypt traffic with an obsolete 40 or 56-bit key rather than the 128-bit that is considered secure.
There are new and changing threats, but SQL injection isn’t one of them. That’s a golden oldie. And it also doesn’t matter if the data is encrypted if the web application and/or the database is not secured against someone injecting a rogue query.
Mr Parry is right that there is not “one way to deal with this”. There are plenty. First of all, you need to hire people with some security expertise to build your systems. You need to hire independent experts to come and test your systems. That’s the in house stuff. Unfortunately, TalkTalk seem to have lost a whole lot of their senior technical staff in the last year, including their CIO. Perhaps they weren’t confident in the company’s direction on security and technology matters.
Then there’s the external facing stuff. Having a responsible disclosure process that works and which gives incentives for people to disclose. Have a reward system. If you can’t afford that, have a guarantee that you won’t seek prosecution or bring legal action against anyone who engages in responsible disclosure. Have an open and clear log where you disclose your security issues after fixing them. Actually fix the issues people report to you. Again, Paul Moore’s post linked above notes that he tried to contact TalkTalk and was ignored, disrespected and threatened. That’s not how you should treat security consultants helping you fix your issues.
All of this stuff should be simple enough for an ISP that has over four million paying customers. It isn’t rocket science. The fact that they aren’t doing it means they are either incompetent or don’t give a shit.
Brian Krebs nailed this corporate failure to care about security recently in a discussion on Reddit:
I often hear from security people at organizations that had breaches where I actually broke the story. And quite often I’ll hear from them after they lost their job or quit out of frustration, anger, disillusion, whatever. And invariably those folks will say, hey, we told these guys over and over…here are the gaps in our protection, here’s where we’re vulnerable….we need to address these or the bad guys will. And, lo and behold, those gaps turned out to be the weakest link in the armor for the breached organization. Too many companies pay good money for smart people to advise them on how to protect the organization, and then go on to ignore most of that advice.
Mr Parry said that the important thing was reassuring customers that their information was safe, not actually ensuring that customers data is safe. This is exactly the problem. I don’t want to be “reassured”, I want it to be safe. I don’t want to be reassured that my flight isn’t going to crash into the Alps—I actually want my flight to not crash into the Alps. Reassuring me requires salespeople and professional bullshitting, not crashing requires well trained pilots and staff, engineers doing proper checks, the designers of the plane making sure that they follow good engineering practices, constant testing. Engineering matters, not “reassurance”.
So long as business thinks “reassuring” customers matters more than actually fixing security problems, these kinds of things will keep happening. It would be really nice if the media actually spoke to security experts who could point out how trivially stupid and well-known the attack on TalkTalk was, so this kind of industry avoidance tactic could be properly squelched.