Posts

Policies That Work (Making IT Real)

We talk a lot about IT, but we don’t talk nearly enough about making IT real.

In particular, I’ve found that there’s a disconnect between IT – security guys in particular – and the people they’re securing. While this applies across the board, it becomes a problem when policies are being created.

One of the keys to effective security is to stop trying to create policies or procedures that aren’t going to work. It’s great on paper to require users to have fourteen character passwords that change every day, only use Internet access for work purposes, turn off their cellphones as they walk in the door, have IT install (and fix) all printer connections, and never connect their iPads to corporate wireless. Unfortunately, these requirements work best on paper….not necessarily in reality. Security policies that users don’t buy into weaken security across the board.

When it comes to wireless in particular, it’s hard to tell users to just give up their need for constant Internet access. So I say: give them a clean, safe avenue to feed their need for unlimited access to the Internet. WiFi is so cheap – Internet connections in general are so cheap – that I suggest having open third-party wireless. It goes out to the Internet, and has no connection to the internal network. You get on the VPN as if you were in Starbucks – it’s just as hostile – and you back that up with policy that you treat the VPN as if you were in Starbucks. If you find a user bridging the networks, you have appropriate policy enforcements, equivalent to the ones if you found out that someone was publishing sensitive docs from a open, public network.

The question, then, is “what does appropriate punishment look like.” Personally, I believe in having effective policies that actually result in real change, and for that I have always found that positive reinforcement works the best. Whether positive or negative, acknowledgement either way is effective and very important. When I do a security awareness engagement (pentest) and I’ve completely destroyed the place, I spend the third day going out of my way getting caught. One time, I walked out with the business processing computer from behind a teller machine. There was a guy who had let me do lots of bad stuff, but this time he caught me. As soon as he caught me, I said “ooh, you caught me.” Basically, I gave him the win! It was a bad situation and we found all these flaws in their security. But these four people were able to find something, and that caught their attention.

We spend too much time in our industry showing people what they did wrong. You can’t find everything that everyone did wrong. But you can show them examples of what to do right. That’s what enforcement policies should be based off of – what it looks like to do things right. When I do enforce a punishment, I go to their desk and make that employee stand right behind me and watch while I “check” at their computer, even if I already know what was wrong. I make them watch the process. And then I say “you do understand our corporate policies, right?” Usually, if it’s the first time, I won’t necessarily even report it the first time, but I do publicly show him what the right way forward is. I don’t just educate this person – I’m also trying to educate everyone around that guy.

Unfortunately, not many IT departments have a guy like me.

But every IT guy can be a guy like me. Every quarter, a security professional or IT team doing security needs to physically walk through the company’s buildings. Pick a floor, campus, department. Walk through while people are there. Look under keyboards and monitors for passwords. Let them know what you’re doing, and let them know why you’re doing it. Security is everyone’s job: you’re just the one being obvious about it.

Are we becoming complacent?

The steady drumbeat of security breaches makes me wonder: are people and organizations becoming complacent? Has the mentality become that breaches are inevitable,  so “let’s make sure we are compliant and be really good at incident response and PR when we get hit?”

After all, if our credit cards are compromised, the merchants and banks have the liability and we get new cards. Worst that can happen is we are inconvenienced.

It makes for sensational news when it happens: “today X millions of credit cards were breached in an attack at a large retailer where we all shop.”  Then later, after an investigation that takes months and millions of dollars, we learn a bit about how it happened and eventually (if we are still interested) hear how some network of low lifes worked in the shadows to execute the breach.

The news of these attacks is becoming repetitive, and I think it is making us complacent.

Why is this relevant to us as security professionals? Because if it is true, it means we will see this complacent attitude in the executive team we report to. Complacency isn’t new, but it is getting worse. We have all seen the CFO that wants to cover the risk with insurance, meet minimal compliance, just do what’s required. It’s a typical CYA approach, and it’s the norm. And you and I know that it just isn’t enough.

It might seem like the only way that the CYA will get it is through a major breach, but another option is to show them what is actually being done to them on their watch.

We recently launched our beta of Pwn Pulse and is it ever revealing things that are spinning the helmets of management. While we cannot reveal the details of what we are seeing, let’s say it is well worth your consideration. Let me tell you why:

Though it’s easy for a security professional to say that there are certain things that need to be done, it takes solid proof to actually show most execs that something needs to be done. In “Project Eavesdrop”, Dave Porcello of Pwnie Express worked with NPR to show how you can be spied on. That series alone opened the eyes of many execs. Pwn Pulse, a service which goes live fully this Q4, can show you rogue actors at work in your organization today.  Show that to the CFO who is responsible for risk. I guarantee that he will lose the complacent attitude and you will get the attention you need.

We might need to shake things up.

 

Employees, Education, and Social Engineering

In conversations with CISO’s and others in charge of security, the Pwnies keep hearing the same thing: employees are usually the weakest link.

When people think of hackers, the stereotype is still of some guy in a basement, silently, remotely, and independently accessing the world around them. Of course this is sometimes true, but this ignores the simple fact that sometimes the easiest way to get into a system is to walk (often quite literally) right through the front door (both literally and figuratively).

Lately this threat has become even more visible, many of the recent large breaches used social engineering as the initial attack vector. The now infamous Target and RSA breaches started with targeted phishing emails. A yearly demonstration of social engineering’s effectiveness against even established companies happens every year at DEF CON’s Social Engineering Capture the Flag contest, a competition sponsored by SocialEngineer.org to see how many “flags,” or useful pieces of information, employees at these companies will disclose. 2014’s theme was “retail”, and most of the organizations tested failed with flying colors.

The most effective security audits take this into account, and use social engineering to test the security of the organization – calling for passwords, looking for devices left lying around, and plugging in things that shouldn’t have even been let through the door. Both adversaries and auditors use social engineering to do this, and employees usually don’t know what’s hit them – without knowing how people might take advantage of them, they’ve been left unequipped for the breach.

These problems may be obvious to security professionals, but it can be considerably more difficult to drive the problem home with  everyone else – those who feel that security is taken care of through compliance, or that all cyber attacks are divorced from the physical world. Recalling last week’s post “Scare the CEO,” a crucial part of any effective security plan is education. The most effective form of education is hands on. So, show your employees and colleagues what social engineering is…  as they say, it “takes one to know one”.

As an example of what can go wrong, Pwnie Express has a video called “Don’t Get Pwned,”  showing what it would look like for a pentester to breach an office by exploiting common vulnerabilities.

Check out Social Engineer.org for more.

Derby Con and $100 Off

Did you watch the Pwnies on Security Weekly last week? No? Well then you missed out… and on more than a great show! Pwnie Express was offering $100 off an R3 to those who watched (which expires September 30). You can still catch the show (and the discount code) here or on the Security Weekly site.

Win a red Pwn Phone

Also, Derby Con 4.0 is coming up! September 24-28 in Louisville, Kentucky, and Pwnie Express will be on hand September 25-26 (and we might have stickers), so stop by the booth and say hello! We’ll be having a drawing for a free red Pwn Phone, one of only a few specially-made ones. In order to enter the drawing, stop by the booth and drop a business card. In addition, two of the Pwnies will be leading a workshop called “Make Your Own Pwn Phone” on Friday, Sept. 26 from 2:00pm – 4:00pm where you can, well, make your own Pwn Phone. We will not, however, be providing phones — so remember to bring your own Nexus 5 or Nexus tablet if you want to participate. In addition, we will be selling the “Pwn Pad DIY kit” and the “Pwn Pro DIY kit;” full kits with all adapters, case, velcro, etc. at the booth.

Scare the CEO

Pwnie Express does not in any way condone fear mongering. That being said, the resignation of Target CEO Gregg Steinhafel (following a data breach that affected 40 million customers and will cost the company at least $148 million) is inherently scary, and not just in the InfoSec world. Those numbers should wake up even the most anti-IT, “we don’t need to spend money on that” executives. And with numbers like that, the business world is opening its eyes to the dangers, with publications like Bloomberg Businessweek and Forbes publishing the “CEO Guide to Cybersecurity” and “Five Smart Cybersecurity Moves from Top Security CEO’s”.

But there’s still a lack of communication between the business world and InfoSec experts, which is more detrimental to both parties than many realize. Security is not an isolated problem, both in that the results of a failure affect the entire organization and that for an effective security posture, the entire organization must be involved. The question, then, is how to involve the organization and teach them about a problem that is inherently esoteric – 0’s and 1’s causing real-world trouble.

We recently spoke with a security consultant who talked about the challenges of educating an organization, and his suggestion was both practical and effective. He pointed out that the best way to teach awareness is by getting people’s hands dirty. Sometimes, quite literally – lockpicking is a popular InfoSec hobby, and it’s a great training tool as well. Not until they actually do something – lockpicking, hackathon (what can you find with a basic nmap scan?), or a staged attack – will most people understand what they are up against. With services like PhishMe, employees are shown that they are more vulnerable than they realize – and that phishing emails don’t always come from Nigerian princes.

Computer Weekly published an article on the usefulness of attack simulation in executive buy-in for security. Like a human penetration test, the staged attack can wake up even the most staid executives. With threats becoming real, IT security is suddenly a necessity, not an optional expenditure. Additionally, it helps to identify weaknesses in an incident response plan, as business execs unfamiliar with security problems are forced to understand what is actually wrong and who it affects – do they notify clients? the press? freeze accounts? Not unlike Chaos Monkey, these tests want to break (or simulate breaking) your system when the consequences are not dire. Computer Weekly’s source Marco Gercke said that “in a real cyber attack, I once saw a board take nine days to issue a press statement because they did not understand the complexity of their company’s IT systems.” By making security real to everyone in the organization, the organization’s security posture becomes more robust.

Think of it like Halloween – only every trick you dole out is actually a treat.

For another story on frightening CEO’s with cybersecurity, see NPR’s “Cyber Briefings Scare the Bejeezus out of CEO’s

What’s Up, Doc?

Black Hat 2014 had a roundtable on “Medical Devices Roundtable: Is There a Doctor in the House? Security and Privacy in the Medical World”. Rapid 7’s Jay Radcliffe presented the major issues facing the healthcare industry as it moves in the direction of increasing automation both of information and devices, an expanding surface for all sorts of potential problems.

Though the roundtable was well-attended, Forbes’ Dan Munro pointed out that it was more incredible that medical care was surprisingly not present at the conference. Healthcare is becoming increasingly more automated, and rightly so — bioanalytics and cloud-based monitoring are helping to save lives by giving doctors up-to-date information on patients and remote oversight of their health. As he pointed out, this is not a bad thing: lives are not only being saved by wirelessly controlled pacemakers and insulin pumps; the lives of sick patients are often being improved by the ability to monitor and control processes that were previously invisible to patients. In addition, medical research is infinitely easier when the information from thousands of people — all willing participants, of course — can instantly be aggregated.

Radcliffe was quick to point out the main issues: lack of regulatory oversight, lack of understanding even within regulatory organizations, and lack of knowledge within the industry. As it exists, he pointed, security is under no domain. The FDA gives cybersecurity “guidance”, a tricky word that lacks the emphasis of retail’s PCI regulations and fines. They rightly point out that cybersecurity is a shared responsibility, which is simultaneously a problem and an opportunity, if the industry rises to the challenge.

Unfortunately, the industry is already behind. A DEF CON talk by Scott Ervin and Shawn Merdinger further explored just how lacking in security many medical device currently are, with another Munro article noting that over 90% of cloud services used by healthcare could pose a major security risk. New devices being marketed as health monitors also have the potential to be extremely detrimental, as information gathered from the devices could be used to collect sensitive data.

Meanwhile, data breaches at hospitals and health centers are already occurring, as the recent CHS incident attests. Data breaches, surprisingly enough, are a portion of the healthcare industry that is regulated under HIPAA (the Health Insurance Portability and Accountability Act), a Health and Human Services Act that protects Personally Identifiable Information (PII). Even with HIPAA and the guidance of the FDA, more has to be done in this field.

And with the potential implications of a hack or breach being human life, the stakes could not be higher.

“Advanced Persistent Pentesting” Slides

Slides are available for our “Advanced Persistent Pentesting: Fighting Fire with Fire” talk from Hacker Halted 2012. The central thesis of the presentation is that pentesters have all the tools available to them to simulate an APT, and that the focus on the pentesting report as a binary pass / fail has been wrong. We should be focusing on the result a bit, but we should be focusing on the process far more.

Some of the take aways from the presentation:

  • Both the tester and the test target should work closely together for maximum value.
  • Pentests should not operate in a silo.
  • As a defender, if you don’t want the results, you want the IR capability.
  • Adding or enhancing a capability qualifies as actionable results.
  • Offensive capabilities lead, defensive capabilities lag (several years?).

Thanks again to the entire crew at Hacker Halted that made this possible.