Monday, July 21, 2008

Auditable != Secure

It occurred to me this morning that many of the security systems I've been challenged with recently aren't really security systems at all. They are auditing systems. For every piece of the production environment and every change, there must be a way for an outside auditor to track that piece or change to a slew of paperwork that explains why that piece or change is in the production environment.

Processes and rules are put into place to facilitate easy auditing. In ideal scenarios for auditors, a set of responsible parties need to authorize every change to the production environment. They do this by signing off on a set of documents that, among other things, describe the change that is going into production. The process is anything but swift. Some of the challenges that slow the process include: all approvals must be in place 24 hours before the release window, if any change is made to the paperwork the approval process needs to be restarted, e.g., if you forget one file or step on the day of release, it's likely that the release will need to be postponed. Because approvals are so regular and so frequent, approvers get into the habit of rubberstamping approval requests. People aren't good at diligently performing detailed repetitive tasks.

The inefficiency of the system is costly in terms of time, but also in false senses of security. People believe that the system is secure. I believe that psychologically people believe that an obstructive inconvenience equates to security. And to people who obey rules, the system is secure.

I know it may be difficult for many people to believe this, but there are people who willfully ignore rules. Putting procedural barriers in place of people who have no intent of following those procedures is more expensive and far less productive than having no barrier. The people who follow the procedure are less productive because the process is by definition obstructive. The system is not secure because the obstructive procedures are ignored by the "evil doers".
How do these obstructive systems break down, mostly through a casual disregard for the procedures, social engineering and incompetence.

Let's take a hypothetical system with a Segregation of Duties(SOD) policy. Under SOD, only a trusted few are allowed access to the production passwords. Those trusted few will give out the passwords by means of incompetence, social engineering, or just disregard for the system. It will happen. Once the passwords get out some people will eventually begin using them as shortcuts to avoid the cumbersome process.

The other flaw to the system is those password holders have no mechanism to keep their actions in check. One of them could go rogue and run amok.

The last rung in the ladder are the changes themselves. Any developer worth a damn could sneak code into production that has vastly different functionality than the code in the QA environment.

Those are just a few realistic scenarios. A system that relies on a segregation of those who make changes and those who put them in is not sufficient to protect a system from malice or, our old friend incompetence.

Having a system in place actually makes these situations worse for two big reasons. One reason is the perception that the production environment is secure. People may be vigilant to the obstructive processes, yet completely lax with other aspects of the system. False senses of security breed complacency. The second is the difficulty the process imposes by obstructing people from reacting to incidents that damage the production environment, regardless of intent.

Instead of relying on procedure alone I'd recommend the following practices be put into place: Develop a practical disaster recovery strategy. The better the strategy, the smaller the window for mishaps.

Next, streamline the promotion process: Make all documentation living and make approvals meaningful.

Put the power of approval into lower level people. They know what's going in, let them take responsibility for it. If there's a problem, the director will catch heat for it anyway. If directors are just approving projects without knowing anything about them, they aren't adding value to the project.

Finally if you want to see a good model for security, check out banks and industries that handle money. Banks handle money all the time. Their processes are constantly refined to avoid people from exploiting vulnerabilities in their processes. The good thing about banking and finance is, if there is a hole in the process, that's likely to be the first industry to have someone exploit it.

One idea, why not make production passwords require two trusted people to provide credentials via multi factored authentication to generate a temporary production password that is good for the scheduled window of the release. The intention is that no one person could get into production by themselves.

The system I outlined above is both more secure than the basic SOD system. Computers are powerful now, information can be tracked for auditors without requiring that people perform multiple obstructive steps. Isn't auditing really about tracking changes and making sure that an organization can explain the reasons behind any change.

Is it the mandate of the auditor that efficiency be sacrificed so they are able to do their work? If so, they're very mean people.

No comments: