Wednesday, July 16, 2014

Transformation: A Future Not Slaved to the Past

In his May 30, 2014 contribution to the Washington Post’s Innovations blog, Dominic Basulto lays out a convincing argument as to how cyber-warfare represents a new form of unobserved but continuous warfare in which our partners are also our enemies.  The logic within Basulto’s piece is flawless, and his conclusion, that the “mounting cyber-war with China is nothing less than the future of war” and that “war is everywhere, and yet nowhere because it is completely digital, existing only in the ether” is particularly powerful. 

Unfortunately, the argument, and its powerful conclusion, ultimately fails.  Not because of errors in the internal logic, but rather because the implicit external premise, that the both the architecture of the internet and the processes by which software is developed and deployed are, like the laws of physics, immutable.  From a security perspective, the piece portrays a world where security technology and those charged with its development, deployment and use are perpetually one step behind the attackers who can, will and do use vulnerabilities in both architecture and process to spy, steal and destroy. 

It’s a world that is, fortunately, more one of willful science fiction than of predetermined technological fate.  We live in an interesting age.  There are cyber threats everywhere, to be sure.  But our ability to craft a safe, stable and secure cyber environment is very much a matter of choice.  From a security perspective, the next page is unwritten and we get to decide what it says, no matter how disruptive.

As we begin to write, let’s start with some broadly-agreed givens: 

  • There’s nothing magical about cyber security;
  • There are no silver bullets; and
  • Solutions leading to a secure common, distributed computing environment demand investments of time and resources. 
Let’s also be both thoughtful and careful before we allow pen to touch paper.  What we don’t want to do is perpetuate outdated assumptions at the expense of innovative thought and execution.  For example, there’s a common assumption in the information technology (IT) industry in general and the security industry (ITSec) in particular that mirrors the flaw in Basulto’s fundamental premise; that new security solutions must be applied to computing and internet architectures comparable or identical to those that exist today.  The premise behind this idea, that “what is, is what must be,” is the driver behind the continued proliferation of insecure infrastructures and compromisable computing platforms.

There’s nothing quixotic – or new - about seeking disruptive change.  “Transformation” has been a buzzword in industry and government for at least a decade.  For example, the North Atlantic Treaty Organization (NATO) has had a command dedicated to just that since 2003.  The “Allied Command Transformation” is responsible for leading the military transformation of forces and capabilities, using new concepts and doctrines in order to improve NATO's military effectiveness.  Unfortunately, many transformation efforts are often diverse and fragmented, and yield few tangible benefits.  Fortunately, within the rubric of cyber security, it’s possible to focus on a relatively small number of transformational efforts.

Let’s look at just four examples.  While not a panacea, implementation of these four would have a very significant, ameliorating impact on the state of global cyber vulnerability.

1. Security as part of the development process

Software security vulnerabilities are essentially flaws in the delivered product.  These flaws are, with rare exception, inadvertent.  Often they are undetectable to the end user.  That is, while the software may fulfill all of its functional requirements, there may be hidden flaws in non-functional requirements such as interoperability, performance or security.  It is these flaws, or vulnerabilities, that are exploited by hackers.

In large part, software vulnerabilities derive from traditional software development lifecycles (SDLC) which either fail to emphasize non-functional requirements, use a waterfall model where testing is pushed to the end of the cycle, don’t have a clear set of required best coding practices, are lacking in code reviews or some combination of the four.  These shortcomings are systemic in nature, and are not a factor of developer skill level.  Addressing them requires a paradigm shift.

The DevOps Platform-as-a-Service (PaaS) represents such a shift.  A cloud-based DevOps PaaS enables a project owner to centrally define the nature of a development environment, eliminating unexpected differences between development, test and operational environments.  Critically, the DevOps PaaS also enables the project owner to define continuous test/continuous integration patterns that push the onus of meeting non-functional requirements back to the developer. 

In a nutshell, both functional and non-functional requirements are instantiated as software tests.  When a developer attempts to check a new or modified module into the version control system, a number of processes are executed.  First, the module is vetted against the test regime.  Failures are noted and logged, and the module’s promotion along the SDLC stops at that point.  The developer is notified as to which tests failed, which parts of the software are flawed and the nature of the flaws.  Assuming the module tests successfully, it is automatically integrated into the project trunk and the version incremented.

A procedural benefit of a DevOps approach is that requirements are continually reviewed, reevaluated, and refined.  While this is essential to managing and adapting to change, it has the additional benefits of fleshing out requirements that are initially not well understood and identifying previously obscured non-functional requirements.  In the end, requirements trump process; if you don’t have all your requirements specified, DevOps will only help so much.

The net result is that a significantly larger percentage of flaws are identified and remedied during development.  More importantly, flaw/vulnerability identification takes place across the functional – non-functional requirements spectrum.  Consequently, the number of vulnerabilities in delivered software products can be expected to drop.

2. Encryption will be ubiquitous and preserve confidentiality and enhance regulability

For consumers, and many enterprises, encryption is an added layer of security that requires an additional level of effort.  Human nature being what it is, the results of the calculus are generally that a lower level of effort is more valuable than an intangible security benefit.  Cyber-criminals (and intelligence agencies) bank on this.  What if this paradigm could be inverted such that encryption became the norm rather than the exception?

Encryption technologies offer the twin benefits of 1) preserving the confidentiality of communications and 2) providing a unique (and difficult to forge) means for a user to identify herself.   The confidentiality benefit is self-evident:  Encrypted communications are able to be seen and used only by those who have the necessary key.  Abusing those communications requires significantly more work on an attacker’s part.

The identification benefit ensures that all users of (and on) a particular service or network are identifiable via the possession and use of a unique credential.  This isn’t new or draconian.  For example, (legal) users of public thoroughfares must acquire a unique credential issued by the state:  a driver’s license.  The issuance of such credentials is dependent on the user’s provision of strong proof of identity (such as, in the case of a driver’s license, a birth certificate, passport or social security card). The encryption-based equivalent to a driver’s license, a digital signature, could be a required element, used to positively authenticate users before access to any electronic resources is granted. 

From a security perspective, a unique authentication credential provides the ability to tie actions taken by a particular entity to a particular person.  As a result, the ability to regulate illegal behavior increases while the ability to anonymously engage in such behavior is concomitantly curtailed.

3.  Attribute-based authorization management delivery at both the OS and application levels

Here’s a hypothetical.  Imagine that you own a hotel.  Now imagine that you’ve put an impressive and effective security fence around the hotel, with a single locking entry point, guarded by a particularly frightening Terminator-like entity with the ability to make unerring access control decisions based on the credentials proffered by putative guests.  Now imagine that the lock on the entry point is the only lock in the hotel.  Every other room on the property can be entered simply by turning the doorknob. 

The word “crazy” might be among the adjectives used to describe the scenario above.  Despite that characterization, this type of authentication-only security is routinely practiced on critical systems in both the public and private sectors.  Not only does it fail to mitigate the insider threat, but it is also antithetical to the basic information security principle of defense in depth.  Once inside the authentication perimeter, an attacker can go anywhere and do anything.

A solution that is rapidly gaining momentum at the application layer is the employment of attribute-based access control (ABAC) technologies based on the eXtensible Access Control Markup Language (XACML) standard.  In an ABAC implementation, every attempt by a user to access a resource is stopped and evaluated against a centrally stored (and controlling) access control policy relevant to both the requested resource and the nature – or attributes – a user is required to have in order to access the resource.  Access requests from users whose attributes match the policy requirements go through, those that do not are blocked.

A similar solution can be applied at the operating system level to allow or block read/write attempts across inter-process communications (IPC) based on policies matching the attributes of the initiating process and the target.  One example, known as Secure OS, is under development by Kaspersky Lab.  At either level, exploiting a system that implements ABAC is significantly more difficult for an attacker and helps to buy down the risk of operating in a hostile environment.

4.  Routine continuous assessment and monitoring on networks and systems


It’s not uncommon for attackers, once a system has been compromised, to exfiltrate large amounts of sensitive data over an extended period.  Often, this activity presents as routine system and network activity.  As it’s considered to be “normal,” security canaries aren’t alerted and the attack proceeds unimpeded. 

Part of the problem is that the quantification of system activity is generally binary. That is, it’s either up or it’s down.  And, while this is important in terms of knowing what capabilities are available to an enterprise at any given time, it doesn’t provide actionable intelligence as to how the system is being used (or abused) at any given time.  Fortunately, it’s essentially a Big Data problem, and Big Data tools and solutions are well understood. 

The solution comprises two discrete components.  First, an ongoing data collection and analysis activity is used to establish a baseline for normal user behavior, network loading, throughput and other metrics.   Once the baseline is established, collection activity is maintained, and the collected behavioral metrics are evaluated against the baseline on a continual basis.  Deviations from the norm exceeding a specified tolerance are reported, trigger automated defensive activity or some combination of the two.

Conclusion

To reiterate, these measures do not comprise a panacea.  Instead, they represent a change, a paradigm shift in the way computing and the internet are conceived, architected and deployed that offers the promise of a significant increase in security and stability.  More importantly, they represent a series of choices in how we implement and control our cyber environment.  The future, contrary to Basulto’s assumption, isn’t slaved to the past.