Friday, May 3, 2013

Parallel Tracks: National Cyber Security Policy and the Implementation of Secure Software

When it comes to cyber operations, we’re finally on target with respect to policy.  Practical implementation is another matter.  Fortunately, there is a way ahead, and one that leverages open source offerings to control cost and increase availability.

On 26 April 2013, the Associated Press published a story proclaiming that American military academies are “grooming future officers for warfare in cyberspace.”  The article highlighted the increased emphasis being placed on cyber operations by the US Department of Defense, quoting a recently commissioned US Air Force officer who had given up plans to become a fighter pilot:  ““It’s a challenge, and for people who like a challenge, it’s the only place to be.”

Inspirational as Lieutenant Keefer’s story is, the reality is that American military cyber-preparedness is still in its infancy. It was only in 2012 that US Naval Academy began requiring freshmen to take a cybersecurity course, or, for that matter offered a cyber operations major.  Upper classmen will not be required to take additional cyber-focused courses until 2014.  A statement made by the Academy’s superintendent, Vice Admiral Michael Miller, that “There’s a great deal of interest, much more than we could possibly, initially, entertain” indicates not only student interest but, troublingly, a lack of resource allocation. 

Combined, the lack of resourcing and the relatively small academic emphasis placed on cyber operations paint a very different picture than the publicized student enthusiasm.  It’s the military; commanders and leaders have broad latitude to make rapid, sweeping changes when subjected to the demands of either politics or tactical realities.  Today, the United States has a modular Army, women are authorized to serve in combat roles and a service member’s sexual orientation is a non-issue. 

What we do not have is a cohesive national cyber operations policy that addresses force structure, operational doctrine, the implementation of information assurance policies and the cyber-operations education of leadership in the acquisitions, training, doctrine and operational communities.  Compounding the problem is an inability to attract the necessary talent to the nation’s premier cyber defense organization, US Cyber Command.  According to a recent Defense News article, Cyber Command is still nearly 4,000 personnel short of the “a proper cyber force to adequately give capability to the national command authorities, to the COCOMs (Combatant Commands), and defend the nation.”

Fortunately, there is broad recognition at the policy-making level of the need to harden not only the national defense cyber posture, but that of critical commercial infrastructure as well.  Conferences such as the recent International Engagement on Cyber, held at Georgetown University, and the upcoming Government Cybersecurity Forum, are well attended by government, industry and academia.  Everyone agrees as to the nature of the threat and the need for both proactive and reactive response.  Agreement at the policy level, however, is not the same as the implementation of concrete measures and technologies that mitigate the dangers inherent to today’s connected environment.  

Complicating matters is the fact that all implementations aren’t equal.  Setting a standard is not the same as implementing.  Implementing on a scale and cost applicable to large business or government is often not feasible for small or medium business.  And implementations that adversely impact productivity are bound to be resisted by organizations operating under temporal and/or fiscal constraints.  And in the defense and intelligence sector these days, everyone operates under increasingly tight temporal and fiscal constraints.  In the end, it often seems that the only acceptable cyber defense implementations will be ones that are temporally transparent to users and, as much as possible, fiscally transparent to managers and executives at operating organizations.

Fortunately, there is a path ahead for organizations seeking such a transparent defense.  More accurately, there are two paths ahead, one that addresses runtime concerns and another that addresses design-time issues.

Runtime Concerns

Runtime for modern distributed systems is characterized by a constant flow of message traffic between system components.  Typically, a message represents a request for a resource, such as a particular data entity or processing capability.  These messages may adhere to any of a number of standards.  At the most basic level, securing such an environment requires the validation of the identity of a message sender against a predefined list of users (human or machine) who are permitted to make requests against system resources.  This identity validation, or authentication, may take many forms, such as the provision of a valid username and password pair, of a valid digital certificate or of a valid biometric signature.

Authentication on its own is not a strong enough security mechanism.  Alone, it creates an environment where any authenticated user can access any system resource.  The Wikileaks breach resulted from just such an environment.  To harden systems, an access control, or authorization, scheme is often added to the authentication scheme.  When properly applied, authorization mechanisms ensure that the principle of minimum privilege is applied.  That is, that authenticated users have access to only those system resources consistent with and necessary for their job duties.  The authorization scheme preferred by the US Department of Defense (DoD) is called Policy Based Access Control (PBAC). (PBAC is synonymous with Attribute Based Access Control (ABAC)).

In a PBAC scenario, an authenticated user makes a request for a resource.  The request is halted by a systemic gate guard or enforcement point.  The enforcement point requests an access control decision from a decision making point.  The decision maker evaluates information about the requestor and the resource with respect to a predefined access control policy and renders a decision, which is relayed to the enforcement point.  The enforcement point implements the decision as either a go or a no-go for the request.

The PBAC scenario happens transparently with respect to the user. Importantly, with the incorporation of modern interface definition languages such as Apache Thrift, very little system latency is generated with the additional access control processing.  PBAC is usually implemented through the use of open standards such as the eXtensible Access Control Markup Language (XACML) and the Security Assertion Markup Language (SAML).  PBAC implementations are in use with government, military and commercial enterprises throughout the world.

As can be seen, PBAC helps to ensure the core information security principles of confidentiality (only authorized users have access to the requested resources), authenticity (only properly authenticated users can make requests for resources) and non-repudiation (the resource request is tied to a specific, authenticated user).  What PBAC doesn’t do is help to ensure message integrity or system availability.    As noted, the core of distributed (and that includes service oriented or Cloud-based) systems is the exchange of messages. It’s not difficult to imagine a scenario where legitimate messages contain a malware payload, a problem not addressed by traditional PBAC implementations. 

However, the PBAC architecture provides a useful archetype for addressing this threat.  PBAC is premised on the primary gate guard, or enforcement point.  This mechanism stops – or mediates - all incoming requests for an access control check.  (Mediation is a standard data processing pattern where by data in transit is operated upon before it arrives at its final destination.) 

Instead of conceiving of a PBAC scheme as THE security gateway, architects could conceive of it as phase one mediation of the security gate way process.  Upon successful authorization mediation, the request would pass to phase two mediation, where it would be scanned for malware payloads.  Clear messages would be allowed to proceed, while infected messages would be quarantined and the administrator notified.

Design Time Concerns

For systems earmarked for use by government or military organizations the completion of coding and functional testing is not, to paraphrase Winston Churchill said, the end. It is not even the beginning of the end.  It is merely the end of the beginning. Following the development effort, the system is turned over to a certification and accreditation (C&A) process that can take up to eighteen months and cost more than a million dollars.  The C&A process is meant to ensure that the system complies with applicable security paradigms and standards and that it is appropriately hardened. 

Problematically, the C&A process often creates a laundry list of security holes that must be patched prior to acceptance.  This can result in developers closing only gaps noted, and not truly securing the system.  Cybersecurity, in this case, becomes an overlay, not something that was “baked into” the system from the beginning.  What’s really needed is a way to demonstrate that cybersecurity and information assurance requirements are met by the software as it is being developed.

In this case, the defense industry could take a page from commercial industry’s DevOps community.  DevOps principles stress continuous delivery.  In order to achieve continuous delivery, everything possible must be automated, allowing the achievement of continuous development, continuous integration and continuous test.  The critical elements for addressing the defense C&A process are continuous, automated test and integration.  In this environment, not only the software functionality but also the organization’s governance principles are embodied in the automated test regime.  For the defense community, these principles include the cybersecurity and information assurance requirements flowing from DoD Directive 8500.01E (and related documents).

The objective DevOps environment would be instantiated with governed, distributed, Cloud-based development platform.  In this environment, when a developer checks in a code module, it is automatically tested against not only functional requirements, but security (and interoperability and performance) requirements embodied in the DevOps platform’s test regime.  If it doesn’t meet all of the requirements, it is rejected, and the developer is provided a report indicating why the module failed.  The implications of such an environment are significant.  Potentially, the only mechanism that needs to be formally certified and accredited is the DevOps platform. Any software issuing from that trusted platform would be automatically certified and accredited.  As the platform could be certified independently and prior to the commencement of development activities, no independent C&A test period would be necessary for the delivered system, and fielding could begin as soon as the coding was complete.  This would add an unprecedented level of agility to the defense software acquisition process.

The advantages are magnified when the emerging government and defense mobile environments are considered.  A program might produce dozens of apps each month.  Currently, the C&A overhead associated with such a volume of independent software deliveries is, simply, crushing.  A certified, governed, DevOps style development environment would allow the rapid and continuous delivery of trusted, certified apps.

Affordability

Software packages for organizations seeking to implement transparent, effective cyber-defense mechanisms in both runtime (PBAC + malware mediation) and design-time environments exists today.  The savvy program manger’s first question will – and should – be “How much is this going to cost me?”  The short answer is that there doesn’t have to be any acquisition cost at all. 
A good example of the runtime solution is the WSO2 Security and Identity Gateway Solution.  This solution is an implementation pattern the leverages standard SOA components including an enterprise service bus (ESB), a governance registry, a business activity monitoring tool and an identity and access management (IdAM) management component to deliver:

  • Centralized authentication;
  • Centralized PBAC;
  • Collaboration between different security protocols;
  • Throttling;
  • Standards-based single sign on;
  • Caching;
  • Content based filtering; and
  • Schema based input validation.
An example of the design-time solution can be seen in the WSO2 App Factory.  App Factory is a governed, distributed development environment designed from the ground up to operate in the Cloud.  Effectively, it is a DevOps Platform-as-a-Service (PaaS).  It provides complete application lifecycle management in a manner consistent with organizational policies and governance.  It does so in a completely automated manner, while maintaining man-in-the-loop control.  Specific capabilities include:

  • Product and team management;
  • Software development workflow;
  • Governance and compliance;
  • Development status monitoring and reporting;
  • Code development;
  • Issue tracking;
  • Configuration management;
  • Continuous build;
  • Continuous integration;
  • Continuous automated test; and
  • Continuous deployment.
All of WSO2’s products are 100% open source, and as a result, there are no licensing fees.  The open source promise doesn’t stop there, of course.  For example:  SUSE provides a complete, open source enterprise Linux operating system as well as a Cloud environment.  PostgreSQL provides an enterprise level, spatially enabled database.  The Apache Accumulo project offers a highly scalable, fast and secure NoSQL product.  All of these products are free – as in both beer and speech.

Conclusion

An overall national policy with respect to cyber operations (and cyber warfare) remains an ongoing effort.  This does not obviate the ongoing threat posed by both nation-states and non-state actors, nor should it prevent proactive member of the defense community and commercial industry from adopting software development and implementation patterns that dramatically improve an organization’s security.  Such patterns can be implemented both rapidly and cost effectively through the use of readily available open source products. More importantly, they can be implemented in such a way as to minimize disruption to the user and the organization.


No comments:

Post a Comment