Standardizing secure by default
Connecting state and local government leaders
Building data secure by default is now technically possible, but a standard must be developed before we’re faced with a stream of fragmented, unsecured data.
"Privacy by design" and "security by design" have become common terms to describe the process of building privacy and security into technology at the start, rather than bolting it on after the fact. It may seem perplexing to consider security an afterthought, especially to those of us whose careers are dedicated to information security, but -- based on human nature’s desire for functionality first -- developers have a tendency to wait until a technology has reached maturity before integrating security capabilities. This mentality is changing now that data breaches are making headlines on a regular basis, however. We are finally starting to build security into networks, applications and even chips from the get-go.
A lesser-known term -- secure by default -- applies the same principle to securing data at the source. Secure-by-default data makes the case that all data should have embedded security, and the systems that consume, process and store this data must adhere to the security policies embedded therein. This approach is not as well known because it’s simply not widely employed, if at all. To date, we have failed to embed security into each piece of data as it is created, creating a serious problem, particularly for government agencies.
The issue goes back to the connection between cybersecurity and data security. The terms “cybersecurity,” “privacy by design” and “security by design” refer to securing our systems. “Data security” and “secure by default” refer to securing our information. All are related, but securing our systems without securing the information in them is a lost opportunity that leaves us vulnerable. Even if systems are protected, the data inside them may still be compromised. We’ve seen this with many recent high-profile breaches.
Building security into data at its creation must be a priority. The Internet of Things is a reality, and all its elements -- hardware, devices and appliances such as smart watches, connected medical devices, smart thermostats, automobiles and more -- generate a new stream of fragmented, commonly unsecure data that must be reined in and managed.
Setting the standard
Secure by default is technologically feasible today. Bandwidth, compute power and data size used to be problematic, but these are now addressable. Secured data storage and processing require little to no overhead. What’s needed is adoption of a standard that ensures the security of data at the time of creation. This standard must guarantee for all data that the source can be identified, the integrity can be verified and the security policies can be applied. But this isn’t as simple as it seems.
There used to be organizations solely responsible for setting standards across various industries, particularly in the technology sector. For example, ASCII (the American Standard Code Information Interchange) created a standard outlining how text was stored in binary computers. This particular standard served its purpose and now must give way to a new standard that also addresses who can see the data, how it is classified, how metadata is handled and much more.
However, developing a unified standard today will not be as easy as standardizing ASCII text. There are so many stakeholders -- all of whom want to be involved in the process -- making agreement problematic at best. It’s the proverbial case of “too many cooks in the kitchen.” A good example of how standards-setting has gotten much more complicated is IoT; there are so many different groups creating new standards -- from technologists, to manufacturers, privacy groups and more -- that the standards are already fragmented and lack the authority to be effective.
But because we are still in the early stages, we might have the opportunity to standardize practices to secure data at its inception before the kitchen fills up with more cooks. That’s why it is so important to begin the discussion now. Government has been successful in driving standards, particularly around technology. Agencies like National Institute of Standards and Technolog and the Defense Advanced Research Projects Agency have had a strong influence on what technology looks like today. But the commercial industry should be involved as well, helping government create a realistic and achievable standard.
A good place to start might be to examine the way in which the Apache Software Foundation was created; a group of individuals with an understanding of the importance of open source software came together, outlined what the standard should look like, put it in the form of a whitepaper and vetted it with other interested groups to gain the support to formalize it.
IoT users might challenge some of the security features in a new standard or try to slow down the process so they can continue to advance their products without adhering to a specification. But as IoT begins appearing in consumer-facing systems -- such as in new cars where reports of vulnerabilities are getting attention -- the public has become astutely aware of security and may start to demand this type of standard.
Security is top of mind for the general public, and industry is beginning to explore how technology can help. Building data secure by default is the way to achieve the utmost in security; today's technology makes it possible, and the standard must be developed now. Government agencies should step up to move this standard forward as IoT and open data become widespread. The big data industry stands ready to support them in this necessary step towards true cybersecurity.
NEXT STORY: Registration opens for DOD bug-bounty program