GCN published an article on June 3, 2013 regarding the possible data breach of Customs and Border Protection (CBP) systems operated by third-parties for clearances. The information used to obtain clearances is not only personal identifiable information (PII), but also re-tells the past ten or more years of history of an individual. So the potential compromise of this information is a serious issue.
Now add the recent scandals regarding surveillance by the NSA and other government agencies adds to the concern. This is more than a privacy issue, but one of the capability to maintain data secure. DHS is meant to provide the “cybersecurity” component of the government in conjunction with the DoD, but if DHS and the DoD have issues with maintaining the security of their respective systems, what will the potential breach be with the new surveillance information. While granted, the information of the phone calls from the various telecoms is currently not maintaining the call content itself, the associated metadata could expose even greater risk to individuals than is being expressed. Most phones maintain GPS and cell tower information with a call. Add the additional cell phone number and owner information, it is now possible to track the patterns of the individual in addition to the various calls.
While the potential privacy issues around surveillance has its place, the ability for the government to protect the data is also equally important.
In the November 2012 issue of SC Magazine (Pg 26-28) titled “IPS Grows Up”, an article by Fahmida Rashid discusses some of the changing landscape for intrusion protection systems with a variety of experts. There are a variety of interesting topics and statistics regarding IPS such as the following:
While IPS won’t be able to block attacks exploiting zero-day vulnerabilities or thwart skilled adversaries using sophisticated tactics, it should “prevent 99 percent of push-button or automated attacks, Al-Abdulla says.”
While many can agree with that statement, what probably would not receive a great deal of agreement was the following statement within the article:
Holden predicts IDS will “fall by the wayside” in the next three to five years.
While it is understood that IDS is not detective rather than reactive, but one of the things that many businesses and agencies have a hard time tuning IPS in a way that there will not be any issues with mission or business critical traffic. The thought that IDS will no longer be necessary seems very short-sighted and limited. Granted most IPS devices are also IDS, but if defense in-depth is still a valid concept and that risk is a business decision, then IDS will remain in use for the foreseeable future.
There has been a lot of news recently about the potential for the coming Cyber Pearl Harbor. A cyber attack that would mirror the devastation that hit the naval base in Pearl Harbor during the beginning of WWII. According to an article in CSO Magazine on October 18, 2012, the United States is concerned of a coming cyber attack. The concept of comparing the attack to Pearl Harbor has been around for several years. It wasn’t until a recent a speech by U.S. Secretary of Defense Leon Penetta in New York that this has become more of a topic.
The article states the following:
The results of cyberttacks by a hostile nation-state on critical infrastructure like transportation, water supply or the electric grid “could be a cyber Pearl Harbor — an attack that would cause physical destruction and the loss of life,” Panetta said. “In fact, it would paralyze and shock the nation and create a new, profound sense of vulnerability.”
Panetta also invoked the image of a cyberattack on the level of 9/11. “Before September 11, 2001, the warning signs were there. We weren’t organized. We weren’t ready and we suffered terribly for that lack of attention. We cannot let that happen again. This is a pre-9/11 moment,” he said.
In a follow-up article in CSO Magazine November 7th, the opposing viewpoint was brought forth. Many in the security industry feel that the concept and description of a Cyber Pearl Harbor is nothing more than hot air. Experts including Bruce Schneier have chimed in. Bruce has reduced the extent to which he believes the concept to be exaggerated but according to he article:
Critics argue argue that not only is the threat of a catastrophic cyberattack greatly exaggerated, but that the best way to guard against the multiple risks they agree exist is not with better firewalls or offensive strikes against potential attacks, but to “build security in” to the control systems that run the nation’s critical infrastructure.
Bruce Schneier, author, Chief Technology Security Officer at BT and frequently described as a security “guru,” has not backed off of his contention made at a debate two years ago that the cyber war threat “has been greatly exaggerated.” He said that while a major attack would be disruptive, it would not even be close to an existential threat to the U.S.
“This [damage] is at the margins,” he said, adding that even using the term “war” is just a, “neat way of phrasing it to get people’s attention. The threats and vulnerabilities are real, but they are not war threats.”
The reality is that it is probably somewhere in the middle of the two viewpoints. It can be likened to the Y2K issue a little over a decade ago. The world was going to come to an end and the dark ages would re-emerge. The reality was that preparation help minimize what little impact there may have been. Security is a risk decision, but most risk decisions are defensive in nature. The other decision of a preemptive cyber capability is another aspect of the decision-making that needs to be addressed. Should the U.S. begin cyber strikes on perceived threats? What is the impact of doing this on the long-term? The world has already seen a small view of what can be done with Stuxtnet and will these type of state-sponsored cyber attacks the new nuclear deterrent…that is yet to be seen.
Regardless of the direction that gets taken, business needs to look at potential cyber attacks/hacks as a real potential threat and determine what risk is willing to be accepted and what will need to be mitigated. Whether the issue is the size of a country or your home computer, measure twice, cut once is still the best direction.
Microsoft’s new flagship operating system Windows 8 was released at the end of October, but with its release, so has a new zero-day. In a recent article in SC Magazine, the article describes how the French security firm Vupen is offering the recently discovered zero-day for sale. In fact, a mere $50,000.00 could allow you to obtain the vulnerability that has been described as affecting the new Internet Explorer 10 browser.
According to the article:
Last week, Vupen CEO Chaouki Bekrar tweeted that “various” IE10 and Windows 8 vulnerabilities had been combined to circumvent exploit mitigation safeguards in Windows 8, which was released to the public on Oct. 26. The exploit was reportedly not disclosed to Microsoft, nor was its price made public. Vupen did reveal that the zero-day could allow a particularly skilled hacker to bypass embedded security measures, which include high-entropy address space layout randomization (HiASLR), anti-return oriented programming (AntiROP), data execution prevention (DEP) and protected-mode sandbox.
According to the article, Vupen only sells the vulnerability information to governments and business, but this is very concerning. The fact that they have not shared it with Microsoft, this could become a way to hold applications, business and governments hostage. Secure coding needs to be the priority of developers and the time to market needs to be properly married to insuring limited vulnerabilities.
A recent article in DFI News discusses some interesting research. The article discusses research by physicists at Heriot-Watt Univ. and Univ. of Strathclyde. They are working with tiny particles of light to create a new way of verifying electronic messages and transactions as authentic, helping address the huge cost of e-crime and avoiding potentially catastrophic fraud, online hacking and theft of digital data.
According to the article discusses how the research shows how photons can be used to verify security and authenticity of any transaction or communication with a “digital signature.” The article specifically states it does the following:
Quantum-based secure signatures mean that an “eavesdropper” — a malevolent third party listening in — cannot fake a signed message which is being sent to multiple recipients.
- The sender writes the signature with encoded light particles and sends it to the receiver
- The receiver cannot yet read the signature. However, it can be sure it received an authentic signature
- To confirm a message is authentic and to also read it, the receiver has to receive both the message (the “signature”) plus additional information required to decipher it
- The multiple receivers confirm that they have received identical signatures – only then does the sender provide the additional information required to read the signature
- This process takes place without the user (e.g. a shopper) being required to do anything differently to current security methods
When physicist begin looking at how they can impact and improve e-commerce, you know there is a big amount of money at stake. It will be interesting to see how this can be implemented in the real-world and also how it will be circumvented…
Millions of people are feeling the effects of Hurricane Sandy along the East Coast of the United States. Natural disasters occur all the time all of over the world, but many times some of the basic precautions do not get addressed in advance. Many will ask how this topic deals with technology or security, but the answer is simple…it just does. Most businesses will have a process or procedures if a server crashes or the phone system or Internet goes out for a couple of hours, but how many businesses address the longer term impacts of flooding or the fact that your cloud provider lost one or more of its data centers.
The reality is that business continuity planning and disaster recovery planning should include these types of scenarios. Scenarios and planning for short and long-term outages. Whether it is an earthquake, tornado/hurricane, or flooding, the planning needs to be there and how it could impact your business from a safety and financial stand point. If you take this recent storm as an example, many businesses lost power and will be flooded for days potentially having a strong negative impact for their customers. In fact, the NYSE closed for multiple days as a result of the storm and that has not occurred for weather related issues since the early 1900s.
The bottom line is to plan. Make business continuity and disaster recovery a part of your process and then also test those processes. The last thing your business needs during an outage is to go to a process that does not work or has never been tested. Now we know that security and business has to evaluate risk. If not being prepared is an acceptable risk, then that is the business decision you will need to make…
Are you a small business? Are you satisfied with your customer and business data security? According to a recent survey of small businesses by Symantec and National Cyber Security Alliance, 86% state that they are. In an article in SC Magazine published 10/22, some of the interesting details of the survey are discussed.
According to the article, even those 86% are satisfied with the level of security protecting the customer and business data of their businesses. In addition, 77% of those small businesses surveyed believe that their business is safe from any breach. According to the article about the survey, the following is what is most concerning:
However, 87 percent of respondents have not written a formal security policy for employees, 83 percent lack any security blueprint at all and 59 percent have no plan in place to respond to a security incident.
These statistics are very concerning. If you take this survey of 1,015 small businesses (250 employees or less) as a reasonable grouping of all small businesses this survey is frightening. Even if you take it with a grain of salt, it is scary that no planning is being put in place for most. One can only assume why a business would not put a plan, even one that is basic, in place. Is it the cost of security or the thought that “this business is too small to be hit” mindset? What ever the rationale used to make the decision, it was a decision to accept that risk of compromise and breach, but as more and more businesses begin to use cloud services and other mechanism on the Internet, they are turning from an obscure local “mom and pop” business to one with a larger footprint that can span the globe.
Preparation is always a wise decision. Regardless if you document that you buy the top of the line next-gen firewall and intrusion protection system or just change the Linksys encryption from WEP to WPA-2 and change the default admin password, the documented plan is a step in the right direction. Remember it is important to measure twice and cut once.
In closing the following quote is something for everyone to consider:
“Invincibility lies in the defence; the possibility of victory in the attack.” — Sun Tzu
Research from Symantec has been published in ACM on October 16. The research, which was also referenced in articles in SC Magazine and Dark Reading, looks at the amount and duration of zero-day attacks. Specifically:
A zero-day attack is characterized by a vulnerability that is exploited in the wild before it is disclosed, i.e., t0 > te. Similarly, a zero-day vulnerability is a vulnerability employed in azero-day attack. Our goals in this paper are to measure the prevalence and duration of zero-day attacks and to compare the impact of zero-day vulnerabilities before and after t0.
The research within the paper has some important considerations to business and the need for effective patching and defense-in-depth within the enterprise. Specifically, the paper found the following conclusion:
Zero-day attacks have been discussed for decades, but nostudy has yet measured the duration and prevalence of these attacks in the real world, before the disclosure of the corresponding vulnerabilities. We take a ﬁrst step in this direction by analyzing ﬁeld data collected on 11 million Windows hosts over a period of 4 years. The key idea in our studyis to identify executable ﬁles that are linked to exploits of known vulnerabilities. By searching for these ﬁles in a dataset with historical records of ﬁles downloaded on end-hosts around the world, we systematically identify zero-day attacks and we analyze their evolution in time.We identify 18 vulnerabilities exploited in the wild before their disclosure, of which 11 were not previously known to have been employed in zero-day attacks. Zero-day attacks last on average 312 days, and up to 30 months, and they typically aﬀect few hosts. However, there are some exceptions for high proﬁle attacks such as Conﬁcker and Stuxnet, which we respectively detected on hundreds of thousands and millions of the hosts in our study, before the vulnerability disclosure. After the disclosure of zero-day vulnerabilities, the volume of attacks exploiting them increases by up to 5 orders of magnitude. These ﬁndings have important implications for future security technologies and for public policy.
Based on these findings, it will be interesting to see if the various technology vendors, programmers, and business will take this to heart and work harder in getting less vulnerable software and systems to market. Follow on research from this paper could be to evaluate the cost impact associated with zero-day attacks or vulnerabilities that were left unpatched. The reality is that security is about risk acceptance and in some cases the cost may be deemed an acceptable risk by some businesses.