SCAP Frequently Asked Questions

Last month, we began addressing some frequently asked Security Content Automation Protocol (SCAP) questions. Now that we have clarified what SCAP is, what it consists of, and how it helps with compliance issues, let’s look at FAQs about how validation and independent testing factor in. What is validation? The SCAP Program is responsible for maintaining established standards and ensuring that validated products comply. Validation is achieved through proving that the testing performed by the laboratory has been carried out correctly. Who does independent testing? Test results for validation are accepted from laboratories that are accredited by the National Voluntary Laboratory Accreditation Program (NVLAP). This accreditation is earned after full review of the laboratories’ Quality Management System (QMS) and passing of the technical proficiency tests.

SCAP Frequently Asked Questions

In our last discussion, we aspired for automated provisioning and continuous monitoring of Network Security Management. The National Institute of Standards and Technology (NIST) has spearheaded Security Content Automation Protocol (SCAP) efforts for the last ten years. NIST, an agency of the U.S. Department of Commerce, was founded in 1901 as the nation's first federal physical science research laboratory. In essence, SCAP is a NIST-sponsored effort for both pieces (automated provisioning and continuous monitoring). As a refresher: SCAP, pronounced “S-Cap”, combines a number of open standards that are used to enumerate software flaws and configuration issues related to security. They measure systems to find vulnerabilities and offer methods to score those findings in order to evaluate the possible impact. It is a method for using those open standards for automated vulnerability management, measurement and policy compliance evaluation and was the next logical step in the evolution of our compliance automation tools for Federal Agencies. SCAP defines how the following standards (referred to as SCAP 'Components') are combined and allows results to be easily shared for Federal Information Security Management Act (FISMA), Office of Management and Budget (OMB), Department of Homeland Security (DHS) and others.

Spreading the Word on Cyber Attacks

“It's not the loud pronouncements by hacking groups or the highly visible denial-of-service attacks that scare cybersecurity experts. It's silence,” claims a recent Federal Times article. The article “Programs aim to get the word out when cyber attacks occur” brings light to the idea that one of the greatest tools against cyber attackers is the “relatively low-tech approach of sharing information about attacks.” The article continues on about a push for disclosure, explaining that the DoD has put forth ideas for a new Defense Federal Acquisition Regulation Supplement (DFARS) rule. The proposed DFARS rule would require contractors to provide “adequate security”, report cyber incidents within 72 hours, and review their networks to search for additional attack information. As always, the issue of cost tops the concerns about this communication technique. Not only would there be increased costs for the companies providing the “adequate security”, but government resources would have to be tapped in order to provide data analysis and enforcement of any resulting mandates.

Current State of Information Security | Part 2

Part 2 o2: A few weeks ago, we looked at the current state of information security and implementations from the Ten Domain Model. Using this information, we can now look at where we need to be. Due to the rapidly changing threat landscape two key requirements for information security are becoming increasingly critical. These requirements are automation and continuous monitoring. 1) Why Automation? Only automated approaches can scale and respond rapidly to large-scale incidents. a. Preventative policy enforcement reduces risk: i. overall number of security vulnerabilities ii. the success of any particular attack technique. b. Automated remediation systems have a positive impact on a large number of hosts with a relatively small time investment from computing staff. 2) Why continuous monitoring? A primary goal of continuous monitoring is, as much as is practicable, to apply automated remediation to security vulnerabilities that are found. That takes the need for human intervention out of the picture. Human intervention and the errors and delays that result from it are credited for many of the lapses in IT security.

DLT Teams up with Symantec for Data Security Solutions

In 1991 we set out to align ourselves with the most prominent IT software and hardware manufacturers in the world. The solutions offered by our best-in-class vendor partners would allow us to confidently support our public sector clients to the best of our ability while helping them achieve their agency missions. That objective took us to 2001 when DLT aligned ourselves with VERITAS, the market leader in data storage, through VERITAS’S GSA Agent Program. The GSA Agent Program was designed to give any authorized VERITAS reseller a vehicle for Federal Government business when no contract vehicle existed.

Shakin’ IT up at Innovation Nation 2011

It is already the second half of August and we are quickly approaching a busy conference time for DLT. Up-coming events will take DLT all over the country, but some of the best are local ones happening just down the road. The annual Innovation Nation Forum, hosted by MeriTalk, will take place Tuesday, August 23 at the Washington Convention Center. Aiming to “Shake IT Up,” Innovation Nation will focus on three Federal IT hot topics- cloud computing, cybersecurity and data center consolidation.

Risk vs. Security

It is interesting that there is no equivalent term in Latin for risk outside of the word for danger. While security is the state of being free from danger or threat, risk, is a more complex topic and cannot be addressed without the concept of loss. It is the probability, not merely the possibility of something unpleasant or unwelcome happening that will result in a loss of some kind (life, liberty, property). The term did not even come into existence until the 17th century after the Medici had leveraged eastern mathematics in the calculation of probability in financial terms and still the word risk is derived from the word danger. Big mistake!

Keeping Enterprise IT Systems Secure

“Good security doesn’t stop with just an anti-virus client and a perimeter firewall.” Government Security News (GSN) recently published an article written by DLT Engineer, Aaron Payne, about bringing “Security back to the basics: Managing the threat” that addresses the concern that there are many layers necessary to keeping enterprise IT systems secure.

Security Back to Basics: Managing the Threat (part 3b)

In previous blogs we talked about needing to educate the end users and knowing the details of what activity is occurring on your enterprise’s systems. In part 3, we’re going to talk about Compliance and Endpoint Management. Simply speaking, Compliance is setting a policy and how well you adhere to the policy. If a policy is set to only allow passwords longer than 8 characters in your enterprise, Compliance is the measurement of enforcement of that policy. Any deviations or exceptions from the policy are clearly documented and recorded. So why is Compliance important? A well-developed endpoint security policy ensures that common attacks and threats can be mitigated before they happen. By adhering to that policy, you are protected and secure from attacks without any other controls. There are many examples of compliance guidelines like NIST 800-53 and FDCC (Federal Desktop Core Configuration).