Monday, April 29, 2013

UK likely to outsmart Obama on cyber security? Think again

In the April 26th article for V3 titled "UK government likely to outsmart Obama on cyber security", +Alastair Stevenson opined:
"While the US [cyber security] spending does dwarf that of the UK, I'm still convinced the British government will get more bang for its buck, thanks mainly to its more measured focus on education and collaboration.
Obama is yet to release the full details about where the US money will go, but given the nation's track record when dealing with new threats to its borders or citizens, it's unlikely much of it will reach the country's education system. "

In Stevenson's cursory analysis of U.S. cyber security spending, I believe he has made a number of mistakes. First, he states that "Barack Obama followed suit" in increasing cyber security spending after the U.K. announced its Cyber Strategy in November of 2011. In fact, Obama's focus on cyber security goes back to at least May of 2009, when the White House published its "Cyber Space Policy Review". This 30-page document focuses almost exclusively on cyber security, summarizing the administration's policy and proposing action plans to improve cyber security across both the public and private sectors. Federal funding for cyber security has been increasing steadily year-over-year according to the plans laid out in the policy review.

Stevenson seems to focus exclusively on the increase in cyber security funding within the U.S. Department of Defense including the Air Force and DARPA, while ignoring the significant increases in funding for other cabinet-level agencies, including the Department of Justice, the Department of Homeland Security, and the Department of Commerce (which includes NIST). No wonder, then, why Stevenson doubts that "much of [the funding] will reach the country's education system".

In the U.S., the Department of Defense isn't responsible for cyber security education. That job falls more to NIST and DHS. In my blog post last week, I broke down the NIST cyber security spending and provided an overview of NIST's already significant cyber security mission. Both NIST and DHS play significant roles in cyber security education and collaboration - this has recently expanded to include NIST's National Initiative for Cybersecurity Education (NICE) and the DHS's National Initiative for Cybersecurity Careers and Studies (NICCS).

The U.S. is already years ahead of the U.K. when it comes to public-private cyber security coordination and education. What remains to be seen is which efforts (in both countries) end up being worth the investment of tax payer dollars.

Tuesday, April 23, 2013

United State spending on federal cyber security grows in Obama's new 2014 budget (part 1)

Obama's proposed federal budget for 2014 includes broad cuts to a number of departments and programs, including funding cuts of 34.8% for the Department of Homeland Security, 17.7% for the Department of State, and 8% for the Department of Defense.

Despite these cuts, one area the new budget doesn't skimp on is cyber security. The President has consistently called for increased focus on cyber security across both public and private sectors, declaring "The cyber threat is one of the most serious economic and national security challenges we face as a nation".

This policy is reflected in Obama's 2014 budget, the introduction to which states:
"We must also confront new dangers, like cyber attacks, that threaten our Nation’s infrastructure, businesses, and people. The Budget supports the expansion of Government-wide efforts to counter the full scope of cyber threats, and strengthens our ability to collaborate with State and local governments, our partners overseas, and the private sector to improve our overall cybersecurity."

This blog post series will examine the increases in cyber-security spending across each federal agency in the 2014 budget. We will start with the Department of Commerce.

Department of Commerce

The Department of Commerce will allocate $754M (an increase of $131M from the 2012 enacted level) to the National Institute of Standards and Technology (NIST), a good chunk going towards NIST's cyber security mission:
"This funding will accelerate advances in a variety of important areas, ranging from cybersecurity and smart manufacturing to advanced communications and disaster resilience."

NIST's own 2014 budget request contains more details about their cyber-security spending, including the following increases:

When it comes to R&D and Standards (the first line item above), NIST already has well-established role. NIST is the main agency responsible for approving  cryptographic standards used all over the world, including the Advanced Encryption Standard (AES) and the various secure hashing algorithms we've all come to know and love. Much of the rest of the world takes its cue on approved cryptographic practices from NIST.

In addition to its cryptographic mission, NIST is responsible for developing security standards and policies for government agencies through its use of "Special Publications", including most notably:

I recommend that you browse the complete list of NIST's special publications, as there are some good resources there.

NIST runs the NVD (National Vulnerability Database) and the CSRC (Computer Security Resource Center). More information about NIST's computer security initiatives can be found on the NIST Computer Security Division site.

NIST maintains some technical standards related to security automation and the interoperability of security tools like the ones we develop at Rapid7. This family of related standards includes SCAP (Security Content Automation Protocol), OVAL (Open Vulnerability Assessment Language), and XCCDF (Extensible Configuration Checklist Description Format).

In the next part of this series, we will look at the Department of Defense's proposed increases in cyber-security spending.

Monday, April 22, 2013

Microsoft's EMET 4.0 - a free enterprise security tool for blocking Windows exploits

Last week Microsoft announced their 4.0 beta release of EMET (Enhanced Mitigation Experience Toolkit). If you are responsible for securing Windows systems, you should definitely be looking at this free tool if you haven't already.

EMET is a toolkit provided by Microsoft to configure security controls on Windows systems making it more difficult for attackers to successfully launch exploits. EMET doesn't take the place of antivirus or patch management, but it does provide an important set of safeguards against not only existing exploits, but also against future 0-day exploits which have yet to be developed or released. Even the best signature-based antivirus programs don't do a good job at protecting from 0-days.

EMET allows administrators to exercise fine-grained control over Windows' built-in security features in Windows 7 and higher, including:

While DEP and ASLR have been supported by Microsoft since Windows XP SP2 and Windows Vista (respectively), one of the main weaknesses of this mitigation is that existing applications needed to be recompiled by the developer to "opt-in" to these security controls.  A great benefit of EMET is that it allows administrators to "force" DEP and ASLR onto existing legacy applications.

While there are many exploits out there which bypass DEP and ASLR, it's worth noting that the first versions of these exploits are sometimes thwarted by these controls, which buys you some time for either patches or antivirus detection to become available. There are good reasons why the Australian DSD (Defense Signals Directorate) has included DEP and ASLR on its "Top 35 Mitigations" for two years running.

EMET 3.0 and 3.5 introduced the ability to manage EMET via GPO, putting installation and configuration within reach of the enterprise. EMET 4.0 builds on this feature set and includes some very useful new protections, including:

  • SSL certificate pinning - allows mitigation of "man-in-the-middle" attacks by detecting situations where the Root CA for an SSL certificate has changed from the "pinned" value configured in EMET. For example, you can configure EMET to say "There is only a single trusted root CA that should ever be issuing certificates for, and if I see a certificate for any FQDN ending in from a different CA, report this as a potential man-in-the-middle attack. You can pin the CA for entire domains or for individual certificates. EMET 4.0 beta ships with pinned certificates for and, but administrators can add their own.
  • Enhanced ROP mitigation. There is a never-ending arms race between OS and application developers on the one side and exploit developers on the other side. When a new mitigation technique is developed by Microsoft, clever exploit developers work hard to find ways to bypass the mitigation. In the case of ROP mitigations, EMET 3.5 included some basic ROP mitigations that blocked assembly language "return" calls to memory addresses corresponding to known lists of low-level memory management functions in certain DLLs. This rendered a common exploit technique ineffective. However, exploit developers responded with adjusted techniques to bypass EMET's ROP mitigations, such as returning into the memory management code a few bytes beyond the function prologue. I don't have enough time or space to do this fascinating topic justice, but you can read a good overview of ROP exploit techniques here.

    EMET 4.0 blocks some of these mitigation bypass techniques, which puts the onus back on exploit developers in this cat-and-mouse game. I'm looking forward to the first white paper detailing how the new mitigations can be bypassed.
  • Improved logging. With the new and improved EMET notifier agent, EMET 4.0 does a much better job at logging events to the Windows event log. This opens up the possibility of using a centralized event log monitoring systems such as Microsoft Systems Center Operations Manager (SCOM) 2012 to act as an enterprise-wide early detection system for exploit attempts. Imagine having instantaneous alerting any time EMET blocked an attack on any Windows system across the enterprise.

    One could also use a free tool like event-log-to-syslog to gather event logs centrally, or even something like Splunk (with universal forwarders) if you don't mind breaking the bank.

    Another benefit of centrally logging and analyzing EMET events is it will give you early warning on EMET compatibility problems. Past versions of EMET have been known to cause problems with certain applications, for example I found that the LastPass extension for Chrome needed certain EMET settings disabled in order to run. If you haven't used EMET before in your enterprise, you will definitely want to introduce EMET in a limited rollout before going enterprise-wide via GPO. Note any programs requiring exemption or settings customization and make sure those settings are reflected in the GPO policy.
Update 4/22/2013: +gaten guess was nice enough to point out that ASLR was introduced in Vista, not Windows XP so I clarified my comments above. Many of these controls work poorly or not at all in XP, so it goes without saying that if you're running Windows XP anywhere in your enterprise, EMET should be the least of your worries. :)

Tuesday, April 9, 2013

JavaScript static analysis and syntax validation with Google Closure compiler

A while back, I found myself needing to have automated syntax checking and static analysis for JavaScript code. I found JSLint to be less-than-ideal, even though it has a Maven plugin. JSLint is fairly hard to tune, and it does a poor job at grokking the syntax and constructs from 3rd-party libraries such as jQuery and YUI. JSLint also tends to be noisy and difficult to tune.

I played around with a few different options and ultimately settled on the Google Closure Compiler. This is a JavaScript minifier/optimizer/compiler which also does (of necessity) a good job at syntax validation and error checking.

I ended up writing an Apache Ant task to invoke Closure on parts of the project source tree, excluding known third-party libraries from analysis. I'm reasonably happy with the results, although I'm sure one day this should be migrated to a Grunt task using the Grunt Closure Compiler plugin.

Without further ado, here is the Ant task definition. Hopefully the in-line comments make the usage pretty clear -- let me know if you find this useful or if you have any questions! Note that this task definition assumes that the Closure compiler JAR file is located in the ant lib directory.

   <timed-audit-task> is a reusable macro to run a specific audit tool against the source code,
   storing its output under @{audit-output-dir}. The output directory will be deleted and recreated
   prior to running the audit tool. Some basic logging and timing statements are added for clarity
   and profiling.

   To skip the running of a specific tool, the person invoking ant can specifiy -Daudit-skip-<toolname>,
   where <toolname> is the value passed in to the @{audit-task-name} parameter. By convention this should
   be the short name of the tool, for example "findbugs", "checkstyle", or "pmd". Thus, invoking ant with
   -Daudit-skip-findbugs=1 will cause the findbugs audit tool to be skipped. The actual value of the defined
   property is irrelevant.

<macrodef name="timed-audit-task">
   <attribute name="audit-task-name"/>
   <attribute name="audit-output-dir"/>
   <element name="auditTaskBody"/>
         <not><isset property="audit-skip-@{audit-task-name}"/></not>
            <echo>Running @{audit-task-name} on ${}</echo>
            <stopwatch name="audit.timer.@{audit-task-name}" action="start"/>
            <delete dir="@{audit-output-dir}"/>
            <mkdir dir="@{audit-output-dir}"/>
            <echo>Finished running @{audit-task-name} on ${}, see @{audit-output-dir}</echo>
            <stopwatch name="audit.timer.@{audit-task-name}" action="total"/>
            <echo>Skipping @{audit-task-name} because the "audit-skip-@{audit-task-name}" property is set</echo>

<target name="audit-js" description="Runs source code auditing tools for JavaScript">
            JavaScript auditing with Google closure
            Warnings flags are defined at
            The order of parsing JS files is somewhat important here. You should try to pass
            filenames in the rough order they would be parsed by a browser visiting your site or application.

            We use Google's provided "extern" annotated version of jQuery 1.9 to provide additional
            strict error checking. See for more information.

            Best place to find documentation on command-line options for the compiler is
            <!-- Exclude known 3rd party scripts from analysis by filename or path -->
            <selector id="audit.js.3rdparty.selector">
                  <filename name="scripts/jquery/jquery-*.js"/>
                  <filename name="scripts/yui/**/*.js"/>

            <path id="audit.js.3rdparty.path">
               <fileset dir="${source.dir}/html/scripts">
                  <selector refid="audit.js.3rdparty.selector"/>

            <!-- Include our JS source code to be analyzed, excluding 3rd-party stuff defined above -->
            <path id="audit.js.source.path">
               <fileset dir="${source.dir}/html/scripts">
                     <filename name="**/*.js"/>
                        <selector refid="audit.js.3rdparty.selector"/>

            <!-- Pipe compiler output to /dev/null in a platform-sensitive way -->
            <condition property="dev.null" value="NUL" else="/dev/null">
               <os family="windows"/>

            <pathconvert pathsep=" " property="closure.args" refid="audit.js.source.path"/>
            <timed-audit-task audit-task-name="closure-js" audit-output-dir="${closure.dir}">
                  <java jar="${ant.home}/lib/closure-compiler.jar" output="${dev.null}" error="${closure.dir}/closure-warnings.txt" fork="true">
                     <arg value="--jscomp_warning=checkRegExp"/>
                     <arg value="--jscomp_off=checkTypes"/>
                     <arg value="--jscomp_off=nonStandardJsDocs"/>
                     <arg value="--jscomp_warning=internetExplorerChecks"/>
                     <arg value="--jscomp_warning=invalidCasts"/>
                     <arg value="--jscomp_off=externsValidation"/>
                     <arg value="--process_jquery_primitives"/>
                     <arg value="--js"/>
                     <arg line="${closure.args}"/>

Mitch McConnell's leaked strategy recording has staff crying "bugged"

Today, Mother Jones magazine features a leaked recording of Senate Minority Leader Mitch McConnell's private strategy session, in which his insiders discuss ways to beat Ashley Judd should she run for his seat. Aside from the Nixonian element to the story, and the frankness with which they discussed using Ashley's mental health issues against her in a campaign, there is an interesting security-related angle here.

The meeting consisted only of a small group of loyal insiders, and all deny having recorded the session. Sen. McConnell's office is asking the FBI to investigate: "Obviously a recording device of some kind was placed in Senator McConnell’s campaign office without consent."

Joan Goodchild writes in her blog for CSO Magazine "McConnell’s campaign all adamantly deny any involvement in the recording of the sessions (and its consequential leaking). They are working with the FBI on an investigation into how it happened. But my gut tells me they need to look inward again and evaluate the people they consider allies and consider who may be a potential insider threat."

Eric Wemple from the Washington Post blogs "Let’s just roll with the bug scenario. For the sake of some legal entertainment, suppose that someone, in the wee hours of Feb. 2, broke into this secure location via ductwork, expertly fiddled with ceiling tiles and planted a pea-size device in one of the room’s grommets."

I wonder whether anyone is considering a simpler scenario. Did the room contain a Polycom conference phone system? Back in 2012, my colleague HD Moore published his research into conference phone vulnerabilities, which was covered widely by the mainstream press. There were several scenarios which allowed anyone with a telephone or web browser to silently call into the Polycom and use it to listen to the room and to watch video (for camera-enabled systems) without anyone knowing. It's not too much of a stretch to think that -- it's certainly more plausible than a Watergate-style bugging of a secure room in the capitol.