DIA ramps up innovation, acquisition efforts
Defense Intelligence Agency Director Lt. Gen. Michael Flynn said one out of every three dollars his agency spends on technology innovations goes to small business. (Photo: Dan Verton)The Defense Intelligence Agency announced Tuesday that a key component of its multifaceted crowdsourcing effort is ready to begin testing new technologies against a high-fidelity emulation of the agency’s computing environment.
DIA offered a preview of the Open Innovation Gateway during the first day of its annual Innovation Day at its headquarters in Washington, D.C. A demonstration of the system is scheduled for Wednesday.
Under development for the past year, the gateway is an open-development environment that fully emulates DIA’s information architecture and the operational systems used by DIA analysts around the world (without real-world classified data), allowing innovators to test and prove new concepts and technologies. It also supports DIA’s ongoing efforts to cast the widest net possible when searching for new technologies and try new systems and concepts before purchasing them.
The gateway incorporates protections for intellectual property, preventing competitors from stealing each other’s secrets, as well as keeping nation state and terrorist adversaries from learning about DIA’s current operational shortfalls or capabilities. But the central mission of the gateway is to help DIA find innovations and speed up the process of testing and acquiring new technologies.
“Our intelligence community programs operate at such a turtle’s pace that we have a challenge staying ahead of the adjustments that need to be made,” DIA Director Lt. Gen. Michael Flynn said June 24 at the Innovation Day, which attracted more than 500 attendees from 135 companies. “The stuff is not that agile. It does not provide the flexibility.”
Dan Doney — DIA’s chief innovation officer and the primary force behind the Open Innovation Gateway and a complementary effort known as NeedipeDIA, an online portal that allows DIA to communicate needs to the innovator community and receive proof of concept proposals — said the lack of agility in the intelligence community’s acquisition process is a serious weakness that must be fixed.
“I have not done what I’m here to do as long as there are better capabilities out there that are not in the hands of our mission users,” Doney said. “It’s not acceptable to me that the best of what’s available is not available to our people. That is why I’m here.”
DIA launched the NeedipeDIA effort in January in a direct effort to speed up the acquisition process. The agency awarded the first contract under the new model June 20 and released two new mission needs this week dealing with emerging capabilities in activity-based intelligence and future megacity operations.
“We have to become much more agile,” Doney said. “We have to be able to try out new capabilities that exist almost instantaneously. If it’s true that a year from now capabilities that we are seeing right now are going to be displaced by altogether new capabilities, waiting a year to get that capability is simply unacceptable.”
Defense Intelligence Agency Chief Innovation Officer Dan Doney explains the agency’s Open Innovation Gateway and NeedipeDIA initiatives at the 2014 Innovation Day June 24 at DIA headquarters. (Photo: DIA)But Doney acknowledged that when DIA posts something to NeedipeDIA for industry to review, potential adversaries can also see those requirements. As a result, the needs outlined in the open are for overly broad categories. “There are things that we cannot ask for,” Doney said.
To compensate for this limitation, DIA released a classified version of NeedipeDIA on May 23. The first classified need is now under consideration, and there will be many more coming in the near future, Doney said.
Although DIA must keep sensitive requests out of the hands of adversaries, it still must find a way to communicate those requirements to non-traditional service providers who do not have a security clearance. That capability has been earmarked for the next phase of NeedipeDIA, which Doney said should be available this fall.
So far, NeedipeDIA has helped DIA engage 250 private sector innovators. According to Doney, nearly 80 percent of the companies DIA has interacted with are non-traditional vendors. “We’re reaching out to folks we haven’t done business with before,” he said. And more than 20 percent of the initial white paper submissions go to the next phase of review, including interviews and demonstrations.
But DIA hasn’t always delivered on its promise of a 45-day response time to white papers, Doney acknowledged. “We will not let this continue,” he said. “We’re looking to automate the process. We will not let any ideas slip through the cracks.”
NASA launches open data challenge with Amazon Web Services
NASA, with the help of Amazon Web Services, is opening up its OpenNEX platform for an Earth Science challenge. (Credit: NASA)NASA, by virtue of Amazon Web Services’ cloud, is challenging the public to create ways to study the planet’s actual clouds.
NASA announced its earth sciences challenge Tuesday aimed at harnessing its OpenNEX platform, a collection of open climate and earth science data — hosted on Amazon Web Services — through which NASA scientists share modeling, scientific results and analysis codes.
The challenge will take place in two stages. The first is an “ideation” stage in July, during which groups will pitch novel uses of OpenNEX’s data, earning up to $10,000 for their ideas. The “builder” portion of the challenge, starting in August, will see groups given between $30,000-$50,000 for an app or algorithm that promotes climate resilience. The winner will be chosen in December.
Earlier this year, NASA hosted a similar two-day challenge geared toward building open-source solutions for space exploration. After four of the projects in NASA’s Space App Challenge won awards, project manager Tsengdar Lee said NASA wanted to “do something more grand” with this challenge.
“There’s only so much you can do in two days,” Lee said in an interview. “What we are looking for is more involved.”
Lee talked about how the cloud allows for challenges like this, which in turn gives the public a chance to quickly innovate.
“Think about our current model,” Lee said. “If you want to use our tools, our datasets, you go to a NASA-hosted facility. You want to apply for an account? There’s a very, very involved process. They will check your credentials, your nationality. For some students and citizens out there, it would take months. Going to Amazon, there’s no such issue. It’s open dataset, open platform.”
NASA satellite data incorporated into OpenNEX shows global views of drought conditions. Data from the OpenNEX platform will be used in NASA’s Earth Science challenge. (Credit: NASA Earth Exchange)
Harnessing AWS’s capabilities is something Lee highlighted Tuesday during a symposium at the Washington Convention Center. After hearing about the National Institute of Health’s 1000 Genomes Project two years ago, he “got excited” about the possibility of what AWS could do with NASA’s data, including turning it into “science as a service.”
AWS allowed Amazon to release a wide range of data from NASA’s NEX, including climate projections spanning back to 1950, North American forest disturbances and drought monitoring in California.
However, Lee recognized what Amazon was truly capable of when studying droughts in the actual Amazon, coincidentally.
“Back in 2005, when [scientists] did the study, it took them 24 months,” Lee said. “We captured that entire workflow. When we reproduced the result, we figured out there was a stat that wasn’t done correctly.” Using the captured workflow, they reproduced the study using AWS in six months.
“It’s no longer cost prohibitive to do the big things,” said Ariel Gold, a program manager for Amazon’s global public sector. “It’s no longer technically prohibitive to do the big things.”
Beyond technology, Amazon is providing education and research grants for the challenge, which allow researchers to overcome any barriers they have in accessing or using AWS. NASA has also released a number of online lectures and lab modules ahead of the challenge.
“I want science throughput,” Lee said. “The goal is to get information out of our datasets and get decision-enabled information, build up the knowledge and enable decision making.”
Those interested in registering for the challenge can sign up via NASA’s website.
FedWire: NASA’s earth science challenges, social media for educators and NTIA promotes open Internet
FedWire is FedScoop’s afternoon roundup of news and notes from the federal IT community. Send your links and videos to tips@fedscoop.com.
NASA opens data for earth science challenges.
Education Department releases social media tip sheet for educators.
NTIA on promoting an open Internet.
FDA issues guidance for nanotechnology.
NSF to showcase autonomous car in D.C.
The Big data initiative shines in surprising places
The Obama administration probably didn’t envision local hunting, fishing and boating commissions as examples of how the government can harness big data.
Neither did Waldo Jaquith, director of the U.S. Open Data Institute, when the president asked the Office of Science and Technology Policy to study the power of big data in January. Yet, at a workshop held by OSTP last week, Jaquith spoke of hunting and fishing regulations as a shining example of where big data is headed.
Jaquith was one of many public and private sector officials June 19 who attended a workshop at Georgetown University to discuss the opportunities and challenges big data presents for the government. The workshop was one of six meetings arranged by the OSTP to discuss big data and the first since the White House released its big data and privacy report on May 1.
During a panel discussion, Jaquith told an audience the hunting and fishing data serves as a good example of how his organization tries to release datasets that have clear value to the public, but have not previously been made available.
“A venn diagram of people who care about open data and people who hunt and fish are just two circles,” he said. “This is a problem. What we’re trying to do is identify unintentional biases…and trying to find those new types of data and create these new open data ecosystems where we can identify data that’s useful.”
Several useful applications of big data across government were on display during the panel, which featured a number of agency experts highlighting how big data has helped them be more effective.
Rajive Mathur, director of online services for the Internal Revenue Service, said the IRS now looks at data from a customer service perspective.
“It’s all about: How do we allow the taxpayer to take the information that’s theirs and do what they need to do in order to meet their tax obligations?” Mathur said. “This is your information, and it needs to be secure, and you need to have access to it.”
Mathur spoke about how the IRS has changed its model for data under the Get Transcript program, which allows taxpayers to securely access tax information in an instant, as opposed to waiting up to a week for the same information by mail.
Mathur said Get Transcript allows tax info that’s needed elsewhere, like for federal student aid forms or a mortgage application, to be ported directly into those forms, instead of having users type it in themselves.
At the Department of Veterans Affairs, data is being used on a more granular level for the department to meet its strategic goals.
Rosye Cloud, a senior advisor for veteran employment at the VA, said in years past, data was increasingly scattered, causing for “imperfect gaps” in data moving between agencies. Now, things have become much more streamlined, which has allowed the VA to create new strategies in relation to veteran unemployment.
“By partnering with the Department of Labor and the Department of Defense, we started understanding more about not only those who are unemployed, but we also are starting to ask deeper questions,” Cloud said.
Cloud also touted public-private partnerships that helped VA harness data, including one with the National Student Clearinghouse and Student Veterans of America that examined persistence rates among vets.
Zach Goldstein, acting CIO for the National Oceanic and Atmospheric Association, spoke about how his agency plans to leverage its own public-private partnership off the 20 terabytes of information the agency produces every day. While only 10 percent of that data is available to the public, Goldstein envisions the rest of it being used in conjunction with other data in the private sector to create predictive analysis tools.
“If I know that when this kind of weather pattern occurs and this [other] kind of disease pattern occurs, then I can figure out that I need this kind of medical intervention to prepare for that,” Goldstein said. “There are all sorts of possibilities with the technology that’s available.”
Yet as there is this rush to harness all the data that is already or will become available, other federal officials are concerned that privacy and discriminatory challenges remain.
“I worry about the potential of discrimination by algorithm,” said Julie Brill, a commissioner for the Federal Trade Commission. “There is a whole a lot of information flowing that could be discriminatory. What is a company going to do if they are going to make these differentiations…if that has a discriminatory impact?”
Carole Miaskoff, acting associate legal counsel for the Equal Employment Opportunity Commission, spoke about how the EEOC is wrestling with the challenges big data presents to those looking for work.
“How do we take legal principals and societal consensus of ‘thou shall not discriminate’ and make them meaningful in an era of analysis from these huge databases, and infer rules and criteria that you can select people for jobs? How do you apply it in that new context?” Miaskoff questioned.
Even as agencies follow the Obama administration’s open data directive, Jaquith knows that any positive or negative derived from big data is moot without the push for better technology.
“It doesn’t treat the humans in government as rational actors when we try to browbeat them into publishing data,” he said. “You can pass all the laws you want and all the policies you want demanding that data be released, but if the agency relies on terrible proprietary software, there’s just nothing to be done.”
Big data: What distinguishes government’s high-achievers
A new survey of federal agencies suggests that some are maturing much faster than others in the harnessing of big data and that the core elements of success go well beyond technology and the availability of data scientists.
“What surprised me most is that there’s such a big and distinct difference between the high achievers and low achievers” among big data users at federal agencies, said Adelaide O’Brien, research director for IDC Government Insights, which conducted the survey. The study is among the first to establish a benchmark of maturity in the use of big data and analytic tools by federal agencies.
O’Brien, speaking June 20 at the Federal Big Data Summit, described five stages of maturity among federal agencies, beginning with the ad hoc use of big data tools and progressing through opportunistic, repeatable, managed and optimized phases of big data use. The latter phases reflect more integrated, automated and measured approaches in using big data, with the most mature agencies able to prove the value of big data within their organizations.
IDC measured the maturity of agencies with at least 5,000 employees along five dimensions: the level of intent to harness big data, technology deployed, data analyzed, commitment to staffing and process development tied to big data use.
Overall, the adoption of big data among federal agencies reflected a typical bell curve, with two-thirds (63 percent) of respondents saying their agencies were at the midpoint of the maturity continuum, generating “repeatable” results, 17 percent at the more-mature, “managed” phase and 19 percent at the less-mature, “opportunistic” stage. Only 1 percent of respondents respectively put their agency at the extremes of being at the “ad hoc” or “optimized” phase.
O’Brien noted that those agencies considered to be high achievers, or more mature, in their use of big data, tended to:
- successfully recruit, train and reward not only data scientists and statisticians, but also business and program analysts who were connected to agency end goals;
- actively collaborate and communicate with other agencies or work groups on big data and analytics initiatives;
- have senior executive involvement that contributed to resourcing big data projects;
- use pilot projects, continuous process improvement and quantitative feedback;
- use advanced predictive analytics tools and a high level of automation that helped guide agency decision making.
Agencies that tended to have more siloed strategies, lacked intra-organization coordination on data projects and lacked top level sponsorship typically fell further behind on the maturity curve, O’Brien said.
While budget constraints remain an overarching factor in the evolution of big data analytics in government, O’Brien said the survey revealed that some agencies have managed to figure out how to overcome those barriers in developing their big data capabilities.
The study indicated that agencies are demonstrating stronger intent and developing more strategies to harness big data than was the case a year ago. At the same time, O’Brien spotted an imbalance in results, with agencies having the technology to analyze big data sets, but not the management infrastructure to share the results.
The processes to harness big data and to make better decisions still remains relatively immature. “A lot of the technology is being used for individual projects, but the line-of-business folks aren’t involved in these projects,” she said.
FedWire: New NIST cloud groups, NIH’s bionic man and the White House on immigration reform
FedWire is FedScoop’s afternoon roundup of news and notes from the federal IT community. Send your links and videos to tips@fedscoop.com.
NIST announced three new cloud computing groups.
NIH’s bionic man project.
More White House push for immigration reform and its effects on STEM.
Visual toolkit to analyze NSF investments.
Energy Department launches National Incubator Initiative for Clean Energy.
NASA technology at risk of unauthorized use by foreign nationals.
Joan Mooney, VA’s liaison to Congress, retires
Joan Mooney (center), VA’s Assistant Secretary for Congressional and Legislative Affairs, testifies May 28 before the House Veterans Affairs Committee.The senior executive responsible for managing the Department of Veterans Affairs’ contentious relationship with Congress and ensuring the agency’s cooperation with independent audits by the Government Accountability Office retired June 20, FedScoop has learned.
Assistant Secretary for Congressional and Legislative Affairs Joan Mooney has left the government after more than 20 years of service, the last five of which she spent overseeing VA’s responses to tens of thousands of congressional inquiries on everything from major privacy breaches and hacking incidents to the recent scandal involving informal wait lists.
A VA spokesperson confirmed Mooney’s departure in an email to FedScoop. “VA appreciates her service and commitment to Veterans,” the spokesperson said.
Mooney’s retirement comes less than a month after she testified before the House Veterans Affairs Committee about the latest VA inspector general report that concluded the agency had falsified waiting list records to conceal excessive wait times. The VA’s deliberate efforts to hide treatment delays and systemic problems with its nationwide electronic scheduling system led to the deaths of several veterans who were waiting for treatment. Mooney responded in anger, at times her eyes filling with tears, as lawmakers criticized her performance and asked if she would resign and accept the blame for the agency’s cover-up.
Joan Mooney, Assistant Secretary for Congressional and Legislative Affairs at the Department of Veterans Affairs, retired June 20. (Photo: VA)“We know that the facts of that report are utterly reprehensible,” Mooney said during the hearing. “That is what we know. And we owe a debt to all our veterans who served.”
Chairman Rep. Jeff Miller, R-Fla., reacted cautiously to news of Mooney’s departure and what it might mean for greater transparency at the agency.
“President Obama’s Department of Veterans Affairs is a case study in how to stonewall the press, the public and Congress,” Rep. Jeff Miller, R-Fla., chairman of the House Veterans Affairs Committee, said in an email to FedScoop. “And as VA’s despicable delays in care crisis made painfully clear, the department’s extreme secrecy has resulted in deadly consequences. The only way leadership changes at VA’s Office of Congressional and Legislative Affairs can be viewed as a positive development is if President Obama nominates a leader who understands that taxpayer funded organizations such as VA have a responsibility to provide information to Congress and the public rather than stonewalling them.”
Miller has been an outspoken critic of VA’s alleged lack of transparency, going so far as to establish a “Trials in Transparency” website, where the committee documents the difficulties it has experienced trying to get answers to questions dating back more than a year.
As of June 13, the committee is waiting on answers to 112 requests for information from VA, according to the site.
In one heated exchange during the May 28 hearing, Miller asked Mooney if she could say anything without referring to her notes, adding it takes “repeated requests and threats of compulsion” to get VA to appear before the committee to answer questions and provide documents.
“Until VA understands that we’re deadly serious, you can expect us to be over your shoulder every single day,” Miller said.
When Miller asked Mooney directly why she had not yet provided the committee with information on whether or not VA employees were disciplined for the deaths of nine veterans in Georgia and South Carolina, Mooney attempted to answer with statistics of how many requests her office had handled during her tenure. It did not sit well with Miller, who became noticeably angry and raised his voice in response.
“Ma’am, veterans died. Get us the answers please,” Miller said.
Mooney has served since 2009 as the principal advisor to the Secretary of Veterans Affairs on the department’s legislative agenda. As such, she created and managed VA policies, plans and operations related to all congressional and legislative matters affecting the department, according to her official bio. During her tenure, the Office of Congressional and Legislative Affairs led VA in hundreds of congressional hearings before a dozen committees, prepared the department for each GAO audit, presented thousands of briefings and responded to more than 100,000 congressional requests for information. She also served on VA’s Executive Leadership Board and other governance bodies; the Department of Defense-VA Joint Executive Committee; and the interagency Senior Oversight Committee (DoD-VA-Health and Human Services-Department of Labor), which addressed issues associated with care and services for returning service members.
National Park Service issues temporary ban on drones
The next time you’re taking family photos at the Grand Canyon, you won’t have to worry about an unmanned aircraft photo-bombing you – at least not for now.
In a policy memorandum released today, Jonathan Jarvis, the director of the National Park Service, directed park superintendents to “prohibit launching, landing or operating unmanned aircraft, commonly referred to as drones, on lands and waters administered by the National Park Service.”
“We embrace many activities in national parks because they enhance visitor experiences with the iconic natural, historic and cultural landscapes in our care,” Jarvis said in the release. “However, we have serious concerns about the negative impact that flying unmanned aircraft is having in parks, so we are prohibiting their use until we can determine the most appropriate policy that will protect park resources and provide all visitors with a rich experience.”
The ban is temporary, though, according to the NPS. Next, Jarvis will propose NPS-wide regulations for unmanned aircraft systems. The process, Jarvis said, is not a quick one, and the amount of time it will take NPS to develop those rules will depend on the complexity of the issues involved. The public will also have an opportunity to comment on the proposed regulations.
In the NPS’s Code of Federal Regulations, operating or using unapproved aircraft on land or water is prohibited. The code also prohibits the use of hovercrafts in NPS-controlled areas.
Jarvis called on park superintendents to use their authority, established in the Code of Federal Regulations, to institute this prohibition of unmanned aerial systems. The director also called on superintendents to include the ban in each park’s individual regulations.
According to the release, at the issuance of the memo, any permits that had previously been issued regarding UAS are suspended. Those permits can be reviewed and approved by the NPS’ associate director for visitor and resource protection. The associate director will also be charged with the responsibility of authorizing the use of any previously-unapproved UAS.
The new prohibition will not affect the authorized use of model aircraft for hobbyist or recreational use, according to the release.
The memorandum also leaves open the option for the NPS to use UAS for their own purposes – such as search and rescue, fire operations and scientific studies; however, these uses must go through the same approval process with the associate director for visitor and resource protection.
“We strongly encourage all UAS users to operate the technology responsibly and to follow all applicable laws,” Melanie Hinton, senior communications manager at Association for Unmanned Vehicle Systems International, or AUVSI, said in an email statement. “UAS could have a variety of beneficial uses from assisting search and rescue missions, monitoring wildfires and surveying wildlife. The FAA is currently working on regulations for the operation of UAS, and, until they are complete, individuals and organizations should follow the National Park Service’s rules and other applicable laws in order to ensure the safety of the airspace and help to protect our national parks.”
The NPS is not the first agency to take a stance on UAS. The Federal Aviation Administration this month authorized BP to fly drones over pipelines in Alaska and is currently considering a request by the Motion Picture Association of America to use drones for filming.
The memorandum comes after drones have made appearances in several prominent national parks. According to NPS, in April an unmanned aircraft crashed in the Grand Canyon disrupting what the NPS called a “quiet sunset.” Drones have also reportedly disturbed wildlife in Zion National Park in Utah.
HHS looking for CEO and CTO to run health exchange marketplace
The Department of Health and Human Services is looking to streamline management overseeing the federal health insurance marketplace, appointing a new deputy administrator while announcing that it’s searching for two chief officers to head Healthcare.gov.
HHS Secretary Sylvia Burwell announced Friday that Andy Slavitt will become the Center for Medicare and Medicaid Services’ principal deputy administrator, primarily responsible for oversight of the all day-to-day agency operations. Slavitt, most recently the group executive vice president for Optum, had been overseeing work needed to fix problems associated with Healthcare.gov’s botched launch.
Healthcare.gov, which launched in October, was marred by months of usability problems. Minnesota-based Optum was brought in to fix the site at the end of October. By spring, more than 8 million people signed up through the federal exchange.
“Andy’s breadth of experience throughout the healthcare sector makes him the right person for this role, and I am excited for our partnership across all of the CMS programs,” CMS Administrator Marilyn Tavenner said in a release.
In addition, CMS is also actively recruiting a chief executive and chief technology officer to run the exchange. These two roles will be “accountable for policy development and technical operations” while also working with states on implementing new features.
“These actions will bolster our team and further instill ongoing accountability for reaching milestones, measuring results and delivering results for the American people,” said Burwell in a release. “Under this new structure, we bring additional operational and technological fire power and have a clear single point of contact in the Marketplace CEO to streamline decision-making.”
HHS also announced that Kurt DelBene will step down from his temporary role as senior advisor on HealthCare.gov. DelBene, a former Microsoft executive, had been appointed by the White House last year to oversee technical work for HHS. DelBene will stay on through the end of the month.
DHS releases quadrennial homeland security review
The Homeland Security Department Thursday released its strategic vision and priorities for the next four years as part of a process required by Congress known as the Quadrennial Homeland Security Review.
The release of the 103-page document, however, met stiff resistance in the House, where members of the House Homeland Security Committee held a hearing Friday criticizing the department for delivering the report six months late — well after it could have helped guide the president’s 2015 budget request — and for its failure to address significant management deficiencies that many argue have prevented DHS from becoming a more integrated and agile agency capable of keeping up with a rapidly changing threat landscape.
The five “enduring” homeland security mission areas identified by the Department of Homeland Security. (Source: DHS)“Year after year, DHS has ranked at or near the bottom of federal agencies and many public sector agency performance rankings,” Rep. Jeff Duncan, R-S.C., chairman of the Subcommittee on Oversight and Management Efficiency, said during a hearing to examine the DHS strategy document. “There seems to be a lack of aligning resources with strategic priorities. While the QHSR briefly mentions budget drivers, in general it does not link specific mission areas to the actual budget.”
House Homeland Security Committee chairman Rep. Michael McCaul, R-Texas, said the latest DHS strategy guidance is “more important than ever” given the increasing threats from resurgent terrorist groups overseas, continued border security weaknesses and the constant barrage of major cyber attacks against critical infrastructures.
“While I am encouraged by the emphasis of public-private partnerships and a risk-based approach to homeland security outlined in the QHSR, I am concerned that once again the Department has failed to make a link between their strategy and the resources necessary to implement,” McCaul said in a statement. “In addition, the QHSR’s lack of focus on management initiatives within the Department is troubling. To strengthen the Department, workforce challenges must also be addressed,” he said.
Stewart Baker, a partner at the law firm Steptoe & Johnson LLP and the first Assistant Secretary for Policy at DHS, said the quality and depth of the QHSR has improved significantly since the first review was issued in 2010, but there are areas that lack proper attention.
“With respect to cyber security, the 2014 QHSR has little new to say about the need to recruit and develop a skilled cyber security workforce,” Baker said in his testimony. “It also does not appropriately prioritize the importance of protecting critical U.S. infrastructure from espionage. To be sure, there are parts of the QHSR that need work. Nonetheless, on balance the report is an improvement over its predecessor.”
But at a more fundamental and tactical level, DHS needs to be able to respond quicker to emerging threats, said Frank Cilluffo, director of the Homeland Security Policy Institute at The George Washington University and a former special assistant to President George W. Bush on homeland security issues. “We can’t wait four years for strategies,” Cilluffo said. “You need a department that’s agile. I recommended an Office of Net Assessment along the lines of what the [Defense Department] has. That has played a significant role in protecting our country from a defense perspective. I think DHS would be well served if it had something that was nimble, agile and doesn’t have to wait four years to put together a strategy when the world changes so dramatically overnight.”

