Significant security flaws in Treasury Department, FDIC computer systems
The Government Accountability Office released two reports last week that detailed security weaknesses in two federal agencies responsible for large chunks of the country’s financial information.
A GAO report released Thursday found that the Federal Deposit Insurance Corporation has weaknesses in its information security controls that “place the confidentiality, integrity, and availability of financial systems and information at unnecessary risk.” The GAO released a similar report on Friday saying the Treasury Department’s Bureau of the Fiscal Service — which is responsible for oversight of the federal debt — has a “significant deficiency” in internal controls related to financial reporting.
In the FDIC report, GAO measured a number of security recommendations the office made as part of its yearly audit for 2012. Among the recommendations the FDIC did not fully implement in 2013 were controls for identifying and authenticating users’ identity, restricting access or encrypting sensitive systems and data, completing background reinvestigations for employees and auditing system access.
The report on the Bureau of Fiscal Service found 14 new information system control deficiencies, with half of those related to access controls, which are tied to user passwords or limits placed on what files or resources users are allowed to access. The GAO said a number of these deficiencies have been unresolved since its 2012 audit.
Both reports were couched, with the GAO saying shortcomings did not amount to a “material weakness” in either agency’s system. However, GAO said both agencies are open to unnecessary risk or abuse by not fixing the problems in a timely manner.
Both agencies concurred with the GAO findings, with the commissioner of the Bureau of the Fiscal Service made aware of the weaknesses in a separately-issued, official-use-only report.
The GAO will follow up on the vulnerabilities in each agency with its 2014 audit.
OSTP releases plan for future of civil Earth observations
From checking the weather app on your smartphone to looking online for the water levels in a nearby stream, civil Earth observations — data pulled from Earth-observing systems — have become an integral part of how one interacts with the data of daily life. And in its National Plan for Civil Earth Observations, the White House Office of Science and Technology Policy looks to set the course for the role the federal government will continue to play in these observations.
The plan, announced by Timothy Stryker, the director of the OSTP’s U.S. Group on Earth Observations program, in a blog post on the OSTP website, attempts to maximize the value of observations collected by federal agencies and advance observation systems.
“Americans and people around the world benefit from Earth-observations data every day,” Stryker said. “Have you ever used your smartphone to get a weather forecast? Turned on the TV to check beach conditions? Read a newspaper or magazine article describing the relationship of extreme weather events to climate change? These services are driven by Earth-observations collected by the federal government, which are made routinely available to app-developers, news and weather organizations, mapping services, the scientific community and the general public.”
Improved coordination of these Earth observations will ensure the data derived from them can be used more widely and efficiently across the federal government, in addition to using the data to serve citizens, according to the report.
But efficiency is not the only possible benefit of improving the way Earth observations are handled at the federal level; the plan said there could be a possible economic benefit as well.
“The U.S. government is the largest provider of environmental and Earth system data in the world,” the plan states. “Conservative estimates indicate that federal Earth-observation activities could add $30 billion to the U.S. economy each year.”
Structured around “a balanced portfolio of Earth observations and observing systems,” the observations will be measured in two categories – sustained and experimental. Sustained observations will be taken routinely for an extended period of time, while experimental will only be taken for a short period of time.
The plan establishes five priorities for the program. The continuity of sustained observations for public services and Earth-system research are the top two, while the third priority is to continue to invest in experimental observations. The priorities also call for improvements to the observation networks and a continuous assessment and prioritization process.
In addition to establishing priorities, the plan also calls on agencies and the OSTP to coordinate and integrate observations from multiple platforms, improve access to data and increase efficiency and cost-savings.
While the plan said improvement in the sampling of the data was necessary, this improvement would only be conducted where appropriate and cost-effective. OSTP also encouraged continued maintenance for observation systems infrastructure, exploration of commercial solutions for Earth observations, international collaboration and data innovation.
In February 2011, OSTP created the National Earth Observations Task Force, which went on to establish the National Strategy for Civil Earth Observations and a framework designed to “improve discovery, access and use of Earth observations.” The task force also performed the first governmentwide Earth Observation Assessment in 2012, the results of which were used to develop this plan.
“The plan is a blueprint for future federal investments in and strategic partnerships to advance Earth observing systems that help protect life and property, stimulate economic growth, maintain homeland security, and advance scientific research and public understanding,” Stryker said in the blog post.
Cybercom event explores agency roles in cyber incident response
Cybersecurity and incident response are practices engrained in most every 21st century federal agency. But when it comes to a massive cyber attack requiring the aid of multiple, partnering groups, which agency does what? Last week, the U.S. Cyber Command demonstrated a specific framework for how several critical agencies can play complementary roles in the national cyber incident response process.
With Cybercom at the lead, the FBI hosted the two-week Cyber Guard 14-1 event — a series of cyber incident prevention, mitigation and recovery exercises — at its Quantico, Virginia, headquarters, bringing together members of the National Guard, National Security Agency and reserves to test operations and coordination in supporting the Department of Homeland Security’s response to national cyber attacks.
While DHS is the highest in command during a domestic cyber threat, Cybercom emphasized a cooperative structure for federal agencies in their support. For instance, the FBI and Justice Department will follow DHS’ lead in prevention and response by investigating, attributing, disrupting and prosecuting cyber threats, as well as dealing with any domestic threat intelligence, according to a Defense Department release. DOD components are thereafter charged with defending the nation from further attack and collecting, analyzing and distributing any foreign threat intelligence, as well as backing DHS in its core roles. A the state level, the National Guard assists governments in recovery from the cyber incident, leaving DHS to focus on the federal effort.
“Practicing as an interagency team is essential to ensure national response to cyber events produce results that are effective and efficient,” said Greg Touhill, deputy assistant secretary of cybersecurity operations and programs at DHS, in a statement. “Exercises like Cyber Guard help us develop and refine key information sharing and coordination processes, understand each other’s capabilities and authorities, and operate in a manner that keeps us in the right formation to present the best national response.”
Cybercom describes Cyber Guard as a “whole-of-nation” effort, one that not only involves the critical defense and intelligence agencies, but also members of academia, industry and state government. In its third iteration, the interagency exercise brought in 550 participants, double the number of last year’s Cyber Guard.
“We talk all the time about physical networks connecting computers and communications,” Robert Anderson, executive assistant director of FBI’s criminal, cyber response and services office, said to participants. “But we must remember that on both ends of that computer network, there is a network of people working toward a common goal: to defeat our adversaries. Cyber Guard helps us get better at using the network of warriors on the front lines — like you — to achieve our goal.”
Attendees, like Coast Guard Rear Adm. Kevin Lunday, Cybercom’s director of training, noted how the event continues to simulate more realistic and intense cyber attacks, which strengthens the units responding in the exercises. The result is a stronger cyber incident response should a major domestic attack occur.
“What you’re doing here is critically important to how we will respond on behalf of our nation to a major cyberattack,” Lunday said to the crowd. “The more we know and share about the adversary and the better-defined our processes are, the better we can defend the nation.”
NASA, FAA team up to improve air traffic control

The Federal Aviation Administration is getting some help from NASA to make air traffic management more efficient.
Terminal sequencing and spacing technology, a new computer software tool for air traffic controllers, works to manage the spacing between air traffic as aircraft fly in and out of airports. The tool, created by NASA, is designed to aid planes and save fuel by making the air traffic control process more seamless and efficient.

Source: NASA
To improve efficiency, the software employs performance-based navigation procedures, which according to a NASA release, result in a reduction in controller-pilot communication and fewer flight path changes.
“With TSS, NASA’s aeronautics innovators have delivered to the FAA another valuable tool that will soon benefit our environment, our economy and every individual traveler,” Jaiwon Shin, NASA’s associate administrator for aeronautics research, said in the release.
TSS, according to the FAA, is an airport-centric technology. The software will utilize figurative “corner posts,” which are navigational points in the sky approximately 40 miles from the airport.
“Terminal Sequence and Spacing helps controllers manage aircraft from the four corner posts down to the runway,” a FAA release said.
The new TSS software is an addition to NASA’s Next Generation Air Transportation System. The effort is a multi-agency collaboration to update and improve the way air traffic control works.
According to Leighton Quon, project manager of NextGen, the program is developed by NASA but implemented by the FAA.
Development of TSS began in 2009, piloted by the agency’s Airspace Systems Program, which is a part of the Aeronautics Research Mission Directorate. The first prototype for TSS was developed in 2011, and since then, the prototypes have been tested in more than a dozen “high-fidelity simulations involving controllers and pilots.”
The FAA will reportedly deploy the tool sometime before 2018 and has yet to secure funding or a test location. The FAA said in the release it expects to make a full investment decision about the TSS project by the end of the year. Funding will be allocated through FAA’s Joint Resources Council.
Senate defense spending bill slashes IT by half-billion, uplifts cyber

The Senate Appropriations Committee passed a defense spending bill Thursday that would cut defense IT funding by a half-billion dollars and reduce President Obama’s military budget requests by $1.4 billion overall.
Listed as a way to increase efficiency in the department and reduce spending, the committee recommended a $500 million reduction in appropriations for non-cyber IT programs departmentwide as part of the fiscal year 2015 Defense Appropriations bill. “Trimming IT funding will help prioritize and better target non-cybersecurity IT investments in an era of fiscal constraint,” according to a summary of the bill from committee Chairwoman Sen. Barbara Mikulski, D-Md.
While the Senate Appropriations Committee recognized DOD’s efforts to improve IT efficiency and reduce waste, it said in a report accompanying the bill it “found a number of discrepancies where the resources reflected in the IT budget did not correlate to the operation and maintenance budget justification.” When it comes to enterprise IT in the department, the report states, the goal is to “reduce the cost associated with the Department’s overall information technology infrastructure by simplifying, centralizing, and automating infrastructure at the enterprise level.”
Though the Senate committee aims to condense IT spending, that doesn’t mean it’s losing sight of preserving and advancing DOD’s technological edge. The department’s primary technological concern is to become “more effective and more secure against cyber threats and vulnerabilities,” the report said. And that’s why, despite the greater cuts, cyber got some boosts from the Senate version of the defense budget.
The committee awarded the National Security Agency an additional $7.5 million for its National Centers of Academic Excellence in Information Assurance Education and Information Assurance Research, a program that allows NSA’s Information Systems Security Program to conduct classified research with approved universities.
The bill also grants DOD an additional $10 million for insider threat detection. The report commends the department for its progress in reducing the internal threats with practices such as end-user auditing, but it also recognized that the programs aren’t used robustly. Therefore, it allocated the additional budget to make the tools more widespread.
Combing through the bill, there are several other cyber initiatives with additional funding tacked on, like an added $12.5 million for general cyber force training and resiliency and $4.1 million for Cyber RED team training.
Additionally, the committee granted 5 percent increases to both innovative medical research, a $789 million bump, and basic research to all four military branches and the Defense Advanced Research Projects Agency, an additional $257 million.
DARPA wants help answering ‘trillion-dollar questions’ in the smallest way possible
Bill Chappell and the Defense Advanced Research Projects Agency’s Microsystems Technology Office has some great ideas. But it’s looking for help.
DARPA’s MTO held an expo Friday presenting the new areas in which it’s concentrating its efforts to facilitate collaboration between academics, industry and government that will continue to push the boundaries of microsystem technology.
Chappell has organized the office’s focus into four main areas, or what DARPA calls “thrusts:” electromagnetic spectrum, decentralization, information microsystems and globalization.
“Today is about saying DARPA is the place where you do the big inventions, the capstones of your career that will hopefully change the world in a very literal sense,” Chappell said Friday, adding that the thrusts are a “re-evaluation of why we are doing some of the microsystems we are doing.”
Some concerns in each thrust mirror issues the federal government and global tech community have been wrangling with in the past few years.
The information microsystems thrust is focused on processing the massive amounts of data the world continuously creates, even as engineers are beginning to reach the end of Moore’s Law, or the ability to double processing speed every two years.
Joseph Cross, a MTO program manager, used an interesting analogy to sum up the current state of processing systems: Imagine you are rich and you employed a very smart butler. A year later, your accountant fires the butler and replaces him with two butlers who cost less, but aren’t as smart. The next few years, your accountant keeps adding cheaper-but-dumber butlers to save money.
“Until one day, you open your balcony doors and out on your veranda are 1,000 drooling butlers,” Cross said. “The fact that they are dumb is fine if you want them to paint a fence. It isn’t fine if you want them to make a hollandaise sauce for your breakfast.”
The “hollandaise” Cross is referring to is the vast amount of data the military produces that needs to be utilized. To give perspective to the amount of stress the modern military puts on data processors: One Global Hawk drone uses 500 percent of the bandwidth the entire U.S. military used during the Gulf War.
“The number of transistors has been going up, that’s good, but the ways to use them efficiently have either hit a wall or [are] projected to hit a wall from an engineering perspective,” Chappell said.
Another growing concern for the military is component counterfeiting, which DARPA is looking to tackle through the globalization thrust.
Daniel Green, another project manager, presented slides of transistors and semiconductors that have been forged or tampered with in order to look like components normally sold by Intel.
“In the past, we’ve sort of buried our head in the sand,” Chappell said. “One of DARPA’s roles is to be the leading indicator of what’s going to happen and to avoid strategic surprise.”
While studying how counterfeit components end up in global supply chains is more a business focus, Chappell said it will ultimately be beneficial for the military.
“We have to embrace to global supply chain, but we have to do it in a way so you trust the components that show up on shore,” Chappell said. “If you only upgrade your technology every 20 years, with the fact that we are getting new electronics on a yearly basis, that doesn’t really work for us. We have to show [the military] how to get there through advanced research.”
While the focus of the MTO has been to make things as small as possible, they also want those looking to get involved to figure out how to make things cheaper.
“Cost is now killing us,” DARPA Director Arati Prabhakar said. “It’s a self inflicted wound, but it’s a very serious issue.”
But Chuck Wolf, deputy director of DARPA’s Adaptive Execution Office, said anyone who steps up will have eager “beta testers” ready and willing in the military.
“It doesn’t need to be earth-shattering,” Wolf said. “These improvements can be incremental.”
“We’ve always done advanced computing,” Chappell said. “When that stops, who is going to figure out these trillion-dollar questions?”
IN FOCUS: VA technology and security
FedScoop has been at the forefront of investigative reporting into the Department of Veterans Affairs and its efforts to deploy leading-edge information technology to improve its ability to serve millions of veterans and secure their private information.
In Focus: VA Technology and Security, provides a one-stop-shop for our ongoing coverage of the people, policies and technologies behind VA’s cyber struggles and the efforts to reform an agency under siege. Our coverage is posted chronologically. This page will be updated as new stories are published.
Miller calls on VA to answer for cybersecurity shortfalls
Thursday, July 17, 2014 · 7:00 am House Veterans Affairs Committee Chairman Rep. Jeff Miller, R-Fla., has called on five senior VA officials, including acting Secretary Sloan Gibson, to testify at next week’s scheduled hearing on “longstanding information security weaknesses” that have enabled “data manipulation” throughout the agency. Documents obtained exclusively by FedScoop show that in addition to Gibson, the committee plans to question Assistant Secretary for Information Security Stan Lowe, Executive Director for Enterprise Risk Management Tina Burnette and …
Anti-social: Feds wonder why social media companies drag feet on accessibility issues
The Federal Communications Commission hosted a panel of experts Thursday to talk about the challenges and ongoing need to make social media platforms more accessible to those with disabilities. But there was one group of representatives notably absent from the proceedings: the social media companies themselves.
Of the major social media networks discussed Thursday, only representatives from LinkedIn and open source content management system Drupal attended the FCC’s “Accessing Social Media” event.
Justin Herman, who leads federal social media efforts at the General Services Administration’s Office of Citizen Services and Innovative Technologies, said one top social media firm declined an invitation because the company knew their “accessibility was terrible.”
“You’ll find this a lot, that platforms and applications are quite familiar with the fact that they are not very accessible and they can do better,” Herman said.
Another panel member, Janice Lehrer-Stein, chair of the Access and Integration Committee for the National Council on Disability, said in the past she has reached out to Facebook as a private citizen in order to start a dialogue about accessibility. When Lehrer-Stein — who is blind — first delved into Facebook, she “was very challenged,” but said the company was very open and “discussed issues that were barriers” in her use of the platform. Those discussions helped serve as inspiration for meetings NCD held with a number of federal agencies, including the FCC, the Federal Emergency Management Agency and the Labor Department, on how the government can make social media accessible. They also led to a number of online dialogues hosted on ePolicyWorks that created benchmarks for departments to focus on, including captioning online videos. Last week, the FCC voted to require captions on online videos starting in 2016.
“One of the primary focuses is trying to determine the means for opening up social media to people with print and hearing disabilities, along with mobility and intellectual disabilities,” Lehrer-Stein said. “There are tremendous benefits to making social media accessible.”
Mike Reardon, a policy supervisor with the Labor Department’s Office of Disability Employment Policy, said making social media accessible to the disabled only goes so far: The services have to work well or frustration is going to quickly set in. “For a lot of people, they don’t realize there is a difference between accessibility and usability,” Reardon said. “A service can be 508 compliant, but barely usable. Usability is a much more important standard for us than accessibility.”
Section 508 of the Rehabilitation Act of 1973 requires that federal agencies’ electronic and information technology is accessible to people with disabilities.
Reardon said creating a dialogue with platform developers is crucial, otherwise nothing will ever progress. He cited his discussions with HackPad after people complained about accessibility issues for SocialGov’s Social Media Accessibility toolkit. “We can’t demand perfection, nothing is perfect out of the box,” Reardon said. “We do need to make sure, however, that there is a feedback loop to the developers.”
FCC Commissioner Ajit Pai spoke about some of the efforts to create accessibility to social networks, highlighting EasyChirp, a web-accessible Twitter alternative that has been optimized for the disabled. “Think about what a great thing that is,” Pai said. “There’s so many millions of people thanks to EasyChirp that are able to access the same platform and participate in the public square.”
While workarounds do help, Herman was dismayed by the lack of attendance from the social media companies. “We’re talking about emergency management information that will save lives and there’s empty seats here today,” Herman said. “I think that should really make people upset.”
GSA rolls out dashboard for Connections II purchases
Accessing non-classified government purchasing data just got a little easier with the launch of the General Services Administration’s Connections II dashboard.
Announced in a July 14 blog post by Mary Davie, GSA’s assistant commissioner in the Office of Integrated Technology Services, the dashboard will create a single point of access for all purchasing activity awarded under GSA’s Connections II, an award program for telecom equipment, labor, building and campus infrastructure solution needs.
“Data turned into actionable information will allow government to buy smarter, help agencies make better buying decisions, and lead to smoother bid and proposal processes,” Davie wrote in the post. “More information can help agencies better understand purchasing trends, conduct better market research, and be better negotiators. Ultimately, government buying decisions based on consistent, shared information deliver dollar savings to U.S. taxpayers.”

Source: GSA Connections II Dashboard
On the dashboard, users can view panels of data for total non-classified purchases under Connections II, purchases by federal agency and purchases made with an industry partner. Each panel contains data from 2012 to 2014 and has the capability to display the information in a bar or line graph.
“Users have quick and easy access to the dashboard,” Davie said. “Search results display in easily understood lists, graphs, and charts. The real-time dashboard gives meaningful and timely program information–whether to industry or government – at any time. Users can search for specific items, sort data, and create and download custom reports.”

The launch of the dashboard is one part of the agency’s larger transparency efforts. In the fall of 2012, the GSA launched a similar dashboard for its Governmentwide Acquisition Contracts. And last month, Davie introduced Networx, the agency’s telecommunications program that collects data about what federal agencies are purchasing and how they are paying for it.
Davie said more than 136 federal agencies have used Networx to purchase more than $760 million worth of network and telecommunications services. In addition, GSA’s Federal Acquisition Service is working on a common acquisition platform, according to the post.
This platform will contain more tools, capabilities and governmentwide data on acquisition vehicles, intelligence and prices paid, according to Davie.
“We collectively are opening up data, sharing it, and working together to find additional value. At GSA, we look for ways to make government purchasing data more open, transparent, and accessible, and the Connections II dashboard is one way to do this,” Davie said. “We try to give customers and industry partners real-time data whenever they need it – at both the agency and order level so government can increase data quality and spend analysis, and make better business decisions, and so industry partners can tailor their offerings.”
OPM launches interactive dashboard to boost workforce culture and engagement
Furthering her pursuit of a more engaged, higher performing federal workforce, Office of Personnel Management Director Katherine Archuleta announced Tuesday a new federal dashboard rooted in Federal Employee Viewpoint Survey data to give agencies a deeper understanding of their employees.
Introduced as UnlockTalent.gov, the portal is an interactive tool for agency leaders to dig deeper into viewpoint survey and demographic data and use other human resources tools, like Enterprise Human Resources Integration data, for workforce insights. Archuleta wrote in an OPM blog post Tuesday that the dashboard will help leaders “better understand the data and it will give them the extra support they need to create the most effective engagement programs for their employees.”
Jonathan Foley, a director of planning and policy analysis for OPM, said before this tool, agencies would have to access multiple datasets on their own. But with OPM’s new dashboard, analyzing employee sentiment and managing the workforce becomes exceptionally easier.
“The value proposition for UnlockTalent.gov is that it: 1) combines multiple data sources in one easy to use tool; 2) is available online and on a variety of devices; 3) can be customized to include more data sources over time; and 4) uses the most current data visualization techniques to bring the data to life,” Foley wrote in an email to FedScoop. “This allows agencies a much clearer and more detailed look into their workforce which will allow them to design effective strategies to increase employee engagement and effectiveness.”
The first iteration of the dashboard was launched July 3 to agency deputy secretaries, chief human capital officers and performance improvement officers, though Foley said those officials have authority to distribute access as they see necessary. OPM is “encouraging agencies to provide access broadly to employees who would find this tool valuable in completing their agency missions as well as in cultivating a culture of engagement at their agency,” he said. But those outside the government or agency employees not granted access will have no luck typing “UnlockTalent.gov” into a browser.
To make UnlockTalent.gov a valuable asset in the modern, digital federal government, OPM uses “the latest data visualization tools to highlight key areas for action,” Foley said. To achieve that standard, the office built the dashboard in collaboration with representatives from 14 other agencies “who provided key advice on design of the system and helped test its features,” he said.
For those with access to the dashboard, the site is an iterative work in progress and may change from time to time. Foley said OPM is making “sure that [it’s] responsive to user needs and that the tool can change over time to meet those needs.” OPM is also working with agencies on future iterations that will incorporate more functionality and data sources, including the ability for agencies to upload their own data, he said.
The UnlockTalent.gov dashboard serves as another OPM tool to satisfy the President’ Management Agenda priority of “People and Culture,” a innovative workforce initiative for which the office recently released a plan of action. Like the IT tools mentioned in that progress report, Foley said UnlockTalent.gov “is designed to be a resource that can assist agencies in making data-driven decisions to maximize and drive employee engagement and performance.”
http://youtu.be/6GoxzkqyIcU