NextGen dominates FAA reauthorization debate

The 113th Congress has yet to adjourn, but House Transportation and Infrastructure Committee Chairman Bill Shuster, R-Pa., convened a hearing to start looking at the upcoming 2015 reauthorization bill for the Federal Aviation Administration.
One component of that reauthorization, however, dominated the conversation — the Next Generation Air Transportation System, or NextGen.
“Aviation. We invented it,” Shuster said. “We’ve been the leader in aviation for the last 80 years, but we are now starting to lose our edge. If we don’t do something now, we are going to continue to lose our lead in the world when it comes to aviation. On my watch, I don’t want that to happen, and I’m going to continue to work to be able to craft something.”

The FAA is currently in the process of rolling out NextGen; however, the planned overhaul of the nation’s air traffic and transportation system remains significantly behind schedule and over budget. The agency’s official target date is 2020. If put in place according to plan, the program would bring runway optimization, enhanced satellite-based communication and performance-based navigation to make routes more efficient, less costly, and safer for pilots and passengers.
“FAA is in the midst of a multibillion-dollar effort to improve the efficiency of the nation’s air traffic control system through NextGen,” FAA Inspector General Calvin Scovel said at the hearing. “FAA’s acquisition reforms have fallen short in improving the delivery of new technologies and new capabilities.”
With new technology as a focal point of the NextGen initiative, Rep. Peter DeFazio, D-Ore., said the FAA needed to ensure that its procurement and acquisition processes can support the purchases it needs to make, and that it can enforce and justify the mandate it made for airlines to be equipped with the updated technology.
“[The FAA is] worse than the Pentagon on procurement,” DeFazio said.
According to Scovel, more than 220,000 aircraft are subject to the FAA’s mandate requiring they be equipped with ADS-B Out technology, which determines an aircraft’s position in real time based on satellite information.
“That will move up and down, but we believe that between now and 2020, those numbers will hold generally firm in that range,” Scovel said.
Although equipping planes with ADS-B technology means they can be monitored in real-time, National Air Traffic Controllers Association President Paul Rinaldi said the planes will still need to employ traditional radar monitoring systems due to the fact that the new systems can be turned off with the flip of a circuit.
“ADS-B shows a tremendous amount of value, but we have to have the necessary redundancy [for safety reasons],” Rinaldi said.
Despite the necessity for airlines and the FAA to invest in new technology to make NextGen a reality, Rep. Elizabeth Esty, D-Ill., said finances weren’t the only issue at stake here.
“It seems to me that there are two different issues: One is the funding, one is the timing,” Esty said. “We need to find a way to get this done.”
With the number of benchmark items that need to be achieved by 2020, Scovel said it was “a tall order” to suggest that NextGen would be completed on time.
“What happens between now and then is anyone’s game,” Scovel said.
Although the conversation did center around NextGen, members and witnesses did mention the need to consider the integration of unmanned aircraft systems into the national airspace, which the FAA is due to consider by the conclusion of 2015. However, the agency’s inspector general has reported that the agency is behind schedule.
Shuster, the committee chair, won re-election earlier this month in western Pennsylvania’s ninth district with nearly 64 percent of the vote. The Republican Steering Committee for the 114th Congress has recommended that he maintain the reign of the House Transportation Committee.
Although Shuster remained optimistic about the committee, the FAA and industry’s effort to improve the NextGen rollout, he said the funding issue was one that not even Congress could solve completely.
“The process doesn’t work as it should,” Shuster said. “The funding’s not there, and if you think Congress is going to be able to fix this, we’re not going to be able to, so we need to look at something different from the funding standpoint, and we have to do it together.”
GSA engages industry on special cloud acquisition category

The emergence of cloud computing as a high-demand IT solution hasn’t been the clearest in terms of procurement. But now that it appears the technology is here to stay, the General Services Administration wants to create a cloud category in the federal government’s largest acquisition vehicle to make it easier to find and buy.
GSA hosted an onsite industry day Tuesday soliciting feedback from vendors on its proposed addition of a special item number (SIN) for cloud computing to its IT Schedule 70 acquisition vehicle.
The SIN, a work in progress, was proposed in a July request for information to “improve the way that GSA offers cloud computing services through IT Schedule 70, increase visibility and access of cloud computing services to customer agencies, and to provide industry partners the opportunity to differentiate their cloud computing services from other IT related products and services,” that proposal states.
Those familiar with federal IT acquisition would be right to point out that cloud procurement is already possible under Schedule 70.
But, as Mark Day, deputy assistant commissioner of GSA’s Integrated Technology Service, pointed out to the audience of vendor representatives, those are mostly managed services “with some cloud sprinkled on top.”
The faux cloud options don’t end there. “You’ve also seen many contracting metrics used that try to approximate the cloud business model,” Day said. “And I think that’s a subtle fact that most of us believe, that it is an approximation of the cloud model that is coming out of most of our contracts. It’s not really the pure cloud model [found] in the commercial space.”
Why is that? Day explained that not only do federal agencies face stricter security scrutiny, but also the “fundamental, basic model of how we buy things is not particularly oriented to the cloud business model,” he said.
That has led to difficulty for all parties involved — GSA, vendors and the agency customers. This new SIN, though, hopes to alleviate the current pain points in cloud procurement through greater flexibility and control for customers. All services that meet five key points of the National Institute of Standards and Technology definition for cloud — on demand self service, broad network access, resource pooling, rapid elasticity and measured service — will be eligible for the cloud special item number.
Some of GSA’s past customers have tried to use preferred contracting methods to buy cloud, such as using their own blanket purchase agreement under GSA’s BPA. But for technical reasons, Day said, they can’t do that. “We actually stopped our customer from using their preferred contract model by using it ourself,” he said. “But if we put it under a cloud SIN, we don’t foreclose on their ability to use their preferred contacting model.”
At the same time, Day said this is an opportunity to drive more value into the schedules program and introduce new technology faster than other acquisition vehicles.
“If a SIN works better than a BPA, if only because we can inject things faster, maybe that’s a reason to do something like this,” he said.
It’s also an opportunity to leverage the agency’s subject matter experts for this broad topic — in the past, Day said, they were used only for specific procurement areas.
“We haven’t done a very good job using our subject matter experts,” he said. In the past with IT Schedule 70, “the customer had a very large procurement, a complex procurement and didn’t get much help associating how the schedules work to solve [a] problem. So we didn’t capture some of the really big opportunities that we should have that probably would’ve helped our customers. And our customers were left without any kind of help they might have wanted. Here’s another opportunity where we’re rethinking that.”
While GSA has a broad idea of where it wants to go with the cloud SIN, when the final product arrives and what it will look like is still up in the air. The schedule to submit a final proposal for the cloud SIN is still in its early stages and very dependent upon feedback from industry.
Chuck McGann to leave USPS for private sector

The chief information security officer for the United States Postal Service is leaving government for an undisclosed position in the private sector, the USPS confirmed to FedScoop Tuesday.
The agency did not announce McGann’s departure date.
“After 27 years of dedicated service, Chuck McGann is retiring from the Postal Service,” USPS spokesman Dave Partenheimer said in a statement to FedScoop.
According to Partenheimer, as part of his job, McGann was responsible for overseeing the information security of one of the largest technology networks maintained by any organization in the world.
McGann’s departure comes just eight days after USPS announced a major cyber intrusion that potentially compromised the data of approximately 800,000 current and former employees. McGann’s departure, however, was not related to the breach and had been planned, according to a senior USPS official.
“[McGann] was a key player in the Postal Service’s successful response to the recent cyber intrusion,” Partenheimer said. “He will be greatly missed by the organization.”
After joining the Postal Service in 1987, McGann became the CISO in 2009. At USPS, McGann reported to Chief Information Officer Jim Cochrane and was responsible for cyber and physical computing security at the Postal Service. McGann also worked closely with the agency’s chief privacy officer, Matthew Connolly, to report and notify the agency and others of breaches or data loss.
The departure also comes days after the announcement of the retirement of Postmaster General Patrick Donahoe.
McGann is also a FedScoop FedMentor and has spoken at FedScoop events in the past.
NOAA’s new toolkit a ‘cognitive bridge’ for climate change action plans

When President Barack Obama addressed the United Nations in September, he outlined a number of ways the country has been affected by climate change: flooding at high tide in Miami, a prolonged wildfire season in the western U.S. and tumultuous rainfall patterns that disturbed crop cycles in the heartland.
“We cannot condemn our children, and their children, to a future that is beyond their capacity to repair,” Obama said. “Not when we have the means — the technological innovation and the scientific imagination — to begin the work of repairing it right now.”
A key piece of that innovation and imagination was revealed Monday as the National Oceanic and Atmospheric Administration released the first iteration of its Climate Resilience Toolkit. The toolkit offers data repositories, mapping tools and case studies that allow all levels of government to plan, understand and address issues related to the climate.
NOAA’s David Herring, who led the development of the toolkit, told FedScoop he wants the site to not only serve as a place where people can collect data but also as a “cognitive bridge” that allows leaders to put weather information in the appropriate context.
“Our focus has been to think about the types of questions and issues that decision makers have, get into their space cognitively and work backwards to the federal-sized domain,” Herring said. “Our thought was ‘What do we have across the federal government writ large that would be low-hanging fruit, as it were, it terms of information, tools and data that [leaders] can leverage and utilize in their resilience planning?'”
Some of the features initially rolled out in the toolkit include Climate Explorer, which allows people to visualize various climate and weather patterns based on daily observation and long-term climate models. Developed with the help of the University of North Carolina at Asheville’s National Environmental Modeling and Analysis Center, the tool allows projections based on historical data from hundreds of different weather stations across the country.

Another portion of the website hosts more than 20 case studies on how state, local and tribal governments are responding to climate-related challenges, a feature created when people from different areas around the country reached out to Herring about the project.
“It hadn’t initially dawned on me that they also want to see their assets and information represented through the toolkit and the obvious answer is ‘Sure, why not?'” Herring said.
In total, the full toolkit will focus on seven different areas when it’s completed in March. Yet even after the completed date, Herring said an outside evaluation will be done in order to add features as users call for them.
“If we need to improve, if we need to bridge gaps, if we need to include more types of data, we can take our cues from the users and focus on the areas where we need to do more development,” he said.
In the meantime, if users do not find useful material in the toolkit, the team has built a semantic search engine that has crawled all public-facing government websites in order to pinpoint any missing information.
“We were hoping that [the toolkit] would be useful to people, but we were also imagining that some folks might come to the site for a specific topic or with a specific location in mind, and if they don’t find it, they might think the toolkit doesn’t work for them,” Herring said. “If you are not finding something that you are interested in and you want to cast a wider net across the whole U.S. government, that search capability will allow you to do that.”
While the president’s executive action is focused on climate change, Herring said his toolkit can still be a resource for governments or businesses that need an action plan for weather-related disasters. According to NOAA’s National Climatic Data Center, 170 severe weather events have cost the U.S. more than $1 trillion in damages since 1980, $23 billion of that coming just last year.
“This site is about the bigger umbrella of climate,” Herring said. “No one can dispute that El Niños happen. We know heat waves happen. People are going to experiencing extremes. We wanted to construct the toolkit that’s useful and beneficial. We’re here to save money, we’re here to save lives, and we’re here to be robust in the face of all climate-related changes.”
For more information on the site, check out the video below.
How the Energy Department uses asset management to drive cybersecurity

An IT asset management system for a government agency that employs more than 100,000 people should probably have more than three people working on it and should probably take longer than nine months to build.
Rick Lauderdale has proven otherwise.
Lauderdale, the Energy Department’s chief architect, has been spending the better part of 2014 refining a system that gives agency leadership an agile way to use the agency’s existing technology and phase out end-of-life software. While the primary focus of most IT management systems is lowering cost, Lauderdale’s system operates from a cybersecurity standpoint. As his team worked to integrate the program across the agency, they found that the platform’s security aspect set the table for all the other components of IT decision-making.
“No one has ever tied IT asset management to cybersecurity,” Lauderdale said during an interview with FedScoop. “I can tell you that our CFO office loves it, because now it’s allowing the stakeholders to look forward and try to predict what’s going to happen with the software and the hardware.”
The security focus comes from last year’s Energy Department hack, which compromised the social security numbers and birth dates of 53,000 former and current DOE employees. After Lauderdale determined the vulnerability was due to an out-of-date version of Adobe’s ColdFusion, he set out to create a system that mapped assets as well as managed the life cycles of all DOE assets.
Using a combination of enterprise portfolio management tool Troux and IT information repository Technopedia, Lauderdale has created a way for agency executives and the IT office to map out on-demand reports about potential vulnerabilities, redundant applications and measures that need to be taken to phase out end-of-life technology.
The system Lauderdale and his team set up allows stakeholders to create very granular data visualizations across the enterprise; managers can filter queries based on hardware manufacturers, phases, products and versions, then run it against Technopedia to determine safety, cost, further integration or a host of other filters. Earlier this year, the platform was key in helping DOE understand what Microsoft hardware products would need to be replaced as the company ended support for Windows XP.
Lauderdale said the system has allowed DOE to move away from being reactive about vulnerabilities and instead be more intuitive about asset management.
“It’s going to allow us to predict what is going to occur and then move forward aggressively to prevent any kind of vulnerabilities to the network,” he said.
While Lauderdale said this system “does not solve all the cybersecurity issues that are out there,” the size of the team that created the tool and the time frame within which the team created it has caught the attention of enterprises inside and outside the public sector. Lauderdale told FedScoop eight different federal agencies as well as private companies foreign and domestic have contacted him about the platform. It was also part of a case study recently published by market research firm IDC.
“[Asset management] has a huge gap that is both in industry and government, and that’s going to help them close it,” Lauderdale said.
Yet even with the provided agility and amount of information that can now be uncovered, Lauderdale said it’s still up to people to make sense of the data the platform can unearth.
“You’ve got to be smarter than the data,” Lauderdale said. “The data is telling you something, but if you’re not smarter than the data, you’re not going to be able to read the tea leaves, whether it’s right, wrong or indifferent.”
Public and private research supports UAS commercialization

As the Federal Aviation Administration struggles to stay on schedule to meet Congress’ 2015 deadline to integrate small unmanned aircraft systems into the national airspace system, a range of other public and private organizations are moving on new UAS research programs with an eye toward commercialization.
When Michael Clemens, the assistant chief for Maryland’s Montgomery County Fire & Rescue Service, bought the first UAS for the department last year, he didn’t know the FAA prohibited it for nonrecreational use. In fact, it wasn’t until the device had been tested in open air several times when someone told him what he was doing might be illegal.
“We had the cart ahead of the horse here,” Clemens said Monday, speaking at the UAS Commercialization Industry Conference in Washington, D.C.
Clemens intended to use drone technology to provide an “eye in the sky” view of a burning building in order to determine what areas firefighters should concentrate their fight. The areas, or hotspots, could be identified through thermal imaging cameras mounted on a small quad-copter rotocraft UAS.
“We didn’t know where to put our streams, so [without a drone] we had to get into a [neighboring] high-rise building to look down at the fire,” Clemens said. “Most of our information that we see comes from the ground level. We don’t really get to see above it. I want to know how big the fire is, where it’s going and what I need to do to mitigate it.”
Though Clemens wanted to continue using drone technology to fight fires, he said he understood the need for the FAA to go through the process of ensuring that the devices could be safely integrated.
“I think the FAA has a job to do,” Clemens said. “We really respect what they’re doing. The test flight centers are really important. We think there’s so much opportunity, and we haven’t even scratched the surface.”
Now, Clemens and his department are in the process of applying for a certificate of authorization, or COA, from the FAA, which would allow them to operate drones under specific conditions.
The COA would be granted under section 333 of the 2012 FAA Modernization and Reform Act, which allows the agency to grant exemptions to the non-recreational drone prohibition. Earlier this year, the FAA granted an exemption for BP to operate UAS to check pipelines in Alaska for leaks and other faults.
According to Marty Rogers, the director of the Alaska Center for Unmanned Aircraft Systems Integration at the University of Alaska Fairbanks, BP has operated some of its missions through the center.
Since the COA for BP was granted in early June, the company has used drone technology to inspect its infrastructure and create geologic modules in Alaska’s North Slope. However, according to Rogers, the company’s aim is not to set the standard for UAS operation in the commercial space but rather to gather the information they need in a cost-effective, safe way.
“[BP is] absolutely, without a doubt, system agnostic,” Rogers said. “This is absolutely about the right data at the right time to support their missions.”
In addition to helping BP fly their missions out of the ACUASI, Rogers also leads the FAA-approved test site based out of the University of Alaska Fairbanks.
Through the test site, Rogers and his team utilize their fleet of more than 100 aircraft to fly UAS missions approximately 150 days out of the year. The missions focus mostly around research, including work with the National Oceanic and Atmospheric Administration to study polar bears and other endangered animals.
With UAS technology, Rogers said, the center can perform the “dull, dirty and dangerous” without putting lives on the line and for a relatively low cost.
“Because we’re able to fly so low, so quietly, we’re really able to get some great data” about wildlife and other difficult-to-reach places, Rogers said.
Science, research and commercial opportunities
In May, Rogers and his team traveled to Soldotna, Alaska, to help survey the Funny River Wildfire. After a few days of bureaucratic difficulties, the team flew several UAS missions over the fire in order to determine where the hottest areas were so that firefighters could concentrate efforts there.
“This was a commercial opportunity, but we were doing it as a science and research mission,” Rogers said. “We were really trying to refine our processes and get some actionable data.”
But academic research and pipeline observation are not the only applications for this technology. For example, NASA’s UAS-National Airspace project is looking at the development and testing of technologies that can make airspace integration a little more convenient for the FAA and drone operators, said Laurie Grindle, the program manager for the project.
Through a two-phase process scheduled to end in fiscal year 2016, the agency has started to evaluate various technologies that would be critical to integration, including sense-and-avoid systems that would allow drones or manned planes to know about the presence of the other and automatically redirect out of each other’s flight paths.
In the first phase of the project, Grindle said most of the testing was done in a completely virtual environment that shared content between the different NASA flight centers. Now during phase two, and continuing through the end of the project, the agency will do real-life flight tests to evaluate the quality of the integrated technology.
Yet from NASA to Montgomery County, Maryland, to Alaska, the drone operators said they had one thing in mind: commercialization. Although unmanned aircraft systems now are used for research purposes, the commercial possibilities of the technology are endless, Rogers said.
“We’re sort of going in some places that really haven’t been touched yet,” Rogers said. “We’re trying to help the commercial side. We don’t want to inhibit it at all.”
Patent office holds first cybersecurity partnership meeting
At a meeting between federal officials and members of the tech community, U.S. Patent and Trademark Office Deputy Director Michelle Lee emphasized the part her office can play in fighting a “multibillion-dollar crime wave” sweeping the globe.
“We at the USPTO embrace our role in helping cybersecurity suppliers and providers bring their products and services to the market quickly and efficiently,” said Lee, a former Google Inc. executive, in a televised message at the meeting.
She made the remarks Friday at what was the patent office’s first cybersecurity partnership meeting. Held in Silicon Valley, the meeting brought together patent office staffers and members of the tech industry to discuss protecting cybersecurity intellectual property and encouraging the use of cybersecurity tools.
The goal of the partnership, Lee said, is to ensure the office issues timely, high-quality patents. She also noted that some patents could protect the U.S. and countries abroad from cybercrime.
Among the topics on the event agenda were how Alice Corp. v. CLS Bank International, a recent Supreme Court case on what is patent eligible, applies to cybersecurity; deciding between seeking patent protection for cyber technology or keeping innovations as a trade secret; and computer/network security patent applications.
Presenters also discussed the voluntary Framework for Improving Critical Infrastructure Cybersecurity, which was released by the National Institute of Standards and Technology nine months ago as a result of an executive order. The guidelines were created with collaboration from private industry.
Like Lee, presenter Nestor Ramirez, director of USPTO’s Patent Technology Center 2400, noted that cyber weaknesses pose a major risk.
“The national and economic security of our nation relies on the function of our critical infrastructure,” Ramirez said. “Cybersecurity threats have the potential to destabilize our critical infrastructure.”
But he said the patent office was eager to examine strategies to better address the threat. Looking ahead, Ramirez told attendees that at the beginning of next year the patent office plans to host a roundtable for cybersecurity startups. Also, the agency is evaluating patent examiner training opportunities specific to emerging cybersecurity technology and best standards.
“We want to ensure that our examiners are kept up to date and abreast on the current state of the field so they have the knowledge and tools that they need to make sound patentability decisions,” he said.
Foreign hackers may still have access to VA networks, IG says
The Department of Veterans Affairs is still “actively monitoring” its networks for traces of foreign hackers who successfully infiltrated its computer systems in 2010, and officials acknowledge that “certain threat groups may still have access to VA systems using unauthorized user accounts,” according to the agency’s inspector general.
The attack, which made headlines in 2013 and has been attributed to state-sponsored hackers from at least eight different countries, led to an agencywide security effort that lasted more than a year, according to written testimony to be delivered Tuesday to the House Committee on Veterans Affairs by Sondra McCauley, the VA’s deputy assistant inspector general for audits and evaluations. FedScoop obtained a copy of the testimony in advance of Tuesday’s hearing.
Concerns about the continued presence of foreign hackers on VA networks comes on the heels of a Government Accountability Office report released Monday that shows VA cybersecurity officials did not retain forensic evidence related to known network intrusions, including the 2010 nation-state-sponsored attack, and allowed critical vulnerabilities in two key Web applications to go uncorrected for as long as 18 months.
The GAO report and the IG’s testimony will be the basis of Tuesday’s scheduled hearing of the House Committee on Veterans’ Affairs. Lawmakers plan to grill VA Chief Information Officer Stephen Warren and VA Chief Information Security Officer Stan Lowe on the department’s longstanding cybersecurity gaps. Warren acknowledged to reporters Friday that VA has been notified by the IG that the agency’s IT security controls remain a material weakness for the 16th consecutive year. McCauley’s testimony, however, provides the first detailed glimpse of the issues that contributed to VA’s failing evaluation.
“The financial management system uses an unsupported database with several known critical vulnerabilities that cannot be updated with security patches,” McCauley’s testimony states. In addition to software patches not being deployed in a timely manner, the IG also discovered several VA organizations were sharing the same networks and data centers with organizations that were not under VA’s central control and “often had critical or high-level vulnerabilities that weakened the overall security posture of the VA sites.”
“We continue to identify significant technical weaknesses in databases, servers, and network devices that support transmitting sensitive information among VA Medical Centers, Data Centers, and VA Central Office,” McCauley’s written testimony states. “For FY 2014 we once again found deficiencies where control activities were not appropriately designed or operating effectively. It is particularly disconcerting that a significant number of vulnerabilities we identified at VA data centers are more than 5 years old.”
McCauley also plans to tell the committee Tuesday that VA faces new, emerging security challenges that the IG has not identified in previous audits, including the movement to cloud computing and the increasing threat posed by foreign nation-state hackers. According to the IG, VA entered into a contract last year to move more than 600,000 email users to a private cloud service. But the contract did not include a clause allowing the IG to access VA systems and data, effectively blocking the IG from conducting legal oversight and investigations.
The IG is also investigating multiple whistleblower reports to the IG hotline, including accusations that VA was hosting medical devices containing sensitive patient information “that are not effectively protected from unauthorized access,” as required by VA’s Medical Device Isolation Architecture. The IG is also investigating claims that VA was misrepresenting information in preparation for the fiscal year 2014 security audit.

Sources on Capitol Hill told FedScoop that lawmakers are running out of patience with VA’s inept handling of critical security incidents that are known to have compromised veterans’ data, including a “significant” attack that occurred in 2012 and involved government-backed hackers in China and possibly Russia. According to the GAO study, although VA security operations center documented the actions it had taken to eradicate the foreign hacker threat, VA cybersecurity officials could not locate the forensics analysis report or other materials related to the incident.
“Officials explained that digital evidence was only maintained for 30 days due to storage space constraints. As a result, we could not determine the effectiveness of actions taken to address this incident,” the GAO report states. “In addition, VA has not yet addressed an underlying vulnerability that contributed to the intrusion,” GAO said. Although VA had planned to deploy a solution in February that would have corrected the weakness, it had not yet done so at the time of the GAO’s review. Auditors concluded VA’s networks remain vulnerable to similar incidents.
Meanwhile, a VA official who spoke to FedScoop on background said shortly after news broke of the nation-state hack into VA’s active directory domain controller, VA contracted with Mandiant to conduct a security audit. Mandiant, the company known for a 2013 report that documented the existence and activities of a massive Chinese government cyber espionage campaign, delivered a preliminary report to VA on Friday. The VA official said the report verifies the steps VA took in response to the attack and concludes the domain controller is no longer compromised.
As of May 2014, the 10 most prevalent critical security vulnerabilities at VA involved software patches that had not been applied, according to GAO. In some cases, these patches had been available for almost three years before being deployed. And due to multiple occurrences of each of the 10 missing patches, the total number of vulnerable systems ranged from 9,200 to 286,700, GAO said.
“At the end of our audit, VA officials told us they had implemented compensating controls, but did not provide sufficient detail for us to evaluate their effectiveness,” the GAO report stated. “Without applying patches or developing compensating controls, VA increases the risk that known vulnerabilities could be exploited, potentially exposing veterans’ information to unauthorized modification, disclosure, or loss.”
In a statement emailed to FedScoop, Warren said: “Veterans’ information is well protected because we put mitigating controls in places where we can best simultaneously protect Veterans’ information and not impede our ability to provide timely health care that they have earned and deserve.” Warren also said VA, like other large agencies, records a significant volume of threats, but VA’s “security posture is successfully keeping Veteran information safe, and as we believe that IT security is an evolving process, we’re always striving to improve.”

HHS launches new cohort of Entrepreneurs-In-Residence
The Department of Health and Human Services’ IDEA Lab is signing a one-year lease on a handful of talented entrepreneurs to help solve some of health care’s biggest issues.
IDEA Lab announced a new cohort for its Entrepreneurs-In-Residence program, which matches entrepreneurs from outside government with HHS employees to innovate on “high risk high reward projects” crowdsourced from within the agency, according to a blog post. This latest group, four entrepreneurs with varying backgrounds in private sector innovation, was selected from the most talented applicant pool yet, the blog said.
During the 12-month period, the entrepreneurs will use agile and lean methodologies to address problems that might require unconventional thinking or lack the typical resources needed for a solution.
The third HHS Entrepreneurs-In-Residence class is:
- Danny Boice, the co-founder and CTO of conference call startup Speek, will help the Administration for Community Living explore how the elderly and disabled use technology and media to access services.
- Mark Scrimshire, co-founder of consumer-focused health care company HealthCa.mp, will team with the Centers for Medicare and Medicaid Services to help redesign its Blue Button initiative as a Data-as-a-Service platform for third-party applications.
- Paula Braun is a data scientist with Elder Research Inc. Braun, who started her career as a Presidential Management Fellow, will re-enter federal government for a year to help the Centers for Disease Control and Prevention create a next-generation Electronic Death Registration System.
- David Portnoy, co-founder and CTO of Symbiosis Health, will help the HHS Office of the Chief Information Officer and the HHS IDEA Lab build public-facing research database applications for the department’s massive amount of data.
Bryan Sivak, CTO at HHS and leader of the IDEA Lab, talked highly of the EIR program recently at FedScoop’s annual FedTalks event. Though there are several innovative programs under the IDEA Lab umbrella, the CTO focused much of his keynote on the anecdotal success of HHS’ Entrepreneurs-In-Residence.
Limiting the entrepreneurs’ term to 12 months, he said, is vital to program’s ability to bring rapid solutions to major problems.
“The 12-month thing is really important because with 12 months, by definition if you only have 12 months to solve the problem, you almost by definition have to do things differently,” Sivak said. “You cannot follow standard bureaucratic procedure.”
Sivak keyed in on an entrepreneur from the program’s first class, David Cartier, a 25-year veteran of UPS tasked with helping create an electronic tracking system for organ donations while at HHS’ Health Resources and Services Administration.
“They brought David in because he had all this experience with UPS with [radio-frequency identification] tagging and tracking and could help them do this,” Sivak said. But Cartier’s experience was mostly with “innovation practices and design thinking and human-centered design,” not medical procedure. So, Sivak said, he spent quite a bit of time in operating rooms and realized technicians, nurses and doctors found it cumbersome to handwrite between 30 and 70 labels for the organs transplanted each night. It not only leaves room for human error, like poor transcriptions, but it also takes quite a bit of time, Sivak said. “When you’re working with organs, every minute counts.”
His solution? A mobile printing and barcode system, as Sivak called it, to be used in operating rooms for the rapid tagging and dissemination of the organs.
When Cartier’s time was done and the solution’s pilot wrapped up, they took the systems away from the operating rooms because it was meant to be no more than a test phase. The operating room users, though, demanded it be brought back, Sivak said. “And to me, that’s as good as you can get, right? The users of the system want to continue using it.”
OPM director defends preference in hiring vets
Responding to a question live-tweeted to her during a digital town hall Friday, Office of Personnel Management Director Katherine Archuleta didn’t miss a beat in defending the federal government’s push to hire veterans.
Celebrating her first year in the administrator role, Archuleta invited the public to ask her questions about the future of the federal workforce in an open forum hosted via Google Hangout. About halfway through, she received the tweeted question on veteran hirings:
#AmericasWorkforce – why is it so hard for civilians to get jobs in Fed Govt nowadays? Vets block all the positions civilians need jobs too
— The Devine MrsM (@DvineMrsM) November 14, 2014
But that’s a misconception, she said, before going on to defend veteran hiring.
“First of all, I’m going to say that I am a very, very strong proponent of veterans preference,” Archuleta said. “I believe that the men and women who serve in our military and come home need to have an opportunity to continue their service.”
Archuleta further vouched for the returning soldiers saying it gives the federal government a chance to leverage “those skills, that experience, how do you organize, how do you schedule, how do you develop the strategies for implementing,” which she said are “skills we need in federal government.”
But the woman who tweeted the question might have some ground to stand on thanks to a study released in August by the U.S. Merit Systems Protection Board. MSPB, which resides as an independent agency in the executive branch as a guardian of the federal merit system, found that there might be some undue preference for hiring veterans in the federal government, or at least that’s how some of the federal workforce perceives it.
“In an MSPB survey, 6.5 percent of respondents indicated that they had observed inappropriate favoritism towards veterans while 4.5 percent reported observing a knowing violation of veterans’ preference rights,” MSPB reported. “The survey data showed that employees are less likely to be engaged and more likely to want to leave their agencies if they report having observed either of these two types of conduct.”
OPM has several vet-friendly hiring initiatives, and recently the office announced a new STEM focus under its Vets to Feds Career Development Program. This addition will be the program’s fourth since 2011.
Later in the discussion, Archuleta continued her stand for veterans and specifically the women who leave the military after serving. She said this is one of her major focuses as vice chairwoman of the Veterans Employment Council, an honor she shared with Department of Veterans Affairs Secretary Bob McDonald.
“Together, we’re focused in on how we can bring more women veterans into the federal workforce,” she said. “The skills that women veterans can bring to us and to the federal service is really important, and I want to bring more.”
The director addressed another group underrepresented in federal government, millennials, during her town hall. While the future of the federal workforce won’t get the preferential treatment veterans do, Archuleta said OPM is at work making sure those young minds consider the federal workforce.
“We’re going where they’re at,” she said. “We’re using social [media] a way that we’ve never used it before.” OPM is using social applications like Twitter and LinkedIn, as well as plain language, humor and graphics, hoping to reach to them.
Additionally, Archuleta said there’s a working plan to revamp USAJobs.gov. Right now the agency is using Lean Six Sigma to audit the application process and make it a more meaningful to reduce the time in hiring new talent.