Turner Roth: 18F hears IG’s concerns but isn’t losing long-term vision
The General Services Administration digital services team 18F will take the advice of a recent inspector general report to re-evaluate its balance of billable and non-billable work, GSA Administrator Denise Turner Roth told FedScoop Thursday. But that doesn’t mean 18F will lose focus on its long-term mission to transform the way the federal government buys and builds technology, she said.
“18F in its short existence has showed how it can help the federal government and federal agencies, and how important its mission is,” Turner Roth said. “But certainly, like other startups, it’s not perfect. It has room to grow and areas to work on.”
The GSA IG issued a report — sparked by the concerns of senior GSA official — in late October that called 18F out for continually failing to recover its costs, losing $31.7 million since it was founded in 2014.
[Read more: GSA IG: 18F’s financial projections need a rework]
The report specifically pointed at three factors contributing to that: “18F management’s established pattern of overestimating revenue projections, increased staffing levels, and staff time spent on non-billable activities.”
Even before the report came out, GSA had been trying to address some of those issues, and these findings from the watchdog only reinforce those efforts.
“As we were looking to create the new Technology Transformation Service and develop some of the infrastructure for 18F to be a longer term effort that was going to be in place, we needed to look at all aspects of it,” Turner Roth said. “Some of these things we had started to evaluate, and then some we will go back and make sure as we get the input from IG that we are accounting for it.”
But since the report hit, there’s been a lot of focus particularly on the split between billable and non-billable work, on which the IG found the team spent about 52 percent of its time.
“It’s an important discussion and it’s absolutely the space where we’re going to spend time, and where we have been spending time to focus on what is the right mix,” she said. “I think the report fairly placed out that, because of the amount of time some staff has spent on non-billable work, that that hasn’t impacted the bottom line.”
In many ways, that discussion feeds into a larger one about the greater purpose of 18F — on the surface the organization appears to exist to help agencies build one-off software projects and make sure revenues are in the black at the end of the year.
But 18F is pursuing something bigger, and that’s where the non-billable work fits in — developing tools that other agencies can use to modernize their services, a model that isn’t immediately cost-recoverable or measurable — Turner Roth said.
“Non-billable work is where we do the development of cloud.gov; it’s where we create the design standards for how we go to the web,” she said.
That non-billable work is important, the GSA administrator told FedScoop, and just because 18F is millions in the red doesn’t mean that work is going away.
“There is a mixture that is the right one, but I think that the overall point is that we need this organization to be cost-recoverable, but we do have a demand that will get us there,” she said, noting 18F’s 69 percent revenue increase between fiscal years 2015 and 2016 despite an even quicker growing gap between that and its expenses in the pariod. “We just need to get the right mix of billable and non-billable in the focus of our work and attention.”
Through this IG report, and another two that may follow, 18F is learning where it fits as an innovative startup in the federal government.
The reports “are feeding into how we look at the transition and growth of the next iteration of 18F” — one that comes under the direction of new Technology Transformation Service Commissioner Rob Cook, who Turner Roth said has “worked with a sort of startup organization in a larger organization — Pixar within Disney — and he’s produced a financially viable effort.”
Even startups worth billions in the private sector — think Twitter, Snapchat, Amazon, Box and many more on a long list — lose hundreds of millions each quarter. But the federal government isn’t a private business, and 18F has the taxpaying public, and their overseers on Capitol Hill, to answer to.
“I think what’s important to recognize is that we have a very true commitment in the work of 18F to break even,” Turner Roth said. “There certainly is an investment in the front-end of projects, and it certainly is true here as well, to have a start up to it in terms of its initial up-front costs to develop the organization, the entity itself, and then to get it to a full cost-recoverable position. So we’re going through that cycle.”
“It may just be because 18F is a unique aspect to government,” she said. “Until now, we haven’t had something that looks like this entity and the type of work that’s being delivered to modernize the technology footprint.”
But even as the IG report explains anecdotally, many of GSA’s programs that pull from the same fund as 18F experience years in the red before they are sustainable.
Turner Roth and her deputy, Adam Neufeld, told inspectors that “many ASF-funded GSA programs are not projected to break even in FY 2017 – with some not projected to break even in any of the next five years – and that GSA is in a market with many uncertainties, which are magnified in new investments like 18F,” according to the report.
Faced with such financial challenges, though, 18F remains steady in its pursuit of its bigger-picture purpose, Turner Roth said.
“The long-term value of 18F overall is its role in the modernization of technology in the federal government,” she said. “We’re going to be the interface between how we are introducing technology into the federal government, we’re going to be the interface of how we’re considering policy and how we’re making a path for private sector to federal government to bring forth these new technologies.”
“What I go back to is ‘Is this a usable, helpful, necessary mission?’” Turner Roth continued. “And I think the answer to that is fully yes.”
OMB launches Code.gov repository for open source projects
The Obama administration launched Thursday Code.gov, a new repository for government open source code now featuring nearly 50 open source projects from more than 10 agencies.
Coders can expect to see more projects on the site in the coming months as agencies implement the recently released Federal Source Code Policy, U.S. CIO Tony Scott said in a blog post announcing the launch.
The Federal Source Code Policy seeks to get agencies to release more of their custom-developed software. The policy notably establishes a pilot program requiring agencies to release at least 20 percent of new custom-developed code as open source software.
Code.gov is not just a repository, however, but also a resource for agencies to use when implementing the policy, Scott wrote. For example, Code.gov features a metadata schema for agencies to use when building out their enterprise code inventories and includes information on how to build successful open source projects.
[Read more: OMB finalizes policy on open source code from agencies]
“We’re excited about today’s launch, and envision Code.gov becoming yet another creative platform that gives citizens the ability to participate in making government services more effective, accessible, and transparent,” Scott wrote in the post. “We also envision it becoming a useful resource for state and local governments and developers looking to tap into the Government’s code to build similar services, foster new connections with their users and help us continue to realize the President’s vision for a 21st Century digital government.”
Earlier this week Department of Homeland Security CTO Michael Hermus noted government is going to need to work on building community around its open source work.
“Without that community you can’t just stick it out there and hope magic happens — you have to govern it and manage and harness that community out there,” he said.
The policy is great for promoting open source development, Hermus said Wednesday at the 2016 Red Hat Government Symposium, noting “this is a sign of the fact that the current administration and the current community is really pushing forward in this direction.”
But he also acknowledged, “we still have a lot of work in implementing and adopting this and figuring out how it all works.”
[Read more: Open source policy holds promise, but agencies must do more — experts]
Scott in his blog post cites several new open source digital tools that have sprung up during the Obama administration, including Vote.gov, Vets.gov, the citizen petition site We the People, the White House Facebook bot and Data.gov.
“The code for these platforms is, after all, the People’s Code – and today we’re excited to announce that it’ll be accessible from one place, Code.gov, for the American people to explore, improve, and innovate,” he wrote.
International dark web FBI sting launched ‘to send a message’
A new global police operation against dark web marketplace users identified thousands of individuals last month, signaling unprecedented international cooperation between dozens of major law enforcement agencies across three continents. The operation was designed to spotlight law enforcement’s long arm extending into the digital underground.
“This is to send a message,” an FBI spokesperson told CyberScoop.
At the same time, new alternatives to bitcoin are launching and quickly gaining steam by promising untraceable transactions that can open up both a new tier of financial privacy and novel challenges for law enforcement and intelligence agencies across borders. One glance at the major dark web markets shows that bitcoin remains king. But ascendant and powerful cryptographic technology foreshadows an imminent new chapter to this fight.
Beginning late last month, a global law enforcement operation dubbed Operation Hyperion identified thousands of dark web market users across three continents. In the U.S., the FBI announced it made contact with over 150 U.S. residents suspected of buying illegal items on dark web markets. In New Zealand, over 160 people were identified and contacted by police “with more police visits to come,” according to the country’s authorities. Sweden claimed 3,000 identifications.
Hyperion is an ongoing global operation, according to the FBI, with more information to come.
Police in the Netherlands announced impending criminal prosecutions and even launched a slick website announcing police attention was focused on the markets. The website listed active, identified and arrested individuals.

(Mikal Marquez / https://flic.kr/p/99bs5g)
The new sweep of international identifications, arrests and the creation of a flashy website on the dark web was meant to grab attention and make headlines this week. The FBI in particular hoped to echo and amplify the message sent by Director James Comey in 2015 when he told Congress that criminals who use Tor to hide from the FBI are “kidding themselves.”
Instead, repeated American election controversies involving the FBI stole the spotlight so that almost nothing else out of the bureau has received any media attention in recent days. That’s how a press-savvy international police operation involving hundreds of investigations went largely unnoticed when it was first made public earlier this week.
Three Operation Hyperion arrests were announced this week. On man was arrested on Oct. 10 for allegedly selling illegal goods on dark web markets. Two more men were arrested during the week, accused of money laundering to aid illegal transactions on dark web markets. Canada announced at least one arrest for narcotics distribution.
However, it remains almost completely unclear how many of the thousands of previously anonymous and now apparently identified individuals will face criminal charges.
This appears to be the biggest ever police operation directed at dark web market customers. The usual target of such major operations are the vendors and market owners who sit throned atop the digital underground.
It’s not clear yet exactly how many arrests took place internationally as a result of this operation nor is the full scope of the operation known. The FBI identified numerous customers who then confessed to ordering illegal substances, such as heroin, cocaine, morphine and ketamine on dark net marketplaces.
But without specifics, it’s worth treating vague claims of a vast operation with at least some skepticism. Law enforcement has a history of eye catching theatrics and exaggerating successes in dark net operations.
There’s no information regarding exactly how law enforcement identified targets. In the past, police have employed a wide range of tactics including mail interception, hacking and tracking bitcoin transactions. It’s that last investigative method that’s fueling a privacy arms race.
Bitcoin, the currency du jour of the dark web, is transparent by design so that every transaction shows up on the blockchain, a decentralized public ledger. That can be tricky for any kind of individual or business. For an illegal drug operation, the blockchain is even more treacherous.
For the last several years, “tumblers” and “mixers” have been employed to launder currency and obfuscate bitcoin transactions. The goal is to make it more difficult for law enforcement and intelligence agencies to trace coins, and thereby identify users and transactions.
The tumblers often don’t work as well as advertised, according to Kathryn Haun, assistant attorney for the U.S. Department of Justice in San Francisco. The services “were not that great, did not tumble or mix as well as advertised,” she said in a recent Forbes interview, and, “some of the time” allowed agents to “unscramble” transactions.
Monero, a two-year-old privacy-focused currency with automatic mixing features, boasts adoption by AlphaBay, the largest dark web market currently in existence, and a $63 million market cap. Monero’s price and trade volume spiked when AlphaBay adopted it in August, but both metrics have precipitously declined since then.
While Monero has the headstart in actual adoption, another rival cryptocurrency is grabbing even more attention lately. Zcash, a currency backed by a parade of high-profile scientists, engineers and investors touted a major technical leap forward last week. Known as the zero-knowledge proof, the currency allows for the sender, receiver and details of all transactions to remain secret while maintaining trustworthiness for everyone in the network.
“If you want private transactions, it’s so much better than any better bitcoin offering,” former Bitcoin Foundation head and security researcher Peter Vessenes told CyberScoop. “You want real cryptographic privacy, Zcash is just really excellent. Monero had slight innovations on bitcoin. Zcash is really a giant leap forward.”
Chris Bing contributed to this report.
DOD, watchdog spar over Joint Information Environment assessment
Defense Department CIO Terry Halvorsen pushed back Tuesday on a recent congressional watchdog assessment of department efforts to create a Joint Information Environment, claiming the report doesn’t fully grasp what the effort is trying to accomplish.
But a Government Accountability Office official told FedScoop its recommendations to develop better program management controls for JIE still stand in spite of Halvorsen’s qualms.
A GAO report published in July and reissued last week recommends the department better define the JIE’s scope and develop a reliable JIE cost estimate and baseline, schedule, and schedule management plan.
The department has tried to tell GAO the JIE is a concept, not a program, and should be evaluated by its components rather than as a whole, Halvorsen told reporters, noting he could “very clearly demonstrate to them and measure the components like [Joint Regional Security Stacks].”
“This is some education and I’ll take the hit,” he said at a media roundtable at the Pentagon. “Obviously we did not do a good enough job in educating GAO about the difference of what JIE was as a concept.”
Michael Holland, assistant director for information technology acquisition management issues at the Government Accountability Office, told FedScoop the GAO “had a lot of discussions with the DOD CIO and others about what JIE is and isn’t.”
“We understand that DOD does not consider JIE to be a program. DOD has very specific definitions of what a program is and what a program of record is, and we’re not questioning their decision, for example, to not consider JIE a program of record,” Holland said. “But in our assessment, the JIE does meet the standard definition of a program as described by the Project Management Institute.”
He added that implementing the GAO’s recommendations would improve JIE “management and accountability.”
Halvorsen said Tuesday, “I think the piece that is really hard for, if you’re not in this business, to understand — if you read the GAO report it really talked a lot in acquisition like well why isn’t JIE a single program? It can’t be a single program. And frankly if you made it a single program it would fail.”
Still, Holland said there should be better management practices around the effort.
“You know managing JIE consistent with these program management practices doesn’t suggest that JIE is a single monolithic entity,” he said. “Programs can include a variety of related projects and activities that can each include definite beginning and end dates that can all be managed under that larger program.”
The report also cites recommendations made by DOD’s own Office of the Director, Operational Test and Evaluation to improve the way leadership oversees JIE cost and schedule, Holland noted.
“The Director of Operational Test and Evaluation has also made similar recommendations in the area of cost and schedule management for JIE,” he said. “So we’re not the only ones saying this.”
When JIE was first launched, the department was not clear enough that it is not a program, Halvorsen maintained. “It’s a concept that says I want to get to this environment.”
“If you tried to do JIE as one of our standard programs my own suspicion is what you’d end up with is something that was out of date the beginning you fielded it,” he said. “So you have to do, in my mind, and this is what industry tells me to do too so this isn’t just me… what we’ve done with JRSS and [Mission Partner Environment] is field it using industry best practices.”
The report also provides recommendations specific to the JRSS, Holland pointed out. For instance, it says officials should develop a strategy and schedule to transition JRSS to the Risk Management Framework, and develop a reliable cost estimate and baseline for the effort. The JRSS seeks to replace about 1,000 legacy network security stacks with 48 standardized stacks at 25 locations, according to the GAO.
Halvorsen said the department is still testing and evaluating JRSS.
“I think they were looking for a different acquisition answer that doesn’t exist,” he said.
GPO gets new acting CIO

The Government Publishing Office named Tracee Boxley as the agency’s new acting CIO Thursday.
“Tracee’s institutional knowledge of GPO and IT systems makes her a natural asset in overseeing the agency’s IT operations,” GPO Director Davita Vance-Cooks said. “I look forward to working with her as GPO continues to transform and expand our IT technologies and services.”
The agency’s last CIO, Chuck Riddle, has moved to the Securities and Exchange Commission, according to the announcement.
The GPO said in an Instagram post that Riddle will be the SEC’s new chief technology officer.
Boxley came to the GPO in 2006 and was promoted to deputy CIO in 2012. In her new role, Boxley will develop and maintain the agency’s operating budgets for network IT systems, and manage IT policy and standards “relating to the acquisition, implementation, and operation of telecommunications and IT systems throughout the agency,” according to the announcement.
Boxley has decades of federal IT experience, according to the announcement, including working as the chief of the American Housing Survey Division at the Census Bureau, where she administered the program for the software data center for the 2000 Census. She also served as the deputy CIO and chief of the Technical Services Division at the Agriculture Department’s Food Nutrition Service. Her IT career began in the Office of the Secretary of the Army, where she served as director of IT operations.
“She also worked with the U.S. Air Force CIO in developing a portal for employees’ personnel and career data,” according to the announcement.
Obama administration launches electric vehicle charging network
The Obama administration designated 48 interstates Thursday as electric vehicle charging corridors, laying the foundation for a vehicle-charging infrastructure that will span nearly 25,000 miles across 35 states.
The electric vehicle corridors will be one part of a 55-route network for many alternative fuel sources covering 85,000 miles, according to the Transportation Department. As part of the White House’s and Transportation Department’s announcements, 24 state and local governments also said they would work with federal government to increase procurement of electric vehicles into their fleets.
And “28 states, utilities, vehicle manufactures, and change organizations are committing to accelerate the deployment of electric vehicle charging infrastructure on the DOT’s corridors,” the announcement reads.
The announcements follow the administration’s recent pushes to embrace emerging automotive technologies. In September, for example, the administration released policy on autonomous vehicles.
[Read more: Administration asserts role in regulating autonomous vehicles]
“The Obama Administration is committed to taking responsible steps to combat climate change, increase access to clean energy technologies, and reduce our dependence on oil,” the announcement reads. “Already, in the past eight years the number of plug-in electric vehicle models has increased from one to more than 20, battery costs have decreased 70 percent, and we have increased the number of electric vehicle charging stations from less than 500 in 2008 to more than 16,000 today – a 40 fold increase.”
“But there is more work to do,” it notes.
The procurement commitments include the addition of more than 2,500 new electric vehicles in 2017 alone and “help pave a path for a sustained level of purchases into the future,” according to the announcement. It also notes that if government at all levels works together it can aggregate demand to lower purchase costs and expand the national electric vehicle infrastructure.
The more than $1 million in commitments, the administration says, could potentially save more than 1.2 million gallons of fuel annually.
The announcement also notes, “38 new businesses, non-profits, universities, and utilities are signing on to [The Department of Energy’s] Workplace Charging Challenge and committing to provide EV charging access for their workforce.”
The electric vehicle charging corridors total almost 25,000 miles, and drivers can expect existing or planned stations every 50 miles.
To make it easier for drivers to find the stations, the DOT has a “sign ready” system where states can be authorized to use Federal Highway Administration-developed signs.
“Alternative fuels and electric vehicles will play an integral part in the future of America’s transportation system,” Transportation Secretary Anthony Foxx said in a statement. “We have a duty to help drivers identify routes that will help them refuel and recharge those vehicles and designating these corridors on our highways is a first step.”
Federal Highway Administrator Gregory Nadeau said “identifying where alternative fueling stations can be found will help the public in many ways.”
In July Foxx asked states to nominate national electric vehicle charging and hydrogen, propane and natural gas alternative fueling corridors along major highways.
But this first round is just the beginning, Nadeau noted.
“This initial designation sets the stage for the next round of nominations early next year and begins a conversation with stakeholders about developing and implementing a vision to enable coast to coast travel using alternative fuels,” he said.
In the future, the corridors will be expanded to accommodate alternative fuels, including more electric vehicle charging stations, according to the DOT announcement.
The announcement also notes the Energy Department “ is conducting two studies to evaluate the optimal national electric vehicle charging deployment scenarios, including along DOT’s designated fueling corridors.”
NIST out with cyber workforce ‘dictionary’
Federal scientists Wednesday published a draft “dictionary” aiming to help businesses figure out whom they should hire, with the guide describing every kind of cybersecurity job and cataloging the knowledge and skills needed to do them well.
The draft framework says it aims to provide American companies with a “common, consistent lexicon to categorize and describe cybersecurity work.” Additionally, it will be used by every federal agency to catalog the U.S. government’s own cyber workforce by the end of next year.
The framework is a project of the National Institute for Standards and Technology-led National Initiative for Cybersecurity Education and was rolled out at NICE’s annual conference in Kansas this week.
At the heart of the framework are 51 “Work roles” — or more succinctly, jobs.
In a statement, NIST said that in a “nascent and rapidly developing field … job titles and role descriptions vary from organization to organization and sector to sector.”
From SP-RM-001, “authorizing official,” to IN-FO-002, “cyber defense forensics analyst,” and including SP-ARC-002 “security architect” and OV-ED-001 “cyber instructional curriculum developer,” the work roles cover every conceivable aspect of cybersecurity.
Each job is comprised of a number of tasks — nearly 1000 of which are catalogued in the framework. From T0010 “Analyze organization’s cyber defense policies and configurations and evaluate compliance with regulations and organizational directives,” to T0928 “Collaborate with key stakeholders to establish a cybersecurity risk management program.”
Each work-role/job also requires a set of knowledge, skills and abilities — and the framework catalogues nearly 100 abilities, more than 300 skills and almost 600 items of knowledge.
The framework lays out seven high-level categories of activity, each breaking down into multiple “speciality areas,”which total than 30.
The seven categories are:
- Securely Provision — Designing and building secure IT systems
- Operate and Maintain — Providing the support, administration, and maintenance to IT systems
- Oversee and Govern — Providing “leadership, management, direction, or development and advocacy” for cybersecurity
- Protect and Defend — Identifying, analyzing, and mitigating threats
- Analyze — Review and evaluate incoming cybersecurity information
- Collect and Operate — denial and deception operations and collection of cybersecurity information
- Investigate — Investigating cybersecurity events or crimes and collecting digital evidence
The framework, which NIST says “is the culmination of many years of collaboration between industry, government and academia,” comes comes the day after the launch of workforce mapping tool.
The Pentagon and the Department of Homeland Security “were significant contributors,” NIST said.
The guide is open for public comment until Jan. 6.
Open source exploding, generating benefits in new areas
The federal government has experienced an explosion in adoption of open source applications and systems in recent years, and the benefits are extending beyond the obvious efficiencies and savings to areas like public trust and IT security, officials and experts said Wednesday at the 2016 Red Hat Government Symposium.
“I remember the day 20 years ago when something open source came across our desk, we thought aliens had landed from Mars,” Mark Bohannon, vice president of corporate affairs and public policy for Red Hat, said during a panel at the symposium. “I think we’re way, obviously, beyond that now.
Ten years after that, Bohannon explained “we were trying to explain it’s OK to use it. Today, I think it’s about how to use it. How can it help you, how are we implementing it? I think we’re in a much different chapter these days.”
More recently, that’s been propelled by the administration’s Federal Source code Policy, published in August, which requires agencies to explore existing solutions used by agency partners or other commercial off-the-shelf solutions before procuring custom software code. It also launched a pilot requiring agencies to release at least 20 percent of their custom-developed code as open source in the next three years.
[Read more: OMB finalizes policy on open source code from agencies]
“This policy seeks to address these challenges by ensuring that new custom-developed Federal source code be made broadly available for reuse across the Federal Government,” the policy says. “This is consistent with the Digital Government Strategy’s ‘Shared Platform’ approach, which enables Federal employees to work together—both within and across agencies—to reduce costs, streamline development, apply uniform standards, and ensure consistency in creating and delivering information. Enhanced reuse of custom-developed code across the Federal Government can have significant benefits for American taxpayers, including decreasing duplicative costs for the same code and reducing Federal vendor lock-in.”
David Bray, CIO of early-open-source-adopter the Federal Communications Commission, said the federal government shouldn’t be in the business of coding, for the most part.
“For most things that we do, we should not be coding,” Bray said. “We are not in competition. I understand that code is secret sauce, and that makes sense.”
Customizing code, he said, “is all good and well until five or six years from now, and then you’ve go to go back and you have to change what you’ve made, maybe something seems broken, maybe a new patch has come out and it broke something — that’s currently the state we’re in right now.”
Rather than purely coding, Bray said, agency IT should be using APIs to customize open source or commercial code that’s already available — “stitch together pieces of quilt as opposed to build pieces of quilt yourself.”
Bray also pointed to added trust that agencies can gain from users by using open source code.
The FCC was wildly successful in the launch of its broadband speed test app, he argued, because its open source code showed it didn’t collect unnecessary user information, so they weren’t worried about privacy issues.
“By making it open source, those who’d go on the GitHub who wanted to could see that by design we weren’t capturing your IP address, and by design we didn’t know who you were within a 5-mile radius,” Bray said. “And as a result, we got public trust and it was the fourth-most-downloaded app, right behind Google Chrome.”
There’s already a huge support base for open source in the federal government, he continued. “The bigger conversation is how you can use open source to actually get trust, because you’re now exposing what your code or algorithm is doing, what is being done with the data.”
“How many of you would be willing to share data on air quality, water quality, transportation quality if it would make your community safer if you knew that the data was kept private and anonymous?” he proposed to the crowd at the symposium, produced by FedScoop. “A way you can do that is by making open source what’s being done with the data and the algorithm, and I think that’s the real value of open source we’re just beginning to scratch the surface on.”
Meanwhile , many decry open source code as unsecure because it’s open to the eyes of anyone. But the opposite has actually shown true as more organizations embrace open source.
“For a long time people thought negatively about open source, kind of like it was Wikipedia … because anybody could edit it and you didn’t know what people may have done,” said Curtis Yanko, director of partner enablement at Sonatype.
However, the more eyeballs that are on that code, the more secure it is, Department of Homeland Security CTO Michael Hermus argued.
Paul Smith, Red Hat senior vice president and general manager, agreed, calling open source “the foundation for choice and security.”
Open source policy holds promise, but agencies must do more — experts
The federal IT community by-and-large recognizes the value of open source code, but there is work still to be done to implement the practice in government and realize its potential, experts said Wednesday.
“I don’t have to stand up on stage anymore and tout the benefits of open source as a development model. It’s a given,” Paul Smith, senior vice president and general manager for Red Hat North America Public Sector, said at the 2016 Red Hat Government Symposium, produced by FedScoop
This increasing acknowledgement of the value of open source can be seen in new Office of Management and Budget policy finalized this year, Department of Homeland Security CTO Michael Hermus said. The policy establishes a pilot program where agencies must release at least 20 percent of their newly developed custom code as open source.
[Read more: OMB finalizes policy on open source code from agencies]
The policy is great, Hermus said, noting “this is a sign of the fact that the current administration and the current community is really pushing forward in this direction.”
But he also acknowledged, “we still have a lot of work in implementing and adopting this and figuring out how it all works.”
In particular, Hermus noted, “one of the things I think we’re concerned about is saying ‘just open source 20 percent of your code’ is not a good idea right? It really needs to be much more thoughtful than that.”
“Percentage of code” is not really a good metric, he said, recommending “people… take a look at systems or even modules or libraries of usable functionality that add value.”
He said government is also going to have to work on building the community around its open source work.
“Without that community you can’t just stick it out there and hope magic happens — you have to govern it and manage and harness that community out there,” he said.
Tim Yeaton, senior vice president of Red Hat’s infrastructure business group, also noted earlier in the conference that open source communities are crucial to driving innovation.
“The innovation that happens in software today now is almost entirely born out of these open collaborative upstream communities,” he said.
Yeaton said agencies or companies trying to move to adopting agile methodology in particular benefit from using open source code others have shared before them. There are more than a million open source projects in the world today, Yeaton said.
“If you’re trying to prove out a concept in six weeks and you’re inventing all of that code from scratch — it’s kind of hard to do that,” he said. “What you find a lot of customers doing is they’re actually using some of those million open source components to actually do that prototyping. So open source isn’t just driving innovation… it’s changing the development model.”
Similarly, Hermus said one of the things his agency’s mission needs from open source is “improved time to market of those mission capabilities.”
“We need those guys out there on the frontlines to get the things they need to do their job as quickly as possible, not to get hung up in four years of planning and then procurement,” Hermus said.
He also noted that needs are going to change over time, so technology has to be agile enough to pivot when necessary.
“The open source ecosystem really provides a foundation there,” he said.
DHS will report on mobile threats to federal IT
The Department of Homeland Security will present its report to Congress on cybersecurity threats to mobile devices Dec. 16, the official in charge said Wednesday.
Vincent Sritapan, from the DHS Science & Technology Directorate, said at the Red Hat Government Symposium that the report would be “a pretty big game-changer in the federal space.”
The report, mandated by last year’s cybersecurity law, was produced with input from government scientists at the National Institute of Standards and Technology.
“We partnered with NIST and used their draft mobile threats catalogue,” Sritapan told FedScoop after his session.
“We had one-on-one meetings with all the large telecoms” and issued a request for comments from the vendor community, he added, to make sure the report could cover “what are the standards, what are the best practices, what are the mitigations for some of these threats.”
He said the report “looked at all the threats out there to the mobile ecosystem, the defenses and the gaps there may be, legal authorities the department may or may not have when a mobile incident occurs and then next steps — what do we recommend in terms of mobile security for the federal government?”
Sritapan said he expected the report to guide federal R&D efforts in the mobile arena for the next few years and provide a framework for federal officials and even state and local governments “as we implement mobility.”
The report will help federal officials as they “develop mobile strategy and app development,” added Javier Perez, director of product management for Red Hat.
“The key [to mobile security] is execution,” he explained after the session. “We make it easier to build those apps, integrate with backend systems and use the latest technology.”
Other observers agreed that the report would be a big deal for federal mobility.
“They put a lot of work into it, and it will be very useful,” said Tom Suder, president and founder of the Advanced Technology Academic Research Center.