For agencies making data center migrations, hybrid cloud is the only cloud
If cloud migration is one of the hottest topics happening in federal IT right now, then lift-and-shift could be quickly becoming its greatest profanity.
The practice of migrating all operations from a legacy system to a single cloud infrastructure can ultimately cost agencies the operational efficiencies their leaders desire, a panel of industry and government experts said Thursday at the Red Hat Government Symposium produced by FedScoop.
Instead, they said federal executives should try to bridge the from legacy systems to a sprawl of services across both private and public cloud networks.
“Lifting-and-shifting does not get you anywhere,” said Jane Circle, senior manager of Red Hat’s Global Cloud Provider and Cloud Access Program. “You really have to look at how you are going to re-architect and take advantage of all of the services you have available to you on Microsoft Azure, on Amazon Web Services, on Hewlett Packard Enterprise — and by the way, they’re all different.
“You really have to take a look at how you can take advantage of every one of those platforms, but not lock yourself in,” she said.
For agency leaders trying to formulate the right mix of a private cloud data center versus a commercially accessible one, Jay Huie, the General Services Administration’s secure cloud portfolio director, said the focus should be on what applications each cloud service provider offers and having the choice of what to deploy.
“If you define a static split [of cloud programs], I guarantee you are wrong,” he said. “The whole point of cloud — what is it — is to expand the elastic capacity. So, by definition, everything you do should be a hybrid cloud. Because if you are locking stuff inside your walls, you are not elastic, you’re not secure and you are not stable. If you are locked in to a commercial provider, you’re locked in.”
So the question then becomes not simply which cloud service to use, but how those services work together and with existing legacy systems until they can be decommissioned.
“This homogeneous world that we live in… There is not going to be one, single cloud, period,” said Susie Adams, Microsoft chief technology officer for federal. “What we all recognize is that interoperability is key and standardization is key. That’s why you are seeing folks support things like open source, DevOps and agile compute, and being able to run almost anything on any enterprise.”
Ultimately, the panel said CIOs searching for both security and efficiency will best find them in multiple, integrated solutions with a strong focus on user access control.
“It’s about granularity,” Huie said. “I want this cloud of clouds. Every app should be in its own cloud. You go rob banks because that’s where the money is. I’d rather have a dollar in 100 banks than $100 in one bank, from a security perspective.”
Anthony Robbins joins NVIDIA
Anthony Robbins, a longtime government IT industry executive, joined NVIDIA as vice president of the artificial intelligence-focused company’s public sector practice.
In his new role, Robbins will lead NVIDIA’s federal and defense businesses in the U.S. and Canada, and oversee its higher education and research businesses. Ultimately, with NVIDIA he’ll be helping agencies make the leap into adoption of artificial intelligence “for a variety of applications, including analytics, geospatial intelligence, healthcare, defense and video analytics,” according to a release.
Prior to this, Robbins served as vice president of global defense at AT&T and in various roles with Brocade, Oracle, Sun Microsystems and Silicon Graphics.
He was a FedScoop 50 winner of the Industry Leadership award this year.
Future DATA Act data submissions could be more complete, GAO says
The Digital Accountability and Transparency Act of 2014 has already made an impact on government spending transparency, but there remain some kinks to be worked out to ensure that data reported to USAspending.gov is complete and accurate.
The DATA Act requires that federal agencies submit “accessible, consistent, reliable, and searchable data” for publication and use by policy makers and the public. It also requires that the Government Accountability Office independently assess the “timeliness, completeness, accuracy, and quality” of that data in a report to Congress.
So that’s precisely what the watchdog did — and while the data submitted was broadly found to be timely, there were some weaknesses in its completeness and accuracy.
For example the investigation found that, within the data submitted for the second quarter of fiscal year 2017, 160 financial assistance programs (an estimated $80 billion in spending) were not included in the data. Additionally, while budgetary data was found to be reasonably accurate, less that 1 percent of award data was accurate when comparing reported data with other “authoritative agency sources.”
GAO also found that while the Office of Management and Budget provides guidelines for agencies on how to define different data elements, agencies can and do interpret these guidelines differently. Further specification is needed, GAO said, to ensure “data consistency and comparability.”
Finally, the report argues that the Department of the Treasury website where all this spending data is displayed, Beta.USAspending.gov (which is currently under development but is supposed to become the sole source for this data this fall), “does not sufficiently disclose known limitations affecting data quality.”
“Without the transparent disclosure of known limitations, users may view, download, or use data made available on the site without full knowledge of the extent to which the data are timely, complete, or accurate, and therefore, could inadvertently draw inaccurate conclusions from the data,” the report states. “Disclosing data limitations does not detract from the value of the data reported under the DATA Act. Instead, it enhances its value by providing users with the information that they need to interpret and use the data appropriately to inform future decision making.”
So while the report acknowledges the “significant strides” that OMB, Treasury and the rest of the government has made in implementing the DATA Act, it concludes that there is still work ahead.
“Our audit of the initial data submitted by agencies and made available to the public on Treasury’s Beta.USAspending.gov website shows that much more needs to be done if the DATA Act’s promise of improving the accuracy and transparency of federal spending data is to be fully realized,” the report states.
The report concludes with six recommendations — two for OMB on clarifying agency guidelines, and four for Treasury on improved oversight and display of the data.
As another requirement of the law, agency inspector generals recently released their mandated audits of each agency’s DATA Act compliance. As the contents of the GAO report might suggests, the results are mixed. The data submitted by the Consumer Financial Protection Bureau, for example, was found to be “complete, timely, accurate, and of good quality.” Meanwhile, over at the Department of Housing and Urban Development, the IG “found widespread errors, inconsistencies, omissions and false values.”
Agency IT architects seeking API ‘developer nirvana’
As the federal government plows forward on innovating the way it provides citizen services, at least one agency is looking to provide fertile ground for API developers to design new solutions.
To make that happen, Rob Brown, division chief of enterprise infrastructure of the U.S. Citizenship and Immigration Services’ Office of Information Technology, is tapping a mix of standardization and customization within his agencies walls with a concept known as inner-sourcing.
Speaking at the Red Hat Government Symposium produced by FedScoop, Brown outlined the practice, which uses open source code and a centralized platform as its bedrock, encouraging developers to build customized tool sets for individual agency components, but on a grounded code that shares a commonality throughout the agency.
“As we move forward and we are aggregating all of these disparate teams, we start to leverage one version control system for configuration management, and one of our thoughts moving forward was promoting the use of code reuse,” he said. “Having a common sort of tool set where we could to start to collaborate on issues across different portfolios, as well as sections of a division. Ultimately, across all of the OIT.”
To move USCIS in a direction that could capitalize on the benefits of inner-sourcing, the agency utilized GitHub as its software development platform and encouraged innovation from there.
But in order to avoid the disparate solutions that can’t integrate and have plagued federal information technology, Brown added that agencies must draw the sidelines in which developers can operate by laying down good governance.
“Because this is a highly-regulated, compliant-driven environment, ensuring that we have some level of governance, we are trying to minimize a lot of the rogue actors, the rogue systems,” he said. “So the goal was let’s do a complete value stream, impact analysis, let’s look at what we can to really build a dev factory.”
The result is a system that can encourage innovation but within the structure of a shared platform. To make it work across the enterprise — achieving what Red Hat chief architect Adam Clater called a “developer nirvana” — Brown said that the DevOpsSec of developers, operations and security professionals collaborating has to take place.
“As we move forward with a lot of these platforms, we actually made sure we partnered with security and security engineers at the outset,” he said. “So when we had these ideas, we went through looking at the right contracts, looking at the right tooling, they were riding shotgun and sometimes driving.”
While the inner-sourcing process continues to develop at USCIS, Brown said it affords the agency both the collaboration of component teams and the environment for developers to experiment with.
“I think that the digital asset of tomorrow, or even today, is API and not [user interface],” he said. “Moving forward from a developer nirvana perspective, ensuring those developers have the right tools in place, again with a little governance, so they could essentially have a portal they could all work in, that there’s a marketplace to promote that kind of dry principle.”
18F founder takes new private sector digital government role
Greg Godbout, one of a handful of founding 18F members, has joined TechFlow, Inc. as chief digital officer, he told FedScoop.
Godbout noted in an email the importance of the work companies like TechFlow do helping government agencies move to modern technologies and agile software development, calling the transformation “both impressive and daunting.”
“I will be working with the TechFlow team to continue my efforts to help the US Government with its digital transformation,” he wrote to FedScoop. “I have had the privilege of working with TechFlow over the last 6 months and I have found their team to be deeply knowledgeable, efficiently practical, and highly innovative.”
After founding and leading the General Services Administration’s 18F digital team, Godbout moved to the Environmental Protection Agency as CTO. From there, he’d leave government service in 2016, but his work with government didn’t end. He joined Danish tech company cBrain, serving on the company’s international team, primarily working to make sure its F2 software platform fits the needs of U.S. government customers.
In joining TechFlow, Godbout won’t completely separate from cBrain, he said. He’ll remain as the chair of the company’s board.
“I look forward to continued advisement of cBrain North America as they continue to expand their efforts in the US and Canada,” he said. “There is much we can learn from the Danish Model and their successes with Digital Government.”
A ‘blueprint’ for government IT reform
Reforming the way government manages its information technology has spanned several administrations, but the Partnership for Public Service thinks it has a plan on how to finally get it done.
The better governance nonprofit partnered with Accenture to unveil a blueprint Wednesday on how federal agencies could better makeover their IT efforts.
“I think the message here is the world is changing, agencies’ missions are changing and the technology that agencies are using to accomplish the mission are probably changing faster than all of this stuff,” Eric Keller, senior manager for research and evaluation at the partnership, said at an event focused on the blueprint’s release.
“One of the theories we came into this work with was that new technology sometimes requires a new approach to leadership,” he said.
The research centered on five strategies to help smooth agencies’ paths to IT modernization:
- Linking tech initiatives directly to results, based on agency mission
- Transforming culture and how organizations do business
- Focusing strategies on people and managing many different stakeholders
- Encouraging organizations to move quickly and take on more risk
- Manage rather than react to changes resulting from new technology
To illustrate how the blueprint can work, a panel of CIOs who have pursued digital transformation in their offices highlighted some of the ways they put the plans in action.
Make it all about the end user
Acting Department of Veterans Affairs CIO Scott Blackburn said that after the agency had to regain veterans’ trust in the wake of its waitlist scandal, it centered its innovation efforts on designing systems to serve both veterans and frontline employees more easily.
“When we are doing this, we really need to ground ourselves in the end user and getting the entire management team focused on that goal,” he said. “Our transformation, like all others, it revolves around people, revolves around processes and revolves around technology. And all three of those play an incredible part, and I think people is the most important.”
Leading with one voice
When tasked with applying the Federal Information Technology Acquisition Reform Act to the Department of Interior, CIO Sylvia Burns said the agency involved its executive leadership by forming a FITARA implementation committee that allowed stakeholders to understand their expectations and how they could collaborate on their governance efforts.
The result, she said, was a unified approach that developed from tying in leadership and stakeholders to get a full view of the agency and how to manage the project from the top down.
“Quite honestly, through that process, I feel like we got strong,” she said. “It started with [the belief] that we have to be together in the department first. If we are going to face the bureaus and ask them to do this, we can’t not have our act together.”
Keep working the basics
Risk is a key component of innovation, but Washington, D.C., while innovation-hungry, is the proverbial poster child of risk-aversion.
To overcome this dissonance, Department of Justice CIO Joseph Klimavicz said that agencies have to harken back to the mission when evaluating new IT projects and make sure to keep the trains running while working on innovation.
“My thinking is that the missions really don’t change when we change administrations,” he said. “As we move through, priorities shift and you need to be sensitive to that. If you want to get funding for your projects, you need to know where the priorities are. But the mission doesn’t really change.
“It’s great to be a change agent, but if you don’t keep the lights on or your core customers happy, you don’t get to be a change agent. You have to be really good at the basics to be able focus on change,” Klimavicz said.
MGT Act moves to NDAA floor vote, but sequester challenge remains
The House and Senate have agreed on a $700 billion fiscal 2018 National Defense Authorization Act, and the Modernizing Government Technology Act made the cut as an amendment.
A spokeswoman from the Senate Armed Services Committee confirmed to FedScoop that the conference-approved proposal includes the MGT Act as an amendment as it appeared in the Senate’s version of the defense authorization bill.
The NDAA must now pass the floors of the House and Senate, and receive the signature of President Donald Trump to become law.
The MGT Act — first introduced in the House by Rep. Will Hurd, R-Texas, followed by a corresponding version in the Senate from Sens. Jerry Moran, R-Kansas, and Tom Udall, D-N.M. — proposes again to allow agencies to put money saved through IT efficiencies into working capital funds, which can be accessed for up to three years, to fund efforts to modernize their technology. It also would create a centralized fund agencies can tap into for modernization.
While its inclusion in the post-conference NDAA appears to move the information technology funding legislation to the brink of passage, there is a law that may provide yet another hurdle to the bill: the Budget Control Act.
The 2011 law requires defense spending to be capped at $603 billion in fiscal 2018 and that it won’t be raised without also raising the cap on non-defense spending. That means that while the House and Senate may have emerged from conferences with an agreement on NDAA, they still have to pass it and find a way to meet the requirements of the BCA.
Congress has until Dec. 8 to come to a budget resolution to fund the government until Sept. 30 of next year, which now appears to coincide with the proposed passage of the NDAA.
HUD not DATA Act compliant, underreported billions of dollars, report says
The Department of Housing and Urban Development has fallen short of the Digital Accountability and Transparency Act’s required reporting deadline, an inspector general found.
IG officials said the agency’s chief financial officer failed to implement the data standards required by the Office of Management and Budget and the Department of Treasury, causing HUD to underreport billions in obligations and outlays, and submit incomplete and inaccurate data in its second quarter spending reports.
“Our review of HUD’s seven required files supporting the second quarter of fiscal year 2017 found widespread errors, inconsistencies, omissions and false values, which were reported to USASpending.gov,” the report said.
The DATA Act required federal agencies to submit standardized spending information by May 2017 in an effort to improve transparency. OMB and Treasury developed 57 data definition standards to assist agencies in standardizing spending data.
But investigators found that HUD didn’t allocate enough funding toward DATA Act implementation efforts, including carrying out necessary information system upgrades to ensure that spending information from HUD, the Federal Housing Administration and the Government National Mortgage Association — also known as Ginnie Mae — fit the DATA Act Information Model Schema.
“To subsequently allocate limited funding to system upgrades, HUD leveraged resources from a preexisting agreement with an independent contractor, which were insufficient to complete implementation,” the report said. “The agency continued to remain dependent on financial systems with differing technologies and data elements, which contributed to the issues noted.”
The report also notes the CFO provided limited staff and resources to implementation efforts, which further delayed HUD’s DATA Act transition.
But despite the Treasury Department providing a DATA Act Playbook on how to conduct implementation and the IG offering HUD eight recommendations on how to meet the May deadline, agency officials disregarded the recommendations and “inaccurately represented” their progress to the House of Representatives in a December 2016 hearing.
A lack of agency guidance on implementation and weak internal controls on DATA Act reporting further complicated efforts, leading to information inconsistencies.
“FHA contributed to a total absolute value of $17.3 billion in obligations incurred and $16.6 billion in outlays, and Ginnie Mae contributed to a total of $558.3 million in obligations incurred and $215.8 million in outlays, which were excluded from DATA Act reporting and not reported on USASpending.gov,” the report said. “Additionally, $4.2 billion in apportionments was not reported to USASpending.gov.”
The IG offered five new recommendations on how HUD could achieve DATA Act compliance:
- Designate additional HUD personnel and establish an internal reporting structure to complete DATA Act implementation, while sustaining reliable DATA Act reporting for later periods.
- Validate, certify and submit all reportable FHA and Ginnie Mae data through the DATA Act broker and report the data on USASpending.gov.
- Complete data quality and error resolution for HUD’s loan programs to ensure inclusion in HUD’s subsequent submissions.
- Allocate the financial resources to ensure that reconciliations are performed in the consolidation of source system data to the DATA Act submission files.
- Establish and implement internal control policies and procedures for consolidating and reconciling data from HUD, Ginnie Mae and FHA source systems are documented and include a governance structure, including roles, responsibilities and personnel completing DATA Act reporting procedures.
HUD officials offered responses to nine comments made within the report, but did not comment on the additional recommendations.
Agencies hit road bumps with incremental software development
Federal agencies still have some work ahead in properly implementing incremental IT development practices, a new Government Accountability Office report finds.
Incremental development has a number of acknowledged benefits, mostly surrounding how it allows agencies to incorporate user feedback, keep a project in line on schedule and budget, and abandon or pivot if necessary without too many sunk costs. Conversely, GAO argues, waterfall development practices “too often result in failed projects that incur cost overruns and schedule slippages, while contributing little to mission- related outcomes.”
By way of example, the report cites the Farm Service Agency’s Modernize and Innovate the Delivery of Agricultural Systems program, which was ended in July 2014 “after investing about 10 years and at least $423 million, while only delivering about 20 percent of the functionality that was originally planned.”
More incremental development is needed, GAO says, to avoid such embarrassing cost overruns.
But agencies still face some challenges getting the ball rolling — challenges associated with “inefficient governance processes; procurement delays; and organizational changes associated with transitioning from a traditional software methodology that takes years to deliver a product, to incremental development, which delivers products in shorter time frames,” the report states.
On top of this, there is the issue of chief information officer certification of “adequate” incremental development practices.
Only four of 24 federal agencies have a clear policy that the CIO can use to certify that a given IT investment adequately uses incremental development. Of the remaining 20, 11 have vague policies and nine don’t have any policy at all.
Oversight of incremental development is an important piece of Federal IT Acquisition Reform Act of 2016 (FITARA), which builds on an Office of Management and Budget guidance from 2000 that requires that agencies endeavor to make IT investment in iterative, incremental pieces (as opposed to traditional waterfall development) as a way to avoid costly project failures. According to the GAO report, FITARA requires that agencies “develop policies and processes which ensure CIO certification” that incremental development is being used and “report the status of CIO certification.”
Accordingly, GAO found that at the 24 agencies investigated, 62 percent of major IT investments made in fiscal year 2017 were certified by the CIO as utilizing proper incremental development. The remaining investments were not certified, agencies said, for any number of reasons. In some cases this was an error, while in other cases agencies said that the required certification was “not applicable” to the given IT investment.
However, GAO found that according to OMB’s guidance on the subject, several of these “not applicable” responses were incorrect. That is, agencies should have responded with a “yes” or “no” answer to whether the CIO had certified the investment project at hand as one that utilizes incremental development.
The issue, GAO argues, is that many agencies lack clear policies on CIO certification of incremental development. The data on use of incremental development across the federal government is valuable, GAO says, and so it’s important that it is reported correctly. “It is critical that agencies take action to put in place appropriate incremental certification polices to ensure CIOs exercise the proper authority and oversight over major IT investments,” the report states.
The GAO report offers 19 recommendations to 17 agencies, requesting that executive leadership make sure the office of the CIO implements a clear certification policy.
“Agency CIO certification of the use of adequate incremental development for major IT investments is critical to ensuring that agencies are making the best effort possible to create IT systems that add value while reducing the risks associated with low-value and wasteful investments,” the report concludes.
Lawmakers blast Trump’s proposed cuts to DHS tech directorate
Despite the Trump administration’s advocating for more innovation and technology advances in government, proposed budget cuts to the Department of Homeland Security’s Science and Technology Directorate have the potential to hamstring those efforts, stakeholders say.
Members of the House Homeland Security Committee are none too happy about it, lambasting the administration’s proposed cuts to biodefense, R&D, acquisition and university research programs in a subcommittee hearing Tuesday.
The proposed cuts threatened the existence of a number of homeland defense programs, including the National Urban Security Technology Laboratory, an evaluation and testing program for emerging technology for first responders.
“I was very concerned that the president’s fiscal year 2018 budget request proposed its closure, in addition to the closure of two other DHS labs that focus on chemical and biological threats,” said Rep. Dan Donovan, R-N.Y, chair of the Emergency Preparedness, Response and Communications Subcommittee. “Now is not the time to be cutting federal resources to counter chemical and biological threats and support for our first responders.”
Donovan noted that funding for NUSTL and two other labs was restored during the House appropriations process, but S&T still requires more support in both funding and leadership, two areas the administration has seen deficiencies of late. The cuts, however, could be reintroduced as part of a budget resolution, which Congress must reach before Dec. 8.
Former DHS Under Secretary of Science and Technology Reginald Brothers — who ran the S&T Directorate from 2014 until January — testified that inconsistent funding, coupled with fiscal bureaucracy, have negatively impacted an office that provides next-generation technology development at DHS.
“From personal experience, I know that one of the most disruptive forces for technology and innovation organizations is uncertain and unstable funding,” he said. “This challenge is magnified at DHS because the threat environment can change on a frequent basis, which can call for rapid change across our R&D investment portfolio to meet an immediate or near-term threat.”
Part of the challenge stems from the financial reporting structures, which Brothers said inhibits the S&T director from shifting funding to counter an emerging threat or achieving agility similar to the Defense Advanced Research Projects Agency, or DARPA.
“S&T has to report very specifically in terms of the kinds of spends it does,” he said. “One of the challenges, having served at DARPA and DOD, is with the way that S&T has to report early commitments and obligations of funding — it makes it difficult when things happen.”
The panel also testified that the administration’s lack of an appointed leader at S&T hampers its ability to pursue innovation strategies for fostering developing technology.
“There’s outstanding professionals there in the department that are keeping things moving, but, again, it’s the uncertainty,” said Gerald Parker, associate dean for Global One Health at Texas A&M University. Parker testified on the effectiveness of the directorate’s biological threat research.
“I think Dr. Brothers did a whole lot to steady the ship, so to speak, as S&T and culture is greatly improved. People are happy to come to work, are working hard and we don’t want to lose that momentum,” he said.
The committee concurred, with members promising more support for the office.
“We have to get the administration, regardless of what administration it is, to take this seriously and put in place a budget that is consistent and would allow S&T to do the work and types of things it needs to do,” said ranking member Rep. Donald Payne, D-N.J.
Officials at the Office of Management and Budget were unavailable for comment at press time.