HHS sets voluntary cybersecurity guidelines for health industry
The Department of Health and Human Services closed 2018 — a year plagued by health care breaches globally — by issuing voluntary cybersecurity guidelines for health care professionals.
Published Dec. 28, HHS’s guidance, developed in partnership with industry experts from the Health Sector Coordinating Council, emphasizes the financial and health impacts of security incidents and outlines steps practitioners can take to better secure their systems.
“Cybersecurity is everyone’s responsibility. It is the responsibility of every organization working in healthcare and public health. In all of our efforts, we must recognize and leverage the value of partnerships among government and industry stakeholders to tackle the shared problems collaboratively,” Janet Vogel, HHS acting chief information security officer, said in a statement.
The guidelines were required by the Cybersecurity Act of 2015 Section 405(d) “to develop practical cybersecurity guidelines to cost-effectively reduce cybersecurity risks for the healthcare industry,” according to a release. More than 150 industry partners gathered over the past two years to develop the new document.
“The healthcare industry is truly a varied digital ecosystem. We heard loud and clear through this process that providers need actionable and practical advice, tailored to their needs, to manage modern cyber threats. That is exactly what this resource delivers; recommendations stratified by the size of the organization, written for both the clinician as well as the IT subject matter expert.” said Erik Decker, the guidelines’ industry co-lead from the University of Chicago Medicine.
Read more about the new guidelines on sister publication CyberScoop.
CIOs need to drive the AI debate in their organizations, says IBM global government chief
The deepening presence of artificial intelligence in the workplace — and its looming impact on societies around the world — is forcing government and industry leaders to grapple with new and difficult questions about a technology-driven future that’s arriving faster than many are prepared for. That’s particularly apparent to Sreeram Visvanathan, who, as global managing director for all of IBM’s government business in more than 160 countries, now sees AI dominating discussions among the world’s public sector CIOs.
Visvanathan sees all sides of government: from defense and intelligence to public safety and policing, judiciary and social services, and the CIOs responsible for national infrastructures, smarter cities and education. While his background is in technology and engineering, his passion since joining IBM 16 years ago, he says, has been challenging the status quo and leveraging innovation and modern technologies to fundamentally transform industries.

Sreeram Visvanathan, in his Dubai office, oversees IBM’s government business in more than 160 countries. (FedScoop)
FedScoop met with Visvanathan in his offices in Dubai in late December to discuss how he sees AI unfolding in governments across the globe — and why he believes government CIOs are uniquely positioned to drive the AI debate within their organizations.
Editor’s note: The transcript has been edited for clarity and length.
FedScoop: You’ve commented that a lot of government leaders you meet around the world aren’t prepared for how quickly AI is coming and that many may not be asking the right questions. What are you observing? And how might government leaders get better prepared?
Sreeram Visvanathan: If I look at the last 12 months, the dialogue and debate are no longer if, it’s when, and that’s changed around the world. I notice it in any discussion I have. But many people are still cautious: experimenting, doing proof concepts, maybe playing around with some chatbots, or automation around a stand-alone part of a process rather than an end-to-end process.
If you look at other industries, they are looking at AI and saying, “How can I reimagine the entire process?” That’s because you have a profit and loss account that you’re trying to manage and…you’ve got competition from everywhere.
In the government sector, people are still debating should we, should we not? Do we need to cleanse the data first? What is the single version of the truth that I can have before I can teach a machine how to understand the data? The thinking is sequential at the moment. Once the first movers start making some leaps, though, and the benefits become obvious, then others are going to take leaps.
We see several forcing functions around the world that are going to make this not an option anymore. The first is an aging workforce. If you look at the government workforce in many of the countries, especially in Europe, it’s aging rapidly with, in the next five years, 20-to-30 percent retiring. Take Germany, I think it’s closer to 40 percent. Imagine the impact, the loss of policy-making capabilities, the loss of customer service and citizen service that you lose. How do you replace that?
Second, the technology is just so much more advanced, so much more proven and less expensive. Barriers of entry and to experimentation are coming down all the time. Yet the one thing I hear from [government] CIOs is, “My budgets are getting cut. I can’t find enough people. I’ve got systems that I’ve patched for the last 30 years that I’m afraid to touch because nobody knows how the hell it works, but it actually works. Right?”
I think we as an industry can start addressing some of those workflows…by sharing examples. Intelligence and public safety are wonderful areas [using] AI for video analytics, structured and unstructured data, and patterns, drawing from volumes of data that a human being can never go through and where there’s a compelling need to stop bad things from happening. They’ve embraced it. But in civilian government, it is still slow. Our view is AI is going to take off. The first movers are going to take some risks, but they’re going to see benefits.
FS: You’ve said policy makers may not be engaged in the right debate about AI. How might you reframe the discussion?
SV: I think that the debate that needs to be had is, “Is AI really going to kill jobs? Or is it going to be more replacement of one set of jobs with another set of jobs?” The jury is out on this. You get two different views. One says, for every job that you lose, you’re going to replace with a new type of job and a new set of skills. So, all we need to do is train people on a new set of skills. The other school of thought says, there’ll be a big part of services-related jobs that a machine can learn.
If you look at many of our mature markets, over the last 30 years they’ve gone away from manufacturing and design to becoming much more service oriented. Those are exactly the jobs that can get replaced or significantly reduced by machines. So, what do people do? What do universities need to teach them? How will they work?
Then you add another dimension to this: Most of our children and grandchildren will live beyond a hundred. So, they’re going to have multiple careers and need to become lifelong learners. But what are they going to learn that makes them more valuable than a machine? These are existential issues that need healthy debate and we don’t see enough debate anywhere in the world. It is happening in pockets but not at the scale that needs to happen.
FS: Which governments do you see emerging as early AI adopters and what are the consequences if you’re a late adopter?
SV: I don’t think there’s an easy answer to the consequences, but let me tell you what I’m observing. As I travel around, AI provides a great leap-frogging mechanism. Take Dubai as an example. It has mandated that all agencies use blockchain technology. They are looking at end-to-end processes. No other government around the world has done that.
In Abu Dhabi, for instance, they’ve defined some 80-plus customer journeys, for everything a new expatriate needs coming to live in Abu Dhabi. They need a work permit, flights to come into Abu Dhabi, a place to live, a car, schools for their children and so on. All those services are being integrated into one package of services no matter what the backing agency is.
Now, that’s a very interesting way of looking at [government services] because you’re breaking through the cycles of “this is my turf, this is my data, this is my customer.” Our view is governments are going to compete for resources more and more. You see it now the way Amazon reversed [its search for new headquarters cities], asking, “Who wants my business?” That’s going to happen between countries and between states in countries. Does that mean that Dubai or Abu Dhabi is going to compete with the U.S.? Of course not. But they’re going to be more competitive than neighboring countries to attract investment.
In the U.S., you have states that are moving faster than others. Take Delaware. We just did a blockchain piece of work that allows you to get a business license much faster. Ease of doing business is going to be one of the advantages. The point I’m trying to make is, smarter markets or states that put some distance between them and others in terms of breaking down the silos between agencies are going to get a distinct advantage.
The European Union Council discussion around the role of AI is very interesting because they have a critical mass in terms of countries, in terms of workforce, in terms of governance. It’s the most robust thinking that we have seen. The issue that I have with this is that, it [offers more as] a leadership paper but not practical implementation for day-to-day operations.
For me, the debate needs to be led by CIOs with the CEOs of agencies to say, what is the implication, what could be the benefits and how can we serve our constituents better through the use of AI?
FS: How would you suggest CIOs advance the discussion?
SV: I think the CIOs [need to go] on the offensive and say, “Here is the potential of AI. If I understand the business of our agency, what data we have, what data we could use, and here are the implications, the possibilities, and engage in a debate with the business side of the agency,” I think CIOs will, one, get buy-in, but second, get more budget because it will be seen as transformational. The CEO has to buy into it, but it has to be a partnership, and I think the CIOs have a wonderful chance to lead this debate.
FS: What other trends are you seeing around the world that U.S. CIOs should keep their eye on?
SV: There are a few things that I’m observing: One is a focus on design and experience. A lot of the time we have spent as IT professionals has been on the engineering side of things, not on the experience side of things. The experience side of things is what drives production and endorsement.
I’ll give you an example. One of my clients disperses all [of their country’s] social services benefits and retirement pensions, worth billions of [dollars]. The previous mentality was: I’m the provider, you’re the recipient, you need to claim from me. Now they’ve turned that into a design where they’re in the service of the person who needs the service.
They have saved literally billions of [dollars] in the way that they’ve orchestrated the consumption of the service…where the validation happens behind the scenes. AI tools, for instance, see as you’re filling in the application if you claim to be living in a separate house, but social media shows that you’re living with your parents. The AI can come back and say, are you sure this is your address because we found this other address? The fraud that happens with social services currently quickly drops. So, it’s design and workflow being thought through both from a customer service angle but also to address your core issues, which is what the CIOs need to do.
A second thing is talent and employee experience. We all know it’s a brutal war for talent and it’s going to get worse, especially talent that understands AI and cloud. And in the AI industry, you can’t exactly outsource everything to a third country, so you’ve got to build talent locally. Yes, people who have this notion of service to their country are going to come and join you, but you still have to create a work environment and learning environment that is conducive to attracting talent.
I see the best CIOs thinking about employee experience, not just about their depth of technology. How do you create the right workspace, the right collaboration alignment? I see CIOs rotating people in and out of laboratories that they create, where new tech is tested. That gets a lot of people energized and excited. Some of the best CIOs actually take some of the older members of their team who are not up to date with the latest technology, mix them up with young upcoming tech geeks, put them into a lab and then take them back into their day-to-day business and suddenly you start seeing a pattern change in how they think and implement solutions and engage in business.
Lastly, I’ve seen the best CIOs try and frame the “exam question” differently — they spend a lot more effort not on the downstream government work but on the upstream arguments about what the problem is they’re trying to fix. And that changes the downstream outcomes. So those are the things I would say to CIOs.
Read: New survey shows federal agencies are already achieving demonstrable value in AI
NSF needs tighter controls on agency-owned iPhones
The National Science Foundation could stand to be a bit more strict while enforcing rules governing the appropriate use of agency-owned mobile phones and tablets, a recent report by the agency’s inspector general finds.
The independent federal agency provides iPhones and iPads to some employees on the basis of business need. All in all, the IG found, NSF owned 321 iPhones and 337 iPads as of July 2018.
These devices are provided for work purposes, so there are various agency rules governing appropriate use. For example, accessing pornographic content is prohibited, as is using the device for gambling. The devices are supposed to be used only for work purposes, and only by the intended owner.
NSF uses mobile device management software, currently called “Intelligent Hub,” to monitor devices and the apps installed on those devices. This software has the ability to detect when prohibited apps are installed, but the IG says NSF could do a better job utilizing this software.
“We identified 102 NSF-owned iPhones and iPads that were either not enrolled or enrolled incorrectly in mobile device management software,” the IG writes.
In some cases, NSF-owned phones were incorrectly identified as personal devices, for example, and vice-versa. Part of the issues, the IG suggests, may be that employees themselves are in charge of managing this enrollment. “NSF allows mobile device users to enroll in [Intelligent Hub] themselves, as opposed to requiring enrollment by a central point of service, such as IT Help Central,” the document explains. “NSF does not have a mechanism to ensure NSF staff complete the enrollment process or enroll in [Intelligent Hub] correctly.”
The IG also found various apps installed on NSF-owned devices that seem to violate NSF policy. These include, for example, children’s entertainment apps and games. The IG notes that while Intelligent Hub has the ability to flag apps that violate agency policy, NSF hasn’t really used this capability. “By limiting its use of [Intelligent Hub], NSF may have missed opportunities to detect and deter inappropriate use of NSF-owned mobile devices, which could discredit NSF or damage its public reputation,” the report states.
The IG report gives seven recommendations, including that NSF clarify rules around enrollment in the agency’s mobile device management software, clarify which apps should not be downloaded to agency-owned devices and develop a policy for app review.
The NSF concurred with all recommendations and is developing an “action plan” to address them.
GAO: Agencies aren’t tapping venture capital-owned companies for innovation
Though federal agencies are hungry for new technology solutions, a report from the Government Accountability Office says they may be missing out on a key avenue of innovation.
The Dec. 21 report found that in past three fiscal years, less than 3 percent of grant funding from the Small Business Innovation Research (SBIR) program to went to companies that are majority-owned by multiple venture capital companies, hedge funds or private equity firms, despite being provided the authority to work with such businesses.
The SBIR program awards contracts to small businesses for the purpose of promoting research and development, and innovation efforts at 11 federal agencies, including the departments of Agriculture, Commerce, Defense, Homeland Security and others.
The program is overseen by the Small Business Administration, but each agency determines the R&D areas it wants to focus on and the grants it will award. The SBIR’s phases of award are:
- Phase I awards contracts of at most $150,000 to test scientific and technical merit and feasibility of research ideas for six months to a year.
- Phase II awards up to $1 million for two years of continued R&D for programs that advanced past Phase I.
- Phase III tests the commerciality of the programs and solicits funding from the private sector and the agency that offered the initial SBIR awards.
Since 2011, agencies in the SBIR program have been able to award a select portion of their grant budget to small businesses owned by a group of venture capital companies, hedge funds or private equity firms, which often invest in companies developing new technologies and products.
Per the SBIR Reauthorization Act of 2011, the Department of Energy, National Science Foundation and National Institutes of Health can award no more than 25 percent of their SBIR funding to venture capital-owned businesses, while the remaining SBIR agencies are capped at 15 percent of their funds.
But the GAO report found that between fiscal 2015 and fiscal 2018, agencies awarded between 0.1 to 2.7 percent of their total SBIR obligations to venture capital-owned businesses.
Only NIH, DOE’s Advanced Research Projects Agency-Energy (ARPA-E) and the Department of Education’s Institute for Education Sciences made SBIR awards to venture capital-owned companies during that span, totaling 62 awards worth $43.6 million.
The rules governing SBIR grant awards seem to have impacted at least some of the agencies’ approach to venture capital-owned businesses, the report found.
Officials at three agencies said that the caps on grants that could be awarded to venture capital-owned businesses limit the amount of funding they could receive, not allowing many to advance past the early phases of the program.
Other agency officials requested information on the success of SBIR grants given to venture capital-owned businesses, but NIH and ARPA-E officials told the GAO “it was too early for them to evaluate the impact of the authority.”
And finally, some agency officials claimed that because the venture capital-owned businesses possess enough funding and have technologies in later-stage development, they didn’t believe that the SBIR program would attract those businesses.
GAO had previously called for policy guidance changes in 2015 to make the requirements for awarding SBIR grants to venture capital-owned businesses easier for agencies to understand, with the hope it would bolster the use of the authority after noting that 12 awards were made between fiscal 2013 and 2014.
While awards to venture capital-owned businesses did increase following the policy changes, those agencies said they did so because their technology needs often required significant private investment.
“One reason that NIH decided to allow small businesses majority-owned by multiple investment companies and funds to participate in its program is that it would increase the chances of promising technologies reaching the marketplace,” the report said. “The written determination states that very few small businesses are capable of commercializing their technologies without private investment funding because of the significant cost needed to take a biomedical product from idea to market.”
The report also noted that the Department of Defense had explored awarding SBIR grants venture capital-owned businesses for satellite imagery projects related to the Defense Advanced Research Projects Agency (DARPA) in fiscal 2016. DOD officials said because of the costs of launching satellites, some small businesses raised capital by selling equity shares to venture capital interests and hedge funds.
In its written determination of the authority, DOD officials said “providing such small businesses with incentives to gear their research toward DoD-relevant problems would meet a demonstrated need and substantially contribute to DARPA’s mission.”
GAO offered no recommendations and none of the 11 SBIR agencies provided comments on the report.
Here’s how technology vendors can navigate the legislative branch
Congress can be a difficult place for technology vendors to do business.
The legislative branch’s “unique, fragmented and opaque rules” set a barrier to entry that can keep even vendors with experience in other areas of government out of the loop. But a new white paper from Future Congress aims to lay out the rules of the road for vendors and civic hackers who’d like to help Congress function better.
The paper gives a little information on everything from the governance structure of IT in the House and Senate to the acquisition rules and practices that govern the $288 million in IT spending Congress does each year.
In both chambers, acquisitions generally fall into the categories of “formal procurement,” authorized acquisition or unauthorized acquisition. The “formal procurement” is the more organized and structured of the three — tech acquired here generally serves the institution as a whole.
But individual member offices in the House and Senate also have the ability to acquire necessary software — ideally from vendors approved by the chamber’s respective IT governance overseer. In the House this is the Office of the Chief Administrative Officer (CAO), and in the Senate it is the Office of the Senate Sergeant at Arms (SAA). The paper notes, however, that approval is “not strictly enforced” by either office, so member offices often end up using software services that are noncompliant with the body’s IT governance rules. Examples of these “unauthorized acquisitions” include digital productivity software services like Slack, Dropbox and Evernote.
The paper also gives recommendations for how Congress can “facilitate a better and more effective IT landscape.” These include the creation of a Congressional Digital Services (CDS) and an increase in the ceiling on congressional staff pay to attract more tech-savvy talent. Additionally, it states, the CAO and SAA should attempt to become more approachable, perhaps by creating dedicated help desks for tech vendors and civic hackers.
Future Congress, which was created by a group of about 20 bipartisan advocacy organizations in October 2018, is a “resource hub” dedicated to “efforts to improve science and technology expertise in the legislative branch.” The cohort now includes some academic centers, companies and a list of individuals as well.
2018 in review: CDM gets some big-dollar upgrades
The Department of Homeland Security’s signature cybersecurity program underwent a significant series of developments in 2018, ranging from new network dashboards to a spate of multi-billion-dollar contracts.
The continuous diagnostics and mitigation (CDM) program advanced progress on its Phase 3 plans this year, standing up a collection of data dashboards to monitor network traffic throughout federal agencies.
Officials also aimed to provide agency CISOs more flexibility to acquire the program’s cyber tools in a massive recompete of CDM contracts that were awarded over the summer and fall.
The combination of new visibility for agencies and new cybersecurity tools was intended to bolster the cyber defenses across the federal enterprise, hopefully allowing CDM to provide greater protection of government networks.
But while the acquisitions came fast, transitions to the CDM dashboards took a measured pace. Congress and the Trump administration also signaled the intent to reshape policies behind the program to reflect more efficiency and to solidify agencies’ stature in the cybersecurity sphere.
Here’s a look back at CDM’s highlights in 2018:
February: CDM awarded a six-year, $621 million contract to Booz Allen Hamilton as part of its Dynamic and Evolving Federal Enterprise Network Defense (DEFEND) program for Group B. The DEFEND contracts offer agencies updated cyber tools through a series of task orders that will be awarded throughout the year.
Group B includes the departments of Agriculture, Energy, Interior, Transportation and Veterans Affairs, plus the Executive Office of the President and the Office of Personnel Management.
March: CDM program manager Kevin Cox said that portions of the CFO Act agencies had begun reporting their data to individual dashboards that would allow the program to monitor their network traffic.
DHS officials also received an authority to operate for a shared services network dashboard that would monitor small agency networks, plus provide a host of potential cyber capabilities.
April: Following DHS’s decision to consolidate its 16 security operations centers (SOCs), Cox said that the CDM program was “exploring” providing SOC-as-a-service capabilities through its newly created shared services dashboard.
June: CACI secures a $407 million contract to provide new cyber tools for CDM DEFEND’s Group A, which includes DHS and its components. Cox said then he expected $1 billion worth of DEFEND contracts will be awarded over the course of the summer.
July: Rep. John Ratcliffe, R-Texas, introduced the Advancing Cybersecurity Diagnostics and Mitigation Act, which aims to make the CDM program a systemic requirement for DHS and calls on its secretary to deliver a comprehensive CDM strategy to deliver to Congress.
CGI Federal secured the $530 million Group C contract to service the departments of Commerce, Justice, Labor, State and the U.S. Agency for International Development.
August: Booz Allen Hamilton obtained its second DEFEND award, a $1.03 billion contract for Group D, which includes the General Services Administration, Department of Health and Human Services, NASA, Social Security Administration, Department of the Treasury and the U.S. Postal Service.
Cox also said later that month that CDM was working on addressing the challenges of monitoring mobile devices on federal networks by leveraging the request for service (RFS) functions built into the $530 million DEFEND Group C contract.
September: The House passed the Advancing Cybersecurity Diagnostics and Mitigation Act in a voice vote, while ManTech obtained the $668 million DEFEND Group E contract.
The contract serves the departments of Education and Housing and Urban Development, the HUD Office of Inspector General, the Environmental Protection Agency, the Federal Deposit Insurance Corporation, Nuclear Regulatory Commission, National Science Foundation, Securities and Exchange Commission and Small Business Administration.
October: The Office of Management and Budget issues new FISMA guidance that requires agencies to purchase continuous monitoring tools from CDM contract vehicle unless they can provide a valid reason not to.
2018 in review: The year of the Centers of Excellence
2018 was the year of the IT Modernization Centers of Excellence.
It was the year that Phase I projects kicked off at the initiative’s first agency, the U.S. Department of Agriculture, and it was the year that new contractors were chosen to participate in Phase II. It was the year that the General Services Administration, which stewards the program, announced that the Department of Housing and Urban Development is up next. It was even the year the whole project got a snazzy-looking website.
The CoE concept, which aims to “accelerate” IT modernization across government by creating central repositories for “best practices” that can move from agency to agency, was formally introduced in the fall of 2017. “The ultimate objective of the CoEs is to build change management capacity for enterprise-level change in the federal government,” White House special assistant Matt Lira told FedScoop in a recent conversation.
GSA issued a request for information centered on four “key technology initiatives” in October 2017. In December, the White House announced that it had chosen USDA as the first host agency for the CoEs, citing “top-level commitment” from Secretary Sonny Perdue and CIO Gary Washington as the agency’s key recommendations for this honor.
But it wasn’t until 2018 that things really got off the ground. In March 2018, GSA announced the contract winners for Phase I of the project, and work began in early April.
During the exploratory Phase I at USDA, five teams formed around five different areas of modernization — cloud adoption, IT infrastructure optimization, customer experience, contact center and service delivery analytics. Each team was made up of USDA IT employees, GSA employees and contractors. All five teams worked “deeply embedded” within the office of the CIO at USDA, CoE Executive Director Bob DeLuca told FedScoop during a tour. They focused on determining the agency’s modernization status — both where the agency stands and where it needs to go.
“It helps tremendously to have somebody with an objective point of view come in and help you,” USDA CIO Gary Washington told FedScoop in June. “When you bring in a third party with an objective point of view that’s been there done that, it really helps change the thinking, you know, which helps change the culture.”
But Phase I was, ultimately, just the beginning. In October GSA announced which new contractors who will lead each of the five expertise area teams into Phase II — the operationalizing phase.
Similarly, USDA is just the genesis as well — in September the CoE team within the Technology Transformation Service at GSA said it plans to bring at least some of the five centers of excellence to HUD next, starting with a “discovery sprint” to uncover the agency’s needs.
“I’m thrilled HUD is teaming up with GSA to transform this agency into a more effective and efficient servant on behalf of the American people,” Secretary Ben Carson said in a statement at the time. “This is an important moment for HUD as we embark upon a campaign to modernize our aging technology and bring true financial integrity to everything we do.”
In August, TTS Director Joanne Collins Smee, who oversaw the creation of the CoEs from the GSA side, left government after just one year. Kelly Olson filled in briefly, and GSA finally settled on a permanent replacement in early December.
All these leadership shuffles notwithstanding, the brains behind the initiative at the White House Office of American Innovation are feeling confident in what’s been done thus far.
“Frankly [the CoE initiative] started as an experiment,” Lira said at an event in October. “Is it possible to drive enterprise-level change at the agency level? Well, I’m pleased to report that at least about a year into that project, the early signs are enormously positive.”
“Thus far the CoEs are demonstrating their ability to apply this wholesale change,” he added more recently — change and modernization on everything from back-office backend systems to user experience and design.
Moving into 2019, however, transitioning the “best practices” learned with USDA to the work at HUD will be a big test for this still-nascent concept. “We’re still sort of in that experimentation phase,” Lira said, projecting that 2019 will be all about figuring out the appropriate “playbook.”
TTS’ new director, former SunTrust Banks CIO Anil Cheriyan, has his work cut out for him.
2018 in review: Technology modernization meets the federal workforce
Though the White House previously announced plans to both modernize the federal government’s IT infrastructure and reform its workforce in the year prior, 2018 was when the Trump administration laid the foundation to merge the two goals ahead of the anticipated changing nature of work.
With growing interest in the potential of emerging technologies such as artificial intelligence, machine learning and automation, early rumblings of the administration’s plan to retrain the federal workforce emerged in February when the president requested a $50 million budget to reskill current employees to fill critical IT skills gaps.
The Office of Management and Budget’s plan hinged on the increased adoption of AI and automation technologies to meet the growing demand for federal services. As the technologies would carry out more of the repetitive, heavy-labor of daily operations, federal employees would need new skills to both work with the new tools and focus on more “high-value” tasks.
“Employees who perform transactional work that is phased out can shift to working more directly with customers or on more complex and strategic issues,” the budget request said. “Current employees can shift from legacy positions into emerging fields in which the government faces shortages, including data analysis, cybersecurity and other IT disciplines.”
Though federal agencies were already crafting workforce restructuring plans at the time, the budget request previewed the administration’s plans to fill some critical cyber and IT skills gaps by retaining current employees.
Tyson Meadors, the National Security Council’s director of cybersecurity policy, echoed those thoughts in March when he called for an aptitude assessment to determine which federal employees might be able to convert to cybersecurity jobs in an increasingly competitive marketplace.
Roughly two weeks later, the Trump administration unveiled its Rosetta Stone for reshaping the federal government in the President’s Management Agenda. The three-fold plan provides a series of interdependent milestones that focused on workforce reform, IT modernization and data management goals.
The PMA provided the playbook for all the White House’s workforce and technology efforts, laying out quarterly goals to make both reskilling and modernization targets more reachable.
The Office of Personnel Management discussed plans to make it easier for agencies to hire cyber and IT talent in May when officials signaled their plans to offer direct hire authority to fill critical technology skills gaps.
OPM debuted the first rounds of direct hire authority in October when OMB Deputy Director for Management Margaret Weichert became acting director of the human resources agency, allowing agencies to directly recruit for IT, cybersecurity and STEM jobs and provide alternative compensation structures from the General Schedule pay scale to attract new talent.
Federal CIO Suzette Kent formalized the administration’s plans for workforce reskilling last month, announcing the creation of the Federal Cybersecurity Reskilling Academy, a three-month pilot program housed in the Department of Education that will offer training to federal employees outside of the IT and cyber fields.
The academy will select an initial class of students in February, followed by future classes at a later date.
Kent said Dec. 14 that the academy will be one of four reskilling initiatives that the White House will launch in 2019, including education programs to provide current IT professionals with additional training to help address skills gaps, a program offering instruction on robotics process automation and a leadership development course to help build a pipeline of senior technology talent.
So while 2018 marked the beginning of the White House’s plans to address the changing nature of work, its initial investments will continue to develop over the coming year.
2018 in review: The year of JEDI
When 2018 started, we knew very little about the Department of Defense’s landmark plan to acquire enterprisewide commercial cloud services — but as the year ends, the $10 billion, single-award cloud contract known as JEDI has become the biggest turf war in the federal IT contracting community, one that may forever change the government cloud landscape.
In March, after months of anticipation, the Pentagon debuted its acquisition strategy for the Joint Enterprise Defense Infrastructure cloud, which will stretch across the entire expanse of DOD, focused primarily on commercial platform-as-a-service (PaaS) and infrastructure-as-a-service (IaaS) offerings.
“We want to bend the Department of Defense around the commercial cloud,” Chris Lynch, director of the Defense Digital Service team leading the procurement, said during an industry day, meaning the department wants to adapt to embrace the existing strengths of the expansive commercial cloud and not limit that with excessive customization. “I can’t make that point enough. We want it here, and we want it out in those austere environments. We want to bring this to the warfighter.”
That was all good — it was the single award part that got (most) cloud contractors in an uproar.
In particular, IBM, Oracle and a few others right out of the gate expressed they weren’t happy with one award for DOD’s $10 billion foray into commercial cloud. “The Pentagon would never limit the Air Force to flying only cargo planes for every mission. Locking the entire U.S. military into a single, restrictive cloud environment would be equally flawed,” said Sam Gordy, general manager of IBM U.S. Federal.
Others — like Amazon Web Services, which is thought to be the best suited to win the JEDI contract — were more bullish on the single award.
“We believe for them at this point in time, a single cloud is a good thing — a single award with one cloud for now,” AWS’s Teresa Carlson said in an interview in June.
Meanwhile, within the Pentagon, there was the shuffling of exactly who was in charge of developing this behemoth contract. Former DOD Chief Management Officer Jay Gibson led the effort through its infancy. But ultimately, CIO Dana Deasy was put in charge of the acquisition in June. Gibson has since left the department.
Then came the protests, before the bidding process was even complete — first Oracle, followed by IBM. The Government Accountability Office denied Oracle’s protest, saying JEDI’s single-award approach “is consistent with applicable statutes (and regulations) because the agency reasonably determined that a single-award approach is in the government’s best interests for various reasons, including national security concerns, as the statute allows.” GAO then dismissed IBM’s because Oracle took its protest to the Court of Federal Claims — “the matter involved is currently pending before a court of competent jurisdiction,” GAO said.
Google, on the other hand, removed itself from the competition, citing ethical concerns. The company didn’t want its technology used for lethality. Similarly, after protests from its employees, Google refused work with the Air Force this year on Project Maven, which uses artificial intelligence to help analyze full-motion video surveillance, freeing up some of the workload for human analysts.
Microsoft employees too protested working with the DOD on JEDI for similar ethical concerns. However, the company ultimately bid on the contract.
That leaves us here, today. The battle for JEDI trudges along in the Court of Federal Claims. AWS has intervened on Oracle’s protest, mostly to protect its own name and proprietary information. And it’s still many months before the Pentagon will choose a winner.
Even then, the JEDI contract process is likely long from over. Whoever loses will likely protest, meaning any work on the contract will be delayed until that is sorted out. It could be fiscal 2020 before work under the contract kicks off.
For whoever wins, there are major stakes involved. Though DOD has said JEDI will account for only about 20 percent of its cloud spend, that’s still a big chunk for a single vendor. And DOD intends to move much of its application portfolio to the cloud through JEDI.
But most importantly, the contract, one of the largest cloud services contracts in the federal government to date, sets a precedent for incumbency. Whoever wins it will emerge as the biggest player in the federal IT space and continue to win work — particularly contracts that deal with classified or highly sensitive, mission-critical information — with other agencies ready to move to the cloud.
Indeed, 2018 was the year of JEDI, but really, we’ve only scratched the surface.
Should DHS be hacked?
Does the Department of Homeland Security (DHS) need to be hacked? Legislators seem to think so.
Recently, the SECURE Technology Act was passed in Congress. The bill, introduced by Rep. Will Hurd, R-Texas, who serves on the House Homeland Security and Intelligence committees, would require DHS to establish a bug bounty program and security vulnerability reporting process. This is the third round of legislation proposed in 2018 that would invite hackers to report security weaknesses directly to DHS to detect where they are most vulnerable.
Earlier this month, Sens. Rob Portman, R-Ohio, and Maggie Hassan, D-N.H., introduced the bi-partisan Public-Private Cybersecurity Cooperation Act. This is a companion bill to the House version (H.R. 6735) that passed earlier this year. The bill directs DHS to establish a vulnerability disclosure policy (VDP) for the DHS’s websites. A VDP authorizes individuals to look for vulnerabilities in specified assets.
“At a time when cyber threats are on the rise, the United States government must protect itself. Doing so involves drawing upon the vast expertise of hackers and security experts in our country to identify vulnerabilities and report them to the people in a position to fix those flaws in our systems,” said Senator Portman in a statement.
This statement gets it right: With no clear solution to the increasing threats to cybersecurity, the United States must enlist the help of hackers. Hackers have unique insights. They have diverse skill sets and expertise. And, there are a lot of them ready to help. They are good at building things, but importantly, they also enjoy breaking them. Different from the bad guys, they want to break things in order to have them fixed. To fully leverage the power of hackers, it’s important that they aren’t penalized for the work they do, and that the vulnerabilities they find are acknowledged, analyzed and eventually fixed.
How DHS should work with hackers
The Public-Private Cybersecurity Cooperation Act wisely gives guidance about what should be in a vulnerability disclosure policy — dubbed the “see something say something” of the internet for reporting security weaknesses. A VDP should state not only what a hacker can do, but what a hacker cannot. What testing techniques can be used, and importantly, which ones shouldn’t? What assets are covered and which ones are not? What types of vulnerabilities does DHS want the hackers to look for, and which ones are off limits?
This type of information is included for clarity, but also to provide a safe harbor from liability. The DHS VDP should eventually be written similar to the Department of Defense’s VDP which clearly states that it will not initiate or recommend any law enforcement or civil lawsuits related to activities permitted under the policy, and in the event of any law enforcement or civil action brought by anyone other than DOD, it will take steps to make known that the hacker’s activities were conducted pursuant to and in compliance with its policy.
The more mature version of the VDP is a bug bounty program, where an organization pays hackers monetary rewards for the security flaws they report. The Senate passed a bill to establish a bug bounty pilot program within DHS (S.1281) earlier this year called “Hack DHS Act.” Incentives add an additional layer of complexity. What, how much, and for what needs to be thought through. It involves taxes and forms if cash compensation is used. Most importantly, once hackers start hacking, bugs come in very quickly. Given the speed and volume of the findings, DHS has to be ready; and given its knowledge base and security expertise, I’m sure it will be. However, other agencies may not have as many resources or as developed of a security posture as DHS. In those cases, VDP is the better first step. Make sure that you’re ready to manage and remediate the vulnerabilities before moving on to the bug bounty.
The Department of Defense’s work with hackers has been extremely successful, paving the way for other agencies. Since 2016 the DOD has resolved more than 5,000 security vulnerabilities as a result. As we saw with Hack the Pentagon’s bug bounty program, the first submission was reported within 13 minutes of the launch. By the end of the month, over 130 valid bugs were resolved in the Pentagon’s systems and tens of thousands of dollars paid to hackers for their efforts. This is why the bills require the DHS to consult with the DOD. Information sharing amongst the agencies is important. Less time reinventing the wheel means more time improving security.
We know that not everything can be public, but requiring the number of unique vulnerabilities reported, who found them, and how long it took to remediate them is a good first step. The entire point of finding vulnerabilities is to remediate them. I’m hopeful that one or all of these bills pass by the end of the year because by leveraging the hacking community worldwide, they bring security to the forefront in a positive way. The message is, we’re all in it together.”
Deborah Chang joined HackerOne in 2018 as Vice President of Business Development and Policy. She started her career as an attorney at Wilson Sonsini Goodrich and Rosati in Palo Alto, working on IPOs, venture financings, M & A, and advising directors and officers. She has worked at many Silicon Valley companies, including Applied Materials, and most recently in senior business development roles at Massdrop and Shutterfly.