Former national security leaders want DOD to work more collaboratively on AI testing

The Pentagon’s former No. 3 official Michèle Flournoy and some other national security experts want the Department of Defense to establish a more collaborative approach for testing and evaluation of artificial intelligence.

In a new report, Flournoy and her co-authors stress the need for robust methods of testing, evaluation, validation and verification of AI that can move the development pipeline fast enough to achieve the department’s ambitious AI goals. And Flournoy’s recommendation carries special weight as she has been floated by some as a frontrunner for the top job in the Pentagon if Joe Biden is vote president in November.

The report lays out new structures of teams for testing, ways Congress can fund help the DOD purchase AI testing tools and other new approaches to speed up testing in a way that improves reliability. Avril Haines, an Obama administration White House Deputy National Security Adviser, and Gabrielle Chefitz, a senior associate with Flournoy’s strategic advisory firm WestExec Advisors, co-authored the report.

Without trust in the AI systems that DOD builds, they won’t be useful, Flournoy said during a virtual event promoting the report hosted by the Center for Security and Emerging Technology. “This is going to require much greater coordination across the entire [testing, evaluation, validation and verification] ecosystem,” she said.

Flournoy, who served as undersecretary for policy in the Obama administration, and her co-authors suggest creating a cross-functional team that would report to the Office of the Deputy Secretary of Defense, pulling members from across services and secretariats for a common purpose and to allow for greater flexibility and collaboration. The majority of the team’s work would center on testing and evaluation research, but also the assessment of specific models.

“You really can’t have a one-size-fits-all approach in this area,” Flournoy said.

The report also floats the idea that Congress should grant new budgetary authorities to the DOD for buying AI testing and evolution tools, studies and other needed elements from the private sector. Current regulations do not allow for the needed flexibility, according to the report.

The Joint AI Center in the Pentagon has been working on new solutions to solve many of the challenges highlighted in the report, and some of its officials participated in interviews for its production.

Inside the DOD there is a fear that testing could be a “bottleneck” to AI progress, Jane Pinelis, the JAIC’s head of testing and evaluation, said Tuesday. Flournoy and others echoed the concern, saying that if the DOD doesn’t balance the need for thorough testing and moving fast to field AI, there is a risk of losing technical advantage to adversaries.

IRS under investigation for use of citizens’ phone location data

The inspector general of the IRS intends to investigate its Criminal Investigation unit’s subscription to a commercial database containing the phone location data of millions of citizens.

The Treasury Inspector General for Tax Administration agreed to a Sept. 24 request from Sens. Ron Wyden, D-Ore., and Elizabeth Warren, D-Mass., to review the IRS‘s warrantless use of the Venntel database between 2017 and 2018, in a letter first obtained by Motherboard.

While this isn’t the first instance of federal law enforcement seeking access to citizens’ phone location data, it is for the nation’s tax collector, which has recently come under fire for targeting poorer Americans with audits.

“We are going to conduct a review of this matter, and we are in the process of contacting the CI division about this review,” reads Inspector General J. Russell George’s Sept. 30 letter to the senators. “Upon completion, to the extent allowable under the law, we will advise you of the results.”

At issue is the IRS’s lack of any court order when using the Venntel database, a violation of the 2018 Supreme Court ruling in Carpenter v. the United States that collecting significant quantities of historical phone location data constitutes a search requiring a warrant.

The IRS ignored “multiple follow-up requests” for documentation of the legal analysis of the situation, following the revelation by IRS officials on a June oversight call. Wyden and Warren requested that the analysis be examined to see if an “obvious violation” of privacy rights was approved but also that IRS-CI use of other databases containing citizens’ information be investigated.

“The IRS is not above the law and the agency’s lawyers should never provide IRS-CI investigators with permission to bypass the courts and engage in warrantless surveillance of Americans,” reads the senators’ earlier request.

DHS advances AI project improving use of contractor past performance data

The Department of Homeland Security has entered the second phase of its project leveraging artificial intelligence to help agencies better use data on contractor past performance.

DHS‘s Procurement Innovation Lab recruited 10 agencies to contribute data and funds to further evaluate seven AI solutions improving contracting officials’ ability to find Contractor Performance Assessment Reporting System (CPARS) data quickly.

The $50,000 phase 2 awards will allow a subset of the nine original companies that made AI prototypes to refine their solutions in areas like security accreditation for software-as-a-service (SaaS).

“This is not about using artificial intelligence to replace human intelligence but recognizing the challenge that some can face if it takes a long time to go into the system and figure out what records are relevant,” said Matthew Blum, associate administrator of the Office of Federal Procurement Policy, during an ACT-IAC event Tuesday. “So if we can use AI to shorten that timeframe and get this information much more rapidly, it saves the workforce a tremendous amount of time, but you still get all the value.”

Part of phase 2 will involve allowing agencies to weight past performance criteria depending on their needs in areas, like cost control or business relations, before the AI gathers such information.

One of the companies chosen to advance is CORMAC, which created a federally accredited SaaS solution called the CORMAC Envisioning and Prediction Enhancing System (CREPES). The tool uses machine learning and natural language processing to rapidly determine how relevant past projects were to current requirements when pulling information.

CORMAC has the opportunity to turn CREPES into a governmentwide solution, with phase three of DHS’s project consisting of a full pilot culminating in a multiple-award, governmentwide acquisition contract. DHS envisions a commercial, multi-vendor marketplace where the AI services are sold.

Because the AI solutions are only as good as the agency data provided, some have suggested requiring contractors to provide better information.

“Industry is very excited to hear about the AI pilots and their ability to pull relevant CPARS information more quickly from the existing data,” said Mike Smith, executive vice president of GovConRx. “But there’s a thought that if agencies would utilize the assistance of the contractor to provide relevant data, then it would be easier to put relevant data into the system — garbage in, garbage out.”

Contracting officials might not initially accept the idea of contractor past performance information coming to them via an AI filter, so training to familiarize them with the tools may be needed, said Melissa Starinsky, director of the Office of Acquisition & Grants Management within the Centers for Medicare & Medicaid Services.

“I think that’s a change we’ll have to get the workforce ready for,” Starinsky said.

OFPP Administrator Michael Wooten included past performance in the President’s Management Agenda Cross-Agency Priority Goal of frictionless acquisition.

The administration wants to use technology and data more effectively in acquisition, and CPARS is now a key part of that, Blum said.

“We want to make sure that our system is as responsive as possible,” Blum said.

State Department is looking for tools to manage its global supply chain risk

The State Department is faced with a pressing challenge: It wants to better understand its supply chain of IT vendors and be able to rapidly discover or anticipate risks to its networks.

Essentially, the department wants to maximize its confidence that its supply chain is safe from the bad guys, according to a new request for information published Monday. It calls for existing market capabilities that can help the department better monitor its supply chain and react to threats.

“Through access to and analysis of available sources, the information provided by the industry SCRM (Supply Chain Risk Management) tool to the Department of State (DOS) customer will provide DOS with breaking or anticipatory information, regarding specifically identified subject matters, situations, and geographic areas around the world,” says the RFI. It also suggests a solution would need to “easily integrate proprietary internal Government data.”

Currently, State monitors the IT supply chain it depends on to support its mission of American diplomacy across the globe. But in a new tool, the department would hope to “obtain, maintain, and retain total situational awareness of global supply chain related events before, during, and after they unfold” and “quickly verify or validate the credibility of a source, author, and online information” as quickly and easily as possible.

The security of its supply chain is a topic that stays on the top of the State Department’s mind. In July 2019, the department moved forward with a $2 billion global supply chain security contract awarded to General Dynamics Information Technology.

How a small group of Marines is advancing modernization with coding

A small, informal group of Marines has banded together to bring greater digital understanding and modernization to the force, which has been late to adopt the digital trends embraced by other services.

The Marine Coders, as they call themselves, are on a mission to connect code-savvy Marines and educate others through open-source software projects and training. Their efforts are tied to advancing the guidance recently put out by the corps’ top officials to better take advantage of Marines’ time by automating away rote, time-intensive tasks.

The Marine Corps is the smallest military service but faces the same technology modernization challenges as the others. So, with fewer service members to take on that heavy lifting, corps leadership wants to use automation to offload some mundane tasks — a mantel the founders of Marine Coders have picked up as part of their mission. And the “few and the proud,” as the saying goes, are becoming fewer with the Marine Corps calling for a thinning of the total force, driving an even greater need for computer-based automation.

“We should empower everyone to automate the stuff in their lives that is literally no value add,” Capt. Collin Chew, one of the founders of Marine Coders, told FedScoop in an interview.

Chew and Capt. Andrew Hutcheon have organically grown the organization, evolving it into a community of a handful of coders. The founders say they anticipate growth in the near future based on initial interest.

“We haven’t done an amazing job as a service enabling” tech-savvy Marines, Chew said.

So far, Marine Coders has taken on small challenges, Chew said: formatting word documents, building out listservs and finding ways to get basic software training to Marines. But as the group grows and works to become an officially recognized organization, its founders anticipate taking on bigger problems.

Marines and airmen, hand-in-hand

The group was modeled after a similar organization in the Air Force, Airmen Coders. The Marine Corps’ iteration was founded after Chew and Hutcheon spent time with the Air Force’s Platform One DevSecOps team. Inspired by what technology could do to empower the lives of service members, they sought out a way to bring more Marines together in the fight for digital modernization.

“What we needed to do was establish something like [Airmen Coders] in the Marine Corps,” Hutcheon said.

The coronavirus pandemic added even more fuel to their fire. At the outset of the maximum work from home posture, Chew and Hutcheon were a part of the team that helped build out the Air Force’s “Mattermost” secure chat function. Spending hours on calls with each other and the founder of Airmen Coders, Capt. Christian Brechbuhl, gave them time to dream up ways for Marines to get more involved in software. They pulled the basic foundation of the group together this summer and fully launched in September with a first hack-a-thon.

The partnership between the Air Force and Marines has allowed each to share best practices. One critical piece of their success is making “asynchronous” material, which can be viewed at any time instead of during live-video calls, to accommodate the military’s global presence.

“They took some best practices from us, made them better, and we are pulling [the practices] back,” Brechbuhl said in an interview.

Through Airmen Coders, members have already been able to publish applications on internal Air Force networks and on government-furnished devices. It’s a short-term goal Marine Coders wants to replicate. Chew and Hutcheon both said they don’t foresee a future where they direct all the projects coders work on, but instead allow Marines to take on the challenges they face in their own day-to-day work.

“We don’t want to restrict things that people can work on,” Chew said. “People want to solve things they are passionate about.”

Brechbuhl reports senior leaders are supportive of his initiative. Chew and Hutcheon say they already have seen support from senior uniformed leaders for “non-traditional” means at attacking problems.

CIO Council issues guide for sizing up TBM progress

A federal working group released a tool for agencies to measure their progress in implementing the Technology Business Management (TBM) framework pushed by the White House.

The IT Spending Transparency Maturity Model is an optional method for agencies to evaluate where they’re headed in six areas essential to effective TBM: engagement, taxonomy, data, automation, reporting and value.

In 2019, the Office of Management and Budget began tracking agencies’ use of the private-sector framework for understanding the cost, quality and value of IT, and the model helps guide TBM adoption.

“More specifically, the new maturity model turns qualitative activities into quantitative metrics, helping agencies measure progress and keep teams and leadership aligned on what matters most for their mission,” reads the accompanying white paper. “Most importantly, this model helps U.S. government agencies be more effective, efficient, and accountable to the taxpayers they serve through transparency and continuous improvement.”

The CIO Council‘s Federal Technology Investment Management Community of Practice developed the model in collaboration with the ACT-IAC IT Management and Modernization Community of Interest.

The two groups of government and industry stakeholders submitted a draft model in August for comments, and in releasing the revised final product on Sept. 30 completed a milestone within Action 9 of the Federal Data Strategy 2020 Action Plan aimed at improving financial management data standards.

TBM sees agencies collect and map data to its taxonomy, or common language for generating metrics and reports. Early automation is needed to keep the requisite data updated and error-free.

Agencies with lower maturity according to the model tend to be more reactive and lack the metrics and comprehensive processes for TBM.

“At the highest levels of maturity, you gain tightly controlled governance, integrated systems, and optimized services,” reads the white paper. “Your agency can leverage the IT spending transparency maturity model to support the implementation of TBM as well as the data collection activities in the Federal Data Strategy Action Plan.”

What, exactly, is a U.S. CTO?

With the advent of the U.S. Chief Technology Officer position in 2009, the concept of digital technology got some prime real estate within the White House and, more broadly, the federal government. But what it means to be the CTO of a country like the United States wasn’t immediately clear.

From the moment in 2007 that presidential candidate Barack Obama announced his intention, if elected, to appoint the first U.S. CTO, analysts and commentators began speculating as to the wide range of authorities this role could potentially involve.

Would the CTO oversee technology within the White House? Help the president set policy? Bring “innovation” to executive branch agencies? What is a chief technology officer, anyway?

Thirteen years, two administrations and four U.S. CTOs later, a look through the projects pursued by this office reveals overlapping themes but malleable focal points. If there’s a general statement to be made about the job, it’s that a CTO’s agenda depends on the timing of their arrival, the political landscape and the individual’s fundamental philosophy about the role of technology in society. And while the title is borrowed from the corporate world, the actual job description is a lot more government-native — and it remains adjustable according to the priorities of a given administration at a given time.

Of course, the role, like all presidential appointments, may soon be taken over by a new administration and a new individual. Former Democratic candidate Andrew Yang, for example, has been suggested as a possible Biden administration CTO. What this will look like remains to be seen. But, perhaps a look back at the history of the role will give it some context.

Aneesh Chopra sets the scene (2009-2012)

Aneesh Chopra didn’t think he was lining up to become the first U.S. CTO when he joined the Obama transition team and helped to develop a job description for the role.

“I assumed it was going to go to a Silicon Valley luminary,” Chopra told FedScoop in an interview last fall. “I was hopeful I could join the administration as a CTO role for the Department of Health and Human Services.”

Still, he says, it was a “pleasant surprise” to get the offer.

While on the campaign trail, Obama made the pitch for a U.S. CTO role as part of his pro-innovation agenda. “In the 21st century, our economic success will depend not only on economic analysis but also on technological sophistication and direct experience in this powerful engine of our economy,” campaign material from the time reads. Obama also positioned the CTO as a government transparency driver — an administration official who would use technology to solicit feedback from citizens and use it to improve government functions.

It was Chopra, though, who operationalized this campaign concept and developed the three key responsibilities he says characterize the role. First, the U.S. CTO should think about how to apply technology, data and innovation to solve for economic growth. Second, the CTO should apply this same set of tools to the other “national priorities” of the administration. And third, the CTO should advocate for and support opening up government data for use by the private sector.

One big definitional element to Chopra’s tenure involved setting out the responsibilities of the new CTO vis-à-vis the federal chief information officer. The CTO is an assistant to the president embedded in the White House Office of Science and Technology Policy, while the CIO is a position within the Office of Management and Budget. To Chopra, it was very obvious that the CTO should complement but not compete with the Federal CIO. So while Obama’s initial pitch for the CTO position made more allusions to the kind of inside-government work that is done by the CIO’s office, Chopra decided to decouple the jobs.

“On a simple basis, you know, inside-outside,” Chopra said, describing the turf of each role. “The role of technology inside the government would be led by the CIO, the role of technology outside the government would be let by the CTO, and then a lot of collaboration at the interface between the inside and the outside around things like open data, open APIs and so forth.”

Chopra gives this example of how the roles and responsibilities were divided: It was Federal CIO Vivek Kundra who was in charge of creating data.gov in 2009, but then once created, Chopra says, his office “nurtured” the community of stakeholders surrounding the platform.

This close collaboration and division of responsibilities, he says, was an important feature of his time on the job.

Todd Park and the talent legacy (2012-2014)

Todd Park, the second U.S. CTO, also came to the role with data bona fides. He’d been at HHS working on open data initiatives, and he brought this enthusiasm for the power of open government data to create innovation and jobs with him.

Park continued this work in the White House and then, after the disastrous rollout of Healthcare.gov in 2013, spent a good deal of his attention reviving this signature administration initiative.

But the most distinctive feature of Park’s tenure, and arguably his legacy as CTO, would turn out to be something else: attracting new talent.

Park was a key figure in the 2012 launch of the Presidential Innovation Fellows program, which brings tech-savvy developers, designers and entrepreneurs into the federal government for a limited-term “tour of duty.”

“We’re bringing 15 of the most badass innovators on the planet to come into the government and work on five game-changing projects with the goal of delivering significant results within six months,” he told Fast Company in an interview in 2012.

The program, which has survived in the Trump presidency, in turn helped to create other government tech talent initiatives like the General Services Administration’s 18F team and the U.S. Digital Service. And the impact goes beyond short term appointments — many former PIFs have gone on to continue their careers in government technology in more permanent capacities.

Park also seems to have viewed the talent recruitment job as a long-term feature of the role. “You know, someday in the year 2040, the U.S. CTO is going to attend TechCrunch Disrupt to recruit the next generation of public servants,” he said. “And the people there are going to say, ‘Well, of course, the government is here. That’s not surprising at all.’”

After leaving the CTO role in 2014, Park returned to Silicon Valley (he’s quipped that his wife threatened to divorce him if he stayed in D.C. any longer), but continued to help recruit tech talent for the federal government.

Megan Smith and a dash of data science (2014-2017)

Megan Smith, who left her job as a vice president at Google to become the third U.S. CTO, continued in much the same vein as her Obama-era predecessors.

“As described, [the job is] how do you help the president and the team harness the power of data, innovation and technology on behalf of the American people,” she told FedScoop in an interview in March. She talks about the job as being a “very plus-one role,” where the CTO is present to offer suggestions for ways in which technology might support other policy efforts, such as those to create more economic opportunity or a higher quality of life for American citizens.

For example, Smith’s team was responsible for creating and releasing the first national strategy on artificial intelligence in 2016, a report that explored the opportunities, challenges and impacts arising from this particular technology, as well as what federal research and development priorities in this area should be. Smith’s office demonstrated a commitment to collaboration during this process by seeking active public engagement in the creation of the report, including through a series of national AI town-halls.

In addition to this policy work, she continued to “incubate” government tech talent in her office. Smith scaled the Presidential Innovation Fellowship program and set it up for permanence while also advocating for civic tech talent across the federal government. Working with Chief of Staff Denis McDonough, Smith and Deputy CTO Alexander MacGillivray created the Tech Policy Task Force, an internal coalition chaired by Smith that brought technical talent from various White House teams together to deliberate about and share input on topics such as broadband, data privacy, patent reform, diversity in tech and federal open source policy.

Smith additionally supported the growth of a tech-in-government ecosystem regionally in the U.S., and internationally through groups such as the Open Government Partnership.

And Smith expanded the purview of her own role a little too.

“We added a lot of data science,” she said. For example, it was during Smith’s tenure that the chief data scientist role, filled by DJ Patil, was created at OSTP. And Smith’s office helped kick off The Opportunity Project in 2016 — an ongoing initiative that encourages private sector companies, nonprofit and academic organizations to tap into available federal open data to create new technology tools. Her team also worked on issues like precision medicine, big data and building data science communities of practice within the federal government.

Michael Kratsios, Donald Trump

Michael Kratsios focuses in on national technology policy (2019-present)

By his own admission, Michael Kratsios has taken a “very different track” compared to those who previously held his job.

Fresh from a job as chief of staff at Peter Thiel’s venture firm Thiel Capital, Kratsios was named deputy CTO in 2017. For a while, he was in essence the de facto leader at OSTP — acting both as Director of OSTP and Presidential Science Advisor. Dr. Kelvin Droegemeier eventually joined the Administration to fill those roles in an official capacity in 2018. Officially, the CTO role remained empty until he was promoted and then Senate-confirmed to fill it in August 2019, making him the fourth person to hold the title and the first under President Trump.

The job is still fundamentally a policy advisory role, but one much more focused on policy that pertains to technology itself.

“What we’ve focused on, which I think is new and is also a reflection of the times we live in — this office has been, since inauguration, focused on driving national technology policy,” Kratsios told FedScoop in an interview. “We work to ensure American leadership in emerging technologies.”

This approach was evident very early on — a White House “tech week” in July 2017 focused on drones and the creation of 5G infrastructure. Artificial intelligence, too, has continued to be a key priority.

“Everything from the changing face of artificial intelligence to quantum computing to 5G, even issues like internet privacy, um, encryption … these are all questions that are critical to our nation’s security, and our nation’s future, and it’s important to have a team that’s dedicated to driving policy outcomes in those areas,” Kratsios said.

“Broadly, across all tech issues, we focus on research and development, workforce issues and regulatory issues, and we drive and develop national strategies on our most important technology domains,” Kratsios said. He also highlighted the role he’s taken on the international stage, representing the White House at various intergovernmental group meetings.

This transition of focus at the office of the CTO was solidified by the creation of the Office of American Innovation, run by presidential son-in-law and advisor Jared Kushner. This new office, both Kratsios and Chopra noted, has taken the lead on some of the internal government tech modernization that previously might have been the purview of the CTO. It was OAI that originated the idea of the IT modernization Centers of Excellence now housed at GSA, for example — arguably one of the Trump administration’s flagship government IT projects.

Still, some priorities of the role have remained very similar over time, including the expansion of access to government data. And some mechanics of the role are similar too — Kratsios says that the “inside-outside” model created by Chopra still largely defines how his office interacts with that of the federal CIO. On the American AI Initiative, for example, one of Kratsios’ office’s flagship projects, OSTP collaborates closely with the Office of the CIO on the task of further opening data collected and held by federal agencies.

To Kratsios, the role as it is currently crafted reveals the priorities of the administration he works for. “For us to be able to have senior leadership here driving a singular issue, that being technology, is an indicator of how seriously this administration is taking tech policy,” he said.

And arguably there’s a certain conceptual similarity here too: Throughout the history of this role, revealing priorities has always been what it does best.

JAIC gets new director after Senate confirmation

Lt. Gen. Michael Groen has been confirmed by the Senate to be the new director of the Joint Artificial Intelligence Center, the Department of Defense’s hub for fielding AI technologies.

The general’s confirmation comes with a promotion to major general as he assumes a DOD position with important oversight of emerging technologies. Groen comes to the JAIC at a critical time as the center expands some of its key work around warfighting initiatives and transitioning away from some of its earlier, lower-risk projects.

“The @DoDJAIC welcomed our new Director, Lieutenant General Michael Groen, USMC, to the team today. We look forward to his leadership!” The JAIC tweeted Oct. 1. Groen was confirmed by voice vote at the end of September.

Groen has a long history of working with military technology and intelligence. Before his appointment he served as the chief of Marine Corps intelligence and on the Joint Staff in senior positions. The general most recently came from the National Security Agency, where he led computer network operations.

Groen is the second confirmed director of the JAIC, following in the footsteps of Lt. Gen. Jack Shanahan, who retired from service in the Air Force this summer after standing up the JAIC.

Air Force adds more tech companies to Skyborg program

The Air Force expanded its list of companies vying for task orders on its futuristic $400 million Skyborg program that links piloted aircraft with autonomous planes serving as a “loyal wingman.”

The service added nine companies to the indefinite-delivery, indefinite-quantity contract including several non-traditional companies and technology firms, a move Air Force leadership has said is important as it seeks to field emerging technology.

This second round of announced companies includes AeroVironment Inc., Autonodyne LLC,  BAE System Controls Inc., Blue Force Technologies Inc., Fregata System Inc., Lockheed Martin Aeronautics Company, NextGen Aeronautics Inc., Sierra Technical Services and Wichita State University. This list raises the total number of companies on the contract to 13.

Skyborg will be powered by artificial intelligence to help control the autonomous aircraft that will follow and assist piloted planes. The program wants those autonomous aircraft to be “attributable,” meaning the drones should be cheap enough to make them expendable in a high-risk operation.

“This second phase of awards establishes a diverse and competitive vendor pool by adding several non-traditional and traditional contractors we saw as important additions to the effort,” Brig. Gen. Dale White, program executive officer for fighters and advanced aircraft, said in a statement. “The diversity of approaches and backgrounds, allows us to access the best industry has to offer.”

White leads the Skyborg program along with Brig. Gen. Heather Pringle, commander of the Air Force Research Laboratory. Both leaders have previously spoken about the need for a diverse vendor pool to execute the multifaceted program successfully. The initial companies named on the contract include big defense names like Boeing, Kratos Defense and Northrup Grumman.

CISA selects EnDyna for vulnerability disclosure platform shared service

The Cybersecurity and Infrastructure Security Agency awarded EnDyna, Inc. a $13.5 million contract Friday to support its governmentwide vulnerability disclosure policy (VDP) shard service for agencies looking to work with researchers to find security flaws.

Based in McLean, Virginia, the consulting firm plans to begin providing the centrally managed system in early 2021 for processing reports from researchers as they find vulnerabilities in agencies’ externally facing IT systems.

The VDP platform is the first of three initial shared services CISA will offer agencies as an officially designated quality services management office (QSMO).

“CISA, designated by the White House as the Cybersecurity Quality Services Management Office in April, will build on its current cybersecurity offerings to provide a marketplace of services to agencies to protect and defend systems and operations and deliver cybersecurity solutions that continuously leverage industry innovation, in alignment with the National Cybersecurity Strategy,” said Bryan Ware, assistant director for cybersecurity, in the announcement Friday.

The first of the four original QSMOs made official, CISA will eventually manage a marketplace of cloud-based systems and services, offered by federal shared service providers, for agencies to choose from — rather than finding or developing their own solutions.

CISA partnered with the General Services Administration to acquire the VDP platform on Sept. 25, so both the services and the acquisition vehicle will be available to agencies through the marketplace.

The second marketplace offering is a security operations center-as-a-service (SOCaaS) the Department of Justice will provide to small agencies, though commercial providers will also be identified.

And the final marketplace offering will be a protective Domain Name Service (DNS) for blocking access to malicious websites when translating their people-friendly domain names into the numerical Internet Protocol addresses computers use. That award is expected to go to a commercial vendor next fiscal year.