Congress plans to keep a close eye on bloat in Space Force

The Space Force should stay as lean as possible while ensuring it onboards tech talent, key members of Congress warned in the past week.

While the White House has yet to issue a full defense budget request, the Space Force is already getting warnings to not add any bureaucratic bloat to its spending, Rep. Anthony Brown, D-Md., who sits on the House Armed Services Committee, said Wednesday.

“What we don’t want to see in the Space Force is a burgeoning headquarters and a Fourth Estate,” he said during a virtual Center for Strategic and International Studies event.

The Fourth Estate is the Department of Defense’s support agencies that are not part of the military services, like the Defense Information Systems Agency and Defense Contract Management Agency. Critics often point to these agencies as an easy target of defense budget cuts, and Brown gave the Space Force a preemptive warning not to start growing its own.

Brown did, however, extend his general support for the Space Force, given the advancement in space technology and the potential for conflicts in the domain.

Brown’s comments follow those from the top Democrat on the House Appropriations Defense Subcommittee, who told Space and Air Force leaders Friday they need to hurry up in filling top acquisition and tech roles. The Department of the Air Force houses the Space Force.

“[W]hile progress has been made on the operations side, progress in addressing long-standing acquisitions issues has been disappointing so far,” Rep. Betty McCollum, D-Minn., said during a subcommittee hearing. “Too often over the past two decades, the space acquisitions programs have been delivered late, over budget, and sometimes billions of dollars over budget.”

That disappointment extends to issues before the Space Force started. The Government Accountability Office found space-based missile warning satellites nine years and $15 billion over budget. It’s an example of what McCollum wants to avoid in the future, she said.

“GAO also found in March 2019 that key software-intensive space programs often did not effectively engage users to understand requirements and obtain feedback,” the report stated.

McCollum described the efforts she has seen so far as only “minor tweaks around the edges” and not the wholesale, ground-up reform the Space Force has promised. She added that senior civilian leadership focused on space acquisition is a must, which Chief of Space Operations Gen. John Raymond and acting Secretary of the Air Force John Roth both agreed with.

“We have got to go faster in modernizing our space capabilities and delivering capabilities and putting them in the hands of the warfighter,” said Raymond during the hearing.

The force recently published a strategy to become a “digital service,” where it would leverage technology in all of its operations. The strategy even pitched the idea of allowing guardians, as Space Force service members are called, to be “digital nomads” working remotely instead of being chained to a desk.

Army looking for tech to help segment data

There are two types of data in the Army: Data that needs to move at the speed of milliseconds and data that doesn’t.

Figuring out what data fits under which category and how to best partition a network to meet the speed demands of that data is a challenge the Army is looking to private industry for help. The No. 2 for the Joint Staff’s J-6 command, control and communications office said the Army wants to find technologies that can help it segment different types of data across networks to save precious bandwidth in conflict zones.

“We are also concerned about overwhelming a limited network, particularly in a denied, degraded, intermittent or limited environment,” Army Brig. Gen. Rob Parker, deputy director of Joint Staff J-6, said during an AFCEA DC virtual event. “We are looking to industry to help us find technological solutions to work through that.”

An example of the type of data that needs to move fast is information relating to targeting. The military has been trying to find tech to help to ensure bombs and other weapons hit their target.

The Army is working to build a “data fabric,” a system of systems that data can flow across. It’s a multi-pronged challenge that the services hopes will enable faster, multi-domain operations by having data from machines in the air talk directly with machines on the ground, sea, space or cyberspace.

The Army’s tech backbone of its multi-domain operation strategy is Project Convergence, the connect-everything-to-everything, sensor-to-shooter networking project that itself is a part of the larger military-wide Joint All Domain Command and Control (JADC2) concept of operations. It’s a nesting doll of technical systems and acronyms, but the essential challenge is to find ways to connect and use more data in operations.

The specific challenge Parker spoke about is ensuring that once platforms are connected through a data fabric, bandwidth and other limited resources are properly used.

“We need to figure out how we are sending the right data and only the necessary data,” Parker said.

What the military itself is doing to help answer that question is drawing up new cloud architectures and data management policies. The Army’s “nirvana” for cloud is a “poly-cloud environment” that provides continuous enterprise support and tactical-edge capabilities.

It’s a part of the unified network plan to integrate cloud and networking capabilities championed by the Army’s top uniformed IT official Lt. Gen. John Morrison, deputy chief of staff for the G-6. The architecture design of that network will be coming in the summer, Morrison said.

DOD Deputy Secretary Kathleen Hicks also recently released new “Data Decrees” that aim to implement the data strategy. Parker said those decrees and future policies from the deputy secretary and chief data officer will inform future standards and technical directives the J-6 and other Joint Staff offices will be putting out.

“In those, we will see reflections of those data decrees,” he said.

The pandemic accelerated digital transformation in Washington — what’s next?

For federal agencies, the COVID-19 pandemic served as a springboard opportunity to jump-start digital transformation — either scaling the transformative efforts they already had underway or rapidly pivoting during the crisis to innovate, catch up and maintain operations.

Now, as the world returns to some semblance of normalcy and agencies look to operate beyond the pandemic, the key will be to sustain that innovation and transformation at scale by creating a culture that fully embraces the shifts that occurred over the past year, said Carl De Groote, area vice president of U.S. federal for Cisco.

Ahead of Cisco’s FedFWD Summit on Thursday, De Groote spoke with FedScoop about what federal agencies should be focusing on in the post-pandemic time ahead and how they can double down and emerge from this tumultuous period as truly digital organizations.

Cisco is a big believer in the power of the platform to drive that transformation.

“Whether it be collaboration to bring expertise together with a citizen that has questions or bring the military together to collaborate around their high-value problem sets to protect our nation, whether it’s helping an analyst and national security find that needle in a haystack, platforms allow technologists and CIOs to extend those capabilities into workflows and processes, to take advantage of quicker access to data, converting it to information,” De Groote said.

Platforms are all about being “extensible, rapidly deployable, rapidly usable, and able to produce rapid results as it relates to work,” he said.

Cisco’s Webex is one of those platforms that came to the rescue during the pandemic and will likely play a key role across the federal government as agencies continue to work in a hybrid format split between in-office activity and personnel working remotely.

For Congress, De Groote said, Webex allowed the House and Senate to continue conducting hearings and writing legislation when it became unsafe to do so in-person.

“It extended the capabilities for the green and red light to be able to pass time back and forth between colleagues, to bring together all of our lawmakers, congressmen and women and senators to be able to be present, to take on topics, to learn, to vote, being able to poll and conduct things that they used to in a physical state rely on, to bring that experience to an online platform and mechanism,” De Groote said.

That’s key for the military as well, he said. “If you think about command and control in a secure environment, how do they do that successfully where they can rely on the fact that collaboration is secure? That the content they’re sharing remains protected? So, now it’s really looking at the different problem sets, the different processes and keep on extending and driving that innovation.”

And that security of the platform, it can’t be thought-about after-the-fact, De Groote said. “It’s got to be integrated, it’s got to be built into the design so that as our customers experience our platforms and conduct their work, they have the confidence that it’s going to be secure all the time.”

De Groote hopes that attendees of Thursday’s FedFWD Summit will come away with “the art of what’s possible,” he said. “How can technology serve the government and its mission, and understand that it’s not only an opportunity to deploy technology and platforms, it’s also the opportunity to reshape policy so that we can administer and deploy new capabilities and embrace change.”

DOD takes automation a step further with machine learning

Automating tasks has long been a goal of large workforces, and none is larger than the Department of Defense.

With financial management systems that process a more than $700 billion budget annually, getting a helping (digital) hand can reduce wasted labor hours and costly mistakes. But simple automation sometimes is not enough to help solve more complex challenges, like pairing unmatched transactions in databases.

That’s why the Defense Innovation Unit teamed up with the Joint Artificial Intelligence Center to inject a boost of machine learning so that robotic process automation (RPA) can approach more complex tasks, like finding mislabeled and unmatched transactions.

“The DOD has been using RPAs for several years to help fix [unmatched transactions], but RPAs are based on simple ‘if-then-else’ cases where most of the [unmatched transactions] require more sophisticated analysis, which up to now have required manual intervention,” said Eric Dorsey, a DIU program manager in the AI portfolio.

The specific problem of unmatched funds wastes millions of labor hours each year, according to the DOD. When dollars are misclassified or unmatched, humans need to go into the spreadsheets to correct the record, often a roughly two-hour job per unmatched data point. With the Army’s two million unmatched transactions per year, that’s millions of hours spent on tasks DIU is proving robots can easily do.

Most RPA bots are limited to small, repetitive tasks where operations are explicitly written into software, limiting the program’s flexibility in uncertain scenarios. But using the massive amount of financial data from the DOD’s systems, machine learning is now being used to help advance RPA’s flexibility to solve problems, saving time and money, according to the department.

“After the machine learning front end creates candidate correction, it is routed to an RPA for the final database update,” Dorsey said in an email to FedScoop.

Machine learning finds knowledge within data, learning from massive data sets to find patterns and turn them into predictable algorithms. The data generated through financial systems provide ample information for machine learning to work through and improve RPA’s ability to close the loop on certain tasks, like matching unmatched funds.

DIU paired contractors DataRobot with the Army and Summit2Sea with DOD’s Office of the Comptroller. Both are developing their machine learning platforms within Advana, the government’s secure data management cloud, according to DIU.

Time and money saved

A solicitation for the problem was initially posted in May 2020 with the request to find a machine learning platform that “will identify and suggest corrections to business processes that are not limited to previously well-defined business logic methods.”

The two pilots do not yet cover 100 percent of the unmatched transactions, but they are currently on a “high volume” of use cases.

“We are currently achieving a high level of accuracy with these use cases,” Dorsey said.

It also happening quickly. Because DIU is able to work with commercial and non-traditional vendors by navigating around the traditional contracting morass the department usually goes through, there haven’t been the typical lengthy request for proposals filled with stringent requirements and government legalese — just problem statements and requests for solutions.

Working with the JAIC, DIU helped save time and effort in building out the machine learning algorithms.

“The collective team identified the unmatched transaction use case to be highly suitable for automation while delivering significant mission impact,” Bryan Lane, chief of business and health transformation with the JAIC, said.

GSA leads rise in automation projects governmentwide

The General Services Administration has saved about 50,000 labor hours in 2021 alone by automating work.

On top of that, a dozen machine learning and artificial intelligence projects are in the pilot or developmental phase, while four more are fully operational, according to an agency spokesperson.

The projects are part of GSA‘s “eliminate, optimize or automate” effort over the last two years, an effort that’s only speeding up over time, the spokesperson said.

“We expect that the velocity of AI/ML adoption will accelerate similar to our [robotic process automation] program over the next few years,” the spokesperson said. “The various pilots and projects are on different deployment timeframes but cover all our primary mission areas including Public Buildings Service, Federal Acquisition Service, finance, IT and HR.”

One such project is the Solicitation Review, which uses supervised ML to predict whether federal IT solicitations posted to beta.SAM.gov are compliant with Section 508 of the Rehabilitation Act, which lays out IT accessibility requirements. The tool helps GSA employees more efficiently review solicitations and reduces the risk of noncompliance.

A second automation project is a virtual assistant that provides employees with IT self-help capabilities.

GSA doesn’t rely on one procurement method for AI services, instead using a number of contracts to encourage competition and equity among small and disadvantaged businesses. Such contracts are made available to other agencies as well.

Automation spans a number of technologies including ML, natural language processing, chatbots and RPA — the last of which is often the lowest-hanging fruit for agencies. The State Department, Social Security Program, U.S. Patent and Trademark Office, Department of Labor, Army, Air Force, and Navy are among the agencies that have RPA programs.

A big reason automation projects are on the rise governmentwide is the Federal RPA Community of Practice (CoP) and its voluntary leadership team within GSA, said Jim Walker, chief technology officer at UiPath, during a recent ACT-IAC event.

The government-only user group launched in 2019 and has grown to 69 member agencies and about 1,200 attendees on monthly calls.

In November, the CoP issued a State of Federal RPA report —the first detailed review of the technology across government. The report found a 110% increase in deployed automations between fiscal 2019 and 2020.

Additionally, the report found a 195% increase in capacity hours created.

The CoP created a maturity model for agencies to gauge their RPA progress and saw a 70% increase in Level 4 projects, which went from zero to five between fiscal 2019 and 2020.

While only 23 agencies participated in the first report, that number should grow with enthusiasm for RPA.

GSA Chief Financial Officer Gerard Badorrek recently oversaw a 100-day, industry-wide challenge to create RPA solutions that improved the experience for agencies submitting budget justifications. A total of 10 RPA solutions came out of the event, and 12 employees were trained in the technology.

DOD No. 2 issues new ‘Data Decrees’ for military, pushing for better data management

Deputy Secretary of Defense Kathleen Hicks issued a new memo introducing “Data Decrees” that give instructions to the Department of Defense on how to better use the military’s data for everything from back-office operations to battlefield decision-making.

The memo pushes forward the DOD’s data strategy, echoing many of the same themes from the document published in October. The memo gives specific advice on data management for the whole department and calls on senior leaders to use the DOD’s Advana platform as a central place for decision-making data analysis.

In addition to the Data Decrees, the memo requests assessments on structural changes to the department to improve data use. By July 1, a final report is due on the separation of the chief data officer role from the Office of the Chief Information Officer to make the CDO a more independent and central source for data management efforts.

“Data is a strategic asset. Transforming the Department of Defense (DoD) to a data-centric organization is critical to improving performance and creating decision advantage at all echelons from the battlespace to the board room, ensuring U.S. competitive advantage,” Hicks states in the May 5 memo.

The Data Decrees, which were teased by CDO Dave Spirk at the FedScoop IT Modernization Summit in April, include five action items for the department:

Hicks has been a staunch supporter of the DOD’s data efforts since before her confirmation as the department’s No. 2. At appearances, she has said that turning DOD into a data-centric organization is a top priority.

“As we move into an era of data, the department needs to move there too,” she told the Senate Armed Services Committee during her confirmation hearing in February.

Spirk, who helped craft the decrees, said they are a critical part of implementing the strategy. He heaped praise on Hick’s empowerment of data, saying she has not only offered her personal support to the efforts but also that of powerful groups like the Deputy’s Management Action Group to help push initiatives.

“This is a key step in our transition to a data-centric organization and will help empower our warfighters with the data-enabled capabilities they urgently need,” Spirk said in an email to FedScoop.

The audience for the decrees goes beyond just the department, also including the industrial base and allied militaries, which Spirk says will be a critical part of the data ecosystem. The idea is to not have a “common data standard” but to have a federated approach where leaders across organizations can have tailored data sets that still remain interoperable across the department.

“Every leader is a producer, consumer, and steward of data in the missions they perform. For decades, the Department has pursued elusive common data formats and standards to no avail,” Spirk said.

USPS considering 30 edge AI applications to automate mail processing

The U.S. Postal Service is considering about 30 artificial intelligence applications for the Edge Computing Infrastructure Program (ECIP) developed in 2020.

Apps using optical character recognition (OCR) to streamline imaging workflow, automatically checking if packages have the right postage and deciphering damaged barcodes could all launch before summer.

In three weeks, USPS Senior Data Scientist Ryan Simpson and six NVIDIA architects designed deep-learning models capable of analyzing the billions of images generated by processing centers equipped with edge AI servers. Not only can the distributed edge AI system’s seven algorithms process 231 packages a second, but one is even capable of reverse image searching the 100 million packages USPS sees daily.

“Missing packages that used to take eight or 10 people several days can now be tracked down by one person in a couple of hours,” an NVIDIA spokesperson told FedScoop. “ECIP also enables rapid application deployment; new capabilities, that previously would have taken months, can be deployed in as little as two weeks.”

ECIP runs on the NVIDIA EGX platform across 195 USPS sites after most of the necessary hardware was finished in August, and already added a second computer vision app. NVIDIA’s Triton Interference Server functions like an automated digital mail person delivering AI and machine learning models when and how each of the sites needs them.

The forthcoming OCR use case will live as a deep-learning model in an ECIP container managed by Kubernetes and served by Triton, rather than requiring standalone IT infrastructure or a public cloud service.

“The models we have deployed so far help manage the mail and the Postal Service — it helps us maintain our mission,” said Todd Schimmel, manager of letter mail technology at USPS, in a statement.

But various USPS components ranging from enterprise analytics to finance and marketing have proposed the nearly 30 ECIP apps currently being considered.

AI apps require the real-time computing ECIP affords for large amounts of data.

“Because edge computing processes data locally, instead of in the cloud or a data center, it minimizes latency and bandwidth needs allowing for real-time feedback and decision-making,” said the spokesperson. “Edge AI solutions also provide greater security protections.”

ECIP and the apps being developed for the system play into USPS’s broader effort to make better use of the data it collects to both improve its efficiency and save taxpayer dollars.

Despite the progress USPS has made, edge AI remains a nascent technology.

“Every day, people in our organization are thinking of new ways to apply machine learning to new facets of robotics, data processing and image handling,” Schimmel said.

Agencies sharing AI use cases after December executive order

The executive order on trustworthy artificial intelligence issued by President Trump in December has encouraged agencies like the Department of Veterans Affairs to share best practices.

The VA’s National AI Institute is working with other VA components, like the Data Governance Council and Veteran Engagement Board, as well as outside agencies to create an AI use case catalog, said Gil Alterovitz, director of AI at VA.

The trustworthy AI executive order set the process in motion by requiring the retirement of AI applications that didn’t meet a set of minimum standards and setting deadlines for inventorying and sharing agency use cases.

“It’s really enabled agencies to learn from each other,” Alterovitz said during the SNG Live: Enhancing AI in Government event presented by FedScoop. “In interacting with other agencies through different councils we’ve been able to learn about and share different AI use cases.”

VA is further looking to pilot a set of modules that can be added to an internal review board on AI, Alterovitz said.

Before researchers build AI models, they’ll go through a voluntary checklist for planning purposes. The checklist builds on the work of VA’s National Center for Ethics in Health Care and the Food and Drug Administration and will encourage safeguarding research participants and veterans’ data, as well as training data to eliminate bias.

VA developed an initial AI module to assist its hundreds of medical centers nationally with COVID-19 individual risk prediction by analyzing morbidity and mortality data over time. Explainable AI was leveraged to help patients understand their risk of illness.

New, post-hospitalization data is being fed into those statistical models for additional insights to inform treatment decisions at a dozen rural and urban pilot sites, Alterovitz said.

Space Force wants to become the first true ‘digital service’

The Space Force, the newest branch of the military, wants to take advantage of its institutional youth and become the first “digital service” in the military.

The goal is to have its members — dubbed guardians — be digitally fluent and have their space operations revolve around being “an interconnected, innovative, digitally dominant force,” according to its recently published “Vision for a Digital Service” document.

Chief of Space Operations Gen. John Raymond said it’s a necessity that the Space Force be digital, as conflicts in space will involve operating high-tech satellites, not the physical combat typical in other military services. Force leaders have expressed this notion often in the past, but this is the first time they’ve put it into doctrine.

“Becoming a Digital Service is more than just a generational opportunity; it is a warfighting imperative,” Raymond wrote in the document’s introduction.

The force’s goal is to use digitally native technology and modern data management practices to ensure that the military can keep up with emerging threats in space. The document highlights the importance of initiatives already underway, like training guardians in digital technology, as well as new approaches to data management that will allow the force to share more and leverage new technologies like artificial intelligence.

“We need to leverage information and data to accelerate our ability to develop, field, and operate joint space capabilities with unparalleled speed and ruthless proficiency,” the document states. “We must exploit digital solutions to thrive and adapt within a hostile, complex, and dynamic environment that is inherently more bound to—and driven by— technology than any other defense domain or mission set.”

The force has already taken steps to beef up its data management and security practices. It inked a deal with Palantir to fuse its space data in April 2020 and another in September with Xage Security to protect it. It’s a trend likely to continue.

The service is also staying lean, introducing only three major components within its organization chart for personnel, strategy and operations.

“While some could regard the relative leanness of our Service as an obstacle to success, it may in fact be one of our greatest strengths,” the document states. It’s a point that congressional overseers want to hear, as many have suggested that the Space Force should not take on new bureaucratic bloat as it stands up and integrates with the rest of the force.

Another new digital-native workforce feature of the Space Force is allowing guardians to be “digital nomads,” employees that can work from anywhere, enabled by collaboration technology. While it’s unlikely guardians will start buying vans, dawning their uniforms in the mornings to work on a laptop next to a national park like the nomads portrayed in cinema, the force believes a culture that uses digital tools to the max will support its vision of becoming a digital force.

“Furthermore, the USSF must support a world in which we are no longer bound to a single physical location. This can give the USSF the flexibility to have Guardians operate virtually as ‘digital nomads,’ seamlessly supporting a variety of missions from a range of locations as part of an intrinsically mobile force,” the document states.

New report: Align open data, open source and cloud policies for maximum value

For the last few years, federal data leadership has been moving in an “open” direction. Federal policies and legislation have directed agencies to make their data more open, to use open source solutions, and to use the cloud to manage and publish data more efficiently.

But the potential of these new initiatives has not yet been realized. We need more resources, leadership, and policy alignment to make government information more available across agencies and to the public; to ensure that government decisions are driven by evidence; and to increase the efficiency and lower the costs of government operations.

A new report from the IBM Center for the Business of Government, in collaboration with the Center for Open Data Enterprise (CODE) analyzes “open” government policies and shows how they can be made more effective. The report describes current government policies, shows how they complement each other, highlights implementation challenges, and recommends improvements.  It focuses on four policies: the Foundations for Evidence-Based Policymaking Act (Evidence Act), the Federal Data Strategy, the Federal Cloud Computing Strategy, and the Federal Source Code Policy – and describes the resources, leadership, and alignment needed to implement them effectively.

Resources

Funding and other resources are critical to modernize systems, establish  data governance, build technical infrastructure, and implement projects like technical upgrades and data sharing and migration. Since agencies differ in their needs and capacities, resources should be spread strategically so that all agencies can modernize in ways that make the most sense for their particular needs. The report recommends that the Federal government:

Leadership

Changing culture at Federal agencies requires strong, consistent leadership over time. Leadership at the Office of Management and Budget, the Chief Information Officer (CIO) and CDO Councils, and agency CIOs and CDOs should focus on aligning efforts and finding synergies across the domains of open data, open source, and cloud. CIOs and CDOs should play complementary roles in this effort. For example, CIOs can focus on systems and technology while CDOs work on data governance. The report recommends that the Federal government:

Policy Alignment

Beyond efforts to align the source code policy with open data policies, Federal policies governing open data, open source, and cloud adoption have generally proceeded on separate tracks. Currently, CIOs are very engaged in Evidence Act and Federal Data Strategy implementation—which provides useful leverage to increase data sharing at agencies—but CIOs should also consider how the two imperatives can help advance cloud and open source goals. A unified approach is needed to align open data, open source software, and cloud adoption as interrelated pieces of a larger puzzle. The report recommends that the Federal government:

High-level Recommendations

In addition to these specific recommendations, the report includes five high-level recommendations that cut across all areas of implementation. Federal leadership and agencies should:

As agencies continue on their technology modernization journeys and work to implement laws like the Evidence Act, leaders will have many decisions to make about moving to the cloud, using OSS, and embracing open data. They will discuss these topics virtually and — hopefully soon — in person at events like Think Gov, IBM’s upcoming look at how technology is helping the government adapt, respond, and achieve its mission in a rapidly changing world. By focusing on resources, leadership, and alignment, they can help ensure that  their efforts will have the desired result.

Joel Gurin is President and Matt Rumsey is Research and Communications Manager of the Center for Open Data Enterprise (CODE), a nonprofit organization based in Washington, DC. They can be reached at joel@odenterprise.org and matthew@odenterprise.org. The IBM Center for the Business of Government published this report with support from RedHat.