NGA chief technologist Mark Munsell to retire
Mark Munsell, CTO of the National Geospatial-Intelligence Agency, will retire with 20 years of federal service early next month.
Since February 2019, Munsell has led NGA’s forward-looking technology portfolio and development as the CTO, helping in particular to build out the agency’s geospatial technology ecosystem at its secondary headquarters in St. Louis. He is credited with publishing NGA’s first Technology Strategy.
He will step down from the role Nov. 9.
Munsell has been a big advocate for attracting talented technologists — both inside and outside of government — to work with NGA.
“What does it take to meet future mission with future technology? It requires technologists who understand the mission and how GEOINT is evolving to meet the mission. Driving future mission requires that technologists are leading the GEOINT mission,” Munsell wrote in his technology strategy. “NGA is the world’s leader in geospatial intelligence because of the passion held by our people to deliver timely and relevant information to members of the national security apparatus, policy makers, and international partners.”
During his time at NGA, Munsell served also as deputy director of the agency’s CIO and IT Services Directorate. His tenure began in 1996, when NGA was previously referred to as the Defense Mapping Agency. He spent a few years in the early 2000s as a flight and maritime planning contractor before returning to NGA in 2006.
NGA’s CTO role in an evolving one, created in 2018 to “focus on increasing the agency’s ability to instill next-generation technology, tradecraft and innovative approaches to the geospatial intelligence community, enterprise and mission.” The agency said it will announce Munsell’s replacement in the coming months.
NOAA exploring artificial intelligence pilots with Google Cloud
The National Oceanic and Atmospheric Administration plans to improve weather forecasting by more effectively using satellite and environmental data in a series of pilots with Google Cloud.
Google entered into a three-year other transaction authority (OTA) agreement with the National Environmental Satellite, Data, and Information Service (NESDIS) to explore machine learning and artificial intelligence applications for not only weather but environmental monitoring, climate research and technical innovation, it announced Tuesday.
Together NESDIS and Google will develop small-scale ML and AI systems and use results from those pilots to build full-scale prototypes for operationalization across NOAA.
“Strengthening NOAA’s data processing through the use of big data, artificial intelligence, machine learning, and other advanced analytical approaches is critical for maintaining and enhancing the performance of our systems in support of public safety and the economy,” said Neil Jacobs, acting NOAA administrator, in the announcement. “I am excited to utilize new authorities granted to NOAA to pursue cutting-edge technologies that will enhance our mission and better protect lives and property.”
Like most agencies, NOAA has experienced an uptick in the volume of its datasets, in this case geared toward the environment. ML and AI systems could help better predict extreme weather events like tornadoes and hurricanes.
The number of pilots is to be determined, but they’ll offer NOAA employees hands-on training to improve ML and AI skills. NOAA released an AI strategy in February that emphasized applying the emerging technology to its mission priorities.
Google engineers and data scientists have already explored weather prediction including hyperlocal precipitation forecasting, flood forecasting in India and Bangladesh, and related computational methods.
Air Force seeks companies that are shooting for the sky, not necessarily the moon
The Air Force is looking to open another pot of money up to companies that are ready to think big, but not too big.
The department wants to shift some of its “deep tech” acquisition money to what its top procurement official is calling “skyshots” — ideas short of a full moonshot. Funding will come through the AFWERX development program.
Will Roper, assistant secretary for acquisition, technology and logistics, says the Air Force is interested in ideas that could be game-changers but take years to mature. In this case, the goal is not to acquire specific tech, he said during an online “Ask Me Anything” event hosted by the AFWERX, the Air Force’s tech incubator based out of Austin and Washington, D.C.
“What that means is you have a direct relationship with us,” Roper said, speaking directly to interested companies. “You don’t have to master defense contracting and procurement … you simply have to master your technology.”
AFWERX has not settled on any specific areas for the skyshot funding, Roper said. He posted a poll on Twitter seeking input from the public on what topic to potentially choose. Possibilities include new ways to network sensors in battle, future energy sources and anything related to artificial intelligence. Ultimately, the broader goal is to have companies bring the Air Force ideas it is not even thinking about yet, he said.
“A white paper where you explain your technology is enough to start the relationship with us,” he said.
The program will be similar to other emerging technology programs the Air Force runs, like AFVentures and Agility Prime, its flying-car initiative.
Roper also stressed that the Air Force will not try to own the intellectual property of companies that choose to work with it. One of the goals is to “de-risk” the work of building prototypes that fit the Air Force’s needs, he said.
Roper had jokes, too. He implored companies to “swipe right” on working with the Air Force, saying that he hopes the department’s tech acquisition initiatives are “finally creating a good dating app for us.”
How agencies can modernize their data warehouse infrastructure
Federal agencies are faced with several daunting challenges exacerbated by the COVID-19 pandemic, that are causing them to take a second look at their IT investments and data strategies.
Among them, revamping data warehouse projects could be a good way to modernize and save their agency budget dollars to reallocate to more pressings needs.

Read the full report.
According to a recent Qlik report, a modern approach to data warehouse projects can address several key challenges, including improved cloud interoperability, replacing brittle legacy systems and end-of-life technology and addressing staffing constraints within the IT department
“This is a good opportunity for your teams to review technologies and choose a more modern, responsive approach,” says the report, “Data Warehousing Accelerated.”
There is often a reluctance to adopt new techniques within organizations that are invested in traditional data warehouses. However, modern solutions can bring automation to the processes and replace manually-coded, time-consuming and error-prone tasks, according to the report.
Today, more than ever, government agencies need to adopt greater agility to respond to mission needs. Data warehouse infrastructure is still critical for some agencies, and it will need to to keep up with changing data requirements so agencies can reduce siloes of information.
The Qlik Compose solution can bring modern capabilities across the data warehouse lifecycle, according to the report. That includes:
- Faster requirements gathering, warehouse design and creation with the ability to interpret existing logical data models and automatically generate physical schema in the relational database. The interface allows IT teams to adopt any or all of the different data warehouse styles as they design their warehouse.
- Automation of repetitive, manual tasks of creating, maintaining, testing and debugging data warehouse environments and code generation so incidences of typos and programming errors are dramatically reduced.
- Easier migration from development, test and user acceptance environments to production environments. The solution supports managing global variables for each environment, integrates with version control systems and offers both graphical and command-line interfaces to facilitate moving between environments.
- Change management capabilities so teams can adjust data relationships and transformation rapidly with automation.
Learn more about modernizing your data warehouse infrastructure.
This article was produced by FedScoop for, and sponsored by, Qlik.
CISA’s Krebs: The US will win 5G race because ‘we own the cloud space’
Many projections place China ahead of the U.S. in the development of 5G wireless networks. But according to Chris Krebs, director of the Cybersecurity and Infrastructure Security Agency, the U.S. actually has the advantage in this global race because of existing cloud-based infrastructure.
“Really, when you think about the advantage of 5G, it’s all about moving data. It’s all about massive communication, which really translates to cloud,” Krebs said Tuesday during a panel at ACT-IAC’s annual ELC conference. “And who does cloud better than the United States of America?”
The development of 5G will be critical for federal agencies — particularly the Department of Defense and the military — as they look to move more IT and computing resources to remote, edge environments. The Pentagon has been experimenting with testbeds on military bases to support different 5G use cases. Elsewhere, federal agencies are beginning to build the technology into next-generation IT contracting vehicles.
Krebs continued: “We really own the cloud space. So we should be able to have that next generation of technology that is going to be just built on top of cloud to get us the diversity in an open, competitive global market of trustworthy, dependable componentry.”
Amid the battle with China for 5G dominance, the U.S. government has banned use of any 5G technology developed by Huawei. And now many other Western nations are following suit, worried that the Chinese government can use Huawei devices to spy on their sensitive networks.
Because the U.S. got an early start putting its foot down against Huawei and other technology firms thought to be linked to the Chinese government, the federal government hasn’t had to suddenly “rip and replace” its networks like many European countries, Krebs said.
Krebs also touted America’s cloud leadership as the reason why the U.S., in both the public and private sectors, has been so resilient during the COVID-19 pandemic. He said cloud providers have long been “preparing us for this moment, whether you knew it or not, because of fiber,” referring to the infrastructure they’ve been developing over the past decade and before.
“When you look at some of our European counterparts, they still have a whole bunch of copper throughout their networks. They were not able to expand as well,” Krebs said. “And most importantly, the agencies and the organizations you saw that were truly cloud-ready, those were the ones that made the more seamless transition into the work-from-home model.”
5G will require Air Force to rethink its networks, chief of staff says
While the Air Force continues to play a role in developing 5G wireless technology for the military and economy in general, it will need to rethink its own network configurations to prepare for the related upgrades in speed and capability, according to the force’s top uniformed officer.
Chief of Staff Gen. Charles Q. Brown said IT offices will “most definitely” need to rework networks to be more enterprise-focused and field the enhanced bandwidth and lower latency that 5G promises to bring.
The technology will enhance base operations and also be a warfighting tool to help the force send data through the network-of-networks system called Joint All Domain Command and Control (JADC2), Brown said Tuesday during a virtual National Defense Industry Association event.
“We have a hodgepodge of a network we are going to actually have to reconfigure and redo to make sure we have the digital backbone” for JADC2, Brown said.
Several Air Force bases currently host 5G “test beds” where private industry is developing network prototypes on bases where regulations are more lax. Beyond just 5G, Brown’s goal is to have networks that can withstand technological change for years to come.
The most critical part of using 5G and other new network capabilities is having a broader enterprise-driven architecture that will allow for the interoperability of systems and data across the force. Brown said he recently met with top IT officials to prioritize building a more cohesive enterprise network that will be ready for 5G and emerging technologies in the future.
“It is going to require us to be a bit more enterprise,” he said.
IT enabling battlefield ops
Brown said that a continued push for an enterprise-IT-as-a-service model for its back-end tech will help grow the force’s future warfighting systems, like JADC2. The digital backbone also will be foundational to artificial intelligence-driven concepts by allowing the force to easily share data across the domains of war, from air, land, sea, space and cyber. None of the AI or other emerging technology-based warfighting systems can flourish without the foundational IT supporting its development with interoperability and data sharing, Brown said.
Brown said he recently sat down with the top IT officials to prioritize enterprise network construction and services. Another key part of the network he wants to emphasize is the Air Force’s approach to cybersecurity. Other tech leaders have stressed the importance of zero-trust architecture, something that Brown alluded to but did not directly mention.
“Who are we protecting our network from? Me — because I can’t operate in it — or our adversary?” he said, repeating a joke he says he tells his staff when he gets locked out of the network or slowed down by security protocols.
An aspect of security Brown wants the Department to focus on is migration to secure cloud-based system and storage. While the Are Force is not alone in this push, the department so far has been at the leading edge of with its own cloud service and software development platforms such Platform One, as well as DevSecOps initiatives.
State Department deploying SD-WAN with $711M EIS contract to MetTel
The State Department plans to deploy a software-defined wide area network (SD-WAN) supported by a $711 million task order awarded to MetTel.
A New York City-based small business, MetTel will provide a fully meshed and managed layer-3 multi-protocol label switching (MPLS) network using a virtual private network (VPN), it announced Tuesday.
The 13-year task order was awarded under the $50 billion Enterprise Infrastructure Solutions (EIS) contract as the State Department looks to modernize its IT and telecommunications infrastructure ahead of the Networx contract expiring March 31, 2023.
“MetTel has a demonstrated track record of successfully transforming legacy networks with a fully-managed service to achieve increased performance and access next-generation capabilities,” said Robert Dapkiewicz, senior vice president and general manager of MetTel Federal.
The State Department’s Telecommunications, Wireless and Data (TWD) Services Division manages equipment and maintenance of unclassified voice and data telecom. A diverse wide-area network will support the division’s work by integrating network connectivity across the department’s locations in the U.S., Virgin Islands, Puerto Rico, Hawaii, and Canada.
Additional foreign sites may be added as the department transforms its network into a contractor-managed, hybrid SD-WAN overlay.
MetTel will also provide data, ethernet transport and Internet Protocol services, which State Department officials can monitor via its EIS portal.
The company has now secured a dozen EIS task orders in 2020 from six Cabinet-level agencies and six smaller agencies and Native American tribes exceeding $1.3 billion. Other agencies MetTel is working with include the Department of Defense, Department of Homeland Security, General Services Administration, and Social Security Administration.
CBP’s supply chain efforts are screaming for AI
U.S. Customs and Border Protection wants to apply artificial intelligence to the ingestion and analysis of increasing amounts of data coming out of its efforts to secure the U.S. supply chain.
CBP needs to analyze data earlier along the supply chain because currently, it’s getting involved too late after products have been manufactured and already begun international transit, Vincent Annunziato, director of the agency’s Business Transformation and Innovation Division, said during ACT-IAC’s Reimagine Nation ELC 2020 on Monday. The agency is responsible for ensuring importer and exporter compliance with laws and regulations that prevent harmful or counterfeit products from entering or exiting the U.S.
The volume of supply chain data CBP is dealing with has skyrocketed since the agency started piloting blockchain to secure various industries like steel and oil, and only AI and machine learning can make sense of it all.
“All of this now is starting to play into that AI and machine learning arena because, one, we’re getting data that we’ve never seen before,” Annunziato said. “Two … the government is going to look into designing a system that’s flexible for the data that’s coming in so that, even if you don’t have all the appropriate data at the time that you submit it, you can update it as you go along.”
CBP began piloting blockchain to validate mill certificates, which are documents detailing the chemical breakdown and grade of steel, as well as whether open-market oil is USMCA-certified. More recently the agency announced blockchain pilots that will enhance its visibility into the food, e-commerce shipment and natural gas supply chains while making validation paperless.
AI could help with the categorization of products based on photos or text descriptions while avoiding errors that currently require CBP personnel to delete entries and start from scratch when they’re made, Annunziato said.
Longer-term, smart AI devices could alert CBP to, say, a refrigerated container that has been tampered with.
“They’re starting to come online,” Annunziato said. “But they’re not mature enough yet,”
After months of delays, VA launches modernized Cerner EHR at first medical center
The Department of Veterans Affairs reached a major milestone this weekend launching its modernized electronic health record system at a VA medical center in Spokane, Washington.
The VA brought the system online at its Mann-Grandstaff Medical Center with dozens of modernized interfaces and new tools for medical workers to use, along with a new IT infrastructure that will eventually support the system’s broad interoperability with the Department of Defense’s health records.
The VA’s EHR modernization program is a 10-year endeavor to update the aging Veterans Health Information System Technology Architecture (VistA) and migrate VA and DOD data to a common Cerner-built cloud system.
“I couldn’t be more proud of the team and the way we continue to meet our objective,” John Windom, the executive director of the program, told FedScoop in an interview. “The stars are aligned to support this mission.”
The program’s go-live comes after two delays in 2020 — the first caused by a lack of training and testing and the second due to the pandemic. Also, officials said the facility was impacted by wildfires in the Pacific Northwest late in the summer and a recent snowstorm that brought power outages.
With those delays came many lessons learned, said Laura Kroupa, chief medical officer for the VA’s office of EHRM. “We have learned the value of contingency planning.”
But, the delays also allowed the VA to increase the number of new interfaces and tools over what was planned for the initial launch date in March and allowed for more time to train medical staff on the new system.
Windom and other program leaders who were at Mann-Grandstaff for the launch reported that users expressed excitement for the new program.
“We are getting positive comments,” Kroupa said. She added there are some “usual jitters” among the staff about using the new technology, but overall, feedback on the system has been positive.
A congressional aide who spoke with FedScoop shared minor concerns about the user interface, saying there are some issues that could “annoy” users, but nothing that risks patient safety. The legacy VistA system has contributed to long patient wait times that deteriorated safety and patient care, inspector general reports found.
John Short, the chief technical officer for the program, said the VA’s large investments to upgrade the underlying IT are supporting the new EHR system well. After years of underinvesting in IT, VA Secretary Robert Wilkie previously told Congress the department needed to rip-and-replace much of its wiring and basic IT infrastructure to support the new system.
Short said that so far, the only endpoint that didn’t work properly was a printer that needed a quick update. “All interfaces worked a designed,” he added.
As the VA celebrates this first go-live, it is turning to the next set of medical centers where the new system will be launched.
First up is a VA center in Walla Walla, Washington. The center is much smaller, but in much worse technical state, the congressional aide told FedScoop.
“It will take months and months to get ready for the Walla Walla go-live,” the aide said, pinning the VA’s preparation for the next launch at 5% complete.
Windom, however, remains optimistic about the work ahead. Training continues on virtual platforms and feedback from users in Spokane will inform the work being done at other sites.
“Our people are smart, they are going to get this,” Windom said. “We even expect the [next go-lives] to be even more efficient because of what we are doing here at Mann-Grandstaff.”
Agencies starting DevSecOps can access new ATARC code repository
The nonprofit Advanced Technology Academic Research Center plans to help federal agencies start DevSecOps practices with a source code repository announced Monday.
The GitLab platform agreed to provide the ATARC DevOps Working Group access to its technology so teams can collaborate using source code management.
The working group’s DevSecOps Project Team will create a continuous integration, continuous delivery (CI/CD) software pattern — which leverages automation during development, testing and deployment — that agencies can use as they begin DevSecOps.
“The end-state of this code repository will hold one, two or more working code snippets for each
CI/CD DevOps pattern,” said William Schwartz, a senior DevOps engineer with the Internal
Revenue Service, in the announcement. The code examples will enable agencies to “implement their own instance of the standard CI/CD pipeline template,” he said.
Experts say most federal agencies remain in a “waterfall” mindset, where security is tacked onto the end of software development, rather than fully integrated in the process. With that in mind, the National Institute of Standards and Technology wants to develop a DevSecOps framework for government.
Areas within the ATARC repository include:
- Stages of the CI/CD pipeline development.
- Managerial processes and theories.
- Technical tools and applications.
The ATARC Software Factories initiative in April 2019 began preliminary work on the CI/CD pipeline included in the repository.
Tools and apps in the repository will include those used by agencies and industry for software development and delivery.
The DevOps Working Group uses an industry-standard branching strategy called GitFlow that allows the team to maintain a “production-worthy” codebase while providing branches for development, testing and debugging work.