Air Force wants cyber experts to ‘make a living’ off hacking its tech
The Air Force plans to offer more of its systems as fodder for freelance cybersecurity researchers.
Will Roper, assistant secretary of the Air Force for acquisition, technology and logistics, said Thursday he wants to have enough bug bounty programs for civilian hackers to “make a living” of finding flaws in the service’s technology.
The department will be supporting the “Aerospace Village” at the upcoming DEF CON conference — held online this year, instead of in Las Vegas — where satellites will be up for grabs for white-hat hackers.
Last year’s hacking conference provided Roper inspiration on how the military can work differently with hackers, he said during a virtual press conference Thursday. During DEF CON, the Air Force will be hosting a “Hack-a-Sat” event in partnership with the Defense Digital Services, the civilian “SWAT team of nerds” in the Pentagon. The event will offer up to $50,000 for the grand prize with other smaller prices, and it will be one of many, Roper said.
“It’s an asset that our nation has that we have not leveraged in a smart way,” he said during a virtual press conference. The military has, however, shown a consistent interest in the concept.
Previously the Air Force has served up its public-facing websites for “Hack the Air Force” events, giving out more than a $100,000 to 30 researchers in 2018. Other “Hack the Pentagon” events also used .mil websites for testing.
In 2019, the Air Force added a Fast Track Authorization to Operate (ATO) for cybersecurity firms to do deeper penetrative testing on networks. The ATO gave the department the ability to use white hat hackers on more than just websites and continuously test systems more frequently than hackathon events every few months.
Continuing beyond bug bounty programs, the service also used the new authorities to signed a blanket purchase agreement in February to work with outside firms to penetration-test its IT networks.
Roper wants more than just new contracts and more authorities, he said he wants to see the Air Force and new Space Force as places hackers want to contribute their skills and collaborate on the design process from the start.
He said the force needs to “shift our posture” and “flip the script” on the old ways of thinking that hackers couldn’t be trusted or used for military systems.
“We are trying to first be a valuable member of the community,” Roper said. To do that, the department will be putting “meaningful activity on the table.”
The department has its own in-house cyber office, Cyber Resiliency Office for Weapons Systems (CROWS), but the services doesn’t have enough officers to cover all the technology being acquired and developed. As the force develops emerging technology, like artificial intelligence and the network-of-networks Joint All Domain Command and Control (JADC2), hackers will be drawn into the design process itself.
“I think there is huge potential for this,” Roper said.
As DCSA surpasses background investigation goal, is Trusted Workforce 2.5 likely?
The Defense Counterintelligence and Security Agency could be ready to take another step in reforming the government’s personnel vetting process, a top national security official said this week, now that the agency has surpassed its target for reducing the backlog in background investigations.
An “aggressive” effort has led to a nearly 17% increase in processing of new security clearances, year-over-year, said Bill Evanina, director of the National Counterintelligence and Security Center. That progress has led DSCA to consider what else it can do to maintain momentum, he said.
DSCA set a goal in December to reach a “steady target state” of 200,000 cases pending — a number that would represent a significant improvement in the speed for hiring workers — like many information technology specialists — who need security clearances. The agency is ahead of the goal.
“We are now approaching 180,000 [cases] in the inventory backlog right now, which is unheard of — probably haven’t seen those numbers for a decade,” Evanina said during an Intelligence & National Security Alliance conversation Wednesday.
That number is down from about 725,000 cases in April 2018.
Evanina credited the “almost seamless” transition of the Office of Personnel Management’s National Background Investigations Bureau into DCSA within the Department of Defense for the improvement.
Both DCSA and the Office of the Director of National Intelligence manage continuous evaluation systems (CES) that automate record checks. ODNI’s CES automatically flags records in seven federally required data categories, and the office had enrolled about 300,000 employees and contractors across 26 agencies as of January.
The CES technology is the cornerstone of the Trusted Workforce 2.0 vetting reform effort, the first IT-fueled overhaul of the process.
Trusted Workforce 2.0 not only aims to reduce the background investigation backlog but introduce new investigative standards and adjudicative guidelines expected later this year for 2022. A position designation tool and electronic applications are also in the works, while DSCA develops artificial intelligence to further expedite the clearance process. Some officials see a future where the process reaches unheard-of speeds: Security clearances should be issued in three days, said Rep. Will Hurd, R-Texas, at the Dell Technologies Forum in September.
The Trusted Workforce 2.0 executive steering group meets again Friday.
“Now might be the time to enhance 2.0 and say, ‘What would 2.5 and 3.0 look like?’” Evanina said.
DIU seeks one form of automation (ML) than can help another (RPA)
Think of it as machines helping machines: The Defense Innovation Unit wants a machine learning platform that can boost the Pentagon’s existing uses of robotic process automation (RPA) for business tasks.
The goal of the Silicon Valley-based agency’s solicitation is to help nudge Department of Defense RPAs into more complex problem-solving territory by providing pattern recognition and instructions on how to adjust automation to fit changing scenarios.
“The ML platform will identify and suggest corrections to business processes that are not limited to previously well-defined business logic methods,” according to the solicitation.
The DOD has sought to expand its use of RPAs to reduce some of the tedious work many employees are still required to conduct manually. Current use cases are limited to narrow, “well-defined” tasks, the department says, but it wants machine learning to help automate “less-defined” problems like finding abuse or fraud in finical systems, according to the solicitation. The platform will integrate with current RPA technology and be used for data management and algorithmic training.
Machine learning, a type of artificial intelligence system that trains computers to make inferences from large data sets, can help by identifying corrections and fixes to automation that gets stumped on less-defined tasks. The advantage of machine learning is that computers can detect subtle changes that would get lost to the human eye.
The appeal of RPA is simple, DOD officials say.
“We all generally have more work than we have time to do,” Rachael Martin, the Joint Artificial Intelligence Center‘s mission chief for intelligent business automation, augmentation and analytics, said during an April webinar.
Martin said the JAIC is working to help DOD components adopt RPAs. The center is helping coordinate policy and technical solutions for parts of the military to use in its own problem sets, she said in April. Many of the current use cases are being tested in DOD support agencies, she said.
DIU is not looking for a cloud service provider or new RPAs, just a platform that will simplify data flows and use open architecture to leverage machine learning, according to the solicitation.
How agencies can master their data to deliver better services
Deadlines to meet the Federal Data Strategy 2020 Action Plan are approaching quickly. Federal agencies must prioritize actions to establish processes, build capacity and align efforts that leverage their data to better serve the public. The first step of the action plan requires federal leaders to identify the agency’s data needs by September 2020.
While many large agencies started down this path before the release of the mandate, small- to mid-sized agencies are struggling to meet the same requirements with less budget, talent or tools.

Sherry Bennett, PhD., Chief Data Scientist, DLT and Michael Anderson, Chief Strategist, Public Sector, Informatica
As we talk to agency leaders and chief data officers (CDOs), it’s not that data management technology isn’t readily available. The problem CDOs and program leaders face is putting all the pieces in place to develop a coherent data strategy and having the means to fully assess the state of their data resources.
First steps for holistic data programs
Perhaps the most important action agencies can take right now is integrating their CDOs into the lines of business. This will ensure the CDO understands what data is essential to the organization’s core functions and operational strategies — and respond more effectively as they tackle step one of the action plan to identify what data their agency has in all its systems.
Putting CDOs at the table with program leaders will afford them a stronger position to implement the federal strategy over the next 10 years. Agencies are required to spell out specific strategies, governance and policies to manage their data more successfully. They also need to add iterative tasks to their processes, including inventorying and cataloguing their data.
This means answering questions like – where is the information located? Is it in systems that program leaders aren’t aware of, or don’t have access to? Most organizations we talk with face the same challenge — data is replicated across multiple systems, so it is difficult to know which is the authoritative source.
Many CDOs and program leaders may not know where to start. Hence, initiating a very specific, high-value project that supports the agency’s strategic priorities can help them demonstrate the value of the data to the rest of the operation by starting small. The value of these projects will compound over time and secure more buy-in from leaders across the enterprise.
Imagine how much better leaders can perform when they have solid evidence that’s based on clean data and that is collected to the extent leaders need — whether it be on policy, activities, actions, regulations or operations. And with so many new varieties of data — text, images, video, and more — when the data is managed properly, it can be leveraged to enhance decision-making.
Take an example of an agency considering a digital transformation project on its financial system. If the CDO and program manager collaborate early on concerning the data need from the financial system, and complementary information can be integrated from other resources, leaders will have a clearer view of operations to realize cost-efficiencies and optimize resources across the agency.
Getting to the bigger picture
Assessing the data is only part of the task. The quickest way to deliver specific, high-value data projects is to help project owners demonstrate how they are meeting these critical core functions with data that can back them up. So, ensuring processes are in place to keep data clean and properly cataloged has never been more important.
A good place to start on an initial data project is at the governance structure — making sure rules, workflows and a common lexicon are in place to manage data over its lifecycle. Even as projects begin to deliver improved results, a lot of tuning will still be required to refine guidelines for ETL (extracting, transmitting, load) tools and to decisions on how to integrate data from different systems. Governance takes on added importance in defining automation rules, establishing metadata standards and integrating the components of an enterprisewide data management system.
Governance will be pivotal tool as agencies develop artificial intelligence capabilities. AI needs quality data at massive scale. AI won’t deliver the desired outcomes — or potentially worse, it will deliver faulty outcomes — without a solid governance framework and data at scale.
A fully capable data management system should allow agencies a platform to master their data. With a centralized solution, program leaders will have the ability to manage governance and cataloging rules, maintain data quality and effectively extract, integrate and analyze information across all systems. A master data program also brings data together with a single source of truth, allowing a trusted 360-degree view of a person, place or thing regardless of the number of sources and types of data.
Being able to master data gives agencies the potential to empower better decision-making across the agency, improve services for citizens and operate more efficiently at scale.
Michael Anderson is the Chief Strategist, Public Sector at Informatica. He has over 27 years of experience in executive leadership and strategy for federal government organizations.
Sherry Bennett is the Chief Data Scientist for DLT. She has more than 25 years of data science and IT experience, working with public sector organizations on emerging innovations, data analytics, AI and machine learning strategic initiatives.
Learn more about data architecture and management to deliver better insights into the mission.
Northern Command calls upon Palantir, Apple and others to bring new tech to coronavirus fight
The military’s Northern Command says it is using new apps developed with Silicon Valley’s help to allow hundreds of personnel deployed at health care facilities across the country to communicate directly with commanding Gen. Terrence O’Shaughnessy.
The technology represents a completely new way for military personnel in the field to provide information directly to the top, O’Shaughnessy says, and the process is putting new command and control ideas into practice. The companies working with NORTHCOM on the project include Palantir and Esri, with help from Apple. Another firm with expertise on highly secure apps — the Northern Virginia-based Monkton — is also contributing.
About 700 U.S. soldiers, airmen and sailors under O’Shaughnessy’s command are responding to the coronavirus pandemic alongside civilian health care workers. Commanders can get twice-daily updates on the physical and mental health of the personnel, O’Shaughnessy said during a webinar in early May.
“I was concerned about how we were going to control them, how we were going to keep in touch with them,” the general said, “so we went back to our technology efforts.”
Two of the apps, Esri’s “Survey 123” and Palantir’s “Palantir Mobile” allow service members to log in and fill out forms on how they are physically and mentally feeling with others being for logistics and medical reporting.
The new apps are a part of a technology modernization revolution O’Shaughnessy said the pandemic has brought to NORTHCOM and the North American Aerospace Defense Command (NORAD), which O’Shaughnessy also leads. Both commands are also using its own tech developments in the fight agains COVID-19.
Technology being developed as part of Joint All Domain Command and Control (JADC2) — the military’s network-of-networks Internet of Things — is being used to oversee the use of the apps. Using the apps in a JADC2 architecture has enabled both “enhanced communications and Common Operating Pictures,” allowing commanders to integrate information into easy-to-access dashboards, according to Northern Command spokesman, Bill Lewis. The added advantage of using the JADC2 model instead of typical command and control networks was having real-time access to data that can be used to inform decisions much faster than other methods.
“Ultimately, the command tried to field a capability that increases the Commander’s understanding of the fight against COVID, the real-time status of our deployed forces, and our ability to communicate with those forces,” Lewis told FedScoop in an email.
Using JADC2 helped feed predictive analytics to monitor and project COVID-19 cases. The command used a new platform developed with the help of Google called “AIsmartONE” to use artificial intelligence initially designed with JADC2 to predict the spread of coronavirus, Lewis said. The platform analyzed data on hospital capacity, reproductive rates and other metrics, that directly informed decision making on where the general deployed personnel across the country, decision he said that have helped save lives.
“That’s exactly what we want to do on the homeland defense mission set and frankly that’s what we want to do as a Department of Defense,” he said.
GSA Cloud Information Center gets design, platform upgrades
The General Services Administration says it has addressed design flaws in its Cloud Information Center that limited its ability to keep agencies informed of new cloud solutions.
Using human-centered design, GSA upgraded the platform its cloud team uses to publish information while also making the CIC accessible to people with disabilities, more uniform in presentation and more secure, the agency said Wednesday.
GSA launched the CIC this time last year to be a central repository connecting agencies with cloud service providers, but the focus back then was on consolidating resources and not user experience.
“Modernization is not static, and neither is the CIC,” said Bill Zielinski, the outgoing assistant commissioner of Information Technology Category at GSA, in the announcement. “The government’s one-stop-shop for all things cloud now offers an enhanced digital experience that organizes complex information from a variety of authoritative sources into a format that is more accessible and digestible.”
The CIC includes a market reasearch-as-a-service tool, training documents and acquisition guidance and templates now on a cleaner interface with more intuitive navigation. Content is reorganized to correspond with the cloud adoption lifecycle.
GSA’s cloud team intends to update the CIC regularly as federal policies, technologies and best practices change. The CIC reflects the Office of Management and Budget‘s Cloud Smart Strategy for migration that meets agencies’ needs.
CIC joins the Centers of Excellence and Federal Risk and Authorization Management Program in using 18F‘s Federalist platform and the U.S. Web Design System for human-centered design.
“The next iteration of the CIC puts people at the center of technology,” John Radziszewski, cloud program manager, in a statement.
Investing in the future now: Building digital infrastructure for the next generation
As government agencies at all levels continue to deal with the COVID-19 pandemic and its devastating repercussions, we should remember the warning in a 2013 commentary by technology thought leader Jon Dittmer:
“Organizations that fail to modernize will become unresponsive to customer and constituent needs, and they will ultimately not be competitive in the marketplace. Perhaps most important, the gap between where they are and where they need to be will only widen, which causes a much more painful, expensive and scary future.”
The struggles federal, state, and local government agencies have faced the past few months have borne out Dittmer’s concerns. From collapsing unemployment benefit systems across dozens of states to performance and cybersecurity issues at agencies on the frontlines of the coronavirus response, the ramifications of attempting to serve citizens with outdated information technology is being graphically illustrated in real time. And while the nation and the various states work on the painfully gradual decline of the initial wave, followed by the real possibility of a second wave, we ask ourselves: Will we be ready? What will the future bring?
Today’s war, and lessons from previous battles
In ways both large and small, COVID-19 has affected us on the scale of World War II. It took a massive amount of American sacrifice, money, and skill to overcome the incredible adversaries we faced in the 1940s. Coming out of World War II, the United States invested massive resources and effort to modernize and upgrade physical infrastructure. The outcome of that investment was the greatest prosperity the world has seen for the following seven decades.
America has a long history of successful investments: The construction of the Panama Canal from 1904-1914, the GI Bill from 1944-1956; Creation and Execution of the Marshall Plan from 1948-1951; and the important work of DARPA from 1958 to today all come to mind. Oh, and our greatest infrastructure investment in American enterprise — the Interstate Highway System — from 1956 to 1992. Underwriting those projects built American commerce, influence, and prestige that enabled us as a country to lead the 20th century.
We are now called on to make that level of investment in our digital infrastructure. The Alliance for Digital Innovation (ADI), and many other like-minded organizations, have proposed several key principles to guide the government’s efforts to build our digital infrastructure as part of the COVID-19 response effort. It is imperative that more targeted funds be made available to agency IT modernization to deal with the current scale of the response efforts.
• State and local governments need specific funds dedicated to critical IT modernization and cybersecurity enhancements as they deliver vital benefits to those affected by COVID-19.
• The Technology Modernization Fund desperately needs support at an appropriations level that would allow for meaningful investment in cross-agency IT modernization initiatives.
• Additional funds provided by Congress need to be focused for innovative commercial capabilities, enhanced workforce training, digital service delivery improvements, and robust platform and infrastructure investments that will strengthen cybersecurity and protect citizen data.
Today’s American leadership building tomorrow’s American Dream
Modern digital infrastructure enables and leverage the other innovations we seek to develop and embrace. Without the leading global digital infrastructure, other technologies such as clean energy, bio-medical breakthroughs, and advanced manufacturing capabilities will fall short. Unfortunately, this important effort will not come cheap. But not all spending, and debt, is bad. Running up your personal credit card on short-term goods, such as a nights out on the town or vacations, is not a wise investment. But taking on a student loan to graduate with a degree is debt that pays back many times over. Modernizing our digital infrastructure is like taking on that student loan.
Investing in a modern digital backbone now serves as a powerful force multiplier for other dollars spent in the future. This is not advocating for spending just to throw money at current problems. Having a solid digital backbone for our nation leverages every other dollar spent, enabling America to support the industries being reborn by COVID-19 and new companies that will rise in the aftermath. To make this vision a reality will require a bold commitment by the government.
Significant investment would result in both a short-term improvement to the immediate technology needs we have during the COVID-19 crisis, and would provide the key building blocks needed for a long-term effort to drive our technological future. So let’s address this challenge head on. Now, more than ever, Congress and the Administration have a chance to truly build the modern, secure digital infrastructure that will enable the “American dream for the next generation” to become a reality.
Aaron Newman is the Executive Chairman and Founder of CloudCheckr. He is also a member of the board for the Alliance for Digital Innovation.
Coast Guard extends mobile connectivity to distressed mariners with ‘i911’
The U.S. Coast Guard is taking advantage of phone-tracking services to locate distressed mariners up to 20 nautical miles offshore through a mobile app that acts like 911 emergency service.
The “i911” app is currently being used in the Pacific Northwest and has the clearance to be implemented at Coast Guard districts across the country. While the guard describes the software as “groundbreaking” in a news release, it behaves like similar technology already used by law enforcement and consumers. Its adoption by the Coast Guard, however, represents a step in a more modern direction for an agency whose legacy IT systems are on the “brink of catastrophic failure,” according to its top officer, Adm. Karl Schultz.
The i911 application allows Coast Guard fleets to access phone location data through a web-based interface for mariners in need of a rescue, according to the release. Boaters don’t need to download anything as long as they can click a link texted to them from rescuers and enable the sharing of their location data.
“While VHF radio remains the most reliable form of distress communication, this tool gives the Coast Guard another avenue to rapidly locate mariners in distress utilizing smart phone technology,” said Lt. Cmdr. Colin Boyle, the command center chief of the 13th district in Washington that is the first to adopt the technology.
In February, Schultz called for a “tech revolution” to increase off-shore connectivity and modernize legacy systems across the service. Much of the guard’s plan revolves around cloud migration and upgrading hardware on cutters, mid-sized ships that make up the bulk of the service’s fleet. These initiatives are designed to improve services and better take advantage of applications like i911.
The tech is a free service developed by Callyo Incorporated, a mobile technology company that caters to the law enforcement market. The application is also used by other law enforcement agencies across the country, according to the company’s website. The application can only access location data during rescue operations with the permission of the distressed mariner.
“In addition, the location sharing feature is only utilized during an active search and rescue case and can be turned off by the mariner at any time,” Boyle said in a news release.
Procurement documents show the Coast Guard had a “Cooperative Research and Development Agreement” with Callyo in 2019. That year, the Guard ran pilot programs for the app’s use, which lead to several successful rescues, the guard said. While the 13th district in the Pacific North West is the first part of the guard to fully implement the technology, the rest of the service has the authority to adopt it.
This type of technology has been prevalent for years and widely used in many other industries. Phones-based location data is used in services like Apple’s “Find My” app, which allows authorized location data viewing for devices and many companies use large-scale data aggregation of smart phone location data.
Why open source solutions are playing a more powerful role in enterprise IT
One of the biggest IT challenges government organizations face in their efforts to modernize their systems is how to keep their options open for the future — and avoid the trappings that can come from committing to one primary vendor or technology platform.
Given shift from big monolithic IT environments to more agile cloud services, it would seem that agencies have more choices and flexibility today than in years past. Yet all too often, agencies find themselves trading one set of proprietary on-premise technologies for another set of proprietary services in the cloud.

Melissa Di Donato, CEO, SUSE
That’s in part because, while the technologies may have changed, the economic forces have not. Agencies want the best value, as they should, but that often requires them to concentrate IT investments with a handful of contractors and their preferred suppliers. Vendors, meanwhile, live or die by ROI; so the more of a customer’s stack they can “own,” the greater their chances for survival.
It’s no wonder that the resulting technical debt, and its corrosive effect on modernization efforts, are driving CIOs and those they support to seek a better approach.
Open source solutions, of course, have long been a welcome alternative. But now they are an increasingly powerful way to future-proof your IT operations, especially as organizations continue to transition into the new technology era, built on portable services, APIs and agile development. Open source solutions:
- Modernize aging IT systems while enabling cloud and mobility for mission-critical initiatives, and time sensitive decisions.
- Ensure security perimeters are in place when accessing cloud applications from anywhere on any device, at any time.
- Facilitate rapid and continuous updates — keeping your systems safer and more secure.
- Provide the tools to stay compliant with pertinent regulations and guidelines for infrastructure and services across all environments.
But the power of open source solutions, and the vast community behind them, perhaps can best be illustrated by the remarkable speed with which the open source community has helped develop open source projects to tackle issues related to the COVID-19 pandemic.
These projects — from tools to predict hospital enrollment from COVID-19 to tracking systems to develop the disease’s “family history” — remind us that it’s not the technology we have, but how we apply it to solve complex, real world problems.
Having spent my entire career working for large enterprise software companies, I come to the open source world as a user and a consumer of these technologies, giving me a unique perspective into their utility. But now, having met with hundreds of customers around the world as CEO of SUSE, the largest independent open source company in the world, I’ve also seen how open source solutions are helping enterprises achieve their digital transformation goals while positively impacting the world around them.
From a Federal perspective, we’re seeing open source solutions play an expanding role in helping civilian and defense agencies, for instance, develop real-time cyber defense mechanisms, using machine learning and AI algorithms.
We’re also seeing agencies adapting open source solutions for use in high performance super computers, to analyze vast volumes of criminal and terrorist data to gain greater insights and develop responses faster.
Not to mention, that open source solutions are paving the way towards a common IT layer that functions across a variety of on-prem and cloud environments.
From our perspective, we see open source not only as an increasingly powerful way for agencies to modernize their IT systems faster, but we also see the importance of helping customers capitalize on open source solutions based on what success means to them. With open source, you are not locked into a vendor, you have the ultimate flexibility to build your IT based on your needs. This is the value of open source, and with the powerful community of innovators, the power of many, open source innovations can help agencies start from where they are and get them to where they need to be.
In the end, it all comes down to serving our customers. It shouldn’t be about be beholden to our suppliers.
Melissa Di Donato spent 25 years in top leadership roles at leading enterprise software companies, including SAP, Salesforce and Oracle, before taking the helm in July of last year as CEO at SUSE, now the world’s largest independent open source company.
You can learn more about SUSE Federal’s offerings by downloading this white paper or visiting our website.
Maria Roat officially named deputy Federal CIO
The White House named Maria Roat deputy federal chief information officer at the Office of Management and Budget on Tuesday, a move slated since January.
Several people interviewed for the job, and OMB had to finalize the selection.
Roat takes over for Margie Graves, who retired from government at the start of the year.
“We’re thrilled to have Maria join OFCIO and the OMB team,” said Federal CIO Suzette Kent in the announcement. “The experience and leadership Maria brings to the role of deputy federal chief information officer will be an asset to efforts to shape a secure, modern, and data driven government.”
Previously Roat served as CIO of the Small Business Administration since October 2016, where she took on more leadership in interagency federal IT functions like the CIO Council, where she co-chairs the Innovation Committee. She also sits on the board of the Technology Modernization Fund.
Roat comes to the White House with first-hand experience with the pressures that the coronavirus pandemic has put on federal IT. SBA’s its E-Tran loan system timed out multiple times during the second round of Paycheck Protection Program (PPP) applications in April. Earlier that month, personally identifiable information from about 8,000 Economic Injury Disaster Loan applicants was potentially exposed, a fact SBA made public.
Roat’s successor at SBA is Guy Cavallo, who has been deputy CIO since late 2016.