Former DHS secretaries to propose major reforms streamlining resources for cybersecurity, coronavirus threats

A bipartisan group including former Department of Homeland Security secretaries plans to recommend major reforms in July to improve the department’s response to emerging threats, including cyberthreats.

When President Trump ordered incoming flights from Europe to be screened for COVID-19 starting March 14, DHS couldn’t access the doctors, supplies or facilities needed to do that efficiently, Tom Warrick, the department’s first deputy assistant secretary for counterterrorism policy, told FedScoop.

The result was seven-hour delays at major airports, even though DHS “knows precisely how many people are coming into the country each day through advanced passenger information data technology,” Warrick said.

Now a senior fellow at the Atlantic Council‘s Scowcroft Center for Strategy and Security, Warrick will co-lead six study groups aimed, in part, at aligning resources with federal policies DHS must enforce.

Dubbed the Future of DHS Project, national security experts will tackle not only the coronavirus pandemic but “threats to democracy” — cyber-related issues ranging from election security to social media disinformation campaigns to the sabotage of critical infrastructure — intended to divide Americans.

“In the tech area, we’ve seen some great work being done by the Cybersecurity and Infrastructure Security Agency, but it’s equally clear that a lot of what [Director] Chris Krebs is doing there isn’t getting anywhere near the attention or the support that it needs,” Warrick said. “And one of the things that we’re going to look at is the fact that DHS has added missions over the years, but the level of resources given to the department has not kept up.”

Ransomware plaguing cities like Atlanta, sowing chaos while Iranian hackers raise money, is another threat that needs greater attention, Warrick said.

Increased support for DHS must further trickle down to states, localities and individual businesses in the form of intelligence currently deemed too sensitive to share, he added.

The Trump administration issued a “very good” 5G wireless strategy, but there’s no money in the pipeline to implement it, Warrick said.

“A part of this is the fractured nature of the way congressional oversight works,” Warrick said. “There are more than 90 committees and subcommittees in Congress that have some role and responsibility for overseeing parts of DHS.”

At least 11 major think tanks have recommended streamlining the process, but none of them boasted a senior advisory board that includes former Homeland Security secretaries Michael Chertoff, Jeh Johnson and Janet Napolitano like the Future of DHS Project.

Delivering recommendations by the end of July ensures both political parties’ presidential transition teams and congressional leadership will have time to consider the reforms before the 2020 election.

Former White House officials, congressional staffers, industry stakeholders and think tank members will meet as part of six study groups beginning in a couple of weeks and running through mid-June, before a final report is issued in July. While the groups may eventually meet in person, for now, their meetings will be held virtually due to coronavirus concerns.

The first group will focus on what DHS’s mission should be and adjusting agency responsibilities where there’s overlap. Events at the southwest border in 2019 highlighted the problems that can arise when the public fails to support DHS’s mission, Warrick said.

“This is particularly true because we’ve started to see these organized efforts by foreign adversaries to target American democracy through how we handle elections, through social media, through disinformation campaigns,” he said. “And there needs to be somebody in the federal government that everyone looks to to say, ‘Defending us from this kind of threat is your job.'”

The second group will consider how DHS handles public-private partnerships, a model that hasn’t kept up with advances in technology over the past decade, Warrick said.

Group No. 3 will address how DHS aligns policymaking, a top-down interagency process led by the White House, with resources generally allocated from the bottom up, components to headquarters.

The fourth group will explore how DHS strengthens the capabilities of foreign partners, currently a “dual-key system” between that department and the State Department that can create confusion, Warrick said.

DHS employees strongly support their department’s mission but have the lowest morale of the 17 large Cabinet departments, according to the 2019 Federal Employee Viewpoint Survey. For that reason, the fifth group will handle workforce issues and employee morale.

The sixth and final group will meet on streamlining legislative oversight.

Scowcroft launched the project in partnership with tech company SAIC and consulting firm Accenture.

“DHS sits at the center of national events, whether it is a pandemic, a natural disaster, or an attack on critical infrastructure. They are indispensable in preventing, preparing for, and responding to these threats,” said Amy Rall, vice president of homeland and justice programs at SAIC, in the announcement. “It is essential that they have the right policies, technologies, and key talent — government and industry partners — to succeed.”

The Future of DHS Project draws its name from the Future of Iraq project Warrick headed up while at the State Department in 2002, which grew into a $5 million, two-year postwar planning effort.

Not since 2004, when the Center for Security and International Studies and the Heritage Foundation evaluated DHS, has a think tank seriously done so. One of the results was DHS’s creation of the Office of Policy.

“Since then, there really haven’t been any serious efforts to try to understand the department,” Warrick said.

Federal R&D spending on AI should be doubled, then doubled again, commission says

Congress should double research and development spending on artificial intelligence in fiscal 2021 and then double it once again the following year, according to new recommendations from an independent commission on America’s development of AI.

The National Security Commission on Artificial Intelligence issued a quarterly report Wednesday with 43 recommendations to the legislative and executive branches that are “in the most need of immediate attention, ripe for action, or foundational to AI and national security issues.”

The commission was mandated by the 2018 National Defense Authorization Act to “consider the methods and means necessary to advance the development of artificial intelligence, machine learning, and associated technologies to comprehensively address the national security and defense needs of the United States.” It issued its first interim report to Congress last November.

Top of that list, the commission calls for an immediate doubling of funding for non-defense R&D around AI starting next fiscal year. Though many agencies in that space don’t directly serve national security missions, the commission argues that it’s a national security risk for the nation to fall behind in its development of AI capabilities, even in non-defense areas.

“We believe that we are probably in the lead in global R&D. But we’re being pressed very hard by a number of competitors. And we need to increase our spending,” Bob Work, former deputy secretary of Defense and the vice chairman of the NSCAI, told reporters Wednesday. “You will see in our recommendations that are we focused primarily on non-defense R&D in this first quarter. And we will turn our attention to defense R&D in the coming quarters.”

The recommendation focuses on increasing funding levels at a specific set of agencies: the National Science Foundation, Department of Energy, the National Institute for Standards and Technology (NIST), National Institutes of Health, and NASA.

Because the funding in fiscal 2021 is already “rather set,” achieving that is probably “going to require some reprogramming,” Work said. Despite that, the recommendation says: “This funding should increase agency topline levels, not repurpose funds from within existing agency budgets, and be used by agencies to fund new research and initiatives, not to support re-labeled existing efforts.”

The recommendation is more ambitious that the White House’s goal of doubling AI R&D by 2022. Instead, the commission calls for a doubling in 2021, and then doubling that number the following year again “until we have the right level of spending to maintain our lead in R&D,” Work said.

Eric Schmidt, former Google CEO and the chairman of the commission, said the importance of this increased funding is more apparent than ever as the nation struggles to fend off the coronavirus pandemic.

“I think our recommendation of doubling AI and doubling it again is probably even more important” now, Schmidt said of how the commission is thinking of the pandemic. “And the reason is that virtually all of the interesting medical approaches that I’m familiar with, are using AI techniques to look at essentially the targets of the virus. So I think you know, the way the biology works is that they run large numbers of assays and we try different things. And these new algorithms, they’ll give them a better way of targeting where those assays go.”

Schmidt said this report doesn’t cover the pandemic, but said such an action would have an “indirect benefit” on fighting COVID-19. The commission will take to focusing on the virus over the next few months for its next quarterly report, he said.

In addition to boosting non-defense AI spending, the commission in the new report calls for the DOD’s Joint Artificial Intelligence Center to report directly to the secretary, expand the cyber excepted service, create mandatory AI training at some agencies and enhance the technological infrastructure across the federal government and the nation.

On the release of the new report, Schmidt said: “We wanted to time this so that the congressional staff and the people who care about this would have our recommendations to base their thinking on. These are just recommendations. We don’t have the force of law, but we have a lot of reasons to think that people will follow them or at least come very close to our recommendations.”

NIST will help create CMMC standards for third-party assessors

The National Institute of Standards and Technology will play a “core” role in setting standards for third-party assessors to participate in the Defense Department‘s new Cybersecurity Maturity Model Certification (CMMC).

While NIST will be critical in creating those standards that organizations will have to meet to become a third-party assessor for the cybersecurity certification process, the governing CMMC accreditation board will have the ability to “modify” them, Katie Arrington, the CISO for the DOD acquisition and sustainment and top CMMC official, said Wednesday.

The board will oversee the greater training and credentialing of the third-party assessors. Once certified, those organizations will be charged with testing defense contractors to ensure they meet one of CMMC’s five levels of standards.

Having training, assessing and credentialing all housed under the one board has triggered questions about a potential conflict of interest. But NIST’s role will help mitigate conflicts, and “the executive board has very stringent ethical rules,” Arrington assured during an FCW webinar.

The DOD doesn’t want to create a “self-licking ice cream cone,” she added. Any member of the CMMC accreditation board who works on standards will not be able to participate in the training or certifying of the assessors, Arrington said.

NIST did not return a request for comment before publication.

The board is comprised of several different committees — including separate training and credentialing committees — and many working groups designed to focus on specific parts of CMMC implementation. Several new working groups were just opened for defense industry members to enroll in, according to the board.

Arrington said the first cohort of assessors to be trained by the board is expected to be comprised of roughly 25 to 30 companies. Training will start by the end of April or early May and take place online due to social-distancing requirements during the coronavirus pandemic, she added.

After the first cohort, the CMMC board will look to scale up training to ensure there will be enough certified assessors to inspect the more than 300,000 contractors that make up the defense industrial base.

CMMC guidelines will start appearing in contract requests for information this summer with requirements being implemented in contract bids by October, Arrington said.

The scheduled rollout of CMMC hasn’t changed, despite the larger disruption across the DOD supply chain caused by the COVID-19 response. “We have to have continuity of care, the mission is important,” Arrington said of keeping on track.

Federal agencies collaborate on developing 3D-printed masks

Three federal agencies have banded together to develop 3D printing models for masks as hospital systems run low on personal protective equipment during the coronavirus pandemic.

The Food and Drug Administration, National Institutes of Health and Department of Veterans Affairs signed a memorandum of understanding to share lessons learned, data and technical information on producing masks for health care workers with 3D printers. The federal partnership also coincides with a public-private partnership with nonprofit America Makes for the agencies to connect with medical facilities who are in need of the masks.

Broadly the goal is to “jointly develop a response to COVID-19 that will ensure veterans and civilians have access to the most innovative medical solutions and technologies to support their care,” according to the MOU.

As part of the partnership, the three agencies are connecting hospitals with 3D printing manufacturers and helping to develop models that medical facilities can 3D print.

Representing the VA in the agreement is the Veteran Health Administrations Innovation Ecosystem, a group of innovators that work on emerging technology in the VHA. The ecosystem and the VA have been working on developing 3D printing technologies in the past for telehealth. The VA will make a website for 3D printing experts to participate and provide engineering support.

The FDA will also provide engineering support and maintain an email address to answer questions the public may have about the masks. NIH will leverage its health expertise in the process, providing infectious decease experts to inspect the protective quality of materials used in the additive manufacturing processes

Civilian agencies are not the only ones chipping in to fill the shortage of medical supplies. Universities are working to make face shields and stories of airmen working to prototype 3D-printed masks have circulated on LinkedIn.

GSA’s first CDO, Kris Rowley, has left for the private sector

The General Services Administration’s first chief data officer, Kris Rowley, has left the agency for the private sector and been replaced by Deputy Chief Information Officer Beth Killoran in the interim.

Killoran has led the CDO and chief technology officer teams since Rowley’s departure on March 13 to become CDO of the Conference of State Bank Supervisors.

Rowley headed GSA‘s data management and analytics program, as well as the creation of the agency’s D2D platform that collects and analyzes data to inform decision-making.

“Kris has a passion to understand the data culture and took the lead in cultivating and discussing ways to improve information management and data access across the agency,” wrote David Shive, GSA CIO, in an internal note. “He also established and matured GSA’s data governance structure and developed a data science talent management strategy.”

Rowley was with GSA since 2013.

In her time as deputy CIO, Killoran has focused on piloting new technologies like robotic process automation, machine learning and network enhancements.

“I have no doubt that we will continue to improve innovation and customer satisfaction for employees across the organization,” Shive wrote.

Audit finds SBA’s information security program ‘not effective’ despite cyber improvements

The Small Business Administration’s information security program remains “not effective,” despite improvement in cybersecurity oversight of incident response, risk management and contingency planning.

An SBA Office of Inspector General audit found persistent weaknesses when independent accounting firm KPMG tested 10 of the agency’s systems against Federal Information Security Modernization Act requirements.

Of the eight areas evaluated, SBA only achieved a “managed and measurable” level — denoting effective security — in incident response. The agency reached a “consistently implemented” level in three other areas: risk management, data protection and privacy, and contingency planning.

Configuration management, identity and access management, security training, and information security continuous monitoring were found at a “defined” level.

“To continue to improve its FISMA effectiveness, SBA needs to proactively update and implement security operating procedures and address the new vulnerabilities identified in this report,” reads the OIG audit released Monday.

SBA agreed to meet 11 OIG recommendations across three areas.

Risk management:

Configuration management:

Identity and access management:

“We are encouraged that the inspector general ‘observed improvement in cybersecurity oversight’ in our continued delivery of resilient and cost-effective enterprise security services throughout the organization,” wrote Maria Roat, chief information officer of SBA, in a March 12 response to the OIG’s recommendations. “The Office of the Chief Information Officer will diligently pursue robust and adaptive cybersecurity visibility, defense, detection, and response capabilities across the enterprise.”

Lessons learned on building a common operating picture across networks

For years, government agencies have poured a great deal of money and resources into finding an efficient way to view activity across their networks.

I saw that first-hand during my tenure in government and my work over the years helping to shape national cybersecurity policy. Despite efforts to readily identify cyber risks, achieving an effective common operating picture continued to prove difficult at many agencies, and having a consolidated view of vulnerabilities and threat activity moving within and between networks was a major operational challenge. When I retired from government at the end of 2018, our Security Operations Centers still tended to coordinate cyber incident response and tracking of malicious activity through conference calls rather than synchronizing awareness and action in an automated fashion. This is a serious impediment when you are trying to counter criminal and nation-state actors moving at machine speed!

Jim Richberg, Chief Information Security Officer, Fortinet Federal

I recognized that part of the problem was due to the complexity and age of agency systems as well as the challenges of the federal budget cycle, but I also felt that the cybersecurity technology to enable integrated situational awareness and automated response must not exist.  My view on that has changed, however, since leaving government and discovering that, in fact, the private sector had already come up with a solution — and is deploying it commercially.

From an agency’s point of view, I understand that IT leaders have to balance the risks of security threats with the day-to-day demands of keeping networks operating despite limited financial resources and staff. That balance has gotten shakier, however, as agencies expand into the cloud, begin to embrace wireless/Internet of Things technology and adopt software defined networks that lack perimeters. In short, the attack surface of government networks will continue to grow exponentially.

There are several factors that have made it harder to attain a centralized view of an agency’s network environment. First is the growing volume of devices and applications accessing government networks every day. Another is the sheer number of cybersecurity solutions and tools agencies have acquired over the years that don’t communicate well with each other. We typically have followed the approach in cybersecurity of seeing a problem, building (or in the case of government, buying) a solution and deploying it.

The result is that after a number of years, the typical large organization may have 50 or more security products or services each addressing a separate problem. Data sharing between these tools is rudimentary, and “connecting the dots” of threat activity and creating the “big picture” of cybersecurity health too often falls on the overworked security analyst or network administrator. Finally, restrictions on sharing sensitive and departmental IT – as well as cultural silos – continue to hinder information sharing, even on basic issues such as the existence of unpatched vulnerabilities.

In light of this situation, it’s easy to see why agency leaders might conclude that establishing a common operating picture and an integrated defense of federal networks will always remain an elusive goal.

Industry’s unified platform approach and its benefits

On the contrary, the private sector has largely solved how to achieve a common operating picture — and how to counter threats in real time at the point of attack and pre-emptively inoculate other potential victims.

This approach relies on a platform of devices — both physical and virtual — that can instrument the perimeter and the core of a network, along with the access points for wireless and IoT devices, with the use of hybrid cloud operating environments. Each of the major security technology providers has a different name for its platform. Fortinet’s is called a fabric approach. They vary in maturity and ability to integrate with products from other vendors, but collectively they signal an impending revolution in cybersecurity capabilities.

We often say one of the reasons cybersecurity remains hard is because the attack or vulnerability surface is growing exponentially, along with the variety and sophistication of threats. However, if we instrument this growing surface and collect key data, we have the ability to further discern potentially harmful or clearly malicious activity from abnormal and benign network traffic.

The key to doing this is big data analytics, in particular artificial intelligence and machine learning (AI/ML), which enables both discovery of and response to malicious activity in real-time and in an automated fashion. But what’s also needed is the ability to implement security improvements incrementally as IT departments upgrade and refresh their technology rather than initiating a wholesale rip-and-replace overhaul before they can achieve any significant improvement in security.

The effectiveness of the fabric approach was demonstrated in August 2019, when NSS Labs — the cybersecurity industry’s leading independent testing organization — conducted a breach prevention system test to assess the effectiveness of the unified platform technology approach. Using both real-world threat data and advanced threat models, NSS’s tests showed that, while the fabric solutions varied in effectiveness and total cost of ownership, as a class, they were both markedly more effective and more affordable than non-integrated point solutions.

AI/ML as potential game changers for cybersecurity

Artificial intelligence and machine learning technology are the key to making a unified platform work. Absent mature big data analytic capabilities, if you instrument the breadth of your IT operating surface, you will drown in data.  But if you can instrument your network and if the devices you use are capable of taking action — as well as generating reports — you can potentially turn the tables on a would-be intruder.

The reality is that malicious cyber-activity is seldom invisible; when we audit the firewall logs during a breach investigation, we can usually see when and how the intruder penetrated the network — and the failed attempts that preceded their eventual success. Network defenders get countless alerts every day, many of which are false alarms. Even when the security operations team recognizes there is a real threat, they may not know how to respond or at least how to respond as quickly as the attacker is moving to capitalize on their success. If automation of threat detection and response can be driven by AI and ML, we can take these advantages of relative secrecy and speed away from an attacker.

AI and ML take time to implement successfully; time spent building and training the models and curating the data. I’ve learned that if you try to rush into implementation of an AI/ML solution too quickly, odds are that your project will fail. Fortinet, for instance, has been using AI/ML technology in threat analytics for nearly 10 years and is on its sixth generation of AL/ML used in its global Security Operations Center and Threat Intelligence Unit (FortiGuard Labs).

Recently Fortinet made the functionality of this cloud-based capability available in software that can be deployed by customers in air-gapped or stand-alone networks. This gives government users who do not want to share threat data externally the ability to deploy a neural network-based threat-discovery and mitigation tool that is both effective out of the box and gets smarter as it learns in the local operating environment.

Commercial AI and ML solutions can also provide organizations another benefit — as a force multiplier to support workforce needs. Globally, various reports point to a 3 million-person shortage in cybersecurity personnel. Using automation to tackle the basics of cyberthreat analysis (tier 1 problems) lets IT staff focus on those tasks which require human judgment.

Take compliance reporting, for example. I don’t know anybody who finds generating compliance satisfying — it’s just necessary tedium. Automating this work increases efficiency and has a positive effect on employee morale, by freeing people to use their skills and training to deal with hard problems rather than rote reporting.

Cyber risk and the need to act

But automating tasks only go so far. It’s more important than ever before to establish a reliable, common operating picture of threats and agency vulnerabilities in order to keep government resources secure.

Government agencies — and the public sector at large — were subjected to more security incidents and more breaches than any other sector last, according to the Verizon Data Breach Investigations Report.  A common operating picture helps IT departments attain wider, and more granular network visibility. That in turn makes it easier to eliminate easily exploitable vulnerabilities that criminal actors routinely look for.

State of the art platforms such as Fortinet’s Fabric provide visibility both onsite and across multi-cloud environments. Fortinet’s technology, for instance, can monitor encrypted traffic moving through firewalls without compromising performance. This matters since the bulk of traffic on a network is now encrypted, and being able to read it without adding significant latency is important. This technology can also quickly distinguish between an external threat and an internal user mistake (such as misconfiguration of a web application), and can respond to both.

And because Fortinet’s platform is built on open, rather than proprietary technologies, agencies can remain more flexible. Part of Fortinet’s success with enterprises worldwide is based on working in an ecosystem of established partnerships with other technology providers so they can build a security approach that couples their existing infrastructure investment with the latest capabilities now available in the private sector in those areas of cybersecurity that they are upgrading. And, because new technology solutions often encompass functions performed by multiple older devices, total cost of ownership drops while performance accelerates.

Cybercriminals certainly aren’t slowing down their attacks. But the sooner that federal agencies begin establishing an accurate common operating picture of their networks coupled with an automated capability to respond to threat activity in real time, the sooner they can start to tilt the security odds back in their favor. The products and services to do so are being deployed in the private sector and the government networks and services that support the American people should leverage these same cybersecurity capabilities.

Learn more about how Fortinet’s Security Fabric platform can provide true integration and automation across your agency’s infrastructure.

Jim Richberg is chief information security officer at Fortinet Federal. Richberg formerly served as the National Intelligence Manager for Cyber in the Office of the Director of National Intelligence, where he set national cyber intelligence priorities. Before that, he monitored and coordinated implementation of the whole-of-government Comprehensive National Cybersecurity Initiative for Presidents George W. Bush and Barack Obama.

Army wants help with virtual critical-care services in COVID-19 field hospitals

The Army is looking for help in creating “virtual critical care wards” in the new field hospitals that the military and aid groups are building around the country in response to the novel coronavirus pandemic.

The goal is to scale up existing critical-care telemedicine technology in hospitals so it can be linked to the field hospitals, according to a presolicitation released Monday. The new National Emergency Telecritical Care Network (NETCCN) will be “a cloud-based, low-resource, stand-alone health information management system,” according to the Army.

The Army will have between $30 million to $37 million to offer up to six awardees, according to the document.

Most of the technology is already available, but the Army wants to be able to rapidly scale it to help coronavirus patients “wherever there is need,” according to the notice which was issued by the Medical Technology Enterprise Consortium (MTEC), which fields emerging medical technology for the Army.

“These high acuity, virtual wards would bring high-quality critical care capability to nearly every bedside, be it healthcare facility, field hospital, or gymnasium,” in cities and rural medical facilities, the announcement states.

MTEC is using an “enhanced White Paper” method to quickly screen ideas and get funding to workable solutions as fast as possible.

“Enhanced white papers should specifically address providing EXISTING technologies available for other use cases that can be rapidly adapted to establishing a National Emergency Tele Critical Care Network (NETCCN),” the Army says.

Areas of interest include mobile communications networks, building clinician mobile web portals, using real-time data collection and leveraging cloud-based information storage.

Marine Corps’ new education strategy focuses on tech-driven ‘continuous’ learning

The Marine Corps wants to use technology to support “continuous learning” of its force, according to a new learning doctrine document released Monday.

The doctrine focuses on the broad principles of learning that will guide the education of Marines as the U.S. military faces new technological challenges in an era of great power competition.

The goal of “continuous learning” looks to leverage technology to bring knowledge to Marines dispersed across the globe, on ships in the middle of the ocean and classrooms across the nation and world.

The doctrine is the first major update to the Marines’ learning philosophy in more than 20 years, the corps’ top military official Gen. David Berger wrote.

“Projected future challenges for the Marine Corps include the potential for adversaries to achieve technological equivalence or superiority with the United States,” the document states. “That possibility, coupled with Marines’ expeditionary nature, means that the Marine Corps must be a more lethal, thinking force that fosters continuous personal and organizational learning based upon enduring principles.”

The Marine Corps’ education doctrine comes after the release of a new education strategy from the Department of the Navy, which houses the Marines Corps. The Navy’s strategy will in part implement the corps’ doctrine by expanding new education opportunities through leveraging tech like online courses and standing up the Navy Community College. The Navy previously said that its first cohort of community college graduates will focus on IT and cyber skills. The community college itself will not be a physical school but a central place to accumulate college credit from online and other courses taken at participating schools.

“Marines leverage the art and science of learning, technologies, and learning environments that reflect the changing operational environment to tailor learning and provide each other with constructive feedback,” the Marines’ document states.

Understanding and using — but not overly relying on — technology is critical to the Marines’ new educational focus.

“Marines must continuously improve their knowledge and skills by leveraging technology—but never depend upon technology alone as the solution,” the document states. The document also cautions leaders to teach about technology that will be useful on the battlefield. Network failures are a reality of war that Marines need to be ready for, it says. The ultimate goal is to develop highly-skilled Marines that have the cognitive prowess to face any challenge in war, according to the document.

“Marines cannot always rely on technologies or on being able to digitally search for information during combat due to many reasons, such as time constraints, lack of network access, or the need to minimize electronic signatures,” the document states.

A test case for tech training

Two tech officers in the Marine Corps offered their idea on how to get more tech wizardry into the Marine Corps through education. In an article for the Marine Corps Gazette, Master Gunnery Sgt. Samuel Carter and Master Sgt. Conor Mahoney outlined a strategy to “gamify” tech training in the corps.

Carter and Mahoney posit that continuous learning needs to shift away from a “more reps” mentality for skills like coding, cybersecurity and emerging technology development to being more incremental and iterative.

“(A)n alternate solution for code and mathematical related knowledge, skills, and attitudes may be a continuous micro-learning model that “gamifies” training through incremental rewards,” Carter and Mahoney wrote.

To gamify and offer short-term rewards for short-term success, the authors suggest re-thinking the traditional reward system. Currently, badges are awarded once a Marine has achieved success in a skill like marksmanship.

“[W]e propose incentivizing our information warriors to innovate and pursue new and more advanced skills through a more modern approach to positive reinforcement,” the Marines wrote.

White House asks hospitals for daily coronavirus testing data to help track spread

The Trump administration asked hospitals in a letter Sunday to begin reporting daily COVID-19 testing data to the Department of Health and Human Services, as the government looks to improve its surveillance of the coronavirus.

The approximately 4,700 hospitals nationally were also asked to report bed capacity and supply to the Centers for Disease Control and Prevention (CDC) National Healthcare Safety Network’s COVID-19 patient impact and hospital capacity module to help identify what needs exist.

The letter was sent by the Centers for Medicare & Medicaid Services, which oversees Medicare-participating health care providers like hospitals, on behalf of Vice President Mike Pence, who heads the White House Coronavirus Task Force.

“This data will help us better understand disease patterns and develop policies for prevention and control of health problems related to COVID-19,” Pence wrote. “All data provided by hospitals will be maintained in accordance with the Privacy Act of 1974, as amended, 5 U.S.C. § 552a.”

Hospitals are expected to report the data without including personally identifiable information on patients.

Public health laboratories and private lab companies already provide their data to the task force. But the CDC and Federal Emergency Management Agency require academic and hospital “in-house” lab data as well in assisting states — some of which have already requested the data themselves — and localities with additional resources.

The task force supplied hospitals with a spreadsheet to be completed no later than 5 p.m. EDT daily.

“America’s hospitals are demonstrating incredible resilience in this unprecedented situation, and we look forward to partnering further with them going forward,” said Seema Verma, CMS administrator, in a statement.