Revealing the promise of digital forms and documents
DeeDee Kato is vice president of corporate marketing for Foxit. She has over 20 years’ experience in product management and product marketing for large technology companies focused on solutions in support or ERP solutions and large-scale commercial off-the-shelf software installations in both the public sector and private industries.

DeeDee Kato, Vice President, Corporate Marketing, Foxit
In spite years of efforts to move away from paper-based systems, federal agencies continue to face a mountain of paperwork. In 2018, Congress passed the 21st Century Integrated Digital Experience Act to require agencies to digitize their services, but by 2020 congressional reports found that some were still far from meeting these requirements.
For example, the Department of Health and Human Services and its component agencies use over 5,000 unique paper forms. Within the Centers for Disease Control alone that equates to about 1.3 billion paper forms processed annually. The Centers for Medicare and Medicaid Services processes roughly 1 billion paper forms. All that paper across these agencies adds up in cost and time to process.
And HHS is hardly alone. The judiciary system, tax agencies, human resource departments, acquisitions organizations and finance teams are some of the areas where government manages a high amount of paper in their workflows.
The Idea Act was put in place to push more services and forms to be available online to bring savings to government. However, digitizing documents also requires new infrastructure investments to modernizing website, integrating applications and streamlining the user experience. Prior to the pandemic there was a lack of inertia and urgency to invest in these upgrades.
The pandemic, though, has increased the need to digitize documents and forms. The remote work environment puts accessibility and security of digital information at the top of most CIO’s priorities. What managers have yet to fully appreciate is that there are alternatives to commonly used PDF tools that can offer competitive pricing and improved user experience.
Canada makes a federal-wide investment
When the Canadian government established Shared Services Canada (SSC) in 2011, they gave them a mission to transform how the government manages its IT infrastructure and streamlines processes. That included supporting more than 200,000 federal employees with modern collaboration tools.
Like many government agencies, the investment choices that were made in the years leading up to the pandemic have proven themselves to be big wins during the remote work shift.
When Canada chose to work with Foxit, its executives were looking to form the foundation of the government’s shift to becoming more agile, open, and user-focused. Digitally enabling public servants was key to everything the government does; including the services it delivers, the policies it develops and their transformation initiatives. For them to be successful, standardizing tools empowered their employees to be more responsive to Canadians citizens.
- PDF viewing, editing and collaboration by making documents readable and easy to open. Reducing document file size, and standardizing file formats are just two examples of how PDF productivity software streamlines the user experience.
- Handling large document files. By packaging files together, agencies can create PDF portfolios for work like contracts. Our software allows for multiple file types to be kept together in their original formats using a PDF wrapper. And PDF portfolios allow signatures to be applied and continue to be valid within the PDF portfolio.
- User training to increased adoption for the software. Foxit provides virtual customer support for the end-user, which — in the case of the SSC — ultimately took them out of the equation and allowed Foxit to help individual users and departments for the agencies the SCC SSC serves.
The SSC decided to make Foxit’s PDF Editor a standard on employee’s workstations desktops, along with other office collaboration tools, to streamline how their employees work. Additionally, they were ultimately able to save on costs for procurement, licensing and support by standardizing the services they offer across the board.
Foxit is a best-of-breed PDF solution
Adobe created the PDF format, but it was released as an open standard in 2008, and there are many vendors that provide PDF editing capabilities. Over the last 20 years Foxit has given enterprises and government agencies an effective alternative to create, edit and manage their digital documents.
As an open standard, the ISO Committee, PDF Association guides the industry standards for the technology, and Foxit sits on the board as a key leader on the specifications.
We understand the kinds of features government agencies are looking for. Software that solves for complex scenarios like to edit, bookmark, or redact huge documents and record cases. A tool that can quickly scan documents and make them text searchable. There are a range of use cases that having a tool to edit and sign documents digitally can solve.
Ours integrates seamlessly with a cloud-based e-signature solutions such as DocuSign that is FedRAMP approved, which becomes critical to comply with the signature requirements that the Idea Act actually requires.
Besides the full feature capabilities of an enterprise product, cost is a huge concern among our partners who need a large number of licenses for an effective enterprise-wide solution. Though the industry is leaning towards subscription-based licensing, this can end up being a more expensive solution.
Foxit offers a licensing choice. We can offer a perpetual, one-time cost license, which saves up to one third of the cost on average but also a subscription-based license for those who need it. Now large agencies can make the solution available to more users with the scope of their budget.
Ultimately, our goal is to work with our partners on a roadmap to digitize their documents and forms.
Learn more about how Foxit can help your organization take next steps to moving toward PDF editing tools.
DOD is actually doing a good job in keeping some of its weapons secure
The Department of Defense — which has struggled to secure its IT, especially in its supply chain — has met most of the cybersecurity best practices for keeping critical weapons systems secure, according to an inspector general report.
The report, published Friday, was conducted as a checkup on the cybersecurity of weapons later in their lifecycles. Each branch of the military and U.S. Special Operations Command demonstrated timely analysis and reaction to cyberthreats, which earned them a pat-on-the-back from the IG.
“We identified best practices employed by program officials that ensured that information gathered and analysis performed was sufficient to identify and mitigate potential malicious activity, cyber vulnerabilities, and threats; and assess the effectiveness of protection measures within the weapon system for data and cyber resiliency,” the report stated.
The report looked specifically at the operations and support phase of the acquisition lifecycle, which is the last phase in which weapons are used and sustained through repairs and updates. Operations and support lasts for years, making it a ripe time for damaging cyberattacks as systems age and are in use. Leaders have warned that weapons systems remain vulnerable and must be constantly monitored since even slight variations in control could undermine trust and accuracy of their deployment.
The report dove into five systems, including the B-2 Spirit Bomber and guided missiles used by the Navy. The many other weapons systems not inspected in this report should learn from the best practices implemented on the five systems the IG audited, the report noted.
“We did not identify any internal control weaknesses related to developing and updating cybersecurity requirements based on risk for the programs we assessed,” it said.
One of the critical best practices was the timely sharing of information, a difficult function for siloed government agencies. By sharing new cyberthreats between offices for different systems, patches and risk reduction could more quickly be deployed. The IG pointed to working groups that different agencies and program offices made in order to boost collaboration on threat assessment and risk mitigation as a best practice others should adopts.
“Program officials for all weapon systems should consider the best practices described in this report when developing plans and procedures for reducing cybersecurity risks within their programs,” the report stated.
Unusable and unused vaccine systems hinder Biden’s COVID-19 response
The federal government must build working IT systems to manage COVID-19 vaccines and share real-time data on supply and immunizations, say technology and health experts.
Under-resourced states and localities didn’t receive the tech they needed from the Trump administration to support vaccine rollout, according to current and former federal and state officials. States expected the federal government to provide at least one functional system for distribution, scheduling and reporting they could use. But, the officials say there’s still time to develop such a system given the uneven vaccine rollout nationwide.
Federal agencies also withheld “a lot” of data on the pandemic prior to the 2020 presidential election, said Ryan Panchadsaram, a board member for COVID Act Now, a nonprofit formed to make more pandemic data public. Most federal data products were developed for internal use by agencies as they sorted out their response to the pandemic, said Panchadsaram, who was a U.S. deputy chief technology officer under President Obama.
COVID Act Now and other organizations like it are helping to fill the data gap the government is still working to address. The nonprofit tracks cases, case count confidence, infection growth rate and vaccine data. Others, like The Atlantic’s COVID Tracking Project, focus on data like hospitalizations — which the Centers for Disease Control and Prevention didn’t begin releasing on its own until December.
States started receiving vaccines in December. But the Trump administration neglected to prepare a process for vaccine scheduling that relied on multiple systems, even though companies like ZocDoc had been offering their services since April. So, it’s up to the Biden administration to task agencies and industry with developing interoperable solutions for vaccine scheduling and reporting to ensure people who want to be immunized get the information they need.
“The real questions right now are: Where can I get vaccinated and when?” Panchadsaram told FedScoop. “Am I eligible?”
The Centers for Disease Control and Prevention awarded consulting firm Deloitte a no-bid contract back in May to build a $44 million system capable of providing the answers. But many states found its Vaccine Administration Management System (VAMS) — to which they were given free access to handle distribution, scheduling and reporting — unusable and have since purchased alternatives. VAMS is plagued by bugs that canceled appointments, informed people they were registered when they were not and locked staff out of the reporting section.
The numbers are simply too great for the data management systems to be fragmented.
Assuming 75% of the U.S. population — about 250 million people — get vaccinated at two doses means half a billion injections. That requires cooperation among government at all levels, pharmacies and industry partners, but also a federally supplied “single source of truth” on vaccination numbers, among what groups, efficacy and health monitoring, Casey Coleman, former chief information officer of the General Services Administration, told FedScoop.
States ill-equipped
Scheduling alternatives to VAMS include the Maryland Partnership for Prevention’s commercial system PrepMod, adopted by other states, and even the events platform Eventbrite, which some Florida counties continue to use despite the state finally launching its own system. While far from perfect, states have generally had less trouble scheduling vaccines through these more established systems.
Alaska’s state government uses PrepMod, which has proven the easiest way to book vaccine appointments, according to officials. Despite that decision, the state has had a “bumpy” vaccine rollout because PrepMod updates have taken a backseat to work on other software for reporting data to the federal and state governments, Brendan Babb, chief innovation officer of Anchorage, said during a recent New America event.
The city’s innovation team had to write code to scrape the PrepMod site in order to show people what appointments are available because they lacked an application programming interface (API), Babb said.
Seniors applying for the vaccine using PrepMod have to fill out seven screens of information, and difficulties have forced officials to lean on phone sign-ups. Signing up 15 seniors by phone takes two hours, and they’re being asked pregnancy questions, Babb said.
“This is an issue that we saw coming as soon as we knew that the federal government was only handling the logistics portion and that the last-mile delivery would be handled by the states,” said Hana Schank, director of strategy for public interest technology at New America, during the same event. “We know from our work that the states are really not equipped to handle that kind of work; they’re not equipped from a staffing perspective, and they’re not equipped from a technology perspective.”
The application CDC ‘forgot’
To make matters worse, the CDC is sitting on unused software that states struggling with last-mile delivery could use as a fallback.
The application, CDC Assist, was developed as part of the pandemic response plan after the H1N1 flu to distribute vaccines, Andrew Needleman, CEO of DoseSpot, told FedScoop. The United Way involved Needleman in the CDC Assist program when it was still known as Flu on Call in 2013, and he’s spent the last two months trying to raise awareness on Capitol Hill and in the White House.
The web and phone app qualifies patients, directs them to pharmacies and other distribution centers that have the vaccine in stock, and notifies them where to get their shot when eligible — eliminating lines. Rides can even be arranged for people who can’t afford them.
CDC Assist last-mile delivery would complement scheduling systems, but the agency hasn’t told Needleman why it sits dormant. The CDC did not respond to multiple requests for comment.
“It’s one of those solutions where it’s as simple as basically just turning it on because we actually have been doing testing every year in order to make sure everything is ready to go,” Needleman said. “Sometimes people forget about things like this.”
Needleman has spoken to Biden administration officials and said they’re interested in using the program now and in the event of future outbreaks. But it’s unclear when they’ll give the green light and who’s responsible for doing so.
Launching CDC Assist would take about a week because, while the program is mostly automated, a call center consisting of a small staff needs to be stood up for people who aren’t tech-savvy, Needleman said.
The app qualifies people for the vaccine based on their state’s criteria and sends secure messages through pharmacies’ existing e-prescribing infrastructure. That’s important with some pharmacies currently writing and faxing fake prescriptions to themselves to get people showing up for vaccination in their systems.
CDC Assist can also connect to less traditional systems that distribution centers are using for vaccine scheduling like Eventbrite.
The program isn’t the only one sitting on the shelf either. CDC has another tool capable of checking vaccine supply at locations that works in tandem with CDC Assist. The agency intends to flip the switch but hasn’t said when, Needleman said.
Troubled transition
The Trump administration established systems like VAMS quickly, but the Biden administration is still assessing their usefulness.
Under President Trump, Operation Warp Speed saw the creation of Tiberius, a Palantir-developed software platform designed to analyze vaccine administration data the CDC receives from states and localities and then visualize national uptake.
While Tiberius is being used by the Biden administration, just how good a picture it’s painting of the U.S. vaccination effort remains unclear.
White House Press Secretary Jen Psaki couldn’t provide specifics on the vaccine stockpile during a Jan. 26 press briefing, saying only that the number is monitored daily through Tiberius and other systems.
“It sounds like a magical creature a little bit, to me. But it provides vaccine information — publicly available information on vaccine supply that’s gone to states and what’s been used,” Psaki said. “It doesn’t mean it’s perfect. Oftentimes it isn’t.”
The fact that the Palantir-developed system is named after “Star Trek” Capt. James T. Kirk’s middle name aside, neither state and local officials nor the public are informed of Tiberius’ insights.
Psaki further noted the Biden administration had only been in control six days. But it only took the Department of Health and Human Services nine days in April to create HHS Protect, the health insight platform informing the Trump administration on COVID-19‘s spread.
Asked about the government’s progress developing a clearinghouse for vaccine information on Feb. 8, Psaki had no updates.
“There is a great deal of confusion,” Psaki said. “And one of the focuses we have had is trying to alleviate that confusion.”
Operation Warp Speed was about funding the development and manufacturing of vaccines, but with the delivery phase must come a change in approach designed to eliminate bottlenecks, Coleman said.
Coleman now works for Salesforce, which recently launched a cloud service aimed at helping communities, including Lake County, Illinois, manage the scale and monitoring of vaccine delivery.
At the end of the day, such systems still need to connect back to a single, federal source of truth for vaccine reporting, experts say.
“This is the top priority of the Biden administration, so we’re seeing a lot of focus on connecting those dots — on the CDC playing a key role in terms of clear guidance,” Coleman said.
President Biden‘s COVID-19 strategy and executive order calling for a data-driven response offer promise that federal coordination on vaccine scheduling, data collection, information sharing and the required systems is coming. But a multidisciplinary team consisting of data, tech and health policy experts needs to quickly pick up where Operation Warp Speed left off to ensure equitable last-mile vaccine delivery and immunization tracking in real-time. While the Trump operation has not been officially declared over, its deadline for delivering vaccines was January.
“There are rumors that the U.S. Digital Service might take this on or that they might build a federal tech team together to work on this,” Schank said. “So there is hope.”
USDS hasn’t been directly tapped for vaccine data consolidation yet but is discussing how it can help with COVID-19 response, according to a spokesperson.
The team was responsible for the rollout of HealthCare.gov and served as the government’s fix-it team during the Obama administration. Tapping USDS to right the ship on vaccination data gathering could signal a return to its roots and resurgence under Biden, but that would require rapid hiring of the necessary talent. The team posted a hiring announcement to its Twitter page on Jan. 20.
In the meantime, Biden appointed Cyrus Shahpar COVID-19 data director, and under him, the Data Strategy and Execution Workgroup has ensured daily releases on healthdata.gov.
Taking more than ‘one swing’
States forgoing federal vaccine systems face their own struggles given the wide-ranging IT they buy or, in some cases, fail to buy.
CIOs supporting health departments have had to rely on two types of vaccine scheduling solutions, both originally designed for other purposes.
Enterprise, end-to-end scheduling applications can track vaccination eligibility, send reminders and integrate with state immunization systems. But they’re hard to adapt to states’ specific planning, technology, budgetary and population needs, said Raphael Lee, administrator of U.S. Digital Response’s Health Data Program, during the New America event.
Meanwhile, standalone scheduling solutions may not be health-related, like Eventbrite, or be repurposed after helping with COVID-19 testing, like Solv. The burden of selecting the best vendors, carrying out procurements and ensuring data is interoperable falls to states, Lee said.
While you can see “some foresight” in funding VAMS, Panchadsaram said, the Trump administration likely awarded the contract to the most convenient vendor: Deloitte. The same was true when HHS fast-tracked Palantir, which had been working on similar projects, to develop HHS Protect citing “unusual and compelling urgency.”
HHS did not respond to multiple requests for comment.
The CDC has attributed some of states’ issues with VAMS to user error. But that raises the question of whether Deloitte spent enough time with states refining the system’s scope and usability, Panchadsaram said.
“The CDC’s VAMS is one element in the complex vaccine ecosystem and has performed strongly, with consistent stability and availability since it launched in mid-December,” said a Deloitte spokesperson. “The overwhelming demand for vaccinations far exceeds the current vaccine supply and the availability of appointments. While we understand the frustrations many people are experiencing, they are unrelated to VAMS’ technology.”
At least nine states, one territory and one hospital system use VAMS.
Meanwhile, the CDC’s VTrckS vaccine tracking system is largely being used to report shots in arms. But interoperability with state systems has been a problem, leading to data transfer delays as state officials personally facilitate the exchange.
“A lot of these systems aren’t communicating well with each other,” Barr said. “So people are manually having to type in information.”
Investing in systems to improve their interoperability, automate data collection from electronic health records and other streams, and update legacy immunization registries won’t come cheap, but it will be harder to justify after the pandemic. And leaving it to counties to build or buy the right systems clearly isn’t working, Panchadsaram said.
Having one contractor attempt to build the perfect system for federal, state and local governments keeps failing, so it would be better to introduce competition, he said.
“What we need to do is find a way to really encourage groups within the government — like 18F and USDS on the federal level, contractors on the outside that can work in agile, iterative ways — to really try to build multiple solutions,” Panchadsaram said. “We tend to fail on technical projects a lot because we just take one swing at it.”
Former Air Force tech leader Will Roper joins Pallas Advisors
Former Air Force tech and acquisition leader Will Roper has joined strategic advisory firm Pallas Advisors, the group announced Tuesday.
Roper, who left his role as the assistant secretary for acquisitions, technology and logistics in late January, was one of the military’s biggest boosters of emerging technology, especially artificial intelligence. As a senior counselor, he will continue to work on issues similar to those he led while in government — though his position is only part-time, according to the firm.
“Dr. Roper is widely viewed as one of the Department’s most innovative leaders in recent years and we are excited to welcome him to Pallas Advisors,” Tony DeMartino, a founding partner of the firm, said in a release. “He is well-versed in the challenges our clients face and will add immense value to our mission.”
Pallas Advisors has a slew of former high-ranking officials who serve on its board and as consultants, including Tony Thomas, a former commander of U.S. Special Operations Command, and Gary Cohn, who was director of the U.S. National Economic Council under President Trump. Along with DeMartino, the firm was co-founded by Sally Donnelly who served as a senior adviser to then-Secretary of Defense Jim Mattis. In that capacity, she played a role in building the Pentagon’s relationships in Silicon Valley.
Before he left government, Roper expressed interest in taking a new role in the Biden administration. But as a political appointee, he left government at the end of the Trump administration. He said that his passions lie in pursuing stronger innovation policies that could deter America’s adversaries, especially China.
“Fully unlocking private sector innovation for the military was my passion while serving in government,” Dr. Roper said in a statement. “I am excited to work with the Pallas team to continue this important pursuit.”
Correction: Feb. 17, 2021. An earlier version of this story incorrectly stated that Sally Donnelly worked directly on the Pentagon’s JEDI procurement.
CIO who oversaw the first ‘online census’ leaves Census Bureau
The Federal Housing Finance Agency named Kevin Smith its chief information officer after he left the Census Bureau.
He took over for acting CIO Tom Leach, who remains the agency’s chief technology officer, on Jan. 3.
Smith is no stranger to the federal CIO role, having spent the last four years as the Census Bureau‘s CIO. His last day there was Dec. 31, 2020.
As the bureau’s CIO, Smith led its IT program and oversaw the launch of enterprise services essential to 2020’s first primarily online census. That included emerging technologies for data collection, analytics, processing and dissemination.
Still, the 2020 census faced its fair share of challenges, namely the COVID-19 pandemic. Initial census results were delayed from Dec. 31 to April 30 due to the interruption in the collection of responses.
The delay gives data scientists extra time to resolve any processing issues that may arise during the apportionment process, where state population counts are used to determine the number of seats they get in the House. IT systems doing the processing faced an increased risk of defects because testing was fast-tracked to end in October, rather than January, according to the Government Accountability Office.
Initial census results will include total population counts for the nation; states; Washington, D.C. and Puerto Rico but not demographic breakdowns by race, sex and ethnicity. Those will come in a later release, according to the bureau.
“If this were a typical decade, we would be on the verge of delivering the first round of redistricting data from the 2020 Census,” wrote James Whitehorne, chief of the Redistricting and Voting Rights Data Office, in a blog on Friday. “However, COVID-19 delayed census operations significantly.”
The bureau’s focus on apportionment means redistricting counts won’t be delivered to states until Sept. 30, he added.
Census Deputy CIO Skip Bailey is performing the bureau’s CIO duties until a replacement is hired through an executive search process.
Space Force starts transitioning cybersecurity professionals into its ranks
The Space Force started receiving its first cybersecurity personnel from other military services at the beginning of February, the chief of space operations said recently.
Most of those cyber personnel transitioning into the new force come from within the Department of the Air Force, which oversees the Space Force. In total, the force has brought in 2,400 of the 6,400 active duty cyber personnel it’s planning for, Gen. John Raymond, commanding general of the Space Force, told reporters during a Defense Writers Group media call
These Cyber Guardians — what members of the Space Force are called — will be protecting satellites and other space-based assets from hacking. While Space Force leaders often repeat they want to keep the newest branch of the military “lean,” cyber personnel is one category they are actively bringing onboard.
“There’s a spectrum of threats that are out there. Everything from reversible jamming of satellites and GPS satellites, communication satellites, GPS satellites,” Raymond said. “And there’s cyber threats.”
Civilian leadership in the Department of the Air Force has put greater emphasis on satellite security. At the DEF CON 2020 conference, the Air Force and Space Force partnered with ethical hackers to find better ways to harden their cyberdefenses. Working with outside experts helped the department to better identify vulnerabilities.
But now the force wants its own cyber personnel to boost its cyber expertise.
“They will be part of our crew force; they’ll understand the cyber terrain of space and will help us protect this critical domain from that threat,” Raymond said of the new cyber operators in the Space Force.
Space Force acquisition professionals have also been at work to increase cybersecurity by inking new deals with private security companies. One recent deal with Xage security will build a zero trust-style security system to protect space assets.
CDO Council issues first report to Congress
The Federal Chief Data Officers Council plans to begin member-developed projects advancing innovative data practices this year, according to its first report to Congress.
Projects include developing a framework for sharing decision-support dashboards across agencies, creating a data skills training program playbook, finding new ways to analyze public comments, and using data to better manage wildfire fuels.
The council began considering projects in May, and the Office of Management and Budget made final funding decisions in October.
“By delivering data and analytics solutions to our leaders and field employees, we can have a major impact on how federal agencies more efficiently and effectively serve the public,” Ted Kaouk, who chairs the council and serves as CDO of the U.S. Department of Agriculture, wrote in the report. “By implementing data governance, data workforce strategies, and data management best practices, we can enable access to high quality, timely data that will improve evidence-based policymaking.”
The report’s release corresponded with the public unveiling of the council’s new cdo.gov website for sharing updates on priorities, programs and events, as well as engaging on how to improve access and use of federal data.
Monthly meetings will continue in 2021 to support CDOs, still relatively new to their positions, and improve ties with other interagency councils. The council held 11 monthly meetings last year.
“The Council’s first year has been focused on setting up its governance structure, building a CDO community and relationships with other intergovernmental councils and groups, sharing best practices/lessons learned, strategic planning, and supporting CDOs in their implementation of the Federal Data Strategy (FDS) Action Plans,” reads the report.
There’s been no word on when the 2021 Action Plan will be released, the 2020 Action Plan having dropped in December of 2019.
The council’s short-term goals included developing a learning community, demonstrating its strategic value, establishing an operating model and creating an FDS roadmap. The body will continue to encourage data-sharing agreements between agencies and strong privacy protections moving forward.
The report further lays out the council’s tiered structure, membership and six initial working groups:
- Operations,
- COVID-19 Data Coordination,
- Data Skills,
- Data Sharing,
- Small Agency Committee, and
- Chief Financial Officers Act Agency Committee.
The council released the report in accordance with the Foundations for Evidence-Based Policymaking Act and is set to sunset in 2025, two years after the Government Accountability Office evaluates it, barring renewal.
CMMC model tweaks coming after industry feedback
The foundation of the Cybersecurity Maturity Model Certification (CMMC) — the Department of Defense’s new cyber requirements for contractors — will see some coming changes, its leaders recently said.
The DOD will make alterations to the highest level of the five-tier security model after receiving public comments on the recently issued CMMC Defense Federal Acquisition Regulation System rule.
The department issued an “interim final” rule in September instead of first issuing a proposed rule, which meant the rule took effect upon publication. But there was still a 60-day comment period for industry to weigh in. The Office of Management and Budget, which hosts the council overseeing acquisition rules, allowed for this because of “the threat to national security” embedded in supply chain vulnerabilities, Jessica Maxwell, a DOD spokeswoman said in a statement.
“We did not plan to make changes to the DFAR rule,” Maxwell said. She added: “We also recognize that as the threat is not static nor should our model not be static, we are always evaluating the best standards to implement to address relevant threats.”
The DOD is also looking to update its CMMC assessment guides as a part of the comment adjudication process. DOD’s authority to create the assessment guides, which will be used by CMMC assessors, was outlined in a recently released statement of work in a contract between DOD and the CMMC Accreditation Body (CMMC-AB), which is the organization charged with implementing the program and overseeing the assessors and CMMC ecosystem.
CMMC was designed to close the many cybersecurity gaps in DOD contractors’ networks through third-party verification. But the new rule won’t be widely adopted in contracts until fiscal 2025.
The biggest change under CMMC is that now contractors will need to get a third-party assessment for their networks. No longer can they perform a self-check to ensure they are meeting standards. Instead, they will need to hire an assessor to verify it.
DOD received comments from contractors and trade groups, many advocating for clear guidance on the reciprocity between the CMMC controls and other federal IT compliance programs, like the Federal Risk and Authorization Management Program (FedRAMP).
“As the Department moves forward with the CMMC, we believe that it is important to get its implementation right by developing and implementing those cybersecurity protocols that are necessary, while simultaneously guarding against actions and regulations that do not add security and result in harm to industry’s ability to innovate and partner with DoD,” trade group ITI wrote in its comments to DOD. ITI also recommended more clear guidance on how subcontractors will be handled with flow-down requirements.
It’s unclear exactly what changes DOD plans to make, but the announcement also comes after the publication of new protective guidance from the National Institute of Standards and Technology, SP 800-172. Maxwell said the process for adjudicating the comments is not related to the new publication, but Stacy Bostjanick, the acting director of supply risk management at the DOD, told InsideCybersecurity, which first reported the changes to the rule, that the department is also trying to “sync” CMMC levels four and five with NIST’s new guidance. Very few companies will need to meet those levels, DOD said previously.
GAO tells VA to stop rollout of $16B EHR program, but the VA ‘doesn’t plan to’
Editor’s Note: this story has been updated with comment from the VA
The Government Accountability Office has recommended that the Department of Veterans Affairs stop work on its new electronic health record (EHR) modernization program to conduct “critical” tests before launching at any more medical centers.
The EHR system has faced critical shortfalls and the VA hasn’t completed tests that could result in the failure of the system at the heart of the 10-year, $16 billion modernization program, the GAO states in a report released Thursday.
The system is currently live at a Spokane, Washington medical center with no major reported issues. But as the program continues to be rolled out, the VA’s new health care IT system could falter, the GAO report cautions. Thus, the GAO recommends that the VA “postpone deployment of its new EHR system at planned locations until any resulting critical and high severity test findings are appropriately addressed.”
While the VA responded to the report’s findings by “concurring in principle,” it told FedScoop that it doesn’t plan to stop the rollout.
“The Department of Veterans Affairs (VA) does not plan to stop the launch of VA’s new electronic health record system,” the VA said in a statement. “VA appreciates the opportunity to review the recent Government Accountability Office (GAO) report regarding the progress of VA’s Electronic Health Record Modernization (EHRM) program and the disposition of test findings in relation to subsequent deployments.”
The GAO had a stark warning for VA if it doesn’t properly test and evaluate the system.
“If VA does not close or appropriately address all critical and high severity test findings prior to deploying at future locations, the system may not perform as intended,” the report warned.
However, the VA said that its current rate of testing and risk mitigation strategy will suffice.
While the GAO doesn’t have the grounds to force the VA to do anything, its reports are used by lawmakers who can censure the department leadership in congressional hearings and require them to take action.
The VA apparently disagreed with some of the specifics in the GAO recommendations. After reviewing a draft version of the report, the VA asked the GAO to change some of the report’s language to soften its negative tone.
“Specifically, in the title [Office of Electronic Health Records Modernization] recommends changing the phrase ‘…but Subsequent Test Findings Will Need to Be Addressed’ to read ‘..and Test Findings are being Addressed,'” the report states.
GAO declined, saying its “recommendations are appropriate, as presented.”
At the time of the first “go-live” in October, congressional aides had concerns with the launch. One told FedScoop at the time the VA was only “5 percent” ready. The next leg of the rollout is scheduled to be at the Puget Sound Health Care System in the fourth quarter of fiscal 2021.
Despite congressional and GAO concerns, the VA remains confident in the systems it developed and its processes for dealing with future bumps in the road.
“VA has made significant progress over the last few months and we are well-positioned to continue moving forward while minimizing impact to providers and Veterans,” the VA told FedScoop. “VA is taking every precaution to deliver a safe and effective system for our clinicians and users and remains committed to getting this right for our Veterans.
The EHR system has faced previous delays due to inconsistencies in system performance and a need for more testing. VA staff leading the EHR program told FedScoop in October at the launch “we are getting positive comments,” despite some “usual jitters” among the staff about using the new technology.
The Department of Defense is also migrating its electronic health records system to the same Cerner Millennium cloud-based platform so that it can be linked to the VA system for the seamless transfer of records when service members retire.
NASA sends AI to space with first commercial edge computing system
When you need computing power at the edge, often that means buying extra hardware for far-flung offices or maybe loading a system on to a truck. But for some agencies, getting compute to the edge means going to infinity, and beyond.
Thursday, NASA and Hewlett Packard Enterprise announced that they will test the limits of the term “edge computing” with a new computer designed to deliver artificial intelligence in space. Later this month, the new Spaceborne Computer-2 will become the first high-performance commercial computer to operate in space on the International Space Station.
HPE says Spaceborne Computer-2 will allow astronauts to process data that used to take months in mere minutes. Once launched and assembled in space, NASA will use it for at least the next two years, giving astronauts the power to use AI and other advanced computing capabilities that were once out of reach in space.
Bringing this type of computing capability to space “is just the first step in NASA’s goals for supporting human space travel to the Moon, Mars and beyond where reliable communications is a mission critical need,” HPE said in its release.
“The most important benefit to delivering reliable in-space computing with Spaceborne Computer-2 is making real-time insights a reality. Space explorers can now transform how they conduct research based on readily available data and improve decision-making,” said Dr. Mark Fernandez, HPE’s principal investigator for Spaceborne Computer-2.
Getting and using computers in space is no easy task. First, just putting the hardware into orbit involves shooting it on a rocket — rattling, shaking and jolting through the atmosphere for minutes on end. Once in space, if the computer’s complex circuits still work, the zero-gravity environment and constant exposure to the sun’s radiation present further challenges. However, Spaceborne Computer-2 was built off a prototype launched into orbit in 2017. And HPE specially designed it to sustain operations in space, along with software coded for space-based work.
Astronauts will use the computer to process data from the space station, satellites, cameras and other sensors. Loaded with the necessary graphics processing units (GPUs), Spaceborne Computer-2 will be ready to process everything from photos of polar ice caps to medical images of the astronauts’ health, according to the news release. The GPUs’ processing power will be enough to fuel AI and machine learning capabilities, eliminating the need to send data back to earth for ground-based processing.
“Edge computing provides core capabilities for unique sites that have limited or no connectivity, giving them the power to process and analyze data locally and make critical decisions quickly,” said Shelly Anello, general manager of converged edge systems at HPE.
HPE partnered with Microsoft Azure to provide additional compute resources through its Azure Space cloud capability recently launched to support NASA, Space Force and other partners.