Energy is using cyber risk assessments to make cloud decisions
The Department of Energy has started factoring quantitative cybersecurity risk into its internal budget decisions.
DOE adopted the Factor Analysis of Information Risk (FAIR) management framework and has begun initial, daily risk assessments at interested national laboratories, Emery Csulak, the department’s chief information security officer, told FedScoop.
This fall, DOE plans to onboard even more agencies.
“Our goal is to find the army of the willing — get that buy-in early in the process — so that we’re not sitting there spending all of our time fighting with the naysayers,” Csulak said.
So far FAIR has been employed when weighing the pros and cons of a particular cloud’s adoption or migrating a certain product into the cloud, versus keeping it deployed locally, he added.
The goal is to use FAIR to make business cases during the fiscal 2021 budget process with the Office of Management and Budget, Csulak said.
Recently the National Institute of Standards and Technology opted to formally reference FAIR within its Cybersecurity Framework.
“That’s a major coup,” Nick Sanna, president of the FAIR Institute, said Tuesday at FAIRCON 2019.
DOE was one of the first agencies to use FAIR because its offices are “highly federated,” Csulak said.
“They need to be able to make honest risk decisions at the level where they affect their operational capabilities — whether or not it’s with science or nuclear protection,” he said.
Rather than wait for the “perfect set of metrics,” DOE started talking to vendors about their quantified risk management approaches and launching risk analysis projects to establish standard operating procedures.
The DOE Office of Inspector General and the Government Accountability Office have already been impressed with the department’s use of the FAIR model, Csulak said.
“When you can demonstrate that you have put forward a thoughtful means of looking at risk, they’re very receptive to that,” he said.
NIST defines zero trust architecture, releases use cases
The National Institute of Standards and Technology wants feedback on its definition of zero trust security architecture and potential deployments — outlined in a draft special publication released Monday.
Zero trust refers to the narrowing of cyberdefenses from wide network perimeters to micro-perimeters around individual or small groups of resources, NIST says in the new guidance.
No implicit trust is given to systems based on their location, and user and device authentication is required prior to establishing a connection. This is particularly important as more employees work remotely and data is migrated to the cloud.
While zero trust architecture (ZTA) isn’t a foreign concept to agencies, more research and standardization is needed to improve their overall security posture, according to NIST.
“[M]any organizations already have elements of a ZTA in their enterprise infrastructure today,” reads the document. “Organizations should seek to incrementally implement zero trust principles, process changes, and technology solutions that protect its data assets and business functions.”
In addition to providing a ZTA roadmap, the document highlights a number of use cases including agencies with satellite facilities, multi-cloud environments and contracted services.
ZTAs still face unique cyberthreats like a compromised policy engine or policy administrator — which approved connections between resources — denial of service attacks or network disruption targeting those components, and insider threats among them.
The Federal Information Security Management Act, Trusted Internet Connection 3.0, and Continuous Diagnostics and Mitigation programs all play into zero trust because they restrict data and service access to authorized parties, the end goal being to eliminate all unauthorized access. Access control enforcement should be as granular as possible, according to NIST.
Most agencies will operate within a hybrid architecture as legacy information technology is modernized, NIST adds. While it’s possible to build a pure ZTA using a ground-up, greenfield approach, large agencies will require multiple tech refresh cycles and migrate one business process at a time.
“After enough confidence is gained in the workflow policy set, the enterprise enters the steady operational phase,” reads the report. “The network and systems are still monitored, and traffic is logged, but responses and policy modifications are done at a lower tempo as they should not be severe.”
Public comments on the document are due by Nov. 22.
Report: Why health data privacy needs more than HIPAA
Everyone who has an interest in healthcare data, including researchers, regulators, patients, and healthcare providers, now faces a conundrum. The increasing availability of health data from medical records, mobile sensors and apps, and public health sources is transforming the health sector. It’s becoming possible to predict medical vulnerabilities and tailor prevention and treatment to the individual more accurately than ever before. But the new promise of personalized medicine also carries significant risks. As the use of personal health data increases, so does the risk of privacy violations — and the harms that can result to both individuals and communities.
The nonprofit Center for Open Data Enterprise (CODE) has just released a report on Balancing Privacy With Health Data Access — the findings of CODE’s research and a recent roundtable on the topic co-hosted with the Office of the Chief Technology Officer (CTO) at the U.S. Department of Health and Human Services (HHS). The roundtable and the report include not only the perspectives of policymakers, but input from patients, patient advocates, and a broad range of stakeholders with an interest in health data privacy.
“The roundtable described in this report addressed the challenge: How should we balance the need for privacy with access to health data that can make groundbreaking insights possible?” says HHS Chief Data Officer Mona Siddiqui. “That discussion is part of a national effort to address data privacy at different levels of government. At HHS, we are at the center of this national conversation. Those of us who believe in technology’s potential for good must lean into this conversation and embrace that it will be messy, incremental, and iterative. But at the end of the day, the voice of the consumer and the voice of the patient needs to be loudest.”
The report finds that the privacy protections in the Health Insurance and Portability Accountability Act (HIPAA), passed in 1996, are no longer sufficient to ensure the data security that consumers need. For example, HIPAA does not cover many of the companies that gather health data from fitness trackers, genetic analyses, or other commercial processes and devices that are widely used. On another level, public data on social determinants of health (SDOH), including income, education, housing, and other factors, is increasingly used to predict health risks for individuals and communities – a trend that promises great opportunities to improve prevention and treatment, but also raises the risk of discrimination in health care coverage.
The CODE report provides a roadmap for healthcare professionals, policymakers, and patients and their advocates to understand the state of health data privacy, shortcomings in the current system, and possible remedies. It concludes with recommendations for improving health data privacy and use in several ways:
- Improve individual access to health data. HIPAA gives patients the right to access their health data, but many people don’t know it. By educating the public and enforcing this right, HHS can help give individuals more power over their data and how it is used.
- Hold health-related “business associates” accountable. HIPAA’s rules cover only certain kinds of businesses and organizations, including healthcare plans, providers, and clearinghouses. The rules are also supposed to cover “business associates” of those covered entities, but HHS does not monitor or enforce its rules directly for these business associates. The report recommends that HHS work with privacy experts to close this potential loophole.
- Help startups comply with HIPAA’s requirements. The cost of meeting HIPAA’s requirements can be prohibitive for data-driven health startups if they plan to download and analyze sensitive health data. However, HHS can provide “data containers” that enable these startups to work with sensitive data that is anonymized and held by HHS, avoiding the need for them to download the data and secure it. The Centers for Medicare and Medicaid Services (CMS) has developed a Virtual Research Data Center that can be expanded and used as a model for other, similar solutions.
- Create industry-wide ethical guidelines for consumer-generated health data. “Consumer-generated” health data, including data from fitness trackers, genomic analyses, and social media, is now virtually unregulated. HHS could convene industry leaders to develop ethical guidelines for data collection and use and communicate those to their customers.
- Increase access to data on social determinants of health – with legal protections. HHS and other government agencies, researchers, philanthropies, healthcare providers, insurers, and others all have a stake in using SDOH data to improve individual health. As the use of SDOH data grows, stakeholders should determine the best ways to increase access to this data while preventing its misused.
- Use technology to improve patient consent for data sharing. Current methods of informed consent for the research use of data – a standard requirement of research protocols – are not sophisticated enough to allow for new uses of data as research opportunities evolve. New technology platforms can enable patients to provide “dynamic consent,” allowing them to choose more precisely how they do or don’t want their data shared and used in the future.
- Create patient-centered outreach and engagement programs. HHS and its partners can help clear up the current confusion around health data privacy. The next step could be to undertake a comprehensive outreach strategy to educate the public about patients’ data privacy rights and current regulations.
- Adopt legislation to broaden data privacy rights. Many roundtable participants believed that new legislation – not just enhancements to HIPAA – are necessary to protect the privacy of new kinds of health data, particularly the consumer-generated data that is now unregulated.
Congress and HHS are now proposing regulatory changes that can improve the balance of privacy protection and health data access. Senators Amy Klobuchar and Lisa Murkowski have introduced the Protecting Personal Health Data Act, which would create a comprehensive set of policies to regulate the use and sharing of consumer-generated health data. And last month, in a move to increase access to data whose use is overly constrained, HHS proposed revising the privacy rule known as 42 CFR Part 2 to facilitate research on opioid addiction. Participants at a roundtable that CODE co-hosted with HHS in July 2018, which focused on data sharing to address the opioid crisis, specifically identified this rule as a major roadblock to research that restricts access to sensitive data without providing significantly more patient protection than HIPAA does.
HHS has also proposed additional measures to make data appropriately usable while protecting privacy, and has released a Request for Information on possible ways that HIPAA should be improved. The need for new approaches to health data privacy is clear – and HHS is moving forward to address them.
Joel Gurin is President, and Paul Kuhne is Roundtables Program Manager, at the Center for Open Data Enterprise.
Secret Service testing counter-drone tech at U.N. General Assembly
The Secret Service is piloting technology to counter drones at the United Nations General Assembly in New York City this week.
Counter-unmanned aircraft systems, or C-UAS, disrupt or disable drones, and while the Secret Service has used them before, it’s interested in testing them out in urban operations.
The Department of Homeland Security designates the U.N. General Assembly a “national special security event,” meaning the Secret Service comes up with the operational security plan for protecting heads of state.
A temporary flight restriction is in effect covering a two-mile radius around U.N. Headquarters, surrounding roads, airspace and the adjacent East River.
The Secret Service’s C-UAS tools include radio frequency (RF) detection and radar imagery, coupled with electro-optical/infrared (EO/IR) cameras. But the agency said in a privacy impact assessment the pilot is not designed to collect personally identifiable information from the public.
“All these technologies will be used to scan for and correlate detected and/or observed flying objects to accurately determine the probability that they are UAS, rather than non-threat items like flying debris or birds,” reads the report. “These technologies are able to intercept and access radio frequency signals used by UAS; however, the functions of storing or capturing the signals are not part of the pilot and will not be used.”
Some of the systems can identify the location of a UAS signal, which could in turn identify the operator. If an unauthorized drone is detected inside the restricted radio — whether it’s there on accident or on purpose — local police will attempt to approach the operator and discuss the threat “when time and circumstances permit,” according to the report. Otherwise, the Secret Service or its Coast Guard partner will use C-UAS supplied by the DHS Science and Technology Directorate.
EO/IR cameras have been pointed at the sky or horizon and only activate when sensors alert them to limit the chance they incidentally capture personally identifiable information, according to the report. And all imagery is captured in real-time without storage.
Additionally, Federal Aviation Administration registration numbers on drones won’t be traced by the Secret Service but, instead, shared with the police for investigation, according to the report.
The Secret Service placed signage at local drone parks warning of the temporary flight restriction, and police will occasionally patrol those areas to enforce the ban.
The Centers of Excellence initiative’s next marks: CPSC and JAIC?
Editor’s Note: This story has been updated with information from GSA’s formal announcement of the GSA CoE-DOD JAIC partnership.
The General Services Administration’s Centers of Excellence team has its eyes set on its next customer — and the one after that, too.
The CoE team is in the early “discovery” stages of working with the Consumer Product Safety Commission, a CPSC spokesperson confirmed to FedScoop. Technology Transformation Service Director Anil Cheriyan first shared the news of the partnership at the White House AI summit earlier this month. “We just got started,” he said during his remarks.
“We’re working with the GSA COE on the development of an enterprise data strategy and implementation plan to support ongoing work to support of the CPSC’s commitment to evidence-based decision making and the importance of data and analytics in agency operations,” the CPSC spokesperson told FedScoop.
The CPSC — a small agency with a prolific and very weird Twitter account — is the CoE team’s fourth engagement.
The team recently acknowledged that it will be leaving the U.S. Department of Agriculture — its “lighthouse” agency — after about 18 months. A USDA spokesperson called the IT modernization partnership “a tremendous success.”
Centers of Excellence are also in operation at the Department of Housing and Urban Development and the Office of Personnel Management.
DOD’s JAIC on deck?
Cheriyan didn’t stop with this one announcement. He implied during his summit remarks, and then confirmed to FedScoop, that the CoEs are in “active discussions” with the Joint Artificial Intelligence Center (JAIC) at the Department of Defense. Lt. Gen. Jack Shanahan, who serves as JAIC’s director, alluded to the potential partnership as well.
When asked for further confirmation, a JAIC spokesperson told FedScoop that the conversations are in very early stages and it would be “premature” to comment further.
One day after the initial publication of this story, however, on Sept. 25, GSA formally announced the JAIC and CoE partnership in a press release.
“The new GSA-DoD partnership reflects the ongoing success of the Center of Excellence initiative,” Chris Liddell, White House Deputy Chief of Staff for Policy Coordination, said in a statement. “In alignment with the Administration’s strategy for ensuring American leadership in the industries of the future, the AI CoE program will build the capacity to deliver AI solutions throughout the federal government.”
Cheriyan additionally stated that TTS and CoE leadership are looking into creating a new center of excellence focused on artificial intelligence. The approach to this is a multi-phased one — starting with creating a community of practice around AI, then engaging agencies (such as JAIC) that may be interested in this capacity and finally launching a new CoE.
If this happens, it will be the first time the White House Office of American Innovation-initiated project has decided to grow its own jurisdiction. The five current CoEs — on IT Infrastructure Optimization, Cloud Adoption, Customer Experience, Data Analytics and Contact Center — were all launched at the same time.
After protest, open source software company Chef will let ICE contract expire
Open source software company Chef announced Monday it will not renew its contracts with Customs and Border Protection or Immigration and Customs Enforcement after protests that included one former employee deleting code on GitHub
The backlash surrounds the company’s contracted work with the two Department of Homeland Security agencies and the recent family separation policy at the nation’s southern border.
“We began our work with the U.S. Government in earnest in 2014 and 2015,” CEO Barry Crist wrote in a public note to employees. “This included DHS and its various departments under a different set of circumstances than exists today. The overarching goal was to help them modernize their computing infrastructure and create a cooperative community of IT professionals inside the government that could share practices and approaches in a similar way to many open source communities. Policies such as family separation and detention did not yet exist.”
Given the government’s changing policies, and some “deep introspection” within the company, Crist went on, “we will not renew our current contracts with ICE and CBP when they expire over the next year.”
“Chef, as well as other companies, can take stronger positions against these policies that violate basic human rights,” Crist wrote. He added that the company is committed to donating money “equivalent to our 2019 revenues from these two contracts” to charities that support people impacted by family separation and detention.
All this came up after word of Chef’s $95,000 software development tools contract with ICE surfaced on Twitter. Shortly thereafter, former Chef employee Seth Vargo removed some of his code from GitHub, causing outages at Chef.
“As software engineers, we have to abide by some sort of moral compass,” Vargo told The Verge. “When I learned that my code was being used for purposes that I personally perceive as evil, I felt an obligation to prevent that.”
In the direct aftermath of this event, Chef stood by the contract. “I do not believe that it is appropriate, practical, or within our mission to examine specific government projects with the purpose of selecting which U.S. agencies we should or should not do business,” Crist wrote to employees Sept. 19.
Now, he’s changing his tune. The ICE contract at the center of this controversy expires in August 2020.
It’s a tension that has similarly gripped many in the federal contracting community. Where does supporting the mission of government (regardless of administration and politics) end and the practical enablement of unethical policy begin?
The trajectory of Chef’s position is reminiscent of when Google announced that it would not seek to renew its contract with Department of Defense AI initiative Project Maven. The decision came in 2018 after “thousands” of employees protested the contract via an internal letter, some resigned and lots of media outlets reported on the issue. The company also decided it would not bid on DOD’s $10 billion JEDI cloud contract for similar ethicals reasons. Google employees have also protested the company’s work with CBP.
Will Congress finally require agencies to make data centers more energy efficient?
A bipartisan bill that would require federal agencies to make their data centers more energy-efficient passed the House and is scheduled for consideration by the Senate’s energy committee.
The House overwhelmingly passed the Energy Efficient Government Technology Act Sept. 9. Under the bill, each agency would be expected to work with the Office of Management and Budget, Department of Energy and Environmental Protection Agency on a strategy for procuring and maintaining energy-saving information technology at data centers.
The Senate Energy and Natural Resources Committee is expected to approve the bill during a markup on Wednesday.
“The government operates over 2,000 data centers to store everything from Social Security tax records to e-books at the Library of Congress, and the Department of Energy estimates that their energy usage could be slashed in half simply by implementing best practices and existing technologies,” Rep. Anna Eshoo, D-Calif., the bill’s sponsor, said in the announcement.
Best practices include advanced metering infrastructure, building-energy management and secure telework and travel substitution tools.
The legislation would see DOE create a program certifying energy practitioners to evaluate data center energy use and efficiency opportunities.
Information on data center energy use would also be made publicly available to encourage consolidation and optimization.
Previous versions of Eshoo’s bill passed the House in 2014, 2016 and 2017 but never made it past the Senate.
“The importance of data centers in the everyday lives of Americans often goes unnoticed, but the federal government certainly depends on these energy-consuming servers as use continues to grow,” Rep. Adam Kinzinger, R-Ill., a cosponsor of the bill, said in a statement.
The administration’s update to the Data Center Optimization Initiative policy this year includes some guidance for how agencies should “consider opportunities for investments that may yield long-term savings through energy efficiency.” However, it stops short of requiring advanced energy metering for agency data centers — such tools are “expected,” the policy says, but they can be costly and “it is not useful for agencies to install these tools in a facility they are planning on closing,” which is ultimately the policy’s goal.
Navy extends NGEN again for $657M
The Navy has once again extended the contract for its premier IT hardware and services program, the Next Generation Enterprise Network.
The latest extension adds at least four months to incumbent vendor Perspecta‘s current contract for the network services portion of the $3.5 billion NGEN, moving that chunk’s expiration date from June 30, 2020, to Sept. 30, 2020. There are also three additional option months that the Navy can add on to extend that part of the contract through the end of calendar year 2020.
In total, the contract extension could be worth up to $657 million, according to a Pentagon contract award announcement.
“Through NGEN, the Navy has long established itself as a technology leader among government agencies,” Mac Curtis, president and CEO of Perspecta, said in a statement. “We are proud of the innovative partnership we’ve built with them and look forward to putting bold new ideas to work in preparation for the next phase of the program.”
According to a notice on FedBizOpps this summer, the Navy also intended to extend the hardware portion of NGEN six months, from October 2019 to March 2020. It’s unclear if that’s lumped into this extension. In its own announcement, Perspecta described only the four-to-seven-month extension of IT services.
Under NGEN, Perspecta manages and provides services for what it describes as the world’s largest intranet — the Navy Marine Corps Intranet. NGEN was set to expire in Septemeber 2018, but the Navy issued an initial extension to Perspecta worth $787 million.
This comes as the Navy sets the course for the next generation of the massive contract, called NGEN-Recompete, or NGEN-R. That contract has been slow to materialize, however, hence the multiple extensions. This latest extension, according to the Navy, “will provide continuation of current services, including transition services, for the Navy and Marine Corps while the Next Generation Enterprise Network Re-compete (NGEN-R) family-of-contracts are procured.”
The Navy said in its summer notice that the two NGEN-R awards were estimated to be awarded in the fourth quarter of fiscal 2019 (which is only a handful of days away) and the second quarter of fiscal 2020, respectively.
Some USDA Centers of Excellence set to expire
Editor’s Note Oct. 4, 2019 — This story was updated with amended statements from USDA and GSA on the status of the CoEs.
While some of the Centers of Excellence at the U.S. Department of Agriculture are ready to close up shop, not all, it turns out, are quite ready to hang up their hats.
USDA and the General Services Administration, the agency that houses the CoE initiative, told FedScoop in a joint statement that some “workstreams” have been completed, while others will continue into the new fiscal year.
“The workstreams under the CoE will continue as we move forward to deliver enhanced customers service in this next phase of work,” the spokespersons said in the emailed statement. “The Centers of Excellence-USDA partnership remains strong; there is more work to do and we look forward to continuing to deliver results together for the American people.”
This statement amends USDA’s previous statement to FedScoop that the CoE teams would be departing USDA “between Sept. 30 and Oct. 15.” Now, USDA and GSA say that “three of the CoE workstreams were completed by the end of FY 2019.”
For the past 18 months, five distinct centers of excellence have been operating at USDA — one focused on each IT Infrastructure Optimization, Cloud Adoption, Customer Experience, Data Analytics and Contact Center.
The workstreams focused on data analytics, infrastructure optimization and cloud adoption have all wrapped up, according to the two agencies. The work focused on “implementing the ASK USDA Contact Center and the creation of ‘one front door’ for USDA customers,” meanwhile, will continue.
A USDA spokesperson previously told FedScoop that the agency would not be exercising the option years attached to its Phase II contracts. “The ‘option contracts,’ which as the title suggests are ‘optional,’ were assessed against our goals, and given the GSA and USDA teams collectively achieved or exceeded nearly everything it set out to do, the ‘option’ contracts weren’t deemed necessary to continue at this time,” the spokesperson told FedScoop. Again, this apparently holds for some of the contracts but not all.
Still, the completion of some of the work is a big milestone for the CoE initiative, which began its journey at USDA after being born out of the White House Office of American Innovation as a way to “drive enterprise-level change at the agency level.” In the months since, the CoEs have been invited into the Department of Housing and Urban Development and the Office of Personnel Management.
USDA’s Phase I work kicked off in April 2018. During this “exploratory” phase, contractor teams from four companies worked alongside GSA and USDA detailees on five teams organized around the effort’s five focus areas.
“It helps tremendously to have somebody with an objective point of view come in and help you,” USDA CIO Washington told FedScoop in a conversation about the initiative in June 2018. “When you bring in a third party with an objective point of view that’s been there done that, it really helps change the thinking, you know, which helps change the culture.”
Then, in October 2018, GSA picked contractors for Phase II, the implementation phase. Again, contractors worked on blended teams with USDA staff and GSA staff to address each of the five “excellence” areas.
RPA is helping with FDA’s drug evaluation and research
The Center for Drug Evaluation and Research has seven robotic process automation projects in development as it works to free up staff for its core science mission.
An agency within the Food and Drug Administration, CDER ensures drugs on the market are safe and effective — regulating them throughout their lifecycle.
Many CDER employees have pharmaceutical science or medical degrees but find themselves performing repetitive, manual administrative tasks like arranging meetings.
“Some of the activity is done by staff, with very advanced degrees, that would rather not do these kinds of tasks,” Ranjit Thomas, CDER Informatics program management lead, told FedScoop.
The FDA is recognized in the RPA space for automating drug intake forms, as well as work within its chief financial officer’s office. But CDER has quietly put several RPA use cases into production enterprisewide.
In order to market drugs in the U.S. or make changes to that strategy, pharma sponsors must submit their plans to CDER, which receives thousands of applications a week, Thomas said.
Bots now ensure applications are complete before transcribing the information in them from PDFs into CDER’s system. Then the bots determine where to route new drug, investigation and master file submissions among hundreds of workflows so reviewers and project managers have their assignments.
When a pharmaceutical company wants a unique identifier, bots assign those numbers
“This used to be a process done entirely by staff,” Thomas said.
CDER estimates the seven new RPA projects in development will save 24,000 work hours annually, including those where bots schedule meetings and assign letters.
The agency has used RPA for a year with plans to apply bots to machine learning and natural language processing for applications in regulatory review, Thomas said. That would see bots data mining for insights and assisting with resource allocation and risk assessment frameworks.
In high-volume situations, bots would run the first triage of analytics on applications, Thomas said.
RPA does have its challenges because bots need authorities to operate (ATOs) within agency networks, which some agencies have gotten around by credentialing them like humans. That doesn’t sit well with every official.
Thomas said CDER is selective with its RPA projects to avoid such situations.
“There are certain use cases we come across where we’ve looked at bots but decided against using bots,” he said.
Archiving documents is a manual task, but CDER tracks who archives what in every instance. Bots “would not accurately represent the person archiving a document,” Thomas said.
In other words, a bot couldn’t be held accountable, so the process wasn’t automated.
Informatics program management works with FDA’s information technology security staff to handle ATOs, and the RPA off-the-shelf solution CDER uses was evaluated and granted an ATO, Thomas said.