TSA releases updated cloud strategy
The Transportation Security Administration wants to better manage data using cloud solutions in order to ultimately improve passenger screenings.
TSA released its Cloud Strategy 2.0 on FedBizOpps this week, adopting a “Cloud First” approach for all new information technology services and a “Cloud Smart” approach for existing applications.
The updated strategy outlines a hybrid cloud architecture that will see sensitive data stored in a data center and transaction data in applications in private and public clouds.
“TSA’s plans to implement Advanced Passenger Screening capabilities are dependent on the ability to collect and analyze large amounts of data,” reads the strategy. “Therefore, elasticity of storage and computing capability available through cloud solutions is essential to success.”
Having formed a Digital Services Team in 2018, TSA will next create a permanent Cloud Team to guide cloud migration and operations as a “center of excellence,” according to the strategy. The team will analyze apps and data to determine which ones should be retired, or refactored or re-hosted in the cloud.
The agency will first consider software-as-a-service (SaaS) solutions and then infrastructure- and platform-as-a-service alternatives. SaaS will be used for support services like email with Salesforce being the first to be implemented.
IaaS will be deployed for mission-unique apps used for vetting and intelligence and hosted in private clouds. Microsoft Azure solutions will be the first cloud solutions used with IaaS apps.
A combination of IaaS and Paas will be employed for custom apps hosted in data centers and refactored or re-hosted in the government cloud.
TSA estimates migrating apps to the public cloud will “take considerable time and effort.”
The agency wants to “rapidly prototype” capabilities using its data, per the strategy. And only Federal Risk and Authorization Management Program-certified solutions that comply with the TSA Cloud Security Handbook and Enterprise Architecture Service and Cost models will be considered.
Are agencies like Energy funding quantum technology wisely?
Some technologists are questioning the government’s priorities as agencies like the Department of Energy invest tens of millions of dollars in additional funding in quantum computing.
The National Quantum Initiative Act of 2018 allocated $1.2 billion to three areas: research at the National Science Foundation, standardization at the National Institute of Standards and Technology, and critical infrastructure protection at DOE.
But of the $30 million DOE’s Office of Cybersecurity, Energy Security, and Emergency Response awarded during the last research call, only $3 million went toward quantum key distribution (QKD) — a controversial, 30-year-old technology rooted in the laws of physics.
John Prisco, CEO of QKD company Quantum Xchange, would like to see more of those funds put toward a QKD proof of concept at DOE.
“You could protect critical infrastructure and their [supervisory control and data acquisition] systems using quantum keys,” Prisco told FedScoop.
QKD is arguably the most mature quantum technology to date, sharing secret keys made of light — photon by photon — across optical fiber networks. Each photon is encoded with a one or a zero and, taken as a bit sequence, used in cryptographic protocols.
More importantly from a security perspective, if a hacker attempts to intercept the key, they will disrupt its quantum state. This introduces errors, rendering the key useless and revealing the intrusion.
If quantum computers are an offensive weapon against cryptography, then quantum keys are a defensive weapon and currently “the only actionable system available” to protect communications channels like the one belonging to the FBI that Russia breached in 2010, Prisco said.
Post-quantum standards
Meanwhile, the National Institute of Standards and Technology continues to evaluate 26 candidate algorithms in the second round of its post-quantum cryptography standardization project.
Post-quantum cryptography — the creation of algorithms designed to resist cracking by quantum computing — was hailed as a “more practical and cost-effective” way to secure communications systems from quantum computer attacks in a 2016 white paper from the U.K.’s National Cyber Security Centre.
“QKD might be appropriate for certain high-security applications, but it is very unlikely to ever be widely deployed,” said Dustin Moody, a mathematician at NIST. “The type of crypto that will be deployed is exactly the field of post-quantum cryptography, which is what NIST is focusing on.”
For one thing, modern services typically rely on authentication mechanisms, like digital signatures, that QKD can’t replace. And QKD systems are short-range, point-to-point protocols that don’t integrate easily with the internet or mobile technologies, according to the paper.
NCSC also found QKD hardware “relatively expensive” to obtain and maintain.
“A number of attacks have been proposed and demonstrated on deployed QKD systems that subvert one or more of these hardware components, enabling the secret shared key to be recovered without triggering an alarm,” reads the paper. “Denial of service attacks that interfere with the paths carrying the QKD transmissions also seem potentially easier with QKD than with contemporary internet or mobile network technologies.”
Quantum Xchange boasts the first commercial QKD system, a 1,000-kilometer fiber network from Washington, D.C., to Boston and New York City into New Jersey, activated earlier this year. Clients range from large financial service providers and telecom companies to federal agencies, civilian and intelligence, who are in pilots, Prisco said.
Dark fiber — unused fiber laid by earlier projects to cut costs — has actually been easy to acquire, and Quantum Xchange has patents pending for technology that would reamplify the keys across unlimited distances, he added.
NIST’s plan is to release draft post-quantum cryptography standards in 2022, and no later than 2024, identifying algorithms that will provide security from quantum computers for at least several decades, Moody said.
“We think there needs to be something done right now,” Prisco said.
A classical encryption key is the product of two large prime numbers. To crack it, you have to factor a large number into those two primes. The largest key ever cracked was 768 digits, and 2,048 digits are used now — which would take today’s computers a billion years to break, Prisco said. Quantum computers, like the two the U.S. is building, will speed that process up once they’re powerful enough.
That’s why NIST is basing its forthcoming standards off the estimated availability of large-scale quantum computers, so if data needs to be kept confidential for five years, it will be protected with post-quantum cryptography five years before those computers are available.
Assuming the data China exfiltrated in the 2015 Office of Personnel Management breach was encrypted with public-key crypto, those files will be vulnerable to quantum computer attacks once the nation-state has the technology, Moody said.
“I don’t think they’ve read those yet because they were encrypted, and I don’t think they’ve broken the key yet,” Prisco said.
Instead, nefarious actors steal encrypted data and store it until they can crack it, he added.
But public-key crypto is usually only used to establish a shared key, after which the data is protected with a symmetric-key based algorithm using that key — which quantum computers will have a tougher time with, Moody said.
With China set to outspend the U.S. on quantum technologies, Prisco envisions quantum keys future-proofing data with a long shelf life and post-quantum cryptography’s software-based algorithms protecting classical computers.
“I would say the U.S. is ahead of China in the area of quantum computers. China is definitely ahead of the U.S. in quantum keys,” Prisco said. “This is a lot like the space race in the 1960s. The U.S. cannot afford to come in second on quantum keys.”
Pentagon testing waters for accreditor of contractor cybersecurity assessments
The Department of Defense will soon be on the hunt for a third-party accrediting organization to make sure contractors have met newly proposed cybersecurity standards.
DOD issued a request for information Thursday asking organizations interested in serving as the accreditation body to submit feedback on the “long-term implementation, functioning, sustainment, and growth” of the process. The program will be known as the Cybersecurity Maturity Model Certification (CMMC).
The department issued version 0.4 of the CMMC last month, giving contractors a glimpse into the sort of cybersecurity standards they must meet if they want to work on projects that handle controlled but unclassified information. Ultimately, CMMC is an effort to secure DOD‘s extremely complicated and spiderwebbed IT supply chain from the largest contractors to the smallest.
The DOD estimates that 300,000 organizations will need to meet the cybersecurity certification. The accreditation body will not directly perform those assessments. It will manage the other third-party organizations who do that work.
The accrediting body must be a nonprofit that uses “revenue generated through dues, fees, partner relationships, conferences, etc.” to fund its work. There won’t be any other funding from the DOD, the RFI says. The relationship between DOD and the accreditor will be governed by a memorandum of understanding.
Interested parties have until Oct. 21 to submit feedback.
DOD plans to issue the final framework for the CMMC in January. Then, beginning in June 2020, all DOD requests for information will include the standards as a “go/no go” requirement, followed by inclusion in all requests for proposals in the fall that year.
Air Force bases embrace 5G to keep up with their planes
The Air Force has enabled fifth-generation wireless technology at 10 bases with plans for 16 or 17 more next fiscal year, in part, to keep pace with increasingly intelligent planes.
Planes now generate data from every possible feature, making it hard to transfer information off of them fast enough to run analytics while they’re in flight.
5G will allow the Air Force to quickly extract data off its planes and, using network slicing, route information for analysis or predictive maintenance, said Frank Konieczny, the branch’s chief technology officer.
“We want to get to the point where we can predict if a plane is going to have a problem and make sure that a part arrives, before the plane actually lands, to fix the problem,” said Konieczny at the General Services Administration 5G Symposium on Thursday.
GSA used the symposium to launch its 5G customer experience campaign, while the Advanced Technology Academic Research Center simultaneously started a new working group for the technology.
Ultimately, the Air Force wants to enable 5G at all 300-plus installations it manages, Konieczny said.
The Air Force already uses augmented reality in pilot training, and virtual reality will make it so pilots can avoid using training aircraft that are “getting older, older and older,” Konieczny said.
5G will allow VR wargames across multiple bases at once, he added.
The Air Force isn’t alone in its need for 5G.
“With government mobile data traffic expected to be five times higher by the end of 2024, we know that agencies have a business need to start testing how this next-generation wireless technology will help them meet their mission,” said Bill Zielinski, assistant commissioner for the IT category at GSA.
5G’s speeds will be a boon for expansive agencies like the Department of Agriculture, which has 4,500 remote offices, said Tom Suder, president of ATARC.
But those agencies will need to rethink network infrastructure and security as they enable 5G.
“There’s going to have to be a lot of interior building work done, but networks aren’t going to look the same,” Suder said.
New network architectures present new challenges, given the older technologies still in use at agencies and the Air Force.
One unsolved problem, according to Konieczny: “As we move to 5G, we have a lot of legacy LTE and everything else sitting out there. So how is the security of 5G going to be transferred through the LTE network that currently exists?”
Here’s who’s joining the 2020 class of Presidential Innovation Fellows
The new Presidential Innovation Fellows are here.
The General Services Administration, which houses the tech leadership-focused program, announced on Thursday that 20 new fellows (PIFs, as they’re colloquially called) have, as of Thursday, officially started their journeys. Over the next year (or maybe longer) the group will work with a total of nine different agencies on 13 different projects ranging from helping agencies better leverage data to advising on customer experience and human centered design practices.
Executive Director for the PIF Program Josh DiFrances called the incoming class “some of the nation’s best technologists.”
“Over the last seven years, fellows have worked to accelerate the adoption of novel technology within the federal government and save taxpayer dollars,” he said in a statement. “Our new cohort of fellows will continue to build upon this work to help agencies create better products, services, and experiences for the American public.”
Whole ecosystems are beginning to form around past PIFs, too, as they go on to hold traditional leadership positions within the federal government. For example Charles Worthington, CTO at the Department of Veterans Affairs, was a former PIF. Three fellows from this new cohort will be joining his office. Gil Alterovitz, the VA’s inaugural AI director, was also a PIF.
The new fellows and their projects, roughly grouped by agency, are:
- Irtaza Barlas — U.S. Department of Agriculture’s Farmers.gov
- Melissa Keene — Farmers.gov
- Devin Brande — U.S. Navy Digital Warfare Office
- Ken Kato — Navy Digital Warfare Office
- Minh H. Chau — Millennium Challenge Corporation
- George Chewning — VA Research AI Center
- Christopher Corpuel — VA Veterans Experience Office
- Scott Weiss — VA Veteran Experience Office
- C.C. Gong — VA Office of the Chief Technology Officer
- Kaeli Yuen — VA Office of the CTO
- Wanmei Ou — VA Office of the CTO
- Joshua Farrar — VA Office of Resolution Management
- Dennis Chornenky — U.S. Department of Transportation
- Ariele Faber — Centers for Medicare and Medicaid Services
- Michelle Holko — National Institutes of Health “All of Us” research program
- Gina Valo — U.S. Food and Drug Administration
- Nina Walia — FDA
- Angelo Frigo — GSA’s Office of Product and Programs
- Johnny Martin — GSA’s Login.gov
- Likhitha Patha — GSA’s Login.gov
The fellows will serve as “entrepreneurs-in-residence” at their respective agencies. The PIF program began in 2012 under President Obama — this is its seventh cohort.
Architect of the Capitol has weak physical access controls at data center
The Architect of the Capitol IT team needs to ensure that it has complete oversight over who is accessing its data centers.
This is the advice and directive of a recently published audit by the inspector general of the AOC — the agency under Congress responsible for the maintenance, operation, development and preservation of the buildings and land that make up Capitol Hill. The publicly released report details the IG’s work assessing how the Information Technology Division (ITD) controls the “physical integrity” of a data center located at a redacted location.
The IG concludes that while ITD has good policies and practices in place to deal with things like environmental control and system back-up, control over physical access to the data center was less clear.
For example, the IG found that of the 35 people who accessed the data center during the audit period, only 10 were approved by and assigned to ITD. The other 25, it turned out, were mechanics and U.S. Capitol Police officers and others. The IG found “no identified concerns” about these people.
Still, the IG argues, ITD should know who these additional people are, as it has formal responsibility for the data center. “The ITD should have a process in place for proper authorization and/or coordination with [the Capitol Police] and other AOC jurisdictions to control physical access to the Data Center,” the report states.
“Without proper physical access controls… ITD’s sensitive network computer equipment and technology may be at risk for unauthorized access, theft, or tampering.”
In response to the suggestions made by the IG, the Architect of the Capitol has implemented new coordination procedures for non-ITD staff who may need to access the data center facility. These procedures were implemented last month and, as a result, the IG considers both of its recommendations “closed.”
Army Futures Command wants help growing commercial cloud data environment
Army Futures Command is interested in bringing in an industry partner to further develop its cloud infrastructure for enabling better data-driven decision-making across the service.
The command — launched in 2018 with the focus of modernizing the Army in six priority areas — issued a request for information this week to conduct market research on potential vendors to support its Unified Data Environment (UDE).
Futures Command describes the UDE as “an existing cloud based information technology infrastructure to host mission systems, applications, services, and data.” The commercial cloud environment “exists as an enterprise enabling platform that enables Army leaders at every echelon to make fully informed, data driven decisions, based on authoritative and/or production data sources,” the RFI says.
In an eventual vendor partner, Army Futures Command is looking for it to do five things:
- Enable the migration of subordinate elements to a cloud environment by providing the necessary shared commons services.
- Support the migration of enterprise and mission applications currently hosted on Nonclassified Internet Protocol Router Network (NIPRNet), Secure Internet Protocol Router Network (SIPRNet), Defense Research and Engineering Network (DREN), and Secure Defense Research and Engineering Network (SDREN).
- Establish a centralized data warehouse containing curated, rationalized data.
- Establish an enterprise DevSecOps capability to support development and deployment of future applications drawing data from, and population data to, the central warehouse.
- Establish an enterprise Data Science capability to support innovative analysis.
Interested vendors with a secret-level clearance can respond until Oct. 21.
Army Futures Command, based in Austin, Texas, will be a testbed of sorts for the service’s modernization. The Army will pilot part of its enterprise IT-as-a-service program at the command’s headquarters.
CIA’s Juliane Gallina details progress in first months as CIO
A new CTO. Wi-Fi in CIA buildings. A program to replace fax machines. And an acceleration of the intelligence community’s adoption of cloud.
These are some of the items new CIA CIO Juliane Gallina says she has already looked to address in her first few months on the job.
Speaking at the VMware Public Sector Innovation Summit, produced by FedScoop, Gallina detailed what she’s done in her first 140 days as chief information officer, a role she’s never held before. She said there are many “basic things we need to improve.”
Industry outreach and Gray Magic
As a longtime industry partner to the government before taking over as CIA CIO, Gallina said industry outreach “is really important” to the agency’s success. It’s particularly important for agents “to speak candidly and clearly to industry counterparts,” she said. “I think our mode of communication, the way we interface between industry and government, is absolutely critical to the success of innovation.”
To better facilitate that, the CIA has launched a program called Gray Magic, which Gallina said will replace its use of the fax machine. Yes, the CIA still uses fax machines.
“It’s a new network, it’s secure, and it’s designed specifically to allow industry partners to have their own secure, direct communications and collaboration with government to help us facilitate acquisitions and procurement. So please pay attention to this,” she said. “I’m really hoping this is going to help us facilitate the quality and the content of our communications with industry. We’re really dedicated to making it work.”
Full throttle to the cloud
While Gallina said the intelligence community is recognized as the government leader in cloud adoption, the CIA must continue to accelerate its adoption.
“We have to accelerate our cloud journey because it’s not good enough to be infrastructure-as-a-service. We have the embrace platform-as-a-service to really be able to bring in the technologies that are just absolutely standard across all of your companies,” she said speaking directly to members on industry attending the event. “So we are really focused … on a cloud journey that has a future that’s multi-cloud, hybrid cloud, multi-security-domain cloud and to give us the geographic presence all the way from cloud out through fog, edge, mist. We have to dominate n all those aspects of cloud, and our strategy reflects that, and our investment portfolio does as well.”
Langley gets Wi-Fi?
Gallina said the CIA is on the verge of bringing Wi-Fi to its buildings.
“The agency is going to use the 20-year-old standard — the 802.11 Wi-Fi protocol in our buildings. This is huge news,” she said, poking fun. “In all fairness to our agency, we do have some really unique security constraints that make it that slow going that we’re not going to just walk in with our cellphones, obviously.”
While it might not seem a major undertaking to introduce Wi-Fi at government buildings in 2019, Gallina said the CIA’s struggle to do so demonstrates the difficulty agencies can have moving things from pilots to production and across “the valley of death.”
“It may surprise some of you, we’ve had laptop pilots in some of our buildings for many years that have been extremely successful,” she said. “It may surprise you that some officers receive some disseminated intelligence on tablets every day. There is mobility in our buildings to some extent, but it’s always been essentially pilots”.
CIA has hired a CTO
A final tidbit of news, Gallina said she has hired a chief technology officer to join her office.
The new position will “essentially be my CIO whisperer,” she said. While Gallina didn’t name the person, she said “I’m planning on that individual helping us focus on what the disruptive future is going to be” at CIA. They will be “free to think about what things look like beyond and outside our normal investment portfolio.”
A roadmap for automating FedRAMP is coming
Federal officials are considering ways of automating risk assessments and security authorizations for cloud products and services, now that a recent call for public feedback has ended.
The Federal Risk and Authorization Management Program (FedRAMP) Management Office launched an Ideation Challenge in July seeking input from industry, academia and agencies on how to improve procedures.
In 2011, the Office of Management and Budget established FedRAMP to authorize and continuously monitor cloud service offerings across agencies. While some agencies have streamlined authorizations, chief information officers and cloud providers continue to complain the process takes months, if not years, and should be automated.
“With the resources that it takes to bring a product to market, that particular certification, it has to create an innovation gap for agencies,” Ranil Dassanayaka, a senior director at VMware, said Wednesday at the software company’s Public Sector Innovation Summit, produced by FedScoop. “[T]hey may have new tech they really want to use but may not have the appropriate certifications available at the time it’s important.”
The FedRAMP program management office is “very aware” of the authorization timelines and closed the Ideation Challenge within the last month — having received more than 60 responses, said Ashley Mahan, the program’s acting director.
Now the PMO is reviewing the results and evaluating which parts of its process — preparing providers for authorization, ensuring security requirements are met and continuous monitoring — can be made more efficient.
“We’re looking to see where we can make things simpler, where can we provide clearer guidance and where can we automate within those three stages,” Mahan said.
Mahan told FedScoop her office plans to create a fiscal 2020 roadmap starting with “quick wins” on the path to automation, though those wins aren’t being publicized yet.
“And we’re also naming more strategic line items, as well, that we can work toward in the future,” she said.
Meanwhile, agencies like the U.S. Marshals Service continue to perfect their procedures as well.
Christine Finnelle, chief technology officer at the law enforcement agency, said it creates its own five-year roadmaps by following National Institute of Standards and Technology guidelines and working with industry to identify emerging tech.
“Because it’s not just the new technology, but it’s also sometimes new ways that the users are consuming that technology that drives the difference in how you need to approach security,” Finnelle said.
Federal Acquisition Service commissioner resigns
Federal Acquisition Service Commissioner Alan Thomas announced Wednesday he is leaving the General Services Administration for the private sector, two days after the agency consolidated 24 acquisition schedules into one.
Thomas bid FAS employees goodbye in an email obtained by FedScoop. He touted the office’s accomplishments during his two-year tenure. FAS was operating at a $100 million loss in fiscal 2017 and will end fiscal 2019 “in the black” with $70 million in revenue, Thomas wrote.
“When is the best time to leave a job that is fun, rewarding, and where you get to work with great colleagues everyday in service of the American people?” he said. “That’s a tough one to answer, but I am proud of our shared accomplishments and believe now is the right time for both FAS and me.”
On Monday, GSA completed the first phase of Multiple Award Schedules Consolidation, which aims to modernize acquisition for federal, state and local governments. MAS Consolidation is one of the Federal Marketplace Strategy’s four pillars along with the Commercial Platforms initiative — the agency releasing a solicitation for e-marketplace portal providers Wednesday.
The latter endeavor will be taken up by Thomas’ successor, Senior Advisor Julie Dunne, along with improving management of catalog offerings and releasing a request for proposals for contract-writing capabilities in Q1 fiscal 2020. Dunne will start as acting commissioner of FAS on Oct. 15.
“FAS is well positioned to continue its good work in FY 2020 and I look forward to another successful year for the acquisition team and GSA,” said Emily Murphy, GSA administrator, in a second email obtained by FedScoop.