Watchdog to probe Thrift Savings Plan transition to new website
The Government Accountability Office is set to investigate the Thrift Savings Plan’s recent move to a new online platform following calls from lawmakers to conduct a probe.
In an Aug. 8 letter to lawmakers, Government Accountability Office (GAO) congressional relations leader A. Nicole Clowers said the agency would review an array of matters including the award of the contract made to revamp the user portal for the Thrift Savings Plan (TSP).
“GAO accepts your request as work that is within the scope of its authority. At the current time we anticipate that staff with the required skills will be available to initiate an engagement in about three months,” Clowers wrote. According to the watchdog, the investigation will be led by Managing Director, Information Technology and Cybersecurity Nicholas H. Marinos and will start in three months.
New of the probe comes after House Democrats called for an investigation, after receiving a deluge of complaints from former federal employee constituents.
Following the website launch at the start of June, lawmakers received complaints from constituents experiencing difficulties accessing funds held within TSP. Federal retirees at the time reported difficulties accessing their accounts and reported the incomplete transfer of beneficiary information to the new digital platform.
As of Aug. 10, TSP has recorded 1,965,831 successful set-ups of the new MyAccount login system for the online platform, TSP spokesperson Kim Weaver told FedScoop. TSP has so far registered a total of 10,165,717 logins using the new system.
“The transition was definitely more difficult for some participants that we anticipated, however, we have been transacting business for millions of participants,” Weaver added.
Accenture Federal Services (AFS) in November 2020 was awarded a contract, which is known as Converge, to “reimagine retirement serves for the digital age” and improve the customer experience for users of the TSP.
The TSP serves 6.1 million federal employees and members of uniformed services, managing a total of more than $644 billion in assets. The AFS contract has an 18-month transition and four three-year option periods.
Following the June rollout, users reported a number of issues in relation to the online system, including being unable to access their accounts and being unable to reach the TSP telephone helpdesk, which is known as ThriftLine.
Some users had to wait weeks for a security code to be delivered by mail in order to access the new platform. They also reported only being able to track the historical growth of their portfolio for a few months where previously they were able to go back multiple years.
A Hill source speaking to FedScoop on the condition of anonymity said issues were exacerbated by user wait times of up to nine hours for the TSP telephone helpdesk, which is known as ThriftLine, and said there were multiple reports of helpdesk calls disconnecting.
One former federal government employee describing his experience to FedScoop on the condition of anonymity said: “I realized there’d be a certain amount of chaos due to the new online procedures but after a month when I started calling to clarify the transfer procedures and the phone lines were still totally clogged I was like a bull flashed with a red cape.”
He added: “After being on hold for the better part of an hour the call was dropped. One call was dropped after the agent put me through the third degree to verify myself. The Thrift Plan is not the only one that does this but the constant interruption with useless information were almost unbearable.”
Thrift Savings Plan spokesperson Kim Weaver added that TSP deliberately did not transfer beneficiary information of roughly 150,000 participants because of data quality issues, but that for those participants it had retained their prior beneficiary details and would have paid out funds under that designation if needed.
She added that the lack of historical information for users on the new system was not an error. “Participants have access in their ‘My Account’ to daily balances beginning on June 1, the date of transfer to the new recordkeeper. They also have access to year end balances back to 2010,” she added.
Weaver said also that TSP informed investors of upcoming changes by email in April.
An AFS spokesperson referred FedScoop to the Thrift Savings Plan Investment Board for comment.
Click here to listen to Thrift Savings Plan spokesperson Kim Weaver discuss TSP website issues on the Daily Scoop Podcast earlier this month.
NOAA looks to improve satellite data availability, manage services with industry
The National Oceanic and Atmospheric Administration seeks information on commercial space-based data relay capabilities that can improve satellite data availability and resilience.
NOAA’s National Environmental Satellite, Data and Information Service (NESDIS) needs a more agile, scalable information system to process data, and its Commercial Services Integrated Product Team identified space-based data relays (SBDRs) as an emerging technology that might fill that need.
The service manages the data gathered by the National Weather Service. Much like NASA, NESDIS plans to transition away from government-owned and -operated satellite services to those managed in partnership with industry, so it needs cost-effective commercial technologies like SBDR that will improve space-ground communications and mission operations.
“NESDIS is formulating plans for transition to a common services-based enterprise ground architecture that is both supportive of NOAA’s next-generation satellite observing systems and responsive to new and evolving threats and opportunities,” reads NOAA’s request for information (RFI).
The service wants SBDR capabilities that are either deployable within five years into in-development satellite platforms or that can facilitate out-of-the-box communications to on-orbit satellites in the S/X/Ka-band. Those capabilities must be able to relay critical telemetry and commanding data, as well as higher-rate science mission-data, to and from NOAA satellites.
NOAA uses geostationary (GEO) orbit satellites for earth-facing persistent observations, low-earth orbit (LEO) satellites for earth-facing global observations and both for space weather observations.
RFI responses are due by 12 p.m. EST on Sept. 8, 2022, and respondents may request virtual one-on-one discussions in September or October.
Air Force develops new model for battle management to underpin requirements for ABMS
The Air Force has developed a model for how battle management should be conducted in the future — an initiative that officials see as a requirement for the service’s forthcoming Advanced Battle Management System architecture.
ABMS is the Air Force’s contribution to the Pentagon’s Joint All-Domain Command and Control (JADC2) concept, which seeks to connect sensors and shooters, and provide battlefield commanders with the right information to make faster — and better — decisions.
While industry has developed sophisticated technical designs and solutions for what they believe the Air Force needs for ABMS, officials told FedScoop that demonstrations of these capabilities operated on either flawed or legacy models for how to conduct battle management, which could lead to commanders and troops making bad decisions, albeit on a faster timeline.
Instead, in order to get to the right technical solution, officials with the JADC2 cross-functional team on the Air Force Futures staff had to essentially reimagine what battle management should be to create the underbelly for technical solutions that use advanced algorithms.
“If we do not understand the process of which to make a decision, there is no technology that you’re going to develop that’s going to suddenly turn this tide,” Brig. Gen. Jeff Valenzia, JADC2 cross-functional team lead for Air Force Futures, told FedScoop in a recent interview.
Col. Jon “Beep” Zall, a member of Valenzia’s team who was at the forefront of developing the model, pointed to the need to introduce innovative technologies and do things differently. He quoted automobile pioneer Henry Ford, who said “If I asked people what they wanted, they would have said faster horses.”
“The idea that if all we [in the Air Force] do is just incremental improvements to the ‘as is’ we won’t necessarily meet our goals and objectives for advanced battle management or for JADC2 writ large,” Zall said.
Personnel involved in modeling and simulation discovered that when observing command and control at major commands, such as Pacific Air Forces and Air Forces in Europe–Air Forces Africa, they were able to characterize the process but they couldn’t determine if it was good.
“On acquisitions, when they’re thinking about the ABMS digital infrastructure, which will undergird many of the capabilities … as they develop the ‘to be,’ how do they know what the ‘to be’ ought to be?” Zall said. “What we realized through the demand for rigor from leadership [was] the need to understand the ‘as is’ from analysis and the ‘to be’ for acquisitions. What we postulate is there’s this third thing, I’ll just call it the ‘must do.’ That’s what our model represents.”
Using systems engineering, the team sought to look at what command and control should look like going forward.
Everyone has a different view of battle management, which is to be expected, but the model will hopefully bring some uniformity and common lexicon to the fore.
“What this model will hopefully do is it will converge our mental models of what battle management is, and then we have a common point of departure as we develop things,” Zall said. “We think one of the things that we hope to add value to the joint and mission partner conversation is a common mental model. That’s what we’re trying to achieve.”
Officials told FedScoop that, in their view, joint command and control doesn’t really exist now. This new model would be the first instantiation and provide the building blocks to disaggregate the functions — allowing for a precise understanding for how to do it and how to implement technical solutions to aid it.
“What this [model] allows us to do is to understand what it means to battle manage, because ultimately, what it delivers is requirements that are going to drive [tactics, techniques and procedures] modernization, the non-materiel modernization, and the technical modernization,” Valenzia said.
Through their work, the team devised 13 sub-functions for battle management that units can organize and place in certain organizations as it best suits their missions.
The sub-functions, among others, include: parse orders and plans, facilitate coordination and collaboration, and improving situational understanding.
“Today, what we do is we don’t understand these 13 [sub-functions]. It’s a black box. We park them all over the place and we don’t even know how the black boxes interact with each other,” Valenzia said. “This is what we’re trying to remove that veil and put that level of precision on.”
He said this model introduces a new way to generate requirements translating into a digital exchange requirement design.
It also holds organizations and industry more accountable, Zall said. Instead of an organization interpreting how one wants to do battle management and developing a solution for it, now, with specific information exchange requirements for each function, they can point to exactly how that process should be done.
The officials noted that this model hasn’t been formally adopted yet. Using the systems engineering model, they plan to model and simulate against it. They’ll also be sending it to industry to kick the tires on it and come back with their ideas.
“Industry has been asking us questions that we haven’t been able to answer. This model lets us now answer those questions,” Valenzia said. “The goal is we create one disciplined approach that uses a common set of lexicon.”
Additionally, the model will be headed to U.S. Indo-Pacific Command soon to put it through its paces.
“We have to prove it still,” Valenzia said. “Our hypothesis is that what this model has become is the seed to create what we think will be the first instantiation of joint C2 … This is why we are partnered with the Navy initially, because the Indo-Pacom was our pacing theater” where Pentagon officials see China as the United States’ top military rival.
Air Force officials said they need buy-in from the other services and international partners. Already the UK, Canada, Australia, Japan and Germany have said they’re interested in partnering with the model, the officials noted.
The plan is to brief what the team found to four-star officers at the Pentagon in January.
“We’re going to start writing an operational concept that then uses this model,” Valenzia said.
The vision is that this new model should be able to stand the test of time and be flexible with changes in doctrine.
“This model … also describes the battle management that people who aren’t even lieutenants yet will be executing in 2032 and beyond,” Zall said. “That’s our hypothesis.”
State Department acting CIO Glenn Miller to leave agency at end of 2022
The State Department’s acting chief information officer, Glenn Miller, will leave the agency at the end of the calendar year, according to two people familiar with the matter.
Miller, who became the acting CIO in May after Keith Jones left the State Dept, is a career Senior Foreign Service official who served as principal deputy chief information officer in 2021 and previously served as deputy chief information officer for both operations and foreign operations.
As principal deputy CIO, Miller managed all IT operations for the agency, including cyber operations, business management and planning, operations and foreign operations.
The State Dept’s IT budget in fiscal 2022 is $2.8 billion with 41 major investments.
During his three decades with the State Department, Miller served in senior IT roles in Kabul, Belize City, Frankfurt, Moscow, Seoul, and Warsaw. Domestically he has served as office director and division chief in multiple Bureau of Information Resource Management offices.
Prior to joining the Foreign Service, Miller was an emergency dispatcher in California and is a U.S. Army Signal Corps veteran.
The State Department did not immediately respond to a request for comment.
State Department head of cloud programs Brian Merrick to leave post
Brian Merrick is set to leave his post as director of cloud programs at the State Department, FedScoop understands.
The IT leader steps down after three years in the role, but according to sources will remain in government service.
He has served at the State Department since 2008, when he joined the Bureau of Information Resource Management as deputy manager for the IT Cost Center Working Capital Fund.
Merrick’s previous roles at State include a period as director of the office of digital in the Bureau of International Information Programs, and as director of the Office of Innovative Infrastructure.
Before working at State, Merrick for a period was a financial management consultant and IT project manager at PwC, and before that served for nearly a decade as a commissioned officer in the Army.
During his tenure at State, the IT leader helped to oversee the agency’s transition to the cloud, including the initial implementation of a three-year data strategy launched in September last year.
Speaking at a FedScoop-hosted event earlier this year, Merrick said the speed of recent events has helped to make the case for wholesale organizational change and closer collaboration between sub-agencies at State.
“In today’s day and age we are facing unprecedented expectations: speed of delivery, speed of information, the need for data analytics to drive decisions at very senior levels. The need to operationalize activities very quickly,” he said.
No further details of Merrick’s next destination were immediately available.
The State Department did not immediately respond to a request for comment.
FDIC prioritizing internal modernization says acting chief innovation officer
The Federal Deposit Insurance Corp. is taking an inward turn and will focus on how it can be better prepared for major technological changes in the financial sector, according to the agency’s acting chief innovation officer.
Speaking to FedScoop, Brian Whittaker said the Federal Deposit Insurance Corp. (FDIC) will reorient itself to become better prepared internally to supervise fintech companies, and other new tech entities in the financial services sector. The comments from the recently installed technology leader come as the agency moves ahead with the relaunch of its innovation lab.
“We now have less of an outward focus on policy and instead more about how do we modernize the FDIC to be prepared to receive crypto currency and adopt new technologies. We want to be familiar with the technologies so we’re not caught on the back foot when crypto and others hit in a bigger fashion,” Whittaker said during a Nava Public Benefit Corporation event on Thursday.
“Now our focus is on how do we improve CIOs capacity to deliver for business units. How do we test out blockchain ledger technology. How do we make sure FDIC is prepared for the direction that fintechs are going in financial services?” Whittaker added.
Whittaker, who is the former acting executive director of 18F, the digital services consulting group situated within the General Services Administration, took over as FDIC’s chief innovation officer in March.
Whittaker’s predecessor at the agency, Sultan Meghji resigned from the post earlier this year and dismissed the FDIC as “hesitant and hostile” to technological change in a blistering op-ed published by Bloomberg News in February.
According to Meghji, he received resistance from staff in response to basic modernization efforts such as ending the use of fax machines and physical mail. In the op-ed, Meghji also criticized the knowledge and open-mindedness of staff.
Whittaker added that he plans to take his time getting to know and gain the trust of staff within the agency before pushing for change in areas like robotic process automation (RPA) in order to increase capacity from manual processes.
National Center for Health Statistics targeting fall launch of virtual data enclave
The National Center for Health Statistics is testing a virtual data enclave internally to make its sensitive data available to more researchers, with plans to onboard select pilot projects in the fall, according to its Research Data Center director.
Speaking at the Joint Statistical Meetings in Washington, Neil Russell said researchers will be able to use the virtual data enclave (VDE) from wherever they’re at to find and request data from NCHS.
The launch of the enclave represents a culture shift for a “fairly conservative” federal statistical agency, in response to the Foundations for Evidence-Based Policymaking Act of 2019 encouraging increased data releases, Russell said. NCHS — the Centers for Disease Control and Prevention center that tracks vital statistics on births, deaths, diseases and conditions to inform public health decisions — recognized researchers having to go to one of four secure research data centers (RDCs) or 32 federal statistical RDCs (FSRDCs) nationwide to access its restricted-use data was impractical.
“There is a definite financial hurdle to accessing restricted-use data through the physical data enclave model,” Russell said. “And we’re hopeful that a whole new group, or cohort, of researchers may be motivated to access the restricted-use data through a virtual data enclave.”
A researcher in New Mexico, which lacks any RDCs or FSRDCs, will no longer need to travel to Texas, Colorado or Utah to obtain the restricted-use data they need for their work. And no background investigations will be required of researchers at NCHS, which sponsored the VDE.
RDCs closed at the height of the COVID-19 pandemic, but the VDE can operate 24/7 in theory.
The VDE is 99% built and Windows-based with familiar software — namely SAS so researchers can write code to generate outputs they then request from NCHS — to be customer friendly, Russell said.
NCHS’s sister agency, the National Institute for Occupational Safety and Health, already had an operational VDE, so the former didn’t require a contract. Instead NCHS sent NIOSH its enclave requirements designed for data scientists and payment, which came out of CDC Data Modernization Initiative funds, in September.
NIOSH had no way of handling non-CDC employees logging into the VDE, so the General Services Administration’s Login.gov service was used. Outside researchers must show their driver’s license to create an account, and NCHS conducts virtual inspections of their offsite locations.
NCHS further had NIOSH build a tracking system to create an audit trail for all data released.
NIOSH’s VDE already had an authority to operate at the Federal Information Security Management Act moderate level; encrypted researchers’ connections; required two-factor authentication, and prevented downloading, copy-pasting, printing and emailing of data.
To address the rest of the risk of data exfiltration, NCHS requires researchers and, in some cases, their employers to sign data-use agreements specifying where they’d like to access the data from via a secure server.
While NCHS can’t control violations of that agreement, such as a researcher taking a photo of their output prior to submitting it to NCHS for review, they can be caught.
“I’ve seen journal articles produced through restricted use data that we didn’t know where they got it from; we know it happens,” Russell said. “Your access to the data will be terminated and your employer notified.”
Researchers still must pay a data access fee, and NCHS hasn’t calculated the true operational cost of the VDE just yet.
If more researchers seek VDE access than NCHS can handle, which seems likely, Russell will have to ask the CDC for additional funding to scale the environment.
“It is possible that the demand for this mode of access will outstrip our supply,” Russell said. “Currently I only have approval to stand up 10 virtual machines, which seems ridiculous.”
Air Force Research Lab seeks new algorithms to enhance space situational awareness
The Air Force Research Laboratory is looking for new machine learning and high-performance computing capabilities that could improve the U.S. military’s situational awareness of potential threats in the space domain.
AFRL, under a Broad Agency Announcement updated on Thursday, is soliciting white papers for the effort.
The focus of the BAA is to “research, develop, demonstrate, integrate, test and deliver innovative technologies associated with tasking, collection, processing, exploitation, analysis and dissemination of data and information in support of Space Situation Awareness (SSA), characterization, and assessment of space related events.”
Additionally, “the BAA will develop techniques that provide avenues to leverage new sensor technology, High-Performance Computational SSA, expertise and applications from the ground, orbital and cyber intelligence assessment perspectives to attain an integrated, predictive SSA perspective,” according to the announcement.
More specifically, AFRL is seeking new algorithms and applications for several technical areas including automated pattern learning and reasoning; anomaly detection and characterization and assessment of space events; multi-source data exploitation, analysis and fusion for “timely, accurate and complete characterization” of space objects; and high-fidelity tools for satellite modeling, classification and vulnerability assessment, among others.
Potential use cases envisioned for astrodynamics algorithms include tracking and data association; advanced orbit estimation and prediction; observation error characterization; track initiation with all space surveillance data types; satellite identification and recognition; and data and analysis required for modeling and simulation.
The lab is also looking for “state-of-the-art algorithms and capabilities for 3D modeling and training set development for AI inference engines and ML algorithms within [deep neural network] architectures,” according to the announcement.
The BAA also notes an interest in “state-of-the-art accelerators” for a high-performance computing (HPC) system that could inform upcoming mission needs or requirements for AI, machine learning and machine inference (MI) applications; radio frequency and synthetic aperture radar applications; and an enterprise class data storage architecture or capability to support an HPC system.
“Proposed technologies should address key gaps and shortfalls as identified by AFRL and other Department of Defense technology studies including capabilities for threat awareness, the ability to gather and fuse intelligence data with current and archived intelligence information, provide intelligence analysis tools and exploit space and terrestrial environment information,” the announcement said.
Total funding for this BAA is just under $100 million. About $23.5 million is expected to be obligated in fiscal 2023.
Procurement contracts, grants, cooperative agreements or Other Transaction agreements may be awarded through a competitive process, according to the announcement. Multiple awards are anticipated.
Notably, successful prototype projects that result from an Other Transaction for Prototype agreement, could transition to a follow-on production contract or transaction.
The BAA is open until Sept. 30, 2023. However, to align with projected funding for fiscal 2023, the announcement recommends that respondents submit their white papers by Jan. 2, 2023.
White House CX adviser: Technologists need a voice at highest levels of federal agency decision-making
Technologists must be included at the highest level of federal agency decision-making to ensure American citizens get the best possible customer service, according to a White House official.
Noreen Hecmanczuk, digital experience adviser to the federal chief information officer, said Thursday that the Office of the Federal CIO is working hard to ensure staff with technology expertise have a seat at the table during C-suite strategy discussions.
”We’re trying to get technologists included in meetings that happen upstream because technology supports mission delivery,” she said. “[I]f you just have employees who don’t necessarily have that background, [who] are cracking ahead with a procurement or acquisition and the technologist is missing … that’s a missed opportunity to deliver a service that will delight the customer.”
Hecmanczuk explained that the federal CIO’s office, housed in the Office of Management and Budget, is “trying to get technologists in the room earlier and often. We want to focus on the C-suite — they also have a huge responsibility to protect the safety and security of the systems that we manage on behalf of the American people, and so technologists are hugely important to them.”
Speaking at a discussion panel hosted by Nava Public Benefit Corporation, Hecmanczuk cited a helpdesk automation project within the retirement service division at the Office of Personnel Management as an example of a best practice that resulted from close collaboration between technologists and senior management.
“We brought our technologists to the table with our business partners, so we worked together, carried out a six-week sprint, and in that amount of time realized that out of the 2.6 million customers, two-thirds of those customers could be served with a self-service tool,” she said. “We brought the right folks together at the table, and what we launched was OPM.gov/support.”
NIST post-quantum algorithm candidate’s future uncertain, with second attack proposed
Uncertainty surrounds a cracked post-quantum cryptography algorithm being considered by the National Institute of Standards and Technology, now that researchers have potentially discovered a second attack method.
NIST won’t make a final decision on whether to standardize any of its four Round 4 candidates, including the Supersingular Isogeny Key Encapsulation (SIKE), for 18 to 24 months, giving the team behind it time to find fixes if it chooses.
Researchers from the University of Bristol on Monday proposed an attack on SIKE they say “significantly reduces” the security of SIKE by solving it with torsion.
The agency already selected four quantum-resistant cryptographic algorithms for standardization to address concerns foreign adversaries like China are developing quantum computers capable of breaking the public-key cryptography securing most federal systems. SIKE represents an alternative approach to general encryption, should others prove vulnerable to quantum computers, but it was recently cracked by two researchers with a single-core computer in about an hour — albeit using complex math.
“NIST selected SIKE as a fourth-round candidate because it seemed promising, but it needed further study before we would have sufficient confidence to select it as a standard,” Lily Chen, Cryptographic Technology Group manager, told FedScoop. “The attack on SIKE, while not good news for SIKE, is a positive sign that the research community is taking this challenge seriously.”
Still NIST will only accept “minor changes, not substantial redesigns” to SIKE and has already rejected algorithm changes proposed by previous candidates cracked in “major attacks” for that reason, Chen said.
NIST continues to study isogeny-based cryptography more generally and work with the research community to analyze SIKE’s weaknesses revealed by the attack, which exploited the fact its public key and ciphertext are based on an elliptic curve with publicly known properties and contain auxiliary information not always given by similar cryptosystems.
The attack might be thwarted by a modification to the Supersingular Isogeny Diffie-Hellman (SIDH) protocol that would generate the elliptic curve while hiding its sensitive properties, Chen said.
Whether that’s a useful fix remains unclear because two more researchers proposed an attack algorithm Monday they say wouldn’t require publicly known curve properties to be successful using a regular computer, David Jao, SIKE’s principal submitter, told FedScoop.
“If the research community remains unsettled with no consensus, then that in and of itself would indicate that SIKE is not ready to proceed to standardization,” Jao said. “There is nothing wrong with this outcome, and in fact it may even be the most exciting scenario from a research standpoint. But standardization requires stability.”
The SIKE team has only exchanged “one round of emails” with NIST on an initial fix, he added.
NIST’s confidence in its algorithm candidates depends on the cryptography research community taking its post-quantum cryptography (PQC) challenge seriously, Chen said.
“The NIST PQC standardization process depends on this type of community involvement, and we have expected cryptanalysis results such as this during the process,” she said. “This is not the only such result released so far, and we will handle it in the same way as we have before.”