Former Democratic staffer and cyber policy expert lands in DOD leadership
Mieke Eoyang, a former senior congressional aide, will be a top cyber policy leader at Department of Defense.
Eoyang tweeted Monday that she will be the deputy assistant secretary of Defense (DASD) for cyber policy, a high-ranking civilian position in setting security policy for DOD networks; strategizing, implementing and budgeting for the cyber needs of the military; and working with international partners on legal agreements.
The DASD position falls under the undersecretary of Defense for policy, a position former Obama administration senior adviser Colin Kahl has been nominated for. One of the biggest DOD leadership positions with a cybersecurity portfolios — the chief information officer — remains unfilled for now.
“Delighted to be joining @DeptofDefense! … Use strong passwords and multi-factor authentication everyone!” Eoyang tweeted Monday.
She worked for several Democratic lawmakers, including as a defense policy adviser to Sen. Edward Kennedy of Massachusetts as well as Rep. Anna Eshoo, who represents key parts of Silicon Valley. She also served as a senior staffer on congressional committees and has been a commentator on MSNBC.
Since leaving Capitol Hill, she worked at the “center-left” think tank Third Way on defense and cybersecurity matters.
Department of Labor gets a chief innovation officer
Chike Aguh is the new chief innovation officer of the Department of Labor, a position left vacant during the Trump administration.
Aguh took the job late last month, reporting directly to the deputy secretary. The office sets the department’s research and development agenda for open government, digital products and the introduction of new technologies intended to improve the workforce system, agency customer service and data sharing.
Aguh previously served as the inaugural head of economic mobility pathways at the nonprofit Education Design Lab, where he launched a multimillion-dollar effort to make community colleges avenues to high-growth fields for thousands of students.
“He is a class act, is incredibly well respected and is full of phenomenal ideas that will help propel labor to the next level,” wrote Xavier Hughes, the department’s chief innovation officer under President Barack Obama, on LinkedIn. “I can’t wait to see what you and the new team at Labor will accomplish.”
Aguh also focused on the future of work and racial equity as a technology and human rights fellow at the Harvard Carr Center for Human Rights Policy, as well as workforce technologies at New Markets Venture Partners. He was a member of the Council on Foreign Relations’ Future of Work Taskforce.
Aguh holds degrees from Tufts University, Harvard Graduate School of Education, Harvard Kennedy School of Government, and University of Pennsylvania’s Wharton School.
“Honored and humbled to be sworn in today as Chief Innovation Officer at the US Dept. of Labor,” Aguh tweeted the day of his appointment. “Eager to get to work on the most important task we have: creating a future of work that includes and dignifies all of us.”
For another top technology job, the department is sticking with Gundeep Ahluwalia as chief information officer — a role he’s held since October 2016.
VA piloting 5G network across Seattle health care system
The Department of Veterans Affairs is piloting a new 5G wireless network across a Seattle medical health care system in partnership with AT&T, the company announced Tuesday.
The network is powered by a new indoor distributed antenna system (DAS) and multi-access edge computing (MEC) that the VA hopes will boost speeds and decrease latency.
The Seattle-area pilot builds on other “5G hospital” experiments, like one at a Silicon Valley veterans hospital, to help enable the use of augmented reality (AR) and virtual reality (VR) tools. This differs, though, in that it is the first network deployment of 5G across a larger VA hospital system, to include multiple facilities linked by the enhanced network.
“This 5G system allows for increased carrier speeds and provides the next generation of cellular and mobility technology for both veterans and staff,” Daniel Mesimer, director of WAN/LAN Infrastructure Engineering and Provisioning Solution Delivery at VA, said. “It sets the groundwork for future capabilities of mobility networks for VA users and applications.”
Under this new pilot, the VA wants to further test the ability of the new technology to one day help connect rural hospitals to major hubs via 5G networks. The hospital system in Seattle includes critical care and training facilities, along with a host of other centers like an eye clinic, mental health center and a pharmacy.
“This public-private partnership to test our 5G and MEC capabilities is distinguished by the scope of implementation and innovation it allows, going well beyond a single-room care environment to encompass an entire medical care and training campus,” said Chris Smith, vice president of civilian and shared services at AT&T public sector.
The VA also choose Washington as the state to launch its new electronic health record modernization program.
JAIC inks deal for prototype of new AI acquisition system
The Department of Defense’s artificial intelligence hub recently signed its first prototype contract to build an acquisition portal to more easily work with private companies.
The Joint AI Center signed an other transaction agreement contract on Jan. 20 with the nonprofit Indiana Innovation Institute to set up and manage a prototype of an agile acquisition portal it’s calling “Tradewind.”
Once up-and-running, the portal will allow the JAIC to communicate directly with prospective vendors and stay in touch with those already on-contract, something the JAIC currently cannot easily do. The center will host Tradewind directly on its AI.mil website with the hope of growing into a project management suite, like a government Asana for AI projects.
“Tradewind will provide a user-friendly framework for our private sector partners to work more efficiently with the DoD to scale and implement AI for the warfighter and consumers across the military,” William Roberts, JAIC chief of acquisition, said in a release Monday. “We want to learn from this initiative to improve the way DoD works with all types of private sector and academic partners, and inject the much needed speed and agility necessary to scale artificial intelligence and transform the Department.”
Tradewind was announced during the JAIC’s AI symposium in September. The concept came together based on feedback from industry and a desire to have a novel acquisition and communication approach that allows for the quick iteration and agile development necessary for AI. Separate from Tradewind, the DOD is also working on contract vehicles specific to AI development.
“One of the key aspects of it is transparency,” Nand Mulchandani, JAIC CTO and former acting director, said of the portal during the symposium. “As a part of Tradewind, we will be building an online portal that will allow industry partners to create self-services and to interact with the DOD and the JAIC.”
While the initial OTA is for a prototype, the JAIC has plans to build out more capabilities to the online portal in the coming months.
DHS wants help with automating electronic invoicing
The Department of Homeland Security is looking for ways to automate electronic invoicing so it can process the more than 200,000 submissions it handles annually in accordance with federal requirements.
DHS wants information technology support that will help its Financial Systems Modernization (FSM) Joint Program Management Office (JPMO) implement products that properly receive electronic submissions and manage data, according to a sources sought synopsis issued Friday.
The current Kofax MarkView platform doesn’t rely on machine learning or artificial intelligence technologies that the Financial Crimes Enforcement Network (FINCEN) bureau within the Treasury Department mandates. FINCEN sets the rules governmentwide for safeguarding financial systems from fraud and abuse, as well as sharing data for law enforcement purposes.
“The objective of this requirement is to update and comply with federal law and regulations to deliver high quality financial services at the lowest price, and automation is playing an increasing role in the future of FINCEN operations,” reads the synopsis from DHS. “Utilizing current technologies such as machine learning, artificial intelligence, as well as other technologies will enable processing volumes transactions in a more efficient manner.”
The invoices in question are all paper and electronic submissions from vendors and within government across all DHS headquarters programs and agencies.
JPMO coordinates all efforts supporting the FSM program, which DHS launched in 2011 to reduce the number of IT systems its agencies use.
While the type of solicitation hasn’t been decided, DHS is planning a one-year contract with a one-year option for software engineers, testers, business process re-engineering, change management, and IT portfolio management.
Contract support will be expected to complete five tasks:
- Data capture and optical character recognition for scanned invoices.
- Automated workflow and robotic process automation for routing invoices, sending approval and rejection notifications, masking sensitive data like banking information, and optimizing based on learned human behavior.
- Process analytics and a dashboard for monitoring business process and invoice statuses, as well as finding anomalies.
- Test and development environments.
- Training of initial user groups.
Businesses that feel they can deliver the solution sought have until Feb. 8 to respond to the synopsis.
AWS cloud now available on milCloud 2.0
General Dynamics Information Technology has added Amazon Web Services cloud services to the Department of Defense’s milCloud 2.0.
Now, AWS is the only other commercial cloud service provided under the GDIT-operated milCloud 2.0 — a suite of hybrid cloud services housed under the Defense Information Systems Agency.
GDIT added AWS to provide defense customers “leading cloud services” and boost the availability of emerging technologies like artificial intelligence and machine learning to be used on DOD data.
MilCloud is a DOD-specific cloud service that DISA has run for military services and Fourth Estate agencies since 2015. With AWS now onboard, the partnership increases AWS’s size in the defense market and gives agencies a new option to migrate to the public cloud in the pay-as-you-go milCloud 2.0 system. Prior to this, milCloud 2.0 was largely an on-premise, private cloud offering.
“Through this collaboration between AWS and GDIT, DoD customers can access leading cloud services from AWS in areas such as compute, storage, database, networking, analytics, machine learning, migration, security, and more,” Dave Levy, vice president of U.S. government, nonprofit and healthcare at AWS, said in a statement.
MilCloud 2.0 is one of the DOD’s many cloud vehicles that agencies are using to migrate away from on-premise compute and data storage. AWS has been an aggressive pursuer of government cloud contracts, especially the DOD’s tactical edge-focused Joint Enterprise Defense Infrastructure (JEDI) cloud contract, which it lost to Microsoft but is protesting.
The milCloud partnership between AWS and GDIT gives defense agencies “another mechanism” to move highly sensitive data to the cloud securely with Amazon’s industry-leading security clearances.
“This aligns precisely with the DoD Hybrid Cloud Strategy and actually simplifies acquisition and choice for DoD and DISA mission partners,” Leigh Palmer, senior vice president of GDIT’s defense division.
Census Bureau makes substantial updates to COVID-19 data hub
The Census Bureau has made substantial updates to its interactive COVID-19 data hub to help better represent the evolving pandemic’s effects on communities and businesses.
Version 2.1 launched this week with new data from the bureau’s County Business Patterns (CBP) data series on payrolls and employment, as well as its latest Non-Employer Statistics, which cover businesses that have no employees.
Those sets and others are geared toward improving how the COVID-19 hub tracks vaccine distribution, said Andrew Hait, an economist with the Census Bureau, during an event Thursday. Employment data is an important piece of the puzzle, he said.
“We know that some additional statistics might be useful to help understand the distribution of vaccines across the nation,” Hait said. “For example, we know already that … occupation is likely going to play a role in sending out vaccines to Americans.”
Various Census Bureau surveys already gather data related to the spread of COVID-19. Like other federal agencies interested in the response to the pandemic, the bureau is entering a new phase of data collection and analysis as vaccine distribution ramps up nationwide.
The bureau recognized at the pandemic’s outset that it needed new programs to measure the coronavirus’ impact and began by reorienting its Business Formation Statistics (BFS), launched in 2018, around pandemic data. New business formations declined at the start of the pandemic but began to rebound in the second half of 2020, Hait said.
BFS data updates are moving from weekly to monthly in January, as the bureau prioritizes more pressing pandemic-related statistics.
The second and third programs with hub dashboards started by the bureau were the weekly Small Business and Household “pulse” surveys gauging the pandemic’s impact on various economic sectors and families, respectively. The former naturally validated the large, negative impact the pandemic has had on the accommodations and food services sector, Hait said.
The Small Business Pulse Survey continues to be revised as the bureau changes its questions to reflect a new stage in pandemic response.
“We started adding new questions that made more sense as the pandemic was evolving,” Hait said. “That will continue to happen as we move forward, so users can make sure that we’re getting data out to you that can help you understand how businesses are not only responding to but hopefully recovering from the pandemic.”
The fourth new bureau program was the Community Resilience Estimates (CRE), which use existing American Community Survey (ACS) data as a baseline to see how states and counties currently fair in 11 risk areas. The fewer risk factors a state or county has, the higher its potential resilience.
CRE is the program most likely to continue after the pandemic ends to estimate resilience with respect to other disasters like hurricanes, Hait said.
The bureau continues to update its hub with the latest data from those four programs, and with its mid-December release of Version 2.0 added data from two new ones: Monthly State Sales Tax Collections and Monthly Retail Sales. Both aspects of the economy have taken a hit during the pandemic and represent new avenues for evaluating its impact.
As for what’s coming, ACS 2019 data will soon be added to the hub, presenting an opportunity for even more variables tied to how different industries and occupations are recovering — provided the pandemic is under control by then, Hait said.
Navy sails into DevSecOps with new program, task force
The Navy is working to adopt new secure coding tools and practices following the lead of other services in using DevSecOps.
To focus its efforts on DevSecOps — an agile coding methodology that aims to bake security into software early on in development — the Navy launched a new platform called Black Pearl and recently established a task force for DevSecOps implementation.
After reorganizing its top ranking cybersecurity and IT officials, the Navy is now trying to inject security into the base level of its digital operations with DevSecOps. The Navy has struggled with cybersecurity, both in its own ranks and with its contractors, and DevSecOps is one route it has taken to try and address its persistent challenges.
Black Pearl hosts a group of products and software practices to give both the ability and resources to Navy coders to make secure products, according to its website. The program is similar to the Air Force’s Platform One and even uses some of its products.
Similarly, the implementation task force was stood up to ensure that as the Navy adopts new software practices and it is not being redundant or inefficient in adopting what other services have already accomplished, according to its establishing memorandum.
Black Pearl had a “soft launch” in the fall but was not widely publicized. One of its founders, Ken Kato, spoke about the program in a closed webinar that was later published online on Jan. 25. Kato, a presidential innovation fellow, was also instrumental in launching the Air Force’s coding factory Kessel Run.
The idea behind the secure platform is to house software development products and repositories for open-source code development and to train the digital workforce in DevSecOps. By putting all code through a common security process, the software achieves a continuous Authority To Operate (ATO) and doesn’t need to conduct timely compliance checks on each application individually.
Some of Black Pearl’s offerings Kato noted in the webinar are “Party Barge,” a shared development environment; “Lighthouse,” a platform-as-a-service baseline; and the “Software Practice,” which is the training hub for coders to become familiar with the new security coding methods.
“We are here to help educate your team so your team can grow,” Kato said during the webinar with other Navy members.
Black Pearl partners with Platform One on products like Iron Bank, a repository of software container images, and Repo One, a place for source code.
Nicolas Chaillan, the Air Force’s chief software office and leader of Platform One, commended the Navy’s new platform during the webinar.
The DevSecOps task force is being led by the Navy CTO Jane Rathbun as a means to create an overarching framework for how the Navy will approach the practice and where it will be implemented. While the memo doesn’t mention Black Pearl directly, the platform will likely play a large part in achieving the task force’s goals.
Unlocking the power of the cloud
Kristie Grinnell is GDIT’s global chief information officer and vice president for supply chain where she leads the company’s enterprise IT strategy and works with federal agencies to support their digital modernization.

Kristie Grinnell, Global Chief Information Officer and VP, Supply Chain, GDIT
Challenges over the past year have forced federal agencies to reconsider their strategic IT road maps. But there’s a larger reason for adjusting course. Agencies are also reaching an inflection point, at which it’s become critical for federal officials focus more fully on their hybrid cloud journey.
Those who’ve already made the leap to the cloud have learned, perhaps the hard way, that each of the major cloud service providers has its own unique strengths and operating requirements. But to fully leverage the power and capabilities of the cloud, it’s crucial that agencies and their components take a step back to make sure they truly understand their data strategies first — and then determine which cloud is best suited to their needs, and why.
It’s equally important that agencies understand what they’re going to be doing in the cloud once they get there — and, longer term, how they plan to integrate the outcomes back into their operations. If you don’t make those decisions properly up front, chances are, your data may very well end up in the wrong cloud.
As the global CIO at GDIT — a company that has a long history of working with government agencies — I’ve experienced those lessons first hand as we went through our own cloud journey. And I’ve come to appreciate they’re particularly important for federal agencies.
Today, GDIT uses a combination of cloud service providers as well as our own, significantly consolidated, on-prem data centers to meet our infrastructure and platform needs. There are several reasons we opted to work with multiple cloud providers.
One basic reason was to prevent vendor lock-in. But more importantly, it also was to make sure we were able to manage our data in the most appropriate environments. And of course, having the right level of security has always been a monumental concern with our agency customers. So is how our thousands of employees and engineers interact with agencies and that data every day.
That journey began by developing a playbook to systematically review our data requirements and determine which cloud services were best-suited to our, and our customers’ needs, including a heavy emphasis on security assurances. We then tested direct connections to respective cloud providers and developed containers to see what worked best. Finally, we moved full speed ahead to take advantage of all that cloud has to offer, including AI and ML, automation, scalability and greater analytic capabilities.
Starting with the right questions
What we learned along the way, though, is that the technical hurdles moving to different clouds are only part of the challenge. A critical question becomes, do you have sufficient talent to get you there — and the right mix of people who understand what’s involved, how different clouds work and the gaps in between?
One reason federal agencies continue to turn to GDIT, for instance, is the breadth and depth of experience among our 14,000-plus cleared professionals — including more than 3,000 cloud specialists supporting federal and state/local government. Our teams have helped set up more than 60,000 different cloud instances across multiple cloud service providers, including AWS, Microsoft Azure, Google Cloud, IBM and Oracle. And we currently work with more than 50 cloud partners. With every engagement, we began by asking: “What type of data do you have? How sensitive is it? What do you want to do with it when you get there? Who needs access to it?”
The other big challenge is the organizational change required — helping people to understand the rationale for change and their role in the future state. We know because we’ve got scars on our back from our own cloud journey — as well as through our experience partnering with agencies as they went through similar transitions.
Building on experience
Collectively, we’ve learned the more scale you have, the more automation you need — whether you’re modernizing on-prem or in the cloud — and that can shift the roles and workloads throughout an organization. That’s where the benefits of experience can help.
At the Centers for Medicare and Medicaid Services (CMS), which operates one of the largest public clouds in the federal government, we have been providing cloud services for the past five years and recently won a new contract for up to an additional four more years. GDIT is standardizing multi-cloud environments for CMS and will implement a range of cloud services, cloud training, tools and software and simplify the financial management of cloud services.
In another example, GDIT helped the Defense Department establish a fit-for-purpose, self-service commercial cloud service within DOD’s network and facilities. By providing critical security controls, including IL-5 security standards (IL-6 coming soon), plus the ability to procure cloud services within in hours, rather than the traditional 16-to-18 month acquisition cycle, DOD mission partners can fast-track the use of AI and ML applications and other emerging solutions.
Our cloud specialists also have a unique understanding of what’s involved in rolling out cloud services at scale. That’s one reason GDIT was awarded a contract to help improve information sharing and collaboration across the DOD with the largest-ever deployment of Microsoft Office 365, to more than 3.2 million users.
What those and countless other projects have taught us is the fundamental importance of constantly re-skilling and up-upskilling of your workforce, in order to take advantage of all the changes and services that the big cloud providers keep rolling out.
Our extensive relationships with all of the major cloud providers, the many partners they work with, and our agnostic approach to finding the right solutions for agencies, are among just some of reasons why federal agencies continue to see the advantage of engaging GDIT’s experts to help them capitalize on the cloud faster and more effectively.
Five years from now, chances are, we won’t be talking much anymore about transitioning to the cloud. Instead, we’ll be talking about all the ways government agencies are leveraging their data using the power of the cloud. But that vision depends on getting your data and cloud strategies properly mapped out today — and having the right talent in place to help you.
Learn more about how GDIT can help your agency with its cloud and digital modernization efforts.
Pentagon: JEDI could be in jeopardy if court doesn’t dismiss political bias allegations
Editor’s Note: This story has been updated with comment from the acting Department of Defense CIO, John Sherman.
The Pentagon is worried that if a federal claims court doesn’t dismiss allegations of Donald Trump’s political influence in the award of the Joint Enterprise Defense Infrastructure (JEDI) cloud contract to Microsoft in 2019, it could “bring the future of the JEDI Cloud procurement into question,” forcing the department to consider a different approach.
The Department of Defense sent an “information paper” to Congress on Thursday evening, explaining the potential impacts of the Court of Federal Claims’ upcoming decision on the department’s motion to dismiss Amazon‘s allegations of “improper influence at the highest levels of Government” in its larger bid protest of the JEDI contract. In the paper, DOD says the court should make “a significant ruling within the coming weeks.”
On the one hand, if the court rules in DOD’s favor to dismiss the single allegation, the department believes it would still likely take four to five months for a ruling on Amazon’s other points in the case, which includes the allegation that DOD made “numerous and compounding prejudicial errors” in its evaluation of proposals. The litigation has been ongoing for more than a year now, and it’s been nearly three years since DOD first solicited proposals for the JEDI contract.
But, if the court finds Amazon’s accusation of political influence to be valid, it will “need to be substantively litigated,” an endeavor that could go on much longer and mean the end of the JEDI contract as we know it.
Any sort of motion for discovery, for example, would likely “include requests for depositions of senior officials at the White House and DoD, including former DoD and White House Senior Officials.”
“These motions will be complex and elongate the timeline significantly,” the paper says. “The prospect of such a lengthy litigation process might bring the future of the JEDI Cloud procurement into question. Under this scenario, the DoD CIO would reassess the strategy going forward.”
Acting DOD CIO John Sherman told FedScoop in a statement: “Regardless of the JEDI Cloud litigation outcome, the Department continues to have an urgent, unmet requirement for enterprise-wide, commercial cloud services for all three classification levels that also works at the tactical edge, on scale. We remain fully committed to meeting this requirement—we hope through JEDI—but this requirement transcends any one procurement, and we will be prepared to ensure it is met one way or another.”
The DOD does have one thing in its favor: Its inspector general already found last year as part of a long and detailed investigation that it does not believe the White House or other top officials influenced the procurement decision.
However, the court has in the past leaned toward siding with Amazon, saying last March that the company was “likely to succeed” in showing that the DOD erred, at least in part, in how it evaluated bids for JEDI. This caused the department to take corrective actions on the contract. But still, even after that, it re-affirmed its original award to Microsoft.