Cybersecurity a rising concern in protecting nuclear stockpile

Nuclear security is no longer solely the frontier of armed guards and anti-aircraft cannons.

Instead, experts say, protecting nuclear assets will depend increasingly on the skills of programmers and cybersecurity experts as the world moves into the digital age.

This was the conclusion of last week’s International Conference on Computer Security in a Nuclear World, where an amalgam of computer experts, system engineers, scientists and policymakers gathered in Vienna as part of a global effort to bolster preparedness for technological threats in the nuclear era.

Hosted by the International Atomic Energy Agency, an organization dedicated to promoting nuclear safe practice and regulation, the conference drew more than 700 representatives from 92 member states and nearly 20 organizations. The largest delegation belonged to the National Nuclear Security Administration, the branch of the Department of Energy that is responsible for regulating nuclear reactors and safeguarding warheads.

“The enduring conference theme is that computer security is a necessary component in an effective and robust nuclear security regime,” said Jazi Eko Istiyanto, who orchestrated the conference. “Computer security and nuclear security must be a continual and holistic process.”

Cyber threats have increasingly come into the international spotlight in recent years, particularly as America continues to wage war in the Middle East and terrorist organizations such as the Islamic State, also known as ISIS, demonstrate technical savvy that eluded predecessors.

In 2010, Iran’s nuclear facility at Natanz was attacked by “Stuxnet,” a virus that targeted programs responsible for regulating mechanical operations. In the aftermath, the facility’s centrifuges, critical to the uranium enrichment process, saw a 30 percent decrease in efficiency.

Another incident occurred last December, when hackers penetrated the network defenses of Korea Hydro & Nuclear Power Co. and stole sensitive data. They used a simple strategy known as phishing, which involves sending volleys of spam emails containing malicious software to employees. The attackers later attempted to ransom the information, claiming that several terrorist organizations and governments had already offered to purchase it.

Although fingers have been pointed — some have asserted that the U.S. was behind the Stuxnet attack — the perpetrators have yet to be identified in either case.

In the wake of these incidents and as the threat of terrorism looms, the NNSA has taken steps to assure that the U.S. nuclear assets are not susceptible to cyber assault in the future.

“Whether it is the protection of critical infrastructure, special nuclear material or nuclear reactors, we all want to protect against cyber attacks, which could have significant impacts, not only at the point of origin but across the world,” said Wayne Jones, NNSA’s chief information officer.

Among the NNSA’s preventive measures is the International Training Course, or ITC, on the Physical Protection of Nuclear Materials and Nuclear Facilities, a joint NNSA and IAEA initiative to train the next generation of nuclear innovators. The program celebrated its 25th anniversary in May, when 43 students from 36 countries graduated from the class.

Despite its name, the ITC takes a well-rounded approach to the strategic defense of nuclear assets, including hands-on exercises, extensive classroom training, and courses that establish an expertise with technology and cybersecurity.

The importance of this sort training became evident at the opening of last week’s conference. The first demonstration consisted of a hypothetical security exercise in which a terrorist organization performs a dual cyber-physical assault and manages to bypass security, swiftly gaining access to nuclear material.

“The demonstration of a blended cyber-physical attack on the opening day of the conference provided a dramatic example of what we have to defend against,” Jones said. “In today’s technology-centric world, so much of physical security has elements in an IT system. However, to have good cyber security, we have to also be able to physically protect our cyber systems.”

Denis Flory, the head of the IAEA Department of Nuclear Safety and Security, agreed.

“When all 164 member states have finally trained their experts to the level offered by the ITC, the IAEA’s task will be greatly facilitated through the existence of a common basis for the further strengthening of nuclear security.”

Not all 164 member states have trained their experts to this extent, however. Even as the NNSA has prepared to protect U.S. assets, there is growing concern that countries like India and Pakistan are not taking sufficient measures in following suit.

According to the Nuclear Threat Initiative’s 2014 security report, which ranked the 25 countries that hold weapons-grade fissile material from most to least secure, India and Pakistan took 23rd and 22nd place, respectively.

These numbers, which indicate increased potential for breaches in security, are of major concern to other nuclear powers, who fear that terrorists might seize upon the opportunity to steal a nuclear device or the material required to construct one.

India and Pakistan possess warheads capable of wreaking havoc on an unprecedented scale. Experts predict that India’s most powerful nuclear weapons could create explosive yields of up to 500 kilotons, more than 30 times more powerful than the bomb that killed 80,000 people in Hiroshima at the end of World War II.

When asked if the NNSA planned on influencing the nations to expedite their security programs, Jones expressed a desire to foster cooperation rather than exert pressure.

“We are not involved in ‘influencing’ [other countries] as much as promoting best practices and sharing lessons learned,” he said. “Certainly, we are willing partners in assisting [India and Pakistan] in any way appropriate in their cybersecurity efforts.”

In a 2015 report, the watchdog group Bulletin of the Atomic Scientists moved the so-called Doomsday Clock, a measure of impending nuclear disaster, forward two minutes. According to the organization, we are now only three minutes to midnight.

“Global nuclear weapons modernizations and outsized nuclear weapons arsenals pose extraordinary and undeniable threats to the continued existence of humanity, and world leaders have failed to act with the speed or on the scale required to protect citizens from potential catastrophe,” the update claimed. “These failures of political leadership endanger every person on Earth.”

The delicate, deliberate delivery of the DATA Act

Rolling out the Digital Accountability and Transparency Act won’t be easy, federal officials cautioned Wednesday.

Karen Lee, branch chief in the Office of Management and Budget’s Office of Financial Management, said the DATA Act will shift how the majority of agencies conduct business.

“Everyone in the federal government knows that data is important,” Lee said during an event held by the Data Transparency Coalition. “There are levels of data that are used to drive programs and their decision-making, but what we’re talking about here is institutionalizing that culture so that it’s replicated over and over again.”

In May, OMB and Treasury unveiled 57 standards, a guidance and an abbreviated DATA Act Playbook to help agencies adhere to the DATA Act. The legislation requires agencies to make their financial, budget, payment, grant and contract data interoperable when published to USASpending.gov, the federal government’s hub of publicly available financial data, by May 9, 2017.

OMB and Treasury, along with General Services Administration and the White House’s Office of Science and Technology Policy, are setting benchmarks to help agencies hit that deadline.

Lee and Christina Ho, the Treasury Department’s deputy assistant secretary for financial transparency, said the feedback from the public and federal agencies has shaped what was unveiled in May — as well as plans for future updates.

“To make sure that those standards work as our economies evolve, we are going to have to set up a mechanism to maintain, improve, add to and take away data elements as our use of that data changes,” Lee said.

OMB has released 15 data standards on the DATA Act GitHub page, with plans to open up 30 more for public comment in the coming months. Ho said they are also working with agencies on how to put the baseline schema into effect at every agency.

“We really want agencies to embrace vision and the value of this approach,” Ho said.

Much of that vision is coming from a two-year test pilot at the Department of Health and Human Services. Lee said the government is already gaining insights, including how to make it easier to report information tied to federal grant money.

“We are great at setting standards, but the work does not stop there,” Lee said. “We know there are lots of ways burden happens. We ask recipients too many times for the same information. We ask recipient and applicants for information in slightly different ways depending on who they talk to. There are a ton of opportunities on eliminating redundancies and how to reduce burden.”

It’s opportunities like this, Lee said, that OMB and Treasury are working to identify to find solutions that have a “beneficial, significant and immediate impact,” similar to how people should learn how to create a healthy diet or workout regimen in order to function properly.

“Intellectually, just like eating broccoli or working out, we know this that is data is good,” Lee said. “But how do we make this part of our work across sectors? Everything in this playbook is marching us toward that goal.”

Navy gets new CIO

Rob Foster, the former deputy chief information officer at the Health and Human Services Department, has been named the new CIO of the Navy, FedScoop has confirmed.

Foster replaces John Zangardi, the deputy assistant secretary of the Navy for Command, Control, Communications, Computers, Intelligence, Information Operations, and Space, who assumed the acting CIO role in May 2014 after Terry Halvorsen became the Defense Department CIO.

Rob-Foster-Navy

Robert Foster takes over as Navy CIO on June 15, sources confirmed. (HHS)

Foster’s first day on the job will be June 15, a source confirmed.

Foster is a retired Navy officer, whose most recent post was as the deputy CIO for Immigration and Customs Enforcement at the Department of Homeland Security. Prior to DHS, Foster was the program manager for the Product Data Management Initiative at the Defense Logistics Agency.

Commissioned in 1984, Foster holds a master’s in information technology management from Naval Postgraduate School. His other IT assignments included stints as director of software process improvement and the Navy/Marine Corps Intranet project officer at Fleet Material Support Office, and the implementation program manager for N/MCI at the Naval Supply Systems Command.

Editor’s Note: An earlier version of this story used an incorrect photo of Foster. The story has been corrected.

Nominee for VA CIO passes first hurdle in Senate

congress

(iStockphoto)

The Senate Veterans Affairs Committee approved the nomination Tuesday of LaVerne H. Council to become the next chief information officer at the Department of Veterans Affairs. The nomination now moves to the full Senate for final confirmation.

“The VA’s information technology program has often experienced project failures, cost overruns and security mismanagement,” said committee chairman Sen. Johnny Isakson, R-Ga., in a statement. “If confirmed, I look forward to working with … Ms. Council to address these issues and ensure that the VA is well-equipped to deliver the best possible care and services to our veterans.”

If confirmed by the full Senate, Council will assume the role of assistant secretary of Veterans Affairs for information and technology. She will replace Steph Warren as CIO and take control of the VA’s $4.2 billion IT budget.

Sources close to Warren said upon Council’s confirmation he will revert to his previous position as the principal deputy assistant secretary for information and technology. He’s expected to stay on at VA at least until he is eligible for retirement in April 2016, sources said.

Council’s nomination and likely confirmation comes at a critical time for VA. The department remains under constant attack from cyber criminals, has suffered from a high rate of personnel turnover, is in the middle of a major acquisition of a new commercial scheduling system and is trying to coordinate health record information sharing with a Defense Department that is bent on pursuing a separate $11 billion initiative.

But Council brings significant private sector experience to the VA, like Secretary Bob McDonald, who came to VA from the CEO position at Proctor & Gamble. Council is the former CIO for Johnson & Johnson.

Before joining Johnson & Johnson, Council served as the global vice president for information technology at Dell.

During her first appearance before the committee in May, Council pledged to develop a technology roadmap to improve data interoperability with the Defense Department. Ranking member Sen. Richard Blumenthal, D-Conn., requested that Council produce a detailed plan and timetable for improving data interoperability with the Defense Department as part of her final confirmation process.

Council said one of the reasons she was chosen for the position was her reputation for doing what she says she is going to do. “I will assure you that you will have a roadmap that will lay out with a scorecard that information about how it should be done,” she said. “This is surmountable,” she said, referring to the data standardization shortfalls between VA and DOD. “And I will make it my duty to make it right.”

Trying to archive that tweet? One startup has a way

From panoramic views on the Interior Department’s Instagram account to a YouTube broadcast of a State Department briefing, the government’s presence on social media has exploded.

But as agencies increasingly take to outlets like Facebook and Twitter, the question arises: Are these posts federal records? And if so, how can agencies save them? It can be tricky and time-consuming to save tweets and chats — some agencies copy and paste them into Word documents for posterity. Other agencies just don’t archive them at all.

“I realized this was a real pain that industries like government were facing — how in the world do you keep records of these electronic communications?” said Anil Chawla, a former IBM engineer.

It’s how he conceived North Carolina-based startup Archive Social, which has emerged as an option for agencies looking to adhere to records rules and still have an active social media presence.

Archive Social acts sort of like a Facebook app — it plugs into your social media accounts and automatically pulls posts and comments, and all their metadata, into an Amazon Web Services cloud. Agencies can then use Archive Social’s portal to search and replay the content. National Archives and a few other federal offices are using the startup to save their posts, said Chawla, the company’s founder and CEO.

The service, which costs about $400 a month for a “good sized” federal agency, allows agencies to easily access documents for Freedom of Information Act requests and could serve as a resource if there’s a legal dispute, he said.

Recently, there’s been increased focus within the federal government on how agencies handle digital records, particularly following news that Democratic presidential hopeful Hillary Clinton used a private email server to conduct official business while she was secretary of State.

The National Archives two years ago said social media content “is likely a Federal record” in a bulletin to federal agencies. It included a list of methods to capture social media content, including using application programming interfaces, aggregators or Web crawling software to pull content. But it stopped short of recommending a method.

“It is not feasible for NARA to provide platform-specific guidance because it is difficult to predict which tools will be available and preferred in the future,” it said.

The National Archives and Records Administration wouldn’t speak about its own work with Archive Social. However, several local and state governments across the country use the startup’s services, including the North Carolina, which put out a searchable portal for the social media activity of many of its agencies through Archive Social.

“We were trying to give citizens and users access to that material without having to go to through an agency,” said Kelly Eubank, head of the digital service section at North Carolina’s Department of Cultural Resources.

Are agencies archiving social media now?

As it stands, many federal agencies don’t seem to be archiving their social media.

According to a survey from the Association for Information and Image Management two years ago, 18 percent of agencies were storing internal social business records, like Yammer chats. Meanwhile, 14 percent were storing external social conversations. That could be anything from copy-pasting text into a Word document to using a backup service to something more sophisticated.

“The use [of social media] is certainly growing,” said Peggy Winton, chief marketing officer for AIIM. “But in terms of capturing that, I believe that most [agencies] either have never started or just abandoned it.”

Winton said ideally agencies would do something similar to what the National Archives is trying to encourage under its Capstone program for archiving emails: set up an automated system that gathers relevant content, and create algorithms to determine what to save.

Expecting, say, a social media manager to gather the records by hand is extremely cumbersome, she said.

“If you leave it into the hands of individuals, it makes it is really, really difficult” to archive, Winton said.

But Chawla said agencies are gaining a greater appreciation for saving their social media records, particularly as agencies increasingly use social media to disseminate critical information on everything from food recalls to disaster response.

“The conversation on social media is incredibly important, and therefore important to retain it and maintain it for the long term,” he said.

Why 18F concentrates on changing culture

An “innovation specialist” at what’s considered one of the most pioneering tech agencies in government made a perhaps surprising remark during a talk on citizen services Tuesday.

“None of this is actually innovative.”

Raphael Majma of the General Services Administration’s 18F digital services team said he and his colleagues are focused on solving problems rooted in culture rather than putting out new code.

“The problems we face are problems every agency faces,” Majma said at a technology conference in Washington, D.C. “How do we get a site on a public server that a citizen can actually click on? How can we make sure we go through a proper process in which we can collect and assess information from citizens when appropriate?”

Majma and two other government IT experts said Tuesday the real innovation in government is coming from cultural changes not a new technology platform. While 18F has helped the government transform its digital presence, Majma said the success comes from instilling new ways of thinking. That thinking, combined with the technology the government has already bought, can deliver what was intended in the first place — government services that people can easily use.

“This is something that the American people already paid for,” Majma said. “This is something that we want to give back to them so that they can use it however they see fit.”

An early example of this new way of thinking actually pre-dates 18F. Peter Levin, formerly the chief technology officer of the Department of Veterans Affairs, said Blue Button — the online tool that makes patient medical records easy to download and share — was not born out of a new piece of technology but legacy systems cobbled together on top of data the department was already using. The agency had high returns on an investment that cost them nothing, Levin said.

Damon Davis, director of the Health Data Initiative for the Department of Health and Human Services, said one of 18F’s key goals is helping agencies realize the potential in opening data. While 18F pushes agencies to release data, it also provides agencies a way to share best practices across the federal landscape.

“I want to know what it is that the Department of Transportation is doing with their data, because we might have a similar issue,” Davis said. “This open communication is incredibly valuable as the entire government moves forward in open data and increased personal access into our own data and information.”

Meanwhile, 18F has been using more open source code in its own new projects, with the hope that other agencies will follow suit.

“Whenever we write any policy for ourselves or work with an agency to write a new policy, we ask them to do it as publicly possible,” Majma said. “We found that when we do that, we get a lot of responses from people. We get folks coming in who wouldn’t have ever weighed in on something and tell us what we need to do better.”

18F is gathering feedback through GitHub repositories instead of the Federal Register to allow people to weigh in on how projects come together.

“One, we show our work,” Majma said. “Two, we show that if we stand it up, there is no reason no one else should be able to. Three, it also allows us to gain acceptance and feedback from people outside the government.”

Davis said this push for openness allows the government to give people the power to create their own novel processes, such as using data to determine how much they should be paying for utilities or prescriptions.

“If data is available to help people find the social services that will help them alleviate their light bill or heat bill issue, then they can spend their time thinking about their health in a better fashion,” Davis said. “That’s innovation and discovery that has never happened before.”

Is the government reversing course on FedRAMP?

Last week was disconcerting for those who provide cloud computing services to the federal government, as it now appears that the federal government is reversing course on all the work to date on the Federal Risk and Authorization Management Program, known as FedRAMP.

Despite an Office of Management and Budget directive that requires agencies to use FedRAMP-compliant vendors for their cloud computing needs, and the investment of tens of millions of dollars by taxpayers and cloud service providers to create a program to meet those requirements, an official at GSA stated that while FedRAMP should be an evaluation criteria, it should not be used to screen eligible vendors from the start. His explanation for this seeming change of direction was that using FedRAMP as an eligibility requirement could limit competition if vendors had not already achieved FedRAMP compliance — referred to as an authority to operate, or ATO — in time to bid. Instead, the official said agencies should simply require that the vendors obtain an ATO before the contract is operational.

Unfortunately, the GSA official’s statement upends the clear security imperatives the government had established for vendors and potentially negates the significant investment of time and money that the government and industry have put into this requirement. Security was a primary consideration when the FedRAMP program was created, but now it seems to have become a secondary concern. Additionally, the agency that is responsible for administering the requirements for FedRAMP is the same one that made contradicting statements. This makes the situation more difficult to address.

Additional concerns stem from a new draft of revisions to the OMB circular A-130 titled, “Management of Federal Information Resources.” The proposed revisions seem to establish independent approval authority over the FedRAMP ATO process by agency privacy officers. The draft also offers an option to create two separate processes.

Editor’s Note: FedScoop first reported the news of the proposed changes to OMB circular A-130.

Overall, these events have raised significant concerns. Industry has been working as a stakeholder in this process to contribute to its success, and dozens of companies have achieved an ATO, with more in the pipeline and who knows how many more getting ready to start the process. All of these companies have each spent millions of dollars to enter and complete this process simply to bid on a solicitation. But now there may be a separate — and possibly overlapping — process.

Industry wants one authority to determine which providers have established and maintained compliance with FedRAMP’s set of security and technical standards and requirements. It would be acceptable to add privacy requirements and another seat at the table, but we do not need another approval authority when we already have governmentwide investment in the existing authority.

It is also important that the eligibility requirements established in 2001 and promoted by OMB, the Defense and Homeland Security departments, and the General Services Administration, be sustained. If not, we will have negated the millions of dollars taxpayers and industry invested to establish security as a precursor for cloud computing investments in the public sector space.

All of the companies that have achieved or are in the process of achieving an ATO fully support security and privacy as technical starting points for the goods or services they offer. Neither security nor privacy should be an afterthought when it comes to any information system, much less those operated by the government. It is important that OMB straighten these issues out before we erode both essential aspects of cloud offerings in the federal marketplace.

Trey Hodgkins is the senior vice president, public sector at the Information Technology Alliance for Public Sector, or ITAPS, a division of the Information Technology Industry Council.

Buyers Clubs in the works for all major agencies

The Office of Federal Procurement Policy hopes to replicate the early success of the Department of Health and Human Services Buyers Club in major agencies across government.

OFPP Administrator Anne Rung, who first announced plans to create procurement innovation teams around government earlier this year, said Monday at the Professional Services Council’s ACQTECH Conference that her office will issue guidance on opening Buyers Club-like labs later this summer.

“We’d like all the CFO Act agencies to have these running in the next few years,” Rung said. “Essentially we want to save space for acquisition officers to try new approaches, even though they’re allowed in the [Federal Acquisition Regulation].”

The first of its kind in government, HHS Buyers Club acts as kind of a test bed for the agency to look for ways to improve how it acquires products and services. Mark Naggar, who leads the club essentially as a one-man shop, piloted the concept within the HHS IDEA Lab under the guidance of former Chief Technology Officer Bryan Sivak about a year ago. The club aims to apply innovative acquisition principles, like human-centered design, departmentwide collaboration and rapid iteration.

Rung said she wants to offer agencies a framework similar to what Naggar has deployed at HHS, but she realizes every agency has different needs.

“The idea is that we’re going to outline some principles that we would hope they would incorporate into these and then allow them the flexibility to set it up in a way that is attuned to their own agency,” she said.

She hopes the new labs would emphasize collaboration, “from beginning to end,” Rung said. “That’s how, I think, Mark Naggar and HHS was so successful, was he brought together the program team, the legal team, the financial team at the front end and worked with them throughout, which doesn’t happen today.”

Anne Altman, general manager of federal government and industries for IBM who joined Rung on stage, agreed that element would be pivotal for improving federal acquisition.

“This notion of bringing different cohorts together — procurement, mission, IT together — to think about what it is they’re trying to accomplish … that to me would be game-changing to have cultural shift within all of the agencies around how together they’re going to be more creative and they’re going to learn each other’s language and they’re going to think differently about the speed to outcome,” Altman said.

Before Rung’s earlier guidance, another department began creating its own safe space for innovative acquisition. The Department of Homeland Security recently launched a concept that it calls the Procurement Innovation Lab. DHS Chief Procurement Officer Soraya Correa said last week the lab’s genesis was part of a larger effort at the department called “Acquisition Innovations in Motion” to take advantage of the many things the FAR allows but are often overlooked.

“The Procurement Innovation Lab is about opening the door to a community that is procurement, but even the broader community that is acquisition … bring me whatever idea you have — and I’m going to open it up to industry as well — that you think I can implement to improve the procurement process,” Correa said. “Let’s see how we can … get things done a little quicker, a little cheaper and a little smarter.”

If these types of innovative acquisition labs can show success, the movement will spread even further, Rung said.

“I think when people start seeing results, though, that’s how you sort of build this effort,” she said. “But people have to see the results.”

The new ‘I’ in CIO

Evolution-of-cio-Grant-Thornton-PSCouncil-illustration

Illustration from 2015 Federal CIO and CISO Survey. (Professional Services Council and Grant Thornton)

Federal chief information officers have no shortage of challenges, but one of the most vexing is how to meet the growing demands to help agencies innovate while also delivering new enterprise IT services more cheaply — and more rapidly.

That was broadly confirmed in a new survey of government CIOs and chief information security officers released Monday by the Professional Services Council and Grant Thornton LLC. The study, the latest in a 25-year run of surveys assessing the state of federal CIOs, provides the usual Rorschach test of federal IT priorities. But it also suggests that the role of the federal CIO is evolving into new — and welcome — territory.

Despite the intent of the Clinger-Cohen Act to give federal CIOs greater executive authority nearly 20 years ago, the job of CIO has continued to mean different things to different agencies. In reality, the “I” in CIO often stood as much for infrastructure or IT operations as it did for information.

This latest survey, however, gives new evidence to the recognition that federal CIOs are increasingly being cast in the role of chief innovation officer, not unlike their commercial sector counterparts, albeit in an environment that remains relatively hostile to innovative practices.

The renewed — if not exactly new — focus on innovation stems from several factors.

One is the continuing mandate to cut costs. The U.S. Census Bureau, for instance, is “committed to taking $5 billion out the cost of the 2020 census,” according to Commerce Department CIO Steve Cooper. That would be a significant achievement, given the 2010 cost around $13 billion. Agencies across the federal government are facing similar mandates. “The only way we can do that is through the innovative use of technology,” Cooper said at a PSC conference Monday, which coincided with the release of the survey.

The Census Bureau, one of a dozen bureaus Cooper supports, is looking seriously into ways to get households to complete census surveys via the Internet. “If we could do that, it would eliminate a significant cost for enumerators. That’s an example of … thinking differently,” he said.

Another factor is the Federal Information Technology Acquisition Reform Act, enacted last December. FITARA routinely gets described as the most significant federal IT reform since the Clinger-Cohen Act, although many CIOs contend the act does more to reaffirm than actually give new authorities to CIOs.

It does, however, give agencies a new mandate to deliver “world-class digital services” and improve the value of federal IT investments. To do that, agencies will need their CIOs to function more as boardroom strategists, not just IT champions.

David DeVries, principal deputy CIO at the Defense Department, is among those who are seeing the evolving role of the CIO. “Our authorities haven’t changed,” he said. “But our value as CIOs and what we bring to the table is now better appreciated and understood by leadership” in a new and more fundamental way, he explained during the conference.

A third factor is the dramatic evolution of new enterprise capabilities coming from the commercial marketplace. These innovations are taking shape on multiple fronts, CIOs say, including the rise of more modern and agile application development, advances in the adoption of cloud technologies and new tools for gathering and analyzing data.

A final and less obvious factor, though, is the fundamental shift in the way technology is filtering up, instead of down, through organizations.

Intel CIO Kim Stevenson put it succinctly in an interview this week with the Wall Street Journal when she said: “IT isn’t transforming the workforce, what’s happening is the workforce is transforming IT.”

We’ve already seen the steady invasion of smartphones, personal computing devices and a new generation of applications invading the workplace. The ease of procuring cloud computing and other IT services promises to give the workforce — especially younger workers who grew up in a mobile, Web-based world — even greater impetus to bring IT innovations to work.

The successful CIOs in government going forward will be those who understand and harness those IT innovations, even as they are forced to work within the rigid lanes imposed upon them by politics and regulations. All told, however, the ability to make innovation, not just information and infrastructure, part of a CIO’s portfolio is an exciting prospect.

Whether every federal CIO is up to the task of serving as their agency’s chief innovation officer, however, is a fair question.

Cleary, “the CIO’s role is evolving,” said Dave Wennergren, senior vice president of technology policy at PSC, and a former CIO in the Navy and deputy CIO at the Department of Defense. “The landscape is changing. Expectations of CIOs have grown. Recognize you’re a key player. So you have a seat at the table, but is your opinion valued?”

He cited the comment of one CIO interviewed for the survey who suggested, “The CIO role as it is defined today is not likely to exist [in the future].”

“I’m not sure what they would be replaced with,” he told FedScoop. “There’s a clearly a need for a chief strategy officer” who understands how technology supports “mission results.”

In the meantime, federal CIOs still “need some clarity” from government and agency leaders on what they are expected to do, if they are to support innovation, Wennergren said. And they need to know top agency executives are “going to give them the tools [they need] and have their back.”

There’s no question, government agencies need innovation-minded CIOs more than ever. How they attract, retain, and support them is arguably one of the most important questions agency leaders should be asking themselves.

Halvorsen promotes exchange program in Silicon Valley

Screen-Shot-2015-06-09-at-1.00.08-PM

The Information Technology Exchange Program, known as ITEP, allows DOD staffers to work temporarily for private industry — and vice versa. (DOD)

Defense Department Chief Information Officer Terry Halvorsen began a five-day tour Monday through Silicon Valley, where he will lead a delegation of high-level Pentagon officials on visits to leading technology companies to discuss ways to improve collaboration.

The trip is in support of the department’s Information Technology Exchange Program, known as ITEP, which allows a limited number of personnel exchanges between the Defense Department and private companies for temporary assignments lasting from three months to one year.

“The DoD cyber strategy recognizes the critical role industry plays in securing the nation’s infrastructure,” Halvorsen said in a statement. “Expanding our industry exchange programs, like ITEP, is part of this effort. Our goal is to create a strong, sustainable, collaborative IT exchange program that fosters partnership and innovation with industry.”

Halvorsen’s visit comes six weeks after Secretary of Defense Ash Carter announced the Pentagon’s new cyber strategy at Stanford University in what was the first visit to Silicon Valley by a sitting Defense secretary in nearly 20 years. In addition to detailing the new Pentagon cyber strategy and announcing the formation of a Silicon Valley-based Pentagon branch of the U.S. Digital Service, Carter also paid a visit to Facebook’s Menlo Park headquarters, where he met with Sheryl Sandberg, the company’s chief operating officer, to discuss managing digital talent.

2014_09_9611341384_4eb7ac69cd_o

Defense Department CIO Terry Halvorsen (FedScoop)

To participate, a federal civilian employee must work in the IT field, be considered an exceptional employee, be expected to assume greater future responsibilities, and be a GS-11 or higher. Industry participants must be U.S. citizens, and they may be required to hold a security clearance. To date, the department has focused on IT workers with expertise in commercial cloud services, mobility, cybersecurity, big data and data analytics, enterprise architecture, and network services.

ITEP was established in 2010 by Congress, which extended it until 2018 as part of the 2014 National Defense Authorization Act. By law, the program is limited to 10 participants at any time.

“We want to be able to engage with technical subject-matter experts from the best cybersecurity and IT companies in the country,” said Gary Evans, the team lead for information management in the office of the DOD CIO.