Jonathan Alboum named USDA CIO

The Department of Agriculture just named Jonathan Alboum its new chief information officer.

Previously, Alboum worked as CIO of USDA’s Food and Nutrition Service and held several jobs at the General Services Administration. He’s also served as a management consultant for PricewaterhouseCoopers and Ventera Corp.

According to his LinkedIn profile, Alboum’s most recent post was as a USDA program executive responsible for MIDAS, an IT program that was halted last year amid delays. The Government Accountability Office recently examined what caused the program to fail, pointing to poor performance and uncertainty about future plans. Alboum took that job a few months before development of the program stopped in July.

He earned a master’s degree in information technology management and a bachelor’s in systems engineering from the University of Virginia.

While Alboum was not available to comment on his new position before deadline, he spoke with FedScoop in 2011 about his first government job as deputy chief information officer at the Food and Nutrition Service.

“What I learned in that first job is how different it was to work in the government as a government employee versus as a contractor,” he said in a FedMentors interview. He added, “Everything that I did every day needed to be with a focus on helping to advance the agency’s mission.”

And in 2012, Alboum told FedScoop he was particularly proud of his work creating a website for MyPlate, an update to the food guide pyramid that first lady Michelle Obama and USDA Secretary Tom Vilsack unveiled in 2011.

Alboum takes the CIO title after Cheryl Cook retired from USDA this spring to take a position with the Pennsylvania Department of Agriculture, news originally reported by FedScoop. Joyce Hunter has been serving in the acting role.

In a release, Vilsack said Alboum’s “vast experience in formulating IT strategies, polices and initiatives will help USDA continue to modernize its tools and resources, innovate our digital services, improve the way we deliver results and technology to our many customers, and protect USDA’s IT assets and information.”

Hunter told FedScoop in April that the department was looking into starting its own digital services shop in-house. The department also has been working to open its data. In April, it co-hosted a hackathon and released an application programming interface, or API, on its public lands data with the Interior Department.

Alboum will start his new job June 29.

​Federal smart cards about to get smarter

Federal personal identification verification cards ­— better known as PIV cards — are about to get a technical facelift.

The National Institute of Standards and Technology, or NIST, has released updated technical specifications and guidance for the next generation of “smart” identity cards used by federal government employees and contractors to gain access to government facilities and computers.

The next generation PIV cards will enable federal employees to connect securely to government computer networks from smart phones and other mobile devices, and provide enhanced security features to verify the identity of federal workers.

The PIV cards in use today contain a microchip that stores digital credentials, including an employee’s photo, fingerprint information, a PIN code and other details, but require card readers that must be attached to computers and mobile devices to complete the verification process.

The new specifications add protections to wireless communications between the PIV card and a mobile device.

“We specified a secure communication mechanism so that the next generation PIV Card can be used with mobile devices, enabling federal employees to connect securely to government computer networks, encrypt or sign email from such devices,” said NIST computer scientist Hildegard Ferraiolo, co-author of the publications.

The new specifications also provide additional ways to prove, or authenticate, the cardholder’s identity. One method, called on-card biometric comparison, helps preserve a cardholder’s privacy using a technique that eliminates the need for an individual’s fingerprint data to ever leave the card. Another new security feature prevents a cardholder from changing the PIN to one that is too short.

“It’s encouraging to see NIST continue to improve the capabilities and security associated with the government’s PIV card,” said Dave Wennergren, senior vice president at the Professional Services Council. A decade ago, as Navy CIO, Wennergren chaired a Defense Department working group responsible for deploying the Common Access Card, which helped launch the use of digitally encoded identification cards for government employees and contractors.

“These enhancements should continue to increase the value of the card and we should applaud NIST’s work. That said though, we must also face the fact that it takes time to implement a new version of a smart card, particularly for a large agency,” he said. “Even after the preliminary work to buy cards and prepare for issuance, new cards will slowly replace expiring cards over a period of several years,” he said.

Wennergren also cautioned that more than a decade after Homeland Security Presidential Directive 12, “there are still far too many government agencies not using the card’s capabilities for cryptographic log-on to networks, digital signatures and physical access. If it’s only being used as a ‘flash pass,’ the new features are wasted,” he said.

The updated NIST specifications are contained in two documents, one dealing with interfaces for personal identity verification and the other detailing cryptographic algorithms needed to maintain the security of the PIV cards. The publications are intended for U.S. government agencies to upgrade their PIV cards, or for vendors that make the cards or develop hardware and software to work with the cards.

Agencies using hybrid clouds need orchestration tools

Agencies that aren’t planning now for a single, open source platform to manage their hybrid IT environments are painting themselves into a virtual corner, a cloud technologist argued at the Brocade Federal Forum Wednesday.

The advent of private and public cloud computing services — and the Obama’s administration’s four year-old cloud-first mandate — have led agencies to migrate a variety of computing work to the cloud. But that has resulted in a complex mix of traditional IT along with private and public cloud systems, each with their own management tools and service level agreements.

“What’s critical to understand is that multiple cloud providers and deployment models require things to be managed separately,” said HP Cloud Chief Technologist James Bond during a forum breakout session. “Every provider requires separate portals, SLAs, and [operating systems]. Go to [Microsoft] Azure, AWS or Dropbox, and you have yet another management console. There’s no incentive for vendors to make their console work with other providers,” he said.

Without having a hybrid cloud management platform, or the proverbial single plane of glass in place, it will become increasingly difficult for IT administrators to orchestrate the provisioning and management of computing, storage and application development across these systems, he said.

“You can put a hybrid cloud management platform in later, but better to do it now,” he said.

One key reason, Bond said, is the time and investment involved in migrating and managing applications as agencies expand the number of private and public cloud systems they use — or opt to move those applications from one system to another.

“You don’t want to be rewriting those applications [for new systems] two years from now. Make sure you write your applications so they’re agnostic to the cloud providers,” he advised.

Bond also made the case for using open source standards — and open source cloud orchestration tools, such as OpenStack. (There are other tools available, such as VCloud Director and CloudStack.) OpenStack was developed by NASA and Rackspace in 2010. It simplifies the provisioning, management and orchestration of cloud computing services. It now has support from many OEMs, hundreds of companies and thousands of developers worldwide, Bond said.

“If you want to have an application that’s capable of moving from one cloud to another, you need industry standards,” he said.

Open source platforms have their drawbacks. With so many voices contributing to their development, users often don’t get the specific features they need, or as soon as they’d like. That’s given companies such as HP, IBM and RedHat the opportunity to build and offer features on top of the open standards, making it easier for enterprises to provision and operate cloud operations.

But longer term, OpenStack will also make it easier for users to automate that work, he said. “The industry is not there yet, but that’s the ultimate goal,” said Bond.

He also cautioned that OpenStack isn’t so much a cloud operating system as it is an “orchestration engine,” that should make brokering computing work to and from different cloud providers easier, he said.

“It’s not the world’s best engine, but its getting there,” he said, adding the OpenStack is still “maybe two years away from having the ability to automate moving applications” from one cloud to another.

But more fundamentally, it is enabling developers to speed up the development of what he called “cloud-native” applications for enterprise users.

HPs-James-Bond-Kristina-Sherk-KSP_fS_Brocade_416

HP Cloud Chief Technologist James Bond used this slide to highlight how open source platforms will foster expanded development of “cloud-native” applications.

 

Additional reporting from the Federal Forum 2015:

Government needs agile networks, federal CIOs sayAt Brocade’s Federal Forum, produced by FedScoop, IT leaders emphasized the need for flexible computing networks as the U.S. faces a mounting cyber threats.

Federal IT isn’t keeping up with new technology – Brocade CEOCEO Lloyd Carney argues the federal government must modernize its legacy IT systems to create stronger defenses against countries using newer technologies.

How the ‘New IP’ can help federal agenciesBrocade said federal agencies that use software-defined networks will get the fast and flexible systems that they have wanted for years.

CIOs turn focus to business outcomes and missionAs they update their IT infrastructure, federal CIOs are beginning to focus on how technology can impact business and mission.

Tony Scott’s plan for restoring confidence in federal cybersecurityThe new U.S. chief information officer outlined his strategy for improving the government’s cybersecurity posture — faster, newer, better.

Federal, industry leaders recognized for ‘Breaking the Status Quo’At Brocade’s 2015 Federal Forum, four federal and industry leaders were recognized for their innovative work in government information technology.

Park Service to unveil easy-to-use mapping tool for workers

National Park Service workers now can make updates to the agency’s digital maps — even if they don’t know how to code.

In the next week or two, the Park Service officially plans to unveil Places Editor, which will allow its more than 22,000 employees to redraw trails, reroute roads or add new buildings in a park. A few hundred workers are already using the tool after a soft launch earlier this year.

Nate Irwin, who leads the Park Service’s mapping team, said the tool would help the agency make quicker updates. Only 50 to 100 people in the Park Service are trained on GIS, or geographic information systems, and many of those workers only do mapping work part time, he said. Now, all workers can make some changes themselves.

“It’s better for us as an organization,” he said. “It makes us more efficient, more agile and nimble so we can address shrinking budgets and resources, and still improve our data processes dramatically.”

To create the tool, Irwin’s team took the existing Park Service GIS data sets and aggregated them into a database that was linked to high-resolution satellite imagery. From there, a Park Service biologist, summer intern or resource manager — or any other worker — can go into the online editor and update trail routes or landmarks on top of the satellite pictures. The edits update the Park Service’s maps almost immediately, Irwin said.

He acknowledges that data from GIS novices may not be ideal. But he said it offers a good starting point. He hopes that thousands of workers will use the tool.

“We’re building a living dynamic product that is going to be constantly updated over time by a growing number of people who have really good on-the-ground knowledge,” he said.

Federal agencies increasingly been looking to reap the benefits of opening up their data — internally and externally. Earlier this month, offices across all levels of government opened up data during the country’s National Day of Civic Hacking.

Irwin’s team has been looking at open data for two years and has been examining how those outside government could improve their data. At a hackathon this spring that was hosted by the Interior and Agriculture departments, the team experimented with a prototype that would allow the public to update Park Service maps. He said preliminary results from the experiment look promising.

Looking ahead, the National Park Service is planning to unveil a new website in 2016 when the agency celebrates its centennial. Mapping will be an integral part of that site, he said.

“So we’re spending this year improving our digital products and next year is going to be a pretty massive roll out for us,” he said.

But in the meantime, he said other agencies across government could used the Places Editor as a model.

“What we’re doing is really prototyping what we think is a system and a workflow that we think will work for most other federal agencies if they choose to go down that road,” he said.

U.S. CTO: Don’t let budgets slow innovation

Despite stiff budget cuts felt in IT shops around government, U.S. Chief Technology Officer Megan Smith believes talented federal IT workers can still find ways to get things done when times are tough.

Smith told federal IT officials Tuesday at the CIO Council’s IT Symposium a story about a CIO who gave up on fixing the agency’s website because it was too costly.

“We can’t talk like that,” the U.S. CTO said. “Let’s find the scrappy way to fix the website together.”

Agencies with innovation labs, like the Department of Health and Human Services and its IDEA Lab, do this. But ideas shouldn’t just come from technically inclined folks, she said.

“The best way to design that is not to have some special group of people way over there — the innovation group — and everybody else is over here,” she said. “The whole team can innovate. Everyone has ideas.”

Lisa Schlosser, the deputy administrator of the Office of E-Government and Information Technology in the Office of Management and Budget, who was also interviewing Smith, agreed.

“It’s also the little things,” she said, like using already-available tools and information to better serve customers. “Use that data to create innovation within your own organization … If you can take data to people who own the money and control the budget in you organization and show how you want to invest to do something better, that is the way to do it.”

A leader in OMB’s IT budgeting efforts, Schlosser is frequently faced with funding issues. Her advice to those with budgeting woes was to prove you can do it first, show evidence and then ask for money.

“When an agency has come up with an innovative plan as part of their budget where they can show that with specific milestones, very specific deliverables and a very strategic approach to transforming their organization from a business-and-mission standpoint using technology, they have been supported by OMB, and I’ve seen them supported throughout the process,” she said. Running a small pilot before requesting money can also make a huge difference, Schlosser said. “You don’t need new funding to run pilots, to show success of your broader plans.”

Smith echoed her advance, citing an idea her former boss, Google co-founder Sergey Brin, promoted during the recent economic downturn: “Scarcity breeds clarity.”

“Maybe you don’t need more money — maybe you need a tiny bit of money — but maybe instead of going for the larger thing, [figure out] what are the ways to move more agile-y, faster and then sort of as a step to show that path” to iterate toward the larger idea, she said.

Amazon CEO and founder Jeff Bezos didn’t start the company as an Internet marketplace that sells pretty much anything. Rather he started with the small goal of becoming the go-to online bookstore, Smith said. Agencies with tight budgets, she said, should imitate that model.

“I think the freedom to iterate is really important,” Smith said. “Because in the iteration, maybe there’s failure … maybe the thing you have isn’t right, and instead of having built this whole huge thing, you have something” to evolve from.

What 40 years of working with data has taught Tony Scott

There was a time when an Apple II with 32 kilobytes of memory and a 160 KB floppy disk drive was all U.S. CIO Tony Scott needed to harness the power of big data.

In the late 1970s, Scott was working for Marriott’s theme park division, using data to study attendance patterns and create employee schedules. Scott said one computer “translated a small amount of information into tens of millions of [dollars in] profit, happy employees and happy guests.”

So as Scott watched big data grow over the past four decades to the mammoth mechanism it is today, he discovered what does and doesn’t work when it comes to harnessing data for an enterprise’s benefit.

Scott spoke about these lessons during a federal big data summit Thursday, dispensing advice to agencies looking to use the massive amounts of data they deal with daily.

Scott stressed the need to have what he called “executive buy-in,” making sure that top-level people are aware of a project’s goals and how the data can benefit the mission at hand.

“I’m surprised how many times it’s not thought over,” Scott said. “If you are going to pick [a project], you better shine a bright light on a problem, or issue, or challenge the agency has.”

Scott also stressed data’s ability to “digitize, not automate” when it comes to effectively carrying out an agency’s mission. He relayed a story from his time at General Motors, where programmers used OnStar, the connected communications service in cars, to collect data that factored into design tweaks before vehicles were mass produced. Prior to the development, Scott said multiple engineers were spending weeks driving concept cars, writing down observations, passing the notebooks off to be digitized and re-entered into the system, with some of the information never reaching designers by the time vehicles were put into production.

“If I look the amount of money that has been spent in the tech space over the last 30 or 40 years, a very high percentage of that has been in automating, not digitizing,” Scott said. “The new world is how you can create and connect in a digital way.”

Perhaps the lesson that best resonates with government agencies is Scott’s call to look outside of people’s own organization for data that could help them achieve a mission. Open data is becoming more important for agencies, whether it’s pushing it to the public or reaching across agencies for data sets that can help government employees reach their own goals.

Scott saw the benefit in external data working alongside Steve Ballmer, the former CEO of Microsoft, who told Scott he very rarely depended on internal data when making business decisions. The remark helped Scott realize that data’s power doesn’t end at the borders of an enterprise.

“When we think about some of the work that we are doing with big data, you have to think like you are a supplier,” he said. “You are a part of the ecosystem that can bring great value to commerce, science and industry.”

Yet for all of the virtues Scott sees in big data, he is cognizant of the challenges data brings to the federal government. He stressed the need for policies that safeguard data, and that establish procedures for determining where data comes from, who owns data, and who is responsible if the data is corrupted or deemed untrustworthy.

“We shouldn’t stop … constantly thinking about unintended side effects and consequences,” he said.

As he’s moved from the days of crunching data on that old Apple II to watching over agencies that are fighting climate change and infectious diseases with GIS and Hadoop clusters, Scott said the mission behind big data has never changed.

“We’re trying to make life better,” he said. “We’re trying to understand our universe better, trying to predict what might happen in the future and trying to improve the lives of everyday citizens around the world.”

Trying to bridge digital divide, FCC votes to expand Lifeline

Millions of Americans stand to receive subsidies for broadband service after the Federal Communications Commission voted Thursday to expand the Lifeline program, which was launched under President Ronald Reagan to offer low-income consumers discounts on phone service.

The highly anticipated vote, which was 3-2, pitted the three Democratic and two Republican commissioners against each other, as the GOP members said they opposed the expansion because there is no spending cap.

Chairman Tom Wheeler said he was “befuddled” at how a program started under a Republican president “has suddenly become so partisan.”

“Let’s look at what the dissenting votes are voting against: recasting Lifeline for the broadband era at a time when America does business online,” Wheeler said in his remarks. “Why should we continue to spend ratepayer funds only on 20th century narrow band service?”

The expansion of the 30-year-old program, which supporters say could help poor families perform simple tasks like pay bills and do homework, would offer consumers a choice to use the monthly $9.25 subsidy for broadband service.

But commissioners Ajit Pai and Michael O’Rielly argued that there is no set budget for the expanded program, which could lead to more fraud, waste and abuse, problems that were rooted out in 2012 during a major overhaul of the program under President George W. Bush.

“I am open to having a conversation about including broadband, but any change must go hand in hand with changes that are necessary to produce a fiscally responsible program,” Pai said. “And this proposal fails that basic test for several reasons.”

Pai said he and O’Rielly had proposed to keep the current $1.6 billion spending cap through 2018, with adjustments for inflation, but the majority rejected the idea. The commissioners have spent months behind closed doors trying to hammer out an agreement.

“It’s clear the majority wants to spend as much as they possibly can without a hint of restraint or possible change in administration,” said O’Rielly.

When asked why the spending cap was rejected, Wheeler told a FedScoop reporter that it would be “putting the cart before the horse” because details of what the modernized program will look like have not been finalized yet.

“Let’s figure out what we’re going to do, and then figure out how we’re going to pay for it,” he said.

Other details that still have to be nailed down include establishing a neutral third-party verifier to determine eligibility. Currently, the carriers and the FCC are charged with confirming who can participate.

The proposal also considers giving out the subsidies directly to consumers through vouchers, and the commissioners will have to decide whether carriers that currently participate in Lifeline should be required to offer broadband service as well as wired and wireless service.

An FCC spokesman said these issues will be discussed and worked out in the next few months before any changes are enacted.

Wheeler and the two other Democratic commissioners, Mignon Clyburn and Jessica Rosenworcel, said it’s a no-brainer that Lifeline should be expanded to include broadband.

“The sad reality is that millions of our citizens are foreclosed from opportunities, trapped in digital darkness and are stranded on the wrong side of the affordability divide,” Clyburn said. “We know broadband is the great equalizer of our time, but it can only be so if everyone has access.”

Rosenworcel focused on what she has coined the “homework gap,” saying an estimated 5 million children cannot do their homework if it involves Internet access — or they have to go to other places with Wi-Fi, like libraries or fast food restaurants, to finish their work.

She said broadband would help students in school and beyond.

“It’s a loss for our collective human capital and shared economic future that we need to address,” she said of the lack of access millions of families have to the Internet.

Advocates of the expansion, including the Leadership Conference on Civil and Human Rights, applauded the vote.

“The FCC took an important step in narrowing our country’s digital divide and ensuring that all Americans have access to the essential communications services they need to live, learn, and work in today’s digital age,” spokesman Scott Simpson said in a statement. “We thank Chairman Wheeler, and Commissioners Clyburn and Rosenworcel, for their leadership in moving forward with the critical and urgent task of building the bridge to connectivity for children, seniors, job seekers, low-income communities, and communities of color.”

Tony Scott’s plan for restoring confidence in federal cybersecurity

FedForum-300

U.S. Chief Information Officer Tony Scott (FedScoop)

Americans’ confidence in the government’s ability to protect national secrets — including personally identifiable information — may be at an all-time low in the aftermath of the recent massive data breach at the Office of Personnel Management. But Tony Scott, the nation’s chief information officer, has a plan to restore that confidence and radically improve the nation’s cybersecurity.

“I don’t know about you, but my angst level on this has gone way up over the last couple of years,” Scott said, speaking to more than 1,100 attendees Wednesday at the Brocade Federal Forum in Washington, D.C. “I don’t think the problem is lack of ideas; I think the problem is lack of implementation. This is our most important mission today [and] we’ve got to do a lot more and we have to do it a lot sooner than what we’re currently on a trajectory to do.”

Although he did not mention or discuss the OPM hack, Scott said one of the biggest security challenges facing government is trying to protect old and outdated IT infrastructure and systems. He likened the process to trying to install air bags in a 1965 Ford Mustang — a not impossible but technically difficult thing to do correctly.

“Fundamentally, at the end of the day, what we have to do is just replace a lot of what we have with much more modern architecture, much more modern concepts of networking, storage and cloud,” Scott said. “So that has to be our most important agenda.”

Replacing the government’s outdated technology certainly won’t happen overnight and is a costly proposition — but it’s one that Scott and many other federal IT officials said is absolutely necessary. But Scott said he expects the percentage of IT spending dedicated to security will likely increase steadily during the next couple of years. “I would expect that that [trend] line is going to take a curve up again as we make investments that have, frankly, been neglected over a 20 or 30 year period of time,” he said. “And we have to do it in a way that has the right set of analytics and data-driven decision making that is important for the success of any enterprise and just the right amount of oversight as well.”

________________________________________________________

Read: What 40 years of working with data has taught Tony ScottFrom theme parks and cars to pharmaceuticals and government agencies, U.S. CIO Tony Scott has seen the ways big data can revolutionize an enterprise.

Improving security in the right way means enforcing some basic processes, according to Scott. And some of those processes and protections that agencies have neglected may have played important roles in some of the recent high-profile data breaches, including the latest at OPM.

“Everything we do should be two-factor enabled, from networks to applications to servers and so on. We need end-to-end security in anything that we do,” Scott said. “Things like two-factor authentication are really important. Things like patching, things like making sure we’re minimizing the number of system administrators and making sure that people with elevated access are also using two-factor [authentication] are some of the key things,” he said. “It’s really important each day we wake up and focus on making our nation’s cybersecurity better.”

Scott called on agencies to refocus on their risk management approach to cybersecurity. “A lot of the money that we’ve spent so far has been on technologies that try to prevent bad things from happening. All of that is necessary and needed but not sufficient,” he said. “Some of the things that I think we’ve got to get better at are things like detecting quickly when something has gone wrong [and] isolating, responding and remediating very quickly.”

The new cybersecurity landscape is one that values speed to market above all else, Scott acknowledged. “My measure of success is speed to market,” he said. “In today’s world, speed means everything.”

Scott’s first response to the uptick in cybersecurity threats targeting the federal government has been forming a 30-day sprint team to study existing security policies, resources and agency priorities. “It won’t be a panacea for everything, but my hope is it will dramatically accelerate our progress,” he said in an interview with FedScoop.

FedForum-300

Department of Transportation CIO Richard McKinney (FedScoop)

Richard McKinney, CIO at the Transportation Department, told FedScoop there’s a growing awareness in the federal government that existing security efforts — smart authentication, trusted Internet connections, continuous monitoring, perimeter defenses — are not enough and more can be done. “I think you’re going to see a shift to understanding what our high value assets are and how to lock that stuff down. I think we’re going to work from the supposition that bad guys will get in. Let’s double-down on understanding our high value assets,” he said.

Scott agreed. “The reality is you’ve been hacked and you either know it or you don’t know it,” he said, echoing a growing trend in federal IT circles that more effort needs to be put into cybersecurity intelligence.

“I think our approach to this is rapidly maturing,” McKinney said. And encryption will play a fundamental role in the future. There is no reason government should not be encrypting data at rest, he said.

Scott, however, acknowledged the difficulties of deploying encryption on systems and applications that are decades old and, in many cases, highly customized. “The other sort of unspoken problem is that in the federal government, there was a trend for a long time to take operating systems and customize them very, very heavily to the point where you couldn’t take patches or upgrades, and couldn’t take advantage of some of the newer modern technologies,” Scott said. “In essence, you were frozen in time wherever you were, and then that locked you out of being able to do a lot of things.”

McKinney said he believes government will eventually get to the point where encryption is commonplace. “To break into a server that stores encrypted data is to have nothing,” he said. “I think the tools to remedy this are right at our fingertips. We just need to act.”

How did USDA’s MIDAS program lose its golden touch?

All that glistens is not gold when it comes to the Agriculture Department’s MIDAS information technology program.

A new Government Accountability Office report looks at why MIDAS, short for Modernize and Innovate the Delivery of Agricultural Systems, was shut down after USDA’s Farm Service Agency spent $423 million on the program.

Originally, the Farm Service Agency hoped MIDAS would provide a platform that would host data, tools and applications for administering farm program benefits. That would be linked to USDA financial, geospatial and data warehouse systems. However, the agency rolled out a system that did not host critical functions that the agency had planned on, particularly the ability to use acreage reporting tools as well as an online portal for farmers. It also wasn’t linked to USDA’s financial system and enterprise data warehouse.

In all, the agency delivered about 20 percent of what was planned, the report said. Meanwhile, cost estimates for MIDAS grew from $330 million to $659 million. After the launch of a second MIDAS software release a year ago, the secretary of Agriculture decided to stop the program.

In its report, GAO found that poor program performance and uncertainty regarding future plans were critical factors that led to the program’s demise. It said FSA did not establish “key program management disciplines” as it pursued MIDAS — particularly in requirements development and management, project planning and monitoring, system testing, and executive-level governance.

Authors advised caution as the FSA starts planning new strategies to modernize its farm services program.

“[T]he agency has not yet established plans to improve its management capabilities. Until FSA establishes and implements such a plan, the agency will continue to lack the fundamental capacity to manage IT acquisitions,” according to the report.

GAO laid out five recommendations for the agency going forward.

In a response to the report, Farm Service Agency Administrator Val Dolcini acknowledged there were “opportunities” to improve IT program oversight.

“FSA has taken active steps to address the issues raised in the draft report,” she wrote in a letter to GAO, which was included in the final report. “Specifically, the agencies has selected a new Chief Information Officer (CIO), who has initiated steps to acquire a third party assessor to holistically evaluate the technology solution chosen for MIDAS, and to provide recommendations that can, and should inform a coherent IT strategy and future IT Service delivery model.”

CIOs turn focus to business outcomes and mission

Federal chief information officers are becoming less concerned with the specifics of their offices’ IT performance and more focused on achieving their agencies’ greater missions, a pair of CIOs said Wednesday.

Speaking on a panel at Brocade Federal Forum 2015, produced by FedScoop, Richard McKinney, CIO at the Transportation Department, and Steve Cooper, CIO of the Commerce Department, discussed how evolving IT systems — what Brocade referred to throughout the day as the “New IP” — could ease CIOs’ operations and maintenance burden, freeing them up to focus more on aligning IT with agencies’ business needs.

At DOT, McKinney said IT has become so siloed and needlessly complex that he said his job is like that of a firefighter: “We’re constantly putting out fires because our legacy construct requires that,” he told the audience.

Since taking over the department CIO role in 2013, he’s been working to reimagine that.

“How can we get ourselves out of the IT business?” McKinney said. “The CIOs could sit down with the business units and begin to imagine what the future is like and quit being firefighters. We don’t do IT for IT’s sake — we do it for a business outcome.”

During his earlier keynote, federal CIO Tony Scott discussed the benefits of using metrics to hold agencies accountable. Commerce’s Cooper, though, wasn’t a fan of looking at IT metrics. Rather, he wants his office’s IT to be assessed for how it helps achieve the department’s overall missions.

“I really don’t care that much about all the normal metrics of IT performance,” he said. “Think of the Department of Commerce. Our entire missions are about enhancing American business and technology around the globe, so why shouldn’t I be held accountable for an increase in the number of jobs in America? It’s a business metric. I’m a CIO. I’m not the IT guy. I don’t care about IT metrics; hold me accountable to business metrics.”

McKinney said in some ways the inverse is becoming true as well — more business units are becoming responsible for the development and modernization of IT.

“Technology is now embedded in almost everything we do,” he said. “And each of us has a role to play in having a successful outcome.”

For instance, thanks to the new passage of Federal Information Technology Acquisition Reform Act, budgeting officials now have to work more closely with the CIO on IT efforts. Likewise, procurement officials are responsible for assisting CIO offices in most effectively acquiring what they need.

Soraya Correa, chief procurement officer at the Department of Homeland Security, said during the panel that her job is “to partner with the IT community and make sure that we’re delivering good solutions together.” Correa said she and DHS CIO Luke McCormack meet regularly to talk about what he needs or envisions and how she can help him obtain it. The end goal, she said, is “getting us to be smarter, better and faster at what we do and what we buy.”

Even chief human capital officers have a stake in agencies’ IT, the DOT CIO said.

“IT is a team sport, and who’s on your team and how they’re trained and how they’re compensated and how they’re retained and how they’re attracted is going to have an impact on the outcome” of IT, McKinney said. “We’re all looking in each other’s eyes and realizing the successful implementation of information technology in this department is a joint responsibility.”

McKinney said “there’s a perfect storm going on right now,” referring to the passage of FITARA, the recent cyber breaches, the proliferation of cloud and how that all places added emphasis on IT in agencies’ missions.

“If we bring all those things together and reimagine where we could be, we could drive down our [operational expenditures], drive up our modernization efforts, and we could begin to have a conversation with the business units about ‘Where are you going and how can we serve you?'” he said.

 

Additional reporting from the Federal Forum 2015:

Government needs agile networks, federal CIOs sayAt Brocade’s Federal Forum, produced by FedScoop, IT leaders emphasized the need for flexible computing networks as the U.S. faces a mounting cyber threats.

Federal IT isn’t keeping up with new technology – Brocade CEOCEO Lloyd Carney argues the federal government must modernize its legacy IT systems to create stronger defenses against countries using newer technologies.

How the ‘New IP’ can help federal agenciesBrocade said federal agencies that use software-defined networks will get the fast and flexible systems that they have wanted for years.

Tony Scott’s plan for restoring confidence in federal cybersecurityThe new U.S. chief information officer outlined his strategy for improving the government’s cybersecurity posture — faster, newer, better.

Agencies using hybrid clouds need orchestration toolsAs agencies expand their IT into multiple clouds, the need for a single, open source orchestration platform is becoming more crucial, a cloud expert argues.

Federal, industry leaders recognized for ‘Breaking the Status Quo’At Brocade’s 2015 Federal Forum, four federal and industry leaders were recognized for their innovative work in government information technology.