What 40 years of working with data has taught Tony Scott
There was a time when an Apple II with 32 kilobytes of memory and a 160 KB floppy disk drive was all U.S. CIO Tony Scott needed to harness the power of big data.
In the late 1970s, Scott was working for Marriott’s theme park division, using data to study attendance patterns and create employee schedules. Scott said one computer “translated a small amount of information into tens of millions of [dollars in] profit, happy employees and happy guests.”
So as Scott watched big data grow over the past four decades to the mammoth mechanism it is today, he discovered what does and doesn’t work when it comes to harnessing data for an enterprise’s benefit.
Scott spoke about these lessons during a federal big data summit Thursday, dispensing advice to agencies looking to use the massive amounts of data they deal with daily.
Scott stressed the need to have what he called “executive buy-in,” making sure that top-level people are aware of a project’s goals and how the data can benefit the mission at hand.
“I’m surprised how many times it’s not thought over,” Scott said. “If you are going to pick [a project], you better shine a bright light on a problem, or issue, or challenge the agency has.”
Scott also stressed data’s ability to “digitize, not automate” when it comes to effectively carrying out an agency’s mission. He relayed a story from his time at General Motors, where programmers used OnStar, the connected communications service in cars, to collect data that factored into design tweaks before vehicles were mass produced. Prior to the development, Scott said multiple engineers were spending weeks driving concept cars, writing down observations, passing the notebooks off to be digitized and re-entered into the system, with some of the information never reaching designers by the time vehicles were put into production.
“If I look the amount of money that has been spent in the tech space over the last 30 or 40 years, a very high percentage of that has been in automating, not digitizing,” Scott said. “The new world is how you can create and connect in a digital way.”
Perhaps the lesson that best resonates with government agencies is Scott’s call to look outside of people’s own organization for data that could help them achieve a mission. Open data is becoming more important for agencies, whether it’s pushing it to the public or reaching across agencies for data sets that can help government employees reach their own goals.
Scott saw the benefit in external data working alongside Steve Ballmer, the former CEO of Microsoft, who told Scott he very rarely depended on internal data when making business decisions. The remark helped Scott realize that data’s power doesn’t end at the borders of an enterprise.
“When we think about some of the work that we are doing with big data, you have to think like you are a supplier,” he said. “You are a part of the ecosystem that can bring great value to commerce, science and industry.”
Yet for all of the virtues Scott sees in big data, he is cognizant of the challenges data brings to the federal government. He stressed the need for policies that safeguard data, and that establish procedures for determining where data comes from, who owns data, and who is responsible if the data is corrupted or deemed untrustworthy.
“We shouldn’t stop … constantly thinking about unintended side effects and consequences,” he said.
As he’s moved from the days of crunching data on that old Apple II to watching over agencies that are fighting climate change and infectious diseases with GIS and Hadoop clusters, Scott said the mission behind big data has never changed.
“We’re trying to make life better,” he said. “We’re trying to understand our universe better, trying to predict what might happen in the future and trying to improve the lives of everyday citizens around the world.”