The Obama administration probably didn’t envision local hunting, fishing and boating commissions as examples of how the government can harness big data.
Neither did Waldo Jaquith, director of the U.S. Open Data Institute, when the president asked the Office of Science and Technology Policy to study the power of big data in January. Yet, at a workshop held by OSTP last week, Jaquith spoke of hunting and fishing regulations as a shining example of where big data is headed.
Jaquith was one of many public and private sector officials June 19 who attended a workshop at Georgetown University to discuss the opportunities and challenges big data presents for the government. The workshop was one of six meetings arranged by the OSTP to discuss big data and the first since the White House released its big data and privacy report on May 1.
During a panel discussion, Jaquith told an audience the hunting and fishing data serves as a good example of how his organization tries to release datasets that have clear value to the public, but have not previously been made available.
“A venn diagram of people who care about open data and people who hunt and fish are just two circles,” he said. “This is a problem. What we’re trying to do is identify unintentional biases…and trying to find those new types of data and create these new open data ecosystems where we can identify data that’s useful.”
Several useful applications of big data across government were on display during the panel, which featured a number of agency experts highlighting how big data has helped them be more effective.
Rajive Mathur, director of online services for the Internal Revenue Service, said the IRS now looks at data from a customer service perspective.
“It’s all about: How do we allow the taxpayer to take the information that’s theirs and do what they need to do in order to meet their tax obligations?” Mathur said. “This is your information, and it needs to be secure, and you need to have access to it.”
Mathur spoke about how the IRS has changed its model for data under the Get Transcript program, which allows taxpayers to securely access tax information in an instant, as opposed to waiting up to a week for the same information by mail.
Mathur said Get Transcript allows tax info that’s needed elsewhere, like for federal student aid forms or a mortgage application, to be ported directly into those forms, instead of having users type it in themselves.
At the Department of Veterans Affairs, data is being used on a more granular level for the department to meet its strategic goals.
Rosye Cloud, a senior advisor for veteran employment at the VA, said in years past, data was increasingly scattered, causing for “imperfect gaps” in data moving between agencies. Now, things have become much more streamlined, which has allowed the VA to create new strategies in relation to veteran unemployment.
“By partnering with the Department of Labor and the Department of Defense, we started understanding more about not only those who are unemployed, but we also are starting to ask deeper questions,” Cloud said.
Cloud also touted public-private partnerships that helped VA harness data, including one with the National Student Clearinghouse and Student Veterans of America that examined persistence rates among vets.
Zach Goldstein, acting CIO for the National Oceanic and Atmospheric Association, spoke about how his agency plans to leverage its own public-private partnership off the 20 terabytes of information the agency produces every day. While only 10 percent of that data is available to the public, Goldstein envisions the rest of it being used in conjunction with other data in the private sector to create predictive analysis tools.
“If I know that when this kind of weather pattern occurs and this [other] kind of disease pattern occurs, then I can figure out that I need this kind of medical intervention to prepare for that,” Goldstein said. “There are all sorts of possibilities with the technology that’s available.”
Yet as there is this rush to harness all the data that is already or will become available, other federal officials are concerned that privacy and discriminatory challenges remain.
“I worry about the potential of discrimination by algorithm,” said Julie Brill, a commissioner for the Federal Trade Commission. “There is a whole a lot of information flowing that could be discriminatory. What is a company going to do if they are going to make these differentiations…if that has a discriminatory impact?”
Carole Miaskoff, acting associate legal counsel for the Equal Employment Opportunity Commission, spoke about how the EEOC is wrestling with the challenges big data presents to those looking for work.
“How do we take legal principals and societal consensus of ‘thou shall not discriminate’ and make them meaningful in an era of analysis from these huge databases, and infer rules and criteria that you can select people for jobs? How do you apply it in that new context?” Miaskoff questioned.
Even as agencies follow the Obama administration’s open data directive, Jaquith knows that any positive or negative derived from big data is moot without the push for better technology.
“It doesn’t treat the humans in government as rational actors when we try to browbeat them into publishing data,” he said. “You can pass all the laws you want and all the policies you want demanding that data be released, but if the agency relies on terrible proprietary software, there’s just nothing to be done.”