Tepid response to 18F’s quietly released openFOIA site
Developers at the 18F digital services shop showcased a new Freedom of Information Act site to a group of open government advocates Wednesday — and the reaction has been lukewarm.
OpenFOIA, now in its alpha stage, aims to make it easier to submit FOIA requests to various agencies. Developed in conjunction with the Justice Department, the current version is essentially an update to FOIA.gov, which lets users search for information on how to file requests in the federal government.
“Hopefully it will eventually go well beyond this,” Nate Jones, director of the FOIA Project for the National Security Archive, told FedScoop in an email.
Some transparency advocates have hoped the government would create one place where users could submit all their requests — more like the FOIAonline portal, which allows users to file requests for various participating government agencies. Indeed, a FOIA reform bill that’s working its way through Congress would require the government set up such a site.
However, openFOIA does not quite reach that mark. Jones said the site’s underlying problem is that 18F is only able to rely on the FOIA websites and methods that agencies currently use — even if those systems are subpar — and not create new processes.
“18F is not able to fundamentally change the ways FOIAs are requested, processed or posted,” he said in an email. “In that sense, 18F’s claim that it can make ‘hard things possible’ has, for now, been prevented by the federal bureaucracy.”
Sean Moulton at the Center for Effective Government echoed Jones’ concerns. While the demo site is “better than anything that’s out there right now,” he said, “what we were hoping for was kind of this one-stop shop where someone could submit a FOIA to any agency.”
He said 18F had explained several technical challenges to releasing a portal advocates had originally envisioned. But he said he remains hopeful that it could be done as the group releases subsequent versions of the site.
Abby Paulson of OpenTheGovernment.org, which was also represented at the meeting yesterday, told FedScoop her group has not taken a stance on the project.
“In general, we’re pretty supportive of the open government partnership process,” she said.
Meanwhile, in a blog post published on Medium, Nick Sinai, a fellow at the Harvard Kennedy School’s Shorenstein Center on Media, Politics and Public Policy, and a former deputy chief technology officer of the U.S., was optimistic the site would be a jumping off point for further innovation.
“In the long-run, imagine this service expanding to include a single place to learn about FOIA, initiate and track a request (like package tracking on FedEx), have open conversations with a federal agency about improving a request, and find previously released FOIA materials,” wrote Sinai.
Last year, 18F published a blog post about its efforts to make the FOIA process easier. Authors said they wanted to build tools that would “improve the FOIA request submission experience; create a scalable infrastructure for making requests to federal agencies; and make it easier for requesters to find records and other information that have already been made available online.”
Ori Hoffer, an 18F spokesman, confirmed his group “soft launched” openFOIA and received “a lot of feedback from stakeholders.”
“What this first step attempts to do is make the FOIA process less confusing for users unfamiliar with the process,” he said in an email. “DOJ, GSA, and their partners are eager to learn about what works best, or what can be improved, based on user’s experience.”
Hoffer said the site would continue to be developed and improved upon. 18F encourages users to send their comments to 18f-foia@gsa.gov.
Retail, financial services divided on federal data standard bill
Executives from the financial services, retail and e-commerce industries were split Thursday on whether a federal data protection standard could better shield consumers and companies from having their information stolen.
The disagreements surfaced Thursday during a House Financial Services Committee hearing on the Data Security Act. Introduced last week, the bill would hold financial services and retailers to the standards set by the Gramm-Leach-Bliley Act, which forces financial institutions to safeguard sensitive data and explain their information-sharing practices to their customers.
Brian Dodge, an executive vice president with the Retail Industry Leaders Association, told the committee that Gramm-Leach-Bliley is too broad of a rule for the range of businesses in the retail sector.
“If Congress were to pursue legislation that shoehorned the Gramm-Leach-Bliley Act into the rest of the business community, it would go beyond the retail industry,” Dodge told the committee, adding that the Federal Trade Commission has sufficient ability to oversee the retail industry. “We don’t think you can regulate your way to security. We need to start with the baseline that is a strong standard and emboldens the FTC to enforce these standards.”
Tim Pawlenty, the CEO of the Financial Services Roundtable, was supportive of the bill, saying the standards in the Gramm-Leach-Bliley Act are flexible enough to cover all sizes of financial service institutions and can be scaled across other private sectors. He also added that the Data Security Act sets an important national standard in the face of data security laws that can vary from state to state.
“We’re only as strong as the weakest link in the chain,” Pawlenty said. “It doesn’t make a lot of sense to have 50 different standards and 50 different responses.”
Laura Moy, senior policy counsel with New America’s Open Technology Institute, expressed some reservations about a federal standard, saying that it needs to be flexible and serve as a “floor, not a ceiling” for states that have their own robust data standards.
Any data legislation “would need to provide an agile mechanism to match developing technology and new threats,” Moy said. “We can’t always forecast the next big threat years in advance, but we know that there will be one.”
A forthcoming threat that was discussed during the hearing was how the move to EMV cards, better known as chip-and-PIN cards, will affect what companies would be liable for in breaches or cases of identity fraud.
Pawlenty and Dodge agreed that more could be done but diverged on how their industries could enhance security practices. Dodge argued that card issuers should be forced to use both chip and PIN measures when card companies switch to EMV standards in October. Currently, card holders will not be required to use a PIN with their EMV cards.
“Retailers believe that American consumers deserve the best available card security and that deploying the two-factor authentication enabled through chip and PIN will prevent criminals from duplicating cards with ease, devaluing the data that retailers collect at the point of sale and ultimately reducing cyber-attacks on retailers,” Dodge wrote in his testimony.
Pawlenty, along with Stephen Orfei, general manager of the PCI Security Standards Council, said that other measures like biometrics, tokenization and end-to-end encryption are moving the industry away from any card-based protections. Orfei added that any security implementations are weak unless their standards are followed.
“Applying our standards is the best line of defense,” he said. “When bundled and implemented properly, the data is useless and there is no reason to break in.”
Rep. John Carney, D-Del., who co-sponsored the bill with fellow committee member Rep. Randy Neugebauer, R-Texas, said a solution needs to be discovered quickly, because the patchwork of state laws is failing to protect companies and consumers.
“We think consumers and the companies that handle their personal financial data should know the rules of the road when it comes to protecting this data,” Carney said. “The fact is that the White House, Congress and consumers agree that the status quo isn’t working.”
STEM in the sun: Kids flocking to tech summer camps
Most kids look forward to a break from school during the hot summer months, but many are deciding to swap alfresco activities for computer coding.
More schools, government agencies, nonprofits, startups and traditional camps are offering programs that appeal to kids’ inner technophiles, with programs lasting from a day to several weeks that focus solely on coding and STEM, trendy new fields that are attracting interest among kids at an increasingly earlier age.
According to a 2013 survey from the American Camp Association, which tracks the $15 billion industry, 54 percent of camp directors said they added a new activity or program in the last two years. STEM summer programs surged by 12 percent, along with fitness and wellness programs.
As the federal government continues to devote more resources to expanding programs, like President Barack Obama’s recent $240 million commitment to increase STEM awareness among minorities, girls and low-income children, more kids are getting into the game – literally. At a tech incubator in Crystal City, Virginia, an after-school coding program recently wrapped up a spring session, attracting about 15 kids who traveled from near and far to attend.
FedScoop has compiled a list of cutting-edge summer programs that are making waves in different cities, including some that are still accepting applications.
Washington, D.C.
Microsoft’s Innovation & Policy Center is putting on an event May 21 called DigiGirlz Day, a daylong technology camp for middle school girls who are interested in IT or entrepreneurial careers. They will be able to use products like the Surface tablet, build apps and play with Xbox games. Prominent women at the company will talk about possible career choices in STEM fields. The event, which is also being held in several other cities on different dates, will be held from 8:30 a.m. to 3:30 p.m. More information can be found here.
iD Tech gets girls involved in science, technology, math and engineering through its TechGirls program, run by the Department of State’s Bureau of Educational and Cultural Affairs. This summer, five American girls ages 7 to 17 will not only get a science-minded education, but will also participate in a cultural exchange program with their peers from countries like Algeria, Egypt, Jordan and Morocco. The three-week program “seeks to inspire international friendships and initiate a vital exchange of perspectives,” according to program officials. The deadline to apply was May 2, but iD Tech runs several other camps throughout the country.
NASA offers a host of summer programs for middle and high school students interested in everything from astronomy to weather trends, but many of the application deadlines come early in March. For those who still want to get a taste of space, the Virginia Space Flight Academy in Wallops Island – about a three-hour drive from downtown Washington – is still accepting applications for four different weeks through August. The co-ed residential camps are targeted to kids ages 11 to 17, where they can build rockets, learn robotics and take a tour of the nearby NASA Wallops Flight Facility. More information can be found at https://www.vaspaceflightacademy.org.
New York
Vidcode, a startup run by two friends who met at a Startup Weekend EDU in New York a couple of years ago, is offering coding classes for girls ages 11 to 14 at the 92nd Street Y. The summer intensive workshop, which costs $595 and runs July 20-24, teaches budding computer scientists the “language of the Web,” also known as JavaScript, according to the program website. Students will also learn how to make interactive music videos, stop-motion animations and documentary projects. For more information, visit http://www.92y.org/Class/Vidcode-11-14-yrs.aspx.
A new city pilot program called NYC Summer STEM 2015 will engage 1,200 kids in 2nd, 7th and 10th grades at sites across the city for four weeks, New York City Schools Chancellor Carmen Farina announced last week. The curriculum will be provided by the Polytechnic School of Engineering at New York University, and will focus on robotics, engineering, technology and the science behind urban development and sustainability for the older students. Elementary school students will receive hands-on learning with topics like infrastructure, energy and transportation. The $2 million project is being funded by Microsoft and the Fund for Public Schools. Students can register for the program by borough when the application is available. For more information, visit the NYC schools page.
California
“Cool Technology,” “Eco Explorations,” “Geology Camp” and “Lego Engineering” are courses still available for kids entering 4th and 5th grades through the Lawrence Hall of Science at the University of California, Berkeley. Groups of 16 campers each can learn how to build technological contraptions (kids will find out how to attach bananas to computers to make music), explore local aquatic and terrestrial environments, study rock crystals and learn the mechanics behind Legos. The workshops start at various times during the summer and last two to three weeks. Several sessions across all grade levels are still open and can be found here.
Kids who are clamoring for an authentic college experience can still apply for some open slots at Digital Media Academy’s technology summer camps at Stanford University, which run from June 22 to Aug. 21. Students ages 12 and up can take sessions on campus, eat in the dining halls and stay in a dorm overnight. Open programs include Adventures in 3D Game Design with Minecraft, Photography & Graphic Design, and Robotics & Programming. For more information, visit https://www.digitalmediaacademy.org/northern-california-tech-camps/stanford-university/.
Health Datapalooza: Data innovation five years later
It was nearly five years ago that an enlightened group of leaders first gathered at an obscure forum in Washington, D.C., hosted by the Institute of Medicine and the Department of Health and Human Services, to explore how big data could improve health care in America.
Propelled by an energetic new chief technology officer at HHS named Todd Park, the forum brought together innovators from all walks of the health care and IT communities. Their vision was to create a platform for harnessing health care data — analogous to how an entire industry arose around a steady government supply of National Weather Service data.
That initial forum, held June 2, 2010, rapidly snowballed into what today is better known as the Health Datapalooza, which is expected to bring more than 2,000 technology experts, entrepreneurs, policymakers and health care system leaders together in Washington at the end of the month.
It also gave rise to the Health Data Consortium, a public-private partnership which now manages the Health Datapalooza and works to promote the availability and innovative use of open, machine-readable health data.
And it later became a model for the White House — which recruited Park to serve as U.S. chief technology officer in 2012 — to showcase how freely available government data might fuel innovation and jobs in other fields including energy, public safety and education.
The challenge then, and now, for those working to unlock the value of all that data — especially government data — has been in trying to liberate oceans of data from the systems, applications and processes in which they were created, and allow it to become the raw material for other uses. But there also remains a slew of legal and policy hurdles that must be addressed if the economic and social benefits of data innovation are to be fully realized in the U.S.
A new report released this week by the Center for Data Innovation puts those hurdles into clearer perspective. In it, authors Daniel Castro and Joshua New lay out a dozen bipartisan recommendations for ways Congress might accelerate data innovation in the U.S.
These recommendations offer a constructive snapshot of the issues facing those who have a stake in the data economy – which is pretty much everyone these days, but in particular those who produce the data on which much of the U.S. economy depends.
A good example, and one that is especially relevant to those gathering at this year’s Health Datapalooza, is the need to adopt universal patient identifiers for health care.
While hospitals and health care providers have generally embraced electronic health records, they have yet to fully benefit from them because there is still not an accurate and efficient way to match patients to their records.
As Castro and New make clear, “Without a reliable patient matching system, providers must spend time manually matching patients; patients can be erroneously matched to the wrong records, and some records belong to a patient can be overlooked. Even a single organization with multiple computer systems may experience this problem where misidentification rates range from two to twenty percent.”
The Department of Health and Human Services “recognized the ‘urgent and critical’ need to create a system of unique patient identifiers almost two decades ago, and this need has only grown more severe since then,” they noted.
While the original language of the Health Insurance Portability and Accountability Act, or HIPAA, identified the need to create a national universal patient identifier system, subsequent legislation blocked funding for enacting such a program. Congress can and should change that.
Similarly, educators, school administrators, researchers, families and prospective college students could gain enormous value from education data.
Unfortunately, many of these data sets are fragmented, vary by state, are often not publicly accessible, or aren’t interoperable. For instance, while 43 states link K-12 education data with early childhood data and 44 states link K-12 data with postsecondary data, only 19 states link K-12 data with workforce data and only 18 states link all four categories of data, according to Castro and New.
Once again, the creation of a reliable system to ensure that a “de-identified student record could be tracked over time” could provide much greater insight in “analyzing things like the impact of early-childhood education on the workforce,” they wrote.
Castro and New offer up a number of other recommendations — improving the management of geospatial data, clarifying financial regulatory data requirements, taking advantage of new data technologies to modernize supply chains, and letting consumers access their energy data from smart meters. Their recommendations are worth reading and advancing.
Innovators, of course, aren’t going to wait for Congress or federal policymakers, nor should they.
That was the genius behind the first Health Datapalooza. It served as an innovative way to circumvent the bureaucracy and inertia that has kept data captive, working as a catalyst that brought data users and producers together in the same room with business-minded developers to reimagine how data can be put to smarter use.
Keeping faith with America’s veterans
Putting the 1 percent of Americans who serve the nation in uniform back to work after a decade of war and sacrifice isn’t just a moral imperative, but it also happens to be good for the country’s tech industry.
With more than 18 million science, technology, engineering and mathematics jobs open and few qualified candidates to fill them, America’s tech industry faces a talent crisis like no other time in its history. Likewise, more than 1 million veterans will make the difficult transition from military to civilian life during the next five years. For many, it will be a move from a life of dedication and purpose to one of uncertainty in a frail private economy known more for eating its young than for developing it.
Retired Marine Corps Maj. Gen. Chris Cortez sees an opportunity in the convergence of these two socioeconomic forces.
“Every day industry is pulling their hair out trying to find qualified people to fill those roles,” said Cortez, vice president of the newly created military affairs division at Microsoft. “And now you have this pool of talented service members who are leaving the military and who are going to want to find a job and nobody’s really going after them. But why not? Look at the experience they gain, look at the leadership skills and the tough situations they have to work their way through. This is a pool of talent that can really help America.”
Cortez’s commitment to his fellow veterans isn’t lost on Microsoft. The company has been an active participant in multiple programs designed to put veterans to work. And when Congress passed the VOW to Hire Veterans Act in 2011 — allowing service members to begin the private employment process while still on active duty — Microsoft stepped up and created the Microsoft Software and Systems Academy. The 16-week intensive course provides those who are accepted training in a multitude of IT disciplines and guarantees graduates an interview with the company.
When we get to our full potential, we’ll probably be graduating over a thousand per year.
“They don’t have to have a technology background, but they do have to have some aptitude. And we personally interview them,” Cortez said in an interview at FedScoop headquarters in Washington, D.C. “We select those we truly believe can make it through this. This is not easy. It’s not a give-away program.”
Through Cortez’s leadership, the MSSA program has become the crown jewel of Microsoft’s military affairs division, stood up by Cortez last July. What started 18 months ago as a pilot project at Joint Base Lewis-McChord in Washington state has expanded to Fort Hood, Texas, and Camp Pendleton in California.
To date, about 220 veterans have graduated from the course. “The overwhelming majority of those 220 have been hired into an IT company,” Cortez said. “We don’t and we don’t pretend to be able to hire them all. We guarantee you an interview, but if we don’t hire you, we want you to get a job.”
So far, Microsoft has hired about a third of the graduates — a hire rate that tracks with industry standards in terms of the interview-hire ratio. Microsoft has enlisted its enterprise partner companies in the effort, many of which have agreed to review and interview candidates for open positions.
But even Apple — Microsoft’s historic competitor — has hired a graduate of the MSSA program. “And we’re happy about it. It’s about the veterans,” Cortez said.
Army special forces Sgt. Bernard Bergan remembers his last year in the military as a stressful time for him and his wife. A tour in Afghanistan and the lengthy, unpredictable deployments that come with life in special operations put a great strain on his family — although they would never say it. As he was preparing to finish his six years as an IT specialist with the Army’s 1st Special Forces Group, an unexpected deployment to Korea injected even more uncertainty about how his post-military life would play out.
That’s when he discovered the MSSA program in 2013. At that time, the program was still new and Bergan understood there was some risk involved in dedicating his last 16 weeks to a training program. But because it was Microsoft behind it, he felt the risk was worth taking. And that’s when it started for him.
The program required 12 to 18 hours per day. “It was almost like I was deployed but coming home every night,” Bergan recalled during an interview with FedScoop. “They have 16 weeks to train you and prepare you for a big interview at Microsoft. It’s very intense. It’s not for the faint of heart.”
And while the interview for a job at Microsoft is guaranteed, nothing else is. “That guaranteed interview is like a light at the end of the tunnel, but it starts another tunnel,” Bergan said. “There’s no real preparation for that outside of just digging deep and knowing that as a military [veteran] you’ve been through so many tough things in service to your country that this is just another tough thing you have to go through if this is the goal you want to pursue.”
Veterans like Bergan bring skills that can only be learned through military service. “The military — start to finish — is a leadership program,” he said. “You learn to trust those around you but you also learn that they are completely dependent on what you bring to the table as well. When companies hire veterans they tap into people who are always willing to step up and always willing to lead.”
Cortez called Bergan “a perfect example” of what can be accomplished through the MSSA program, noting that Bergan has already been singled out at Microsoft as a software tester and has been approached by other divisions. “You give these young people a foundational skill set and you allow them to show what they’re capable of in the company and bingo!”
MSSA graduates go on to become system administrators, cloud administrators, database administrators, software developers and software testers, according to Cortez. And plans are to expand the program.
“We’re going to grow it. When we get to our full potential, we’ll probably be graduating over a thousand per year,” Cortez said. “Our goal is that every time we have a graduation that 100 percent get hired by us and other IT companies. Let’s not just get them a job. Let’s get them a good job.”
Cortez is quick to point out that it will take more than just Microsoft to make a significant dent in the converging crises of STEM worker shortages and veteran employment. “There’s no way we can do everything ourselves, but this is the right thing to do and it’s the right time to do it,” he said. “Somebody asked me, ‘What’s your greatest fear?’ There is none. It’s goodness.”
Federal data collection needs a purpose, federal leaders say
Despite the hype surrounding big data, federal data leaders offered this warning: Just because it’s there doesn’t mean you have to collect it.
Data gathering efforts need a purpose, they said during a FedInsider-hosted event Tuesday. Niall Brennan, chief data officer of the Centers for Medicare and Medicaid Services, told audience members that CMS has seen success recently improving the quality of patient care by establishing “tangible outcomes.”
The world of data science, Brennan said, has become flooded with meaningless buzzwords, almost creating a fad with “way too much trust placed in magical, out-of-the-box plug-in solutions.”
The Federal Communications Commission’s Tony Summerlin, who advises Chief Information Officer David Bray, takes an equally pragmatic, and perhaps more cynical, approach to federal big data. “Why are you collecting this in the first place?” Summerlin asked. “What you’re collecting it for is so important.”
He added, “I think sometimes we think that just putting all this stuff together somehow creates value for someone, and I’m not so sure about that.”
Viewing massive amounts of data, Summerlin reasoned, can be good. It can provide valuable insights if it’s analyzed correctly. But his worry is that viewing it with vague intentions could lead to poor results — “garbage in, garbage out.”
“I can correlate data and make anything true … and I think that’s what we need to be careful of,” he said.
There’s also concern on the privacy side, in aimlessly collecting and publishing large volumes of data. Linda Powell, chief data officer for the Consumer Financial Protection Bureau, said many agencies want to rush into analyzing data without accounting for the fundamentals, like privacy and security.
“I spend a tremendous amount of time ensuring privacy, the maintenance of the privacy, for the data that we get,” Powell said. Scrubbing data of personally identifiable information, or PII, is a tedious and incomplete science, she said. “It’s not possible to completely scrub data so that it couldn’t ever be re-identified,” but it’s something she said agencies must work toward.
Summerlin was a lot less confident in the government’s ability to provide privacy.
“I believe in privacy protection and so forth, but the government is inherently really crappy at it,” he said. “I admire the government for trying their best to keep PII and information out of the realm. I hope we get better at it. But it’s kind of a bifurcation when part of the government is trying to make sure you have no anonymity and the other part of the government is trying to make sure you’re completely protected.”
Generating value from de-identified federal data sets makes extracting their value even more difficult, as most of what would be considered immediately meaningful information is redacted. And it’s hard to do without making harmful mistakes, Summerlin said.
There is value in publishing data, “there’s no question about it,” he said. “But publishing, just giving people access to data that is not PII, that can’t hurt someone, is not easy.”
Nevertheless, Brennan said it can be surprising who might consider released data useful.
“All of the sudden somebody takes it, matches it with someone else and builds something cool,” Brennan said. “So you almost have to err on the side of openness.”
Looking forward, he said the billions of data points his agency collects will help it improve health care across the country.
“We view ourselves and our data as being a key accelerant in health care reform, health care transformation,” Brennan said. “Readmission rates are declining, hospital-acquired conditions are declining, incremental debts are declining. I always like to say, ‘We’re not declaring victory — we’re declaring progress.’ We know there’s a long way to go, but we equally know data is going to play a key role in that evolution.”
Megan Smith pushes STEM at Inventor’s Hall of Fame induction
U.S. Chief Technology Officer Megan Smith took the stage at the National Inventor’s Hall of Fame induction ceremony Tuesday night, where she highlighted the importance of encouraging kids’ interest in the sciences.
“I have high hopes for all of our children around the world — that we will help them come into the greatness that it is to feel what science and math and engineering really are, when … you get to have that amazing moment when you figure something out,” she said.
The annual gala honored the work of 14 inventors who were added this year to the National Inventors Hall of Fame, which is based at the U.S. Patent and Trademark Office’s Northern Virginia headquarters.
Communications staff for the patent office told FedScoop that Smith’s appearance at the event, held in the courtyard between the Smithsonian American Art Museum and the National Portrait Gallery, shows the White House’s strong support of the agency.
The gala was part of a three-day celebration of the award winners, which wrapped up Wednesday with a panel at Smithsonian’s National Museum of American History. Awardees included Nobel Prize winner Shuji Nakamura, who invented the blue light-emitting diode, which allowed researchers to produce energy-saving white LED light; Jaap Haartsen, who invented Bluetooth wireless technology; and George Alcorn, who invented the imaging X-ray spectrometer, which is used in space research.
During her brief address at the start of the program, Smith said she had several connections to the event: She attended school with patent office Director Michelle Lee and had encountered two of the honorees — Paul MacCready, known as a “the father of human-powered flight,” and Stanford Ovshinsky, inventor of the nickel-metal hydride battery — while she competed in a solar car race across Australia as a student.
“It’s through these amazing inventors and the inspiration that they give us that we can see ourselves and the great possibilities,” Smith said.
Smith, a former Google exec and the first woman to serve as U.S. CTO, has been outspoken about the importance of encouraging children — particularly girls — to pursue science, technology, engineering and math, or STEM, careers.
She told event attendees that, earlier this year, President Barack Obama met with a group of kindergarten and first-grade girls who had competed in Lego’s FIRST robotics competition. When Obama asked how the kids came up with their idea, one girl said, “We had a brainstorming session … Have you ever had a brainstorm session yourself?”
“It’s wonderful,” Smith said. “Teaching those habits is really important.”
Pentagon refocuses on cyber basics, threat data analysis
The National Institute of Standards and Technology is close to finalizing a new set of guidelines governing the protection of controlled unclassified information, or CUI, that will form the basis of a new acquisition regulations clause that federal contractors will be required to follow.
The public comment period for NIST Special Publication 800-171, Protecting Controlled Unclassified Information in Nonfederal Information Systems and Organizations, closed Tuesday, and a final document is expected to be released next month. But Richard Hale, the Defense Department’s deputy chief information officer for cybersecurity, said DOD plans to use the new guidelines to rewrite the safeguarding clause contained in the Defense Acquisition Regulations System.
“It’s where we tried to write down the guidebook for doing the cybersecurity basics for government and for industry,” Hale said. “And then what we’re going to start doing in the government is point to this thing in contracts. So we’re going to say, ‘Hey, if you handle controlled unclassified information, this is what you have to do.'”
The effort to define and streamline unclassified data under the CUI moniker — formerly referred to as Sensitive But Unclassified information — has been a work in progress for several years. CUI is meant to cover unclassified data that still requires protection from public disclosure, such as technical defense information, engineering data, specifications or personally identifiable information. The National Archives and Records Administration is the government’s executive agent for CUI and released a draft rule last week for standardizing the definition and treatment of CUI across government.
There are about 150 controlled unclassified systems throughout the federal government that will be placed under the new guidelines, according to Hale.
“This idea of having some baseline that we all have to do I think is going to get stronger and stronger over time,” Hale said. “It’s much better than the list of controls that we used in our last safeguarding clause.”

Department of Defense Deputy CIO for Cybersecurity Richard Hale speaking May 12 at the BDNA IT Security Crisis conference. (FedScoop)
Beyond the basics
For the Defense Department, “mission appropriate cybersecurity” remains the central aim of the department’s risk management framework. Officials are currently conducting a prioritization effort to determine how much and where the department invests its cybersecurity funding. Those decisions will be tracked by the Office of the Deputy Secretary of Defense, Hale said.
“We’re writing guidebooks for program managers on how the risk management framework fits into the acquisition process,” he said. Those guidebooks will cover not only government program offices and acquisition professionals but also contractors who are writing software for mission-critical weapons systems.
But on the operational front, the focus is on using big data to defeat cyber adversaries, Hale said.
“We’re trying to use rich data about bad guys to look at some of the infrastructure defenses and decide whether or not we’re spending our money in the right place,” he said. Agencies now have ample data available and the analytical tools that can tell them how their security systems are performing, and they can quickly map new attacks against that data to determine what can be blocked. “That’s a big change over how things worked even just a few years ago,” Hale said. “You actually now have bad guy capability data that you can use to make decisions about what to buy or how to structure your operation.”
According to Hale, DOD hasn’t always made good use of the data it collected on attackers. “We have ever-better data about bad guys and we have ever-better data that’s unclassified about bad guys. So this data can be very helpful in actually doing cybersecurity.” Hackers tend to get into networks in a particular way that can be profiled, he said. “They do a [domain name system] lookup in a particular way, they contact their command and control network in a particular way, they dump more malware in a particular way, they escalate privileges in a particular way and they move laterally in a particular way. So you can start to characterize all of these particular groups … and then you can kind of map your defenses against this ever-richer set of data about adversaries.”
The Defense Department analyzed all its attack data, as well as data from other organizations, and found that in almost every case, the attacker exploited a preventable security flaw. “The threat data has confirmed that even when we have a really sophisticated adversary and we may actually see very sophisticated malware, we’re not seeing attacks against sophisticated vulnerabilities,” Hale said. “People are still using the same old stuff.”
Why Congress should back a stopgap commercial drone bill
Warning: Due to a slow-moving regulatory process, the U.S. commercial drone industry is rapidly losing altitude while other countries soar ahead.
The Federal Aviation Administration has begun to develop the rules necessary to integrate small unmanned systems into U.S. airspace, but it has already conceded that it will miss its September 2015 deadline, and now the Government Accountability Office predicts that we cannot expect to completely integrate commercial drones into America’s skies until 2017 or later. In the meantime, the lack of concrete rules has put U.S. companies at a distinct disadvantage compared to their foreign competitors, so some have started moving their operations abroad.
That is bad news for U.S. leadership in an important field of cutting-edge technology. The Commercial UAS Modernization Act, sponsored by Sen. Cory Booker, D-N.J., would address this issue by creating a useful stopgap that would allow commercial drone operations until the FAA delivers a final rule.
There are a number of negative effects that have resulted in the FAA’s slow and deliberate rulemaking pace. First and foremost is the loss of competitiveness in UAS technology to other countries with more permissive rules. It is telling that when compared to the relative size of its economy, Canada has 92 times as many commercial drone operators as the United States. Second, by limiting commercial UAS activity to special exemptions, the FAA has significantly limited the testing of this technology. For example, Amazon got approval to test its drones in Canada in three weeks, while the FAA took six months to approve an obsolete model. While the FAA quickly approved the updated model when faced with criticism, the current set of restrictions means companies cannot rapidly tweak and improve their drones, a process that is integral for development of the technology.
If enacted, the Commercial UAS Modernization Act will help curtail these issues. This legislation would mostly set the rules as they would be if the FAA adopted its initial proposal, including prohibitions on operations beyond line of sight, operations above 500 feet, and operations at night. While these rules will hopefully change to allow for more flexibility, establishing safe rules now rather than later will allow the U.S. drone industry to grow.
In addition, the act contains an important research and development component. It requires the FAA to work with the William J. Hughes Technical Center — a U.S. aviation test facility — to run tests and collect data on a variety of different UAS operations that would not be allowed by the FAA’s proposed rules, such as detect-and-avoid systems, autonomous systems, and beyond-line-of-sight operations. This program will further expedite commercial drone integration of these prohibited activities by allowing the FAA to identify safety standards and develop milestones for these operations. The act would also require the FAA to partner with NASA to research and develop an air traffic control system for UAS flying under 1,200 feet.
By creating a safe approval system now and allowing for commercial UAS testing, Congress would spur the continued development of the commercial UAS industry while more flexible rules are being developed. Opening the door to commercial use of this technology will also allow drone makers to continue to innovate and test their products, preserving competitiveness in the U.S. drone industry during the interim. Furthermore, this forward-looking legislation would actually speed along the regulatory process by improving research and development for UAS operations where the FAA needs additional data to rule on its safety.
It is important to note that the FAA is open to changes, as it has publicly stated its commitment to rethinking its hardline stance on operations beyond line of sight, which were prohibited in the original proposed rules (and the Commercial UAS Modernization Act). The FAA has also signaled a shift in its stance towards currently prohibited operations, such as package delivery. But while the FAA continues to deliberate, Congress should pass the Commercial UAS Modernization Act and allow limited commercial use of drones to promote U.S. competitiveness and innovation in drone technology.
Alan McQuinn is a research assistant at the Information Technology and Innovation Foundation.
3 civic hacking projects that aim to open Congress
Not only can open government be a benefit to the public — it can also help legislators do their jobs better.
That was the idea behind #Hack4Congress, a three-city hackathon held over the past three months, during which people from all walks of life came together to create projects that could be used by people working for or with House lawmakers.
Co-produced by the OpenGov Foundation and Harvard University’s Ash Center for Democratic Governance and Innovation, the winning project from each city was demoed in front of Reps. Darrell Issa, R-Calif.; Susan Brooks, R-Ind.; and Greg Walden, R-Ore., as well as a number of congressional staffers.
Seamus Kraft, the director of the OpenGov Foundation and a former staffer himself, called the projects “stunning by a number of measures.”
“I’ve been that staffer banging my head against the desk, saying, ‘No one knows what I’m doing,’ ‘No one knows what the process is,’ and ‘No one is there to help me,’ and it’s incredibly powerless,” Kraft told FedScoop. The projects “are directly relevant to real problems that add up over time which have made Congress less efficient, less effective and less user-friendly.”
Kraft wasn’t the only former staffer touting the benefits of technology Tuesday. Taylor Woods, who has held staff roles in various federal and state offices, put together a platform to facilitate more productive meetings between congressional members and citizens.
The platform, “CongressConnect,” was born out of Woods’ frustration with citizens who would come to meetings unprepared or uninformed. Through CongressConnect, constituents can not only schedule meetings, but they can access tutorials to prepare for meetings and connect with other people who share their same advocacy goals.
“It captures the essence of democracy, helping to engage, educate and empower citizens to be more active in the legislative process,” said Jessie Landerman, a public policy student at Harvard’s Kennedy School who worked on the platform.
The other two demos focused on empowering lawmakers with readily available data. CDash, a data dashboard created by a team based in San Francisco, gives congressional staffers the ability to quickly comb public data sets, explore metrics that matter to their constituents and reduce dependence on lobbyists as a source of information that influences legislation.
“We want to help members of Congress go into those meetings and have their staff fully prepared to brief them, or make the same queries and come up with their own questions in order to have conversations about ‘what do these data mean?’ so they can pay attention to what matters,” CDash project manager Katie Wing said.
Another platform, dubbed CoalitionBuilder, allows members of Congress and their staff to better track and discover which members to reach out to for collaboration on future legislation. While platforms like Politico Pro and Bloomberg Government offer this ability, they often come with a high price point that drives some advocacy shops and nongovernmental organizations away.
CoalitionBuilder tracks voting records, co-sponsorships and committee assignments using free, open data sets. It also allows users to filter by party, caucus or committee, and it plans to add floor speeches, co-sponsorships to the filter list.
Prior to the demos, Issa called the event “critical” to helping lawmakers and the American public figure out what’s going on in the country and how they can better influence legislation.
“The difference between our founding fathers and today is there is so much more information and it’s so hard to even sort through the information the executive and legislative branch wants you to see,” Issa said. “The goal is for the American people, for 300 million-plus people, to begin discovering, and then telling us what they see that was overlooked by the handful of people who run the government today.”
Kraft said the hackathons show that a notable transformation is happening in the House and is catching on in other parts of Capitol Hill.
“Four years ago when I was here, something like this never could have happened. To get to the point where we are now means we are making measurable progress,” Kraft told FedScoop. “The next big step is ‘porting the awesome’ that we’ve seen on the House side. The big interest has been from the Senate side, because they know they are next and they know they have a lot of work to do.”