Advertisement

Mr. FedRAMP: Understand your data before you move to cloud

At the Red Hat Government Symposium, FedRAMP Director Matt Goodrich said even as agencies are mixing in cloud instances with their legacy systems, they are struggling with what data they have and where it's appropriate to put that data.

One of the biggest challenges for federal agencies looking to move to the cloud is figuring out what types of data they have and where that data can reside, the head of the Federal Risk Authorization Management Program for cloud computing services said Tuesday.

At the Red Hat Government Symposium, FedRAMP Director Matt Goodrich said even as agencies are mixing in cloud instances with their legacy systems, they are struggling with what data they have and where it’s appropriate to put that data.

“The reason why you would do hybrid cloud is because you have different types of data,” Goodrich said. “I think the government either over-classifies a lot of the data or doesn’t really take the time to actually look at the data and actually understand how to do hybrid environments. That’s not exactly been a best practice. We don’t know how many data centers we have five years ago. If we don’t know what type of data centers we have, how do we know what data we have?”

Goodrich added that it’s important to get ahold of that data, because while each cloud provider offers essentially the same service, the small differences in how each company handles data could be extremely important to each agency’s individual missions.

Advertisement

José Simonelli, Red Hat’s senior cloud domain architect, said it’s important to look past the data in order to make hybrid instances adaptable to emerging technologies.

“The first step is really to identify what you are up against, what are all these tools, what are the existing processes so that you can start taking it into chunks, so the same processes are happening on Amazon Web Services or VMware or Azure or wherever,” Simonelli said. “We don’t know what’s the next Azure, what’s the next OpenStack, and you have to keep that framework open so you can take those in and it just goes with your system.”

Some agency IT experts talked about how they’ve gone through these growing pains at the event. John Quigley, a Linux systems engineer for the Energy Department’s Oak Ridge National Laboratory, said understanding how researchers are using the lab’s data allowed them to procure what they’ve needed over the course of a fiscal year.

“The more interesting reason we look at hybrid cloud is we have sort of a demand prediction problem,” he said. “Right after you buy the compute and storage, you will get a request for 1,000 cores. It’s hard to give [researchers] what they need when they need it. Hybrid cloud is like a force multiplier. If we can burst into AWS or another provider and give somebody cores and storage when they need it, that’s awesome. And that’s something, until this point, that’s been really hard to do.”

Keith Trout, a senior tech adviser with the Social Security Administration’s Office of Infrastructure and Configuration, said another challenge has been pushing the agency away from its risk aversion. Whether it’s been due to holding highly sensitive data or validating taxpayer money being spent on legacy data centers, Trout said it’s been a tough row to hoe to get in buy-in on hybrid cloud.

Advertisement

“We have sensitive information on every American born since 1936,” Trout said. “Up until a year ago, we would have said, ‘Never. We’re not putting anything out there.’ That is the number one goal — to safeguard all of the information that we have. We are finding ways to secure [the cloud] where we are very comfortable with it.”

But even with top-level buy in, Trout said his work is just beginning.

“We’ve got work to do. I think everyone recognizes that across the board. In all agencies, there is work to be done,” he said.

Greg Otto

Written by Greg Otto

Greg Otto is Editor-in-Chief of CyberScoop, overseeing all editorial content for the website. Greg has led cybersecurity coverage that has won various awards, including accolades from the Society of Professional Journalists and the American Society of Business Publication Editors. Prior to joining Scoop News Group, Greg worked for the Washington Business Journal, U.S. News & World Report and WTOP Radio. He has a degree in broadcast journalism from Temple University.

Latest Podcasts