Three strategies for streamlining the path toward data center consolidation
The federal government has been exploring data center consolidation for several years, but the results have been mixed. Some smaller agencies with limited numbers of applications, including the Small Business Administration, have had great success with their consolidation efforts, while larger agencies with more complex IT infrastructures have struggled.
Acknowledging these struggles, the General Services Administration (GSA) has issued a guide for data center consolidation. One of the steps – “Streamlining the Environment” – calls for agency CIOs and project management teams to “review and simplify the overall IT environment to facilitate the move.”
But how do federal administrators simplify and streamline networks that have grown enormously complex, especially in larger agencies? How can they ensure a continued and unerring focus on sound cybersecurity practices as they engage in their consolidation efforts?
Here are three strategies that federal IT professionals may want to consider as they address those questions.
Start with an application rationalization process
Any meaningful attempt at data center consolidation and cloud migration should start with a careful rationalization of legacy applications. Chances are, many of those applications can be replaced by lightweight commercial Software-as-a-Service (SaaS) apps that can effectively meet mission needs in a more efficient manner.
Email, collaboration and document sharing applications are natural places to start. For example, on-premises email exchange servers can be migrated to cloud-based SaaS solutions, which can help eliminate unnecessary infrastructure. The Department of Defense (DoD) has taken this approach with the Air Force’s Microsoft Office 365 implementation, which has resulted in a simplification of the agency’s infrastructure and net savings.
Existing applications should also be restructured (or “refactored”) to take advantage of the cloud’s ability to provide surge- and usage-based pricing. “Lifting and shifting” applications and buying enough CPU power to accommodate usage spikes is not cost-effective. Refactoring applications to consume compute capability based on actual user consumption will result in a lower total cost of ownership, since the agency will only be paying for actual application usage.
Automate for better efficiency and cost savings
Multi-cloud environments can pose security and process challenges. Applications need to maintain uniform levels of security, regardless of where they are deployed, and the back-and-forth migration process should be easy to manage, with minimal administrative burden.
Automation can address both of these issues. Automation allows workloads to move seamlessly between different cloud and on-premises virtual bare metal environments while meeting an agency’s mission needs. Administrators can also automate the movements of security policies so they follow applications between the environments, ensuring they maintain the necessary security levels, regardless of where the applications may reside.
Automation is also imperative for cost savings. Manual network and application management is time consuming, highly inefficient and expensive, not to mention virtually impossible in larger agencies. Automating application migration, system and network deployment and administration can save agencies significant time and money and help them realize true value from their consolidation and modernization efforts.
Change established security policies for long-term benefits
While savings are important, no agency can afford to endanger national security just to save a few cents on compute costs. Fortunately, data center consolidation and application migration provides agency teams with a great opportunity to take advantage of the security benefits of the cloud. Indeed, they will ultimately end up with even more robust security than they would with on-premises solutions.
Cloud providers can provide agencies with the capability to automatically patch systems and pull down security updates instantly. Virtual threat protection, firewalls and anti-malware solutions can be installed at the press of a button. There are no more protracted procurement cycles, and agencies can react more quickly to evolving and impending threats.
Software Defined Networking (SDN) can also significantly improve the security of the multi-cloud environment. SDN allows for traditional network segmentation and filtering policies to be portable between traditional bare metal, virtualized and cloud environments. This will ensure consistent security policy enforcement as applications are migrated between different environments and result in significant operational savings.
This mindset goes against the grain of traditional security policies and requires agency teams to think differently about the ways they approach risk mitigation. If they are willing to do so, however, they will end up with a more responsive and adaptive security posture.
While changing minds is never easy, it is necessary for agencies to enjoy the full breadth of benefits provided by an optimization and “cloud first” approach. Smaller agencies have adapted, and it is now time for large agencies to follow their path, one that leads to greater economies of scale, simplified application and network management and better security.
David Mihelcic is the federal chief technology and strategy officer for Juniper Networks, supporting the company’s federal sales, engineering and operations teams. David joined Juniper Networks in February 2017 following 18 years with the Defense Information Systems Agency, where he retired as CTO, a position he held for more than 12 years.