The General Services Administration has introduced a new category for its Federal Risk and Authorization Management Program to highlight cloud systems proven to be FedRAMP ready.
Shawn Kingsberry, chief information officer for the Recovery Accountability and Transparency Board, is leaving government for the private sector.
As the demand for data- and computing-intensive research grows, a new study found that the National Science Foundation is giving advanced computing less attention than it should be and falling behind in its ability to provide those resources to scientists.
Red Hat Software announced its Red Hat Cloud for Government platform, providing a host of tools to allow feds to use the cloud in a cost-efficient manner.
For Jeremy Wiltz moving to the cloud has been all about trust. But the the deputy assistant director for the FBI’s Information Services branch acknowledges the bureau will never be in the position where it can deploy the latest technologies and worry about working out the kinks later.
The United States Postal Service did not comply with its established standards on cloud computing, according to a report from the agency’s inspector general.
“The author of The Responsive City: Engaging Communities Through Data-Smart Governance” talks about government-technology partnerships.
Months after announcing his departure from the General Services Administration, Dave McClure is set to join cybersecurity provider Veris Group as chief strategist.
The projects, called Chameleon and CloudLab, are part of the CISE Research Infrastructure: Mid-Scale Infrastructure-NSFCloud program and are meant to be complementary to typically industry-driven cloud development, just like NSF’s involvement during the genesis of the Internet.
FCC Chief Information Officer David Bray recently sat down with FedScoop as part of the Cloud Innovation Heroes campaign, presented by Intel and Amazon Web Services, to talk about his agency’s adoption of cloud computing.
A new proof-of-concept funded by DARPA and led by scientists from AT&T, IBM and Applied Communication Sciences is shredding terabit-sized cloud network inter-connection into a sub-minute process by making that connection “elastic.”
Technocrat talks to Laura Ipsen, Microsoft’s corporate vice president of the worldwide public sector, about how the company is helping city governments tackle unique problems with tailored Microsoft CityNext services.
With complications performing forensics in the cloud in mind, the National Institute for Standards and Technology created a cloud computing forensic science working group to enumerate and explore the challenges distinct to the cloud. Monday, NIST released a draft of the challenges discovered by the working group for public consideration. While the draft briefly explores 65 issues NIST’s group found, working group co-chair Dr. Martin Herman, a senior advisor for Forensics and IT at NIST, said the list is in no way exhaustive — just a first look at a very big problem.
Chris Niehaus, senior director of Microsoft’s Cloud Computing Program, sat down with FedScoop at the 2014 Microsoft Worldwide Partner Conference in Washington, D.C., to discuss current trends in public sector cloud computing.
Amir Capriles, the general manager of U.S. public sector for Microsoft Dynamics, joined FedScoop to discuss the launch of Microsoft Dynamics CRM for government. In the interview, Capriles talks about why Microsoft decided to bring its CRM platform to the public sector space and the impact it can have on the federal, state and local governments.
You should familiarize yourself with “fog computing,” “cloudlets” and “cyberforaging.” The future of the cloud is coming.
With the momentum of a few key new features, Microsoft’s Azure government cloud is a step closer to becoming a reality.
An RFI geared toward cloud vendors could eventually lead to the GSA being able to track cloud-buying trends across federal agencies.