GSA plans greenhouse gas disclosures, reduction targets for IT providers

iStockphoto
The General Services Administration plans to make medium- and large-sized contractors on their huge new Alliant 2 government-wide IT contract vehicle disclose their annual greenhouse gas emissions and set targets for reducing them, according to procurement documents posted Wednesday.
“It is GSA’s intent to require the … contractors to inventory and publicly disclose their operational [greenhouse gas, or] GHG emissions, set targets for reducing those emissions, and report progress toward meeting their targets. This will be an annual requirement,” the agency states.
In a proposed information collection requirement published in the Federal Register, the agency asks for feedback on the plan, saying that 40 percent of the vendors on the $50 billion contract already make such disclosures “in response to requests from their non-government customers, investors, insurers, and corporate sustainability policies.”
The requirement will apply to contractors on the unrestricted Alliant 2 contract, not the accompanying small business vehicle.
The agency says the move is in response to a presidential executive order signed in March last year, which required the seven largest U.S. procurement agencies “take into consideration contractor GHG emissions and GHG management practices.”
“Public disclosure of GHG emissions and GHG reduction goals or targets has become standard practice in many industries,” the GSA notice states, adding “companies are increasingly asking their own suppliers about their GHG management practices … Performing a GHG inventory provides insight into operations and opportunities for energy and operational savings that can result in both environmental and financial benefits.”
The notice estimates the disclosure will take contractors 80 hours to prepare.
Big banks join UBS on blockchain plan
Three of the world’s largest banks have joined with UBS in its plan to use blockchain to document and secure interbank settlements, underlining the transformative potential of the new encrypted distributed ledger technology.
The new “Utility Settlement Coin,” or USC, uses the blockchain technology that underlies bitcoin, but USC is a digital cash equivalent of each of the major currencies used for interbank settlements like the dollar or euro — not a new decentralized digital currency.
The Financial Times reported Wednesday that Deutsche Bank, Santander and BNY Mellon, as well as the broker ICAP, have now signed on to UBS’ plans — first hatched last fall with London-based startup Clearmatics.
Blockchain is a technology using encryption and distributed computing power to create a constantly upgraded and cryptographically secure record of transactions — distributed among all its participants.
Boosters see opportunities to cut red tape and back office operations by introducing a software-based (rather than human-operated) instantaneous and cryptographically guaranteed record of interbank payments and other money movements.
International payments are expensive and time-consuming to execute, Rob Morgan, vice president for emerging technologies at the American Bankers Association told FedScoop. “The business case for any new technology is where there’s the greatest potential for a return on investment,” he said. Where there are multiple parties to a transaction, as in any international payments system, “Each party has their own record of the transaction” and it can take “days and days” to reach a final settlement.
With blockchain, all parties are essentially agreeing to a single record of the transaction in near-realtime.
The total cost to the finance industry of clearing and settling trades is estimated at as much as $80 billion a year, the FT states, citing figures last year compiled by Oliver Wyman.
“The conversation today is about how can we use blockchain to more efficiently complete payments within the existing system,” said Morgan.
“The current settlement machinery is incredibly antiquated, inefficient and balkanized … The longer the settlement process takes, the more risk there is,” said Alex Tapscott, consultant and co-author of the book “Blockchain Revolution.”
Tapscott noted that Santander had been “an early proponent” of the new technology, which he believes will “upend the financial services sector.”
He noted that Wednesday’s news was only the latest in a string of recent announcements about blockchain initiatives from the financial sector.
Most banks and financial institutions, like Santander, “view this [technology] opportunistically … as a way to reduce the costs of moving, storing and managing value.”
“They should be looking at it strategically,” he said.
“It’s all very well to say [blockchain] could save $20 billion from the costs of trading equities,” he said, “But what if banks are no longer involved in those trades” because blockchain has replaced their role altogether? he asked
The four banks will pitch the USC idea to central banks this fall, with a projected commercial launch in 2018, the FT reported.
Survey: Companies can’t cope with privileged access

iStockphoto
Most companies continue to struggle with managing privileged-user access to their IT networks and few managers are satisfied with the degree of visibility and control they have over the privilege granting process, according to newly published survey data.
The problems have grown worse over the past five years and show few signs of improvement, according to the data, released Tuesday by the Ponemon Institute and Forcepoint, a joint venture of Raytheon and Vista Equity Partners.
“It’s very challenging with the interdependencies between today’s interconnected systems, for both business management and IT management, to understand exactly what access is required” for each employee, Forcepoint’s Senior Manager for Insider Threat Operations Dan Velez told FedScoop.
Around 700 IT professionals in the U.S. responded to surveys in 2011, 2014 and 2016, the report said. For the purposes of the study, Ponemon defines privileged users to include database administrators, network engineers, IT security practitioners and cloud custodians.
The issue is vital for cybersecurity because privileged users are the most useful vectors for hackers looking to steal credentials or the most dangerous form of insider threats. A single pilfered credential from a database administrator, for instance, enabled the massive Anthem hack.
“Step one in cybersecurity risk management is understanding what the [organization’s] critical assets are and who has access to them,” said Velez.
But the survey shows that most companies cannot keep up with the administrative demands of managing privileged user accounts.
Over the past five years, the proportion of respondents who say they “cannot keep pace with the number of access change requests that come in on a regular basis,” has risen from just over half (53 percent) to nearly two-thirds (61 percent).
The number saying that it “takes too long to deliver access to privileged users” has risen from a third (32 percent) to nearly half (47 percent).
Thirty-nine percent of respondents said they weren’t confident that they have the visibility they need into privileged user access to determine if users are complying with company policies. Only 18 percent are very confident that they have this visibility.
On the plus side, there was a very slight drop (from 35 to 32 percent) over the five years in the number saying their company lacks “a consistent approval process for access and a way to handle exceptions.” And a larger fall (52 to 41 percent) in the number saying it was “difficult to audit and validate privileged user access changes.”
“Privileged users are very challenging to manage,” said Velez, and the reports states that they “often use their rights inappropriately and put their organizations’ sensitive information at risk.”
For example, three-quarters (74 percent) of respondents agreed with the statement that “privileged users believe they are empowered to access all the information they can view;” while two-thirds (66 percent) agreed that they access “sensitive or confidential data” out of “curiosity.”
This article has been corrected to accurately reflect the status of Forcepoint as a joint venture of Raytheon and Vista Equity Partners.
New approach needed to IT, says NIST’s top cyber scientist

NIST Senior Fellow Ron Ross (Source: FedScoop)
No amount of security software, firewalls or anomaly detection systems can protect an IT infrastructure that’s fundamentally insecure and a new approach to computer architecture is required to deal with the looming cybersecurity crisis, the National Institute of Standards and Technology’s top computer security scientist told the president’s commission on long-term cybersecurity.
The “only way” to address the looming cybersecurity crisis is “to build more trustworthy secure components and systems,” Ron Ross told the Commission on Enhancing National Cybersecurity during a Tuesday meeting in Minneapolis.
The commission, established by presidential order, held the latest in a series of public meetings to hear testimony about how to secure U.S. IT systems for the next decade.
“As a nation,” Ross said, “we are spending more on cybersecurity today than at any time in our history, while simultaneously continuing to witness an increasing number of successful cyberattacks and breaches.”
In other words: the security we currently have in place isn’t working.
The reason: “You cannot protect that which you do not understand … Increased complexity translates to increased attack surface.”
The result is “limitless” — and growing — opportunities for hackers “to exploit vulnerabilities resulting from inherent weaknesses in the software, firmware, and hardware components of the underlying systems and networks,” Ross said.
As organizations and users struggle to find and patch known vulnerabilities, the number of unknowns keeps growing as systems grow more numerous and complex and continue to be built in ways that are insecure.
Current approaches “fail to address the fundamental weaknesses in system architecture and design,” he said.
Ross called for a new approach based on “build[ing] more trustworthy secure components and systems by applying well-defined security design principles in a life cycle-based systems engineering process.”
Security, he observed, “does not happen by accident.” Things like safety and reliability needs to be engineered in from the beginning, he argued, comparing the process to the “disciplined and structured approach” used to design structurally sound bridges and safe aircraft.
“Those highly assured and trustworthy solutions may not be appropriate in every situation, but they should be available to those entities that are critical to the economic and national security interests of the U.S.” like “the electric grid, manufacturing facilities, financial institutions, transportation vehicles, water treatment plants, and weapons systems.
This new approach “will require a significant investment of resources and the involvement of essential partnership including government, industry, and the academic community,” said Ross, comparing it to the moonshot of the 1960’s.
“The clock is ticking and time is short,” he concluded, “We have an opportunity to do what is necessary to protect our national treasure and defend the country in the brave new world of cyberspace.”
Report: Cyber crimes will cause $6 trillion worth of damage by 2021
By 2021, cyber crime is expected to cause roughly $6 trillion worth of annual damages, annually, according to a newly published report authored by Cybersecurity Ventures and backed by security consulting giant The Herjavec Group.
If the $6 trillion figure ultimately proves accurate, it would represent double of that reportedly witnessed this year — totaling $3 trillion in estimated damages experienced by governments, businesses and private citizens at the hands of cyber criminals. The World Economic Forum also estimated the economic cost of cybercrime to be approximately $3 trillion worldwide.
“Damages” are defined as any incident where hackers caused the destruction or theft of money, intellectual property, personal or financial data. Additional monetary losses experienced due to lost employee productivity, embezzlement, fraud, forensic investigations or other restoration efforts, are similarly included in the total. Who exactly qualifies as a “cyber criminal,” however, is less clear, based on the report, as criminal cyber syndicates have increasingly shown a willingness to work alongside nation-states.
The report by Cybersecurity Ventures is based on evidence collected from market intelligence reports, an analysis of current geopolitical conditions, data concerning a shortage in the cybersecurity workforce, relevant media clips and independently conducted interviews with prominent industry figures, among other things.
Central to the report’s claims of increased future damage, according to Cybersecurity Ventures’ chief Steve Morgan, is the introduction of first-time internet users and a growing adoption of Internet of Things devices by everyday consumers. The prevalence of IoT technology is considered an expanding “attack vector” for hackers, the report states.
“By 2020 the world will need to cyber-defend 50 times more data than it does today,” Cybersecurity Ventures’ report reads.
Experts, tech companies skeptical of CBP proposal to collect social media info
An association for several internet giants, and a number of advocacy groups are not enthused by the U.S. Customs and Border Protection’s proposal to ask people for their social media identifiers on some U.S. travel forms.
In comments sent to CBP, the Internet Association — a 40-member group that includes social media companies such as Facebook, Google, LinkedIn and Twitter — said the proposal would possibly “have a chilling effect on use of social media networks, online sharing and, ultimately, free speech online.”
And a different letter signed by more than 30 organizations opposed the proposal, saying “this program would invade individual privacy and imperil freedom of expression while being ineffective and prohibitively expensive to implement and maintain.”
CBP has been receiving feedback on the proposal to add a question to the Electronic System for Travel Authorization, known as ESTA, and I-94W forms: “Please enter information associated with your online presence — Provider/Platform — Social media identifier.” The comment period for the proposed addition, which would affect people traveling to the U.S. under a visa waiver program, closed Monday.
[Read more: CBP: Travelers’ social media could spotlight potential ‘nefarious activity’]
The question would “further enhance the security vetting process and support the adjudication of Visa Waiver Program ineligibility waivers,” a CBP spokesperson said in a statement.
The spokesperson added that it would also enhance “DHS’ ability to communicate with visa waiver applicants electronically.”
Whether DHS will be tweeting or poking applicants on Facebook remains unclear.
The CBP spokesperson said the question would be “clearly marked as ‘optional’” on the revised ESTA application.
“Providing this information is voluntary,” the spokesperson said. “Choosing not to provide this information will not result in a denial of an ESTA application.”
The spokesperson also noted that: “DHS will only have access to information publicly available on those platforms, consistent with the privacy settings of the platforms.”
The coalition letter described the collection of online identifiers as “highly invasive,” and voiced concerns that people would have little or no opportunity to explain their profiles or challenge denials.
One organization’s comments stood out from the “crowd” so to speak, of technology advocacy organizations, companies and other interest groups.
The Center for Data Innovation said in its comments to CBP that it supports the exploration of using social media identifiers to enhance national security.
“DHS is unable determine if it can effectively use social media data to screen travelers unless it first conducts a pilot program,” the center said. “It is therefore prudent for DHS to proceed with this data collection to study the effectiveness of such an effort, but it should refrain from using the data on a widespread basis until it can verify that it has produced a system that delivers useful results.”
Daniel Castro, director of the Center for Data Innovation, told FedScoop this is a chance for the government to collect data and try to use it to be more effective.
“We’re seeing social media data being used in lots of innovative ways,” he said. “There’s no reason the government shouldn’t try and do that as well.”
Castro said the conversation around the proposal is somewhat limited because the public doesn’t know how the government will use the data yet.
“It’s an un-ideal conversation that we’re having because they’re saying, ‘Well we want to collect this and then we’re going to kind of see how we can use it,” he said.
But Castro said it can be “difficult to make progress,” when people assume government data collection is bad. He noted that information already publicly posted is available for anyone to use.
“We have to find a way to kind of balance, this maybe distrust of government with the need to recognize there’s a very positive and important role for government to play in collecting and using data so it can work better,” he said.
The Internet Association noted that it was concerned about the scope of the data requested and how it would be used, including from a cost perspective.
“While the Internet Association supports the national security objective underpinning the DHS proposal, it is unclear from the notice how DHS would seek to achieve this goal,” the IA said. “Analysis of all applicants’ social media ‘activity and connections’ would be costly and difficult.”
IA’s comments note that “this cost does not appear to be factored into DHS’ analysis.” The comments also noted that asking social media platforms to give extra information is a burden for them.
Castro said cost is not necessarily prohibitive, depending on how the DHS plans to use the data.
“Collecting every piece of social media data and analyzing it — yeah, that’s probably not cost effective,” he said. “That’s probably not what they would do.”
He added: “Doing some targeted analytics in certain cases, doing some social network analysis in certain situations to develop risk profiles for people, you know that is something that could be considered.”
Castro noted that the center’s comments stress the need for involvement from the Office for Civil Rights and Civil Liberties to make sure the system protects civil liberties.
Another concern voiced in the comments FedScoop reviewed was with the accuracy of the data. The Internet Association’s comments note that someone’s “declaration” of a username should not necessarily be taken as fact.
The Center for Data Innovation also listed this as a concern, but said this, among many of the potential problems with using social media data, is not “necessarily insurmountable.”
“For example, CBP already must handle the problem of determining whether the information submitted by travelers in response to other questions is accurate and complete,” the document’s authors, including Castro, wrote.
FTC’s Ramirez: New tech’s complexity leaves privacy basics unchanged

FTC Chairwoman Edith Ramirez addresses a TPI luncheon in Aspen, Colorado (YouTube)
Online privacy issues have grown exponentially more complex in recent years with the growth of emerging technology and big data, but consumer control and consent need to stay top of mind, Federal Trade Commission Chairwoman Edith Ramirez said.
“I believe that consumer control remains paramount,” she told a luncheon audience at the Technology Policy Institute in Aspen, Colorado, Monday.
“Many consumer devices and appliances – from your Fitbit to your fridge to your thermostat – are silently talking to one another, collecting data, and transmitting that information to various third parties,” Ramirez said. These new technologies were threatening the traditional consent model of privacy and creating challenges to policymakers and consumers alike.”
“In a world of connected devices, consumers often do not know which companies are doing what,” she explained adding that a mobile device was made by a manufacturer, run by an OS, connected by a carrier and populated with apps.
“Given the multiplicity of actors involved,” she asked, “how can consumers possibly understand where their information is going and what it is going to be used for? And does this complex ecosystem allow companies to pass the buck to avoid accountability for privacy and security failings?”
As result, she said, “we hear with increasing frequency the claim that technological innovation and big data have rendered certain fundamental tenets of privacy, particularly the idea of consumer consent, outdated and ill-suited for today’s digital world. I disagree.”
One way the FTC was seeking to meet the challenge was by broadening the definition of Personally Identifiable Information, or PII. Because of the proliferation of data sources, even anonymous data could be de-anonymized, by cross checking it with other information sources.
“We now regard data as personally identifiable when it can be reasonably linked to a particular person, computer, or device,” she said. “In many cases, persistent identifiers, such as device identifiers, MAC addresses, static IP addresses, and retail loyalty card numbers meet this test.”
She said device manufacturers and other consumer-facing companies should improve customers’ “ability to manage and express their privacy preferences” through things like “set-up wizards and settings menus … as well as dashboards where consumers can revisit and modify their choices.”
She called on behind-the-scenes companies like ad networks and data brokers to respect consumer preferences and avoid using technologies and practices — like so-called “supercookies” — designed to get around or obviate them.
Technological innovations like data tagging are another way of meeting these new challenges, she said. “Under this approach, consumer preferences could attach to and travel with data. This could be particularly useful as data is passed from consumer-facing entities to data brokers, ad companies, and others.”
Finally, Ramirez urged the growth of what she called “privacy intermediaries … to facilitate communication of privacy preferences between consumers and businesses” through things like feedback loops and rating systems.
“Imagine a privacy app that allows me to create an account, specify what information I would like to share, and have companies compete for my business,” she said.
How agencies are meeting their aggressive cybersecurity goals
Greg Otto spoke with Government Matters host Francis Rose Sunday about the takeaways from the first series of agency reports mandated by the Cybersecurity Act of 2015.
He also joined the federal roundtable to discuss the fallout of National Security Agency-linked cyber weapons that were leaked on the internet.
Report: U.S. retailers aren’t investing in cybersecurity even as breaches persist
As high-profile hacks like Target, Home Depot and Eddie Bauer show, U.S.-based retail stores are especially susceptible to damages caused by hackers. A new survey out Tuesday shows how much that damage usually amounts to.
A data breach costs retail outlets, on average, roughly 19 percent of their customer base, according to a survey conducted by global audit, tax and advisory firm KPMG.
Nearly one-fifth of respondents told KPMG they would avoid a retailer that was the target of a successful cyber attack, regardless of how the company remediated damages caused by hackers.
Another 33 percent of people surveyed said they would wholly abandon an affected business for about three months, due to immediate fears concerning the exposure of personal and financial information typically stored by the retailer. Interestingly, these same customers also said they are least likely to return to a hacked store when its leadership fail to publicize a solid plan to prevent future cyber attacks.
“Make no mistake, there is a lot at stake here for retailers. Consumers are clearly demanding that their information be protected and they’re going to let their wallets do the talking,” Mark Larson, a KPMG executive who analyzes global retail market activity, said in a statement.
The typical American consumer has become more aware of cyber attacks, KPMG notes, as several high profile data breaches have recently occurred.
Though the survey’s findings may be considered worrisome for many retailers, 55 percent of sitting, senior cybersecurity executives serving the sector — separately surveyed by KPMG — said they had not invested in cybersecurity over the last 12 months.
“Quite frankly, many retailers are not doing enough to protect their businesses from cyber attacks or react to them when they occur, and the effects of their inaction will end up harming them in the long run,” said KPMG Principal and Retail Cybersecurity Leader Tony Buffomante in a statement.
Several agencies working on plans to collectively buy 55,0000 laptops, desktops by year’s end
About 33 agencies have participated in workshops so far that are leading up to a General Services Administration-led buying venture for about 55,000 laptops and desktop computers, an official said Monday.
GSA has been working with the Office of Management and Budget to coordinate the fourth-quarter buying venture, which would help agencies comply with a memo released last year, Steve Krauss, director of category management strategic execution at GSA, said Monday at a panel during an Industry Day focused on shared services.
The OMB memo mandated that agencies use governmentwide acquisition contracts for purchasing laptops and desktop computers.
[Read more: OMB to feds: No new desktop or laptop contracts]
Panelists, among them Krauss, talked about recent government efforts to integrate category management principles into agency workflow so they share best practices to make more centralized buying decisions together and eliminate redundancy.
Panelist Stephanie Hrin of GSA’s Unified Shared Services Management, who hosted the event, said shared services is “something that makes sense to us taxpayers.”
“I think that the same thing can be said for category management,” she said. “They’re both about efficiencies, and using resources in the best way possible.”
The laptop and desktop project has been one of the recent efforts Krauss’ team has been tackling.
“We’re seeing significant savings quite frankly,” Krauss said, so far, an average discount rate of about 18 percent.
Krauss estimated plans are in the works so far for 55,000 laptops and desktop computers to be purchased during the event, he said.
“Our industry partners get out of that also increased reliability of demand and also a better understanding of where the government is going from a purchasing perspective,” he said.
A crucial thing to remember, he noted, is “it’s not really about unit cost.”
“The key to optimizing value with money the government is spending is to make sure that we’re really optimizing requirements,” Krauss said.
So the best strategy is to share things like best practices, and statements of work, Krauss said, to get the best value for the money spent.
Panelist Lesley Field, deputy administrator of the White House Office of Federal Procurement Policy, said $275 billion a year is spent on “common spend,” certain purchases that nearly every agency makes.
“We were looking at a tremendous amount of redundancy and inefficiency,” Field said.
In June, the Category Management Leadership Council approved the first version of strategic plans to look at ways to eliminate redundancies in the federal government’s buying decisions.
They are working on the second version, Field said, which will refine the initial plans and look further out into the future.