Background check bureau’s IT ‘backbone’ completion more than a year out
The National Background Investigation Bureau will take over as the federal government’s improved background investigations and security clearance entity Oct. 1, but its more secure and modernized IT infrastructure won’t be fully operational for more than a year.
Senior administration officials involved in the build out of the Office of Personnel Management’s NBIB spoke with reporters Thursday in anticipation of the “semi-autonomous” entity’s launch over the weekend on the progress they’ve made since announcing in January the transformation to replace OPM’s Federal Investigative Services and then next steps forward in making federal background check process more efficient, effective and secure.
Since January, OPM, its NBIB Transition Team and other partners governmentwide have “worked diligently to establish the framework, vision and infrastructure necessary to stand up the NBIB,” OPM acting Director Beth Cobert said. “On Oct. 1, the transition…will begin, and the transformation process will have started from that point on.”
Cobert also named Charles Phalen, most recently the vice president of corporate security for Northrop Grumman Corp. and a former director of security for the CIA, to head the NBIB Thursday.
[Read more: Former CIA official Phalen to head background check bureau]
But really Oct. 1 is more of a starting point than a finish line, Cobert explained. “While we’ve done a tremendous amount of work up until this point, Oct. 1 is the beginning in many ways.”
Indeed, for the Defense Department-operated backend IT infrastructure supporting NBIB, there’s still “a year to 18 months” until the “new, more secure, effective, efficient, modern” support systems are up and running, DOD CIO Terry Halvorsen told reporters.
While DOD and the Defense Information Systems Agency have “started architecting, designing, building, securing, operating and maintaining an IT backbone for the new NBIB” — what it calls the National Background Investigation System — Halvorsen matter-of-factly stated that “no,” the “whole new NBIB IT infrastructure” is not ready.
In fact, he said, it wasn’t until Congress passed a continuing resolution Wednesday that NBIS had the $95 million funding and the authority it needed to start building and procuring new systems, “thanks to bipartisan support.”
“That money is there. We will begin executing it Monday, and begin to rapidly start fielding the new IT systems,” he said.
DISA released a request for information on the NBIS earlier this month. And even when the system is built as initially envisioned, the plan is to continue improving on it as an iterative work in progress.
That doesn’t mean, though, DOD has been sitting around idly.
“We’ve done all of the background work,” Halvorsen said. “In addition to building the new system, we are aggressively working with OPM, law enforcement, all of the other agencies to better secure the current systems at OPM…We’re not just letting the system currently operate the way it was.”
As a reminder, the creation of the NBIB was precipitated by a series of hacks on Office of Personnel Management personnel and security clearance systems in 2014 that compromised the information of more than 20 million Americans.
When the modernized NBIS system is finally fully up-and-running, in addition to securing and protecting “the personal information of millions of Americans and their families using DOD’s cyber expertise,” OPM wants to work with law enforcement of all levels be able to automate background checks to build full, accurate and timely composites of the individuals it is investigating.
“In particular, starting with the work following the Navy Yard tragedy, we have focused on the importance of leveraging technology to ensure we have as complete and accurate access to records as possible. This has been a core focus with work within the Federal Investigative Services group, and it is work that we want to emphasize even more so going forward,” Cobert said.
NBIB’s launch establishes a Federal Investigative Records Enterprise office tasked with promoting “records automation and an increased focus on information sharing agreements with interagency partners, state and local entities, and commercial records providers” and “standardizing data exchanges for records information as a hallmark of the investigations enterprise, leveraging new and evolving data sources such as social media checks,” according to an OPM fact sheet.
Additionally, officials said, the continued evolution of the new background check system will help improve issues with the current backlog processing clearances, which currently takes between 120 and 170 days, depending on an applicant’s requested level of clearance.
The technology is perhaps the most critical aspect of the NBIB buildout. Amid the fallout of the OPM breaches, it became apparent just how important modern and secure IT systems are today to just about any federal agency’s mission.
“That’s why we have created this new role,” Cobert added. “It’s part of the recognition that as we move forward, the role of technology and digital information will be even more critical than it has been in the past, and you want to get a clear leadership and part of you organization focused on that mission.”
So important is the technology, she said, that it will be what continues to drive the evolution of NBIB in the future so the information on millions of Americans isn’t once again made vulnerable to bad actors.
“Speaking about the space that we’re in and how it needs to continue to adapt to the changing environment that we’re in,” Cobert said. “So we have a model, we have a roadmap, we have plans in place, but we’ll continue to adapt those over time as we have new systems that give us a new capabilities.”
GSA widens usability of DUNS data, opens future for alternatives
The General Services Administration recently renegotiated a key contract with the company that provides the federal government a proprietary identifier to track its spending, allowing agencies a wider use of the company’s data going forward.
Under the renegotiated contract with Dun & Bradstreet, allowing agencies to use the compiled spending data beyond “a narrowly defined ‘acquisition purpose,'” as it had been prior to this, “for other activities, like compiling research of historical procurement information and conducting trend analysis,” a GSA blog post explains.
With the renegotiation, the contract now extends through June 2018 — the original deal was struck in June 2010 — with a modification value of $26 million, a GSA spokesperson told FedScoop.
Since the 1970s, Dun & Bradstreet has collected data on federal spending with its proprietary Data Universal Numbering System, better known as a DUNS. GSA uses DUNS numbers to track spending of the governmentwide contracts in its Integrated Award Environment, which it says supports “responsible award decisions using taxpayer dollars as well as provide[s] insights into federal government spending.”
“There are still some restrictions on the amount of data available for public consumption, but the government’s use of the information is no longer restrained,” Kevin Youel Page, GSA’s acting assistant commissioner for the IAE, wrote in the blog post. “We expect the expansion of these data rights to assist agencies in making better-informed, data-driven decisions as they strive to meet their missions.”
Now any federal agency can use the D&B data for analysis, under the renegotiated contract. Likewise, the contract opens some of the data to third parties for commercial use. It also removes a requirement that D&B data would need to be removed from government systems if another party were to take over its support position.
GSA, with NASA and the Defense Department, also published a regulatory rule Thursday removing proprietary references to Dun & Bradstreet in the FAR.
“By removing references to specific identifiers, the FAR change removes any policy requirements regarding who can provide key services to the federal government,” the blog post says.
“These two actions lay the foundation for the next steps in analyzing alternatives to support the continued integrity of the federal procurement process, increase transparency, and open competition,” Youel Page wrote.
Federal use of the DUNS number came a under widespread criticism in recent years, particularly as the Treasury Department and the Office of Management and Budget have made it a central part of reporting spending in implementation of the Digital Accountability and Transparency Act, which requires the government to make financial data more transparent and accessible by next May.
[Read more: DATA Act standards come with some concerns]
In November, when the regulatory rule was first proposed, the Data Coalition called the move the government’s “first step away from proprietary data standards and its first step towards opening this important data.”
“By scrubbing the contracting rules of any reference to the DUNS Number, the government removes the presumption that the DUNS Number is the only option for agencies and systems that track spending,” the coalition said. “Even after the change, the DUNS Number will remain in use – but this action will make it legally possible to eventually dump DUNS.”
FedRAMP accelerated authorizes first provider in 15 weeks
Microsoft’s Customer Relationship Manager Online last week became the first cloud service provider to receive Federal Risk and Authorization Management Program authorization through its new “accelerated” program, doing so in just a fraction of the time the program used to take.
[Read more: Exclusive: FedRAMP embraces the need for speed]
FedRAMP Director Matt Goodrich said the cloud service provider received a provisional authority to operate on Sept. 22. after only 15 weeks. Before moving to the accelerated process, getting authorized took anywhere from nine months to two years, Goodrich told FedScoop.
The goal for the new accelerated process was to get companies authorized in less than six months, he said.
Getting authorized in less than four months is an “aggressive and fast” timeline, Goodrich said, given the number of security controls that need to be examined.
A big driver in the reduced timeline, Goodrich said, was moving from an initial documentation-based assessment before assessing capabilities to the program’s new FedRAMP readiness assessment that focuses initially on capabilities validated by a third-party assessment organization.
The last provider authorized before Microsoft took 40 weeks to move from documentation reviews to capability reviews, whereas it took Microsoft only 10 weeks, according to a blog post by Goodrich.
Goodrich also noted that moving from a waterfall approach — first looking at documentation, then testing and reviewing risks — to a more agile, iterative review process cut down on time to authorization.
Two other organizations are currently going through the accelerated process: Unisys with its Secure Private Cloud for Government and Edge for Government products, and 18F with its Cloud.gov service.
Goodrich told FedScoop he expects both to be authorized by the end of the calendar year.
Audit slams HHS’ cybersecurity oversight
The U.S. Department of Health and Human Services’ oversight of privacy and cybersecurity in the healthcare sector is deeply flawed, even as the agency is promoting the online digitization and storage of Americans’ health records, according to congressional investigators.
HHS regulates the security and privacy of healthcare records — generally considered highly valuable and private forms of personal data. The department also has an Office of Civil Rights charged with auditing healthcare insurers’ and providers’ compliance with those rules, as well investigating security and privacy complaints. Nearly 18,000 of the latter were received in 2014.
But in some cases it investigated, OCR “provided technical assistance that was not pertinent to identified problems, and in other cases it did not always follow up to ensure that agreed-upon corrective actions were taken once investigative cases were closed,” reported the Government Accountability Office this week.
Of the 18,000 complaints it received in 2014, OCR closed 89 percent of them without investigation, GAO found. “Of the remaining 11 percent, four percent had no violation found after investigation and seven percent resulted in corrective action.”
Moreover, GAO auditors concluded, OCR had no metrics for assessing “the effectiveness of its audit program” which was not yet fully operational.
HHS investigations have revealed that healthcare providers and insurers “have struggled to select appropriate security and privacy controls,” auditors concluded, especially with regard to mandatory risk assessments. But the department’s cybersecurity guidance to regulated companies did not align with the NIST Cybersecurity Framework — considered the gold standard for critical infrastructure cybersecurity.
HHS officials, in comments on the report included in an appendix, agreed that their guidance should “more fully address the implementation of controls described in the NIST Cybersecurity Framework.” But they pointed out the “extreme diversity” of the healthcare organizations covered by their rules — from a single doctor’s practice, through small local clinics, to multi-national hospital networks and insurance companies.
Any new guidance, HHS said, must be “flexible and scalable …[and] technology neutral.” OCR would work on revising guidance “to the extent feasible, given … resource constraints and other priorities.”
And the department’s supporters agree that OCR isn’t really resourced for such a huge role.
“Given the rising incidence of cyberattacks on health organizations across the country, we need to make sure that HHS has the resources and support it needs to implement security tools that will protect personal information, whether it be held by patients, families, providers, or insurers,” Sen. Patty Murray, D-Wash, told FedScoop in an email. Murray is ranking member of the Senate Health, Education, Labor, and Pensions Committee.
NSA garners new partners in hunt for cyber pros
The U.S. Cyber Challenge, an effort to ramp up the country’s training for in-demand cybersecurity professionals, is joining forces with an NSA-sponsored program to help students map out a career in the field.
The new partnership offers users the chance to prove their expertise through taking part in competitions — considered essential by many in the field who are dubious that existing cybersecurity qualifications accurately measure hands-on ability.
The NSA’s Day of Cyber website is open to anyone who registers and provides users an “opportunity to live a day in the life of six NSA professionals,” Kim Paradise told FedScoop. Paradise is Vice President of Partnerships at LifeJourney USA, the company whose “career exploration” technology runs the Day of Cyber platform.
The program which has been running a year, provides approximately two hours of content and is aimed at students ages 13 and up.
“The platform enables students to run cyber challenges and generate their Cyber resume,” said Paradise.
The NSA gets basic aggregate demographic information on registrants, she said — the state they live in and their age — but no personal information.
“It’s not designed … as a direct recruitment tool,” she said, but instead aims “to get students interested in the cyber field in general.”
And the site is not aimed only at youngsters, said Karen Evans, national director of the U.S. Cyber Challenge.
“I could be a veteran coming back [from deployment], I could be unemployed” and aiming to acquire new skills, she said. “The idea is to put an array of tools out there” that can help users map out a path to a cybersecurity career — and help potential employers looking for recruits.
One such employer has already signed up to be a part of the deal, she said: Cyber Adapt, and others were in talks about joining.
The U.S. Cyber Challenge brings to the table its tool, CyberCompEx, a social site which links aspiring cyber professionals with competitions and contests nationwide.
The competitions are key, explained Evans. “That’s the data point … That’s what the competitions do … It’s an outside validation of their actual performance” that potential employers can use.
Unlike the Day of Cyber, CyberCompEx does allow users to post their resumes and other details, so potential employers can find them.
“Industry is looking for people with specific skill sets,” she said. “Now [they] can say: Here’s this person, here’s their qualification, here’s how they’ve done in the competitions.”
Open data’s journey through an administration
In his first week of office, President Barack Obama signed the “Transparency and Open Government” memorandum, striving to promote an unprecedented level openness and transparency in the federal government.
Almost eight years later, officials Wednesday at the White House’s first Open Data Innovation Summit applauded the progress the Obama administration has made since then opening government data — all the while recognizing there’s work still to be done.
“Since then we’ve made astounding progress,” said Shaun Donovan, director of the Office of Management and Budget. “Under President Obama’s leadership, federal agencies are making more data freely available to the public for use, unleashing nearly 200,000 data sets so far.”
In conjunction with the conference, the White House also released a fact sheet with examples of open data work and the initiatives’ outcomes in recent years.
Donovan noted that “it’s fundamentally part of our American values,” to have access to information so citizens can collaborate and “build the tools that create opportunity, a more vibrant democracy, and a brighter future for generations to come.”
While touting examples of open data that have benefited the public, such as the College Scorecard, Donovan acknowledged that there is still work to be done balancing privacy with public good of opening up data. He also said there is more work to be done establishing better data governance structures in all agencies.
“We know that when done right open data and privacy complement, rather than conflict with, one another,” Donovan said.
U.S. CIO Tony Scott, who opened the White House’s portion of the Data Transparency 2016 conference with U.S. CTO Megan Smith, said moving away from a “legacy IT environment” toward “digital transformation” will not only make the federal government more secure and efficient, but also more open.
“I think this is a moment in time when we have a huge opportunity,” Scott said, noting that later, when people look back at this time, they will recognize this was the turning point.
“The data work that we’re doing here is just one example of that,” he said.
But one official at the conference said going forward, there is still a culture change to fight in showing agency officials that open data benefits them as well.
“While I work for the chief information officer, I spend probably less than 5 percent of my time on technology,” said Kris Rowley, chief data officer at the General Services Administration. “I spend 95 percent of my time communicating with people and earning their trust, and explaining to them how sharing data in a real-time way, even pushing it to an open data platform, benefits them. And sometimes they don’t see that benefit initially.”
Technology has evolved so quickly over the past few years, making migration of data from one place to another much easier “to the point that it scares people,” Rowley said.
“We have to do a better job, I think, of making sure we can talk about securing the data when it’s necessary and opening whenever possible,” he said.
Opening the conference Wednesday, Data Foundation interim President Hudson Hollister said the open data movement is really starting to shift to informing agencies that open data is useful for them, too.
“Open data might have begun as a way to bring transparency to the people; it might have started out motivated by a desire to allow citizens and taxpayers to access the information that they own. But that’s not where it’s ending up,” Hollister said. “Open data is valuable as citizens, but it’s in many cases so much more valuable to government leaders.”
He noted, contrary to conventional thought, the first users of open data are actually inside the government.
“We think that that shift from just being a transparency matter to being a management matter is one of the most important changes that open data is undergoing over the past couple of years,” Hollister said.
NIST, others band together to create IoT-enabled smart city framework
The first draft of a framework for Internet of Things-enabled smart cities is due later this fall, public sector and National Institute of Standards and Technology officials announced Wednesday.
Dubbed the IoT-Enabled Smart City Framework, or IES City Framework, the effort aims to strike the balance between standardization and interoperability, according to Martin Burns, who works in NIST’s Smart Grid and Cyber-Physical Systems Program Office.
“We understand that smart city technologies are being developed and deployed at a rapid pace, but many of them are customized,” Burns said. “Our goal is a reference framework for the development of incremental and comparable smart cities.”
To create the framework, NIST developed three working groups to evaluate where cities are now, identify opportunities and chart a course forward. The first draft of the framework is expected by the end of October with a final draft scheduled to be released in June 2017.
But instead of delivering hard mandates, Burns emphasized the importance of leaving cities open for the potential of flexibility specific to the needs of individual cities.
“If you standardize everything, you freeze out innovation,” he said. “If you standardize nothing, you get non-interoperable clusters that can’t be easily integrated.”
The goal, Burns said, is to create a model that is actionable, reduces barriers to interoperability and is not too complex for implementation. By creating standardized interfaces on the areas cities have in common, the group can open up opportunities for cities to focus on individual priorities.
“The IES City Framework is not trying to become the one ring that rules them all,” Burns said. “We want to reveal pivotal points of interoperability, not declare them.”
The working group creating the framework is free and open for participation for anyone globally, and the framework and its associated documents will be posted online for free as they are released.
Dan Hoffman, the chief information officer of Montgomery County, Maryland, who leads one of the working groups on finding case studies underway in cities now for the framework, said he hopes its creation will bring together disparate groups of folks around smart city initiatives. If successful, Hoffman said the framework could help CIOs on everything from technology to procurement.
“There are several different peer groups of city, county CIOs that have formed to help share information like this,” Hoffman said. “When does it help? When a CIO like me needs to go out and procure a solution.”
White House to public: ‘Lock down your login!’
The Obama administration and a bevy of nonprofit organizations, technology firms and financial services companies joined forces Wednesday in a public campaign to get Americans to stop relying on passwords and use stronger methods of identity authentication.
“Your usernames and passwords are not enough to keep your accounts secure,” states the campaign website, which went live Wednesday in the run-up to the 13th annual National Cybersecurity Awareness Month.
“Luckily,” the website continues, “there’s a simple and quick way to put you in control of your personal information and keep your key accounts like email, banking and social media safer — it’s called strong authentication.”
Strong authentication, also called multi-factor or second-factor authentication, has long been advocated by security experts as an alternative or addition to passwords.
Unfortunately, as the campaign factsheet notes, 72 percent of Americans believe their online accounts are secure with just a password and login — something that repeated breaches of password data, like the one revealed last week by Yahoo, have shown to be untrue.
[Read more: Will the yahoo breach finally get us past the password?]
Under the slogan “Lock down your login” the campaign advocates one or more of three authentication technologies that can make online accounts more secure:
- A security key — like a USB keystick
- Biometrics — like an iris scan or facial recognition from the webcam on your laptop or smartphone; or a fingerprint from a special built-in sensor
- A one-time password or code — sent to your phone by SMS or app
“The bottom line is, passwords can’t be secure,” said Brett McDowell, executive director of the nonprofit FIDO Alliance, which advocates for strong online authentication. “It’s long past time that we replaced them with something that’s not vulnerable to phishing, social engineering or replay attacks.”
Phishing or social engineering involves tricking a user into giving up their password. Replay attacks rely on the fact that most users also ignore advice not to use the same password for multiple sites or accounts. That means if a hacker has the password for a user’s email account, they can try it on social media or even financial accounts, too.
The FIDO Alliance is one of the organizations backing the new campaign, which is led by the National Cyber Security Alliance — organizer of National Cybersecurity Awareness Month.
Other partners include Bank of America, CompTIA, the Consumer Federation of America, ESET North America, Facebook, the Financial Services Roundtable, Google, Intel Corp., Mastercard, Microsoft, Mozilla, PayPal, Salesforce, Square, Symantec, Twitter Inc., Visa Inc., Wells Fargo & Company, and USAA.
Former U.S. intel leaders warn that legal action against Yahoo will hurt FBI cyber efforts
U.S. regulators would be wise to avoid legal action against U.S. companies who suffer data breaches because of the precedent such cases can set, former U.S. intelligence leaders said at a recent cybersecurity summit in Washington, D.C.
“[I do think] there is a whiff of activity here … our government is unable to punish the criminals but then spends a great deal of time beating up the victim when these sorts of things happen,” said former NSA Director Gen. Michael Hayden at a U.S. Chamber of Commerce event Tuesday.
Hayden’s comments come one day after Sen. Mark Warner, D-Va., released a statement calling on the Securities and Exchange Commission to “investigate whether Yahoo and its senior executives fulfilled their obligations to keep investors and the public informed.”
Warner claims that Yahoo did not comply with established federal laws that require companies to notify shareholders of “material events” within four days of an incident.
On Tuesday, a Senate cohort, including Sen. Patrick Leahy, D-VT, Ron Wyden, D-Or., and Elizabeth Warren, D-Mass., asked a number of questions related to the breach, including if Yahoo had ever been warned by government officials of “possible hacking attempt[s] by state-sponsored hackers.”
Yahoo has attributed the loss of 500 million user records to a state-sponsored attack. It remains unclear whether the FBI or a private forensic team is responsible for Yahoo’s claim that a nation-state actor is behind the breach.
A former U.S. official with knowledge of the case tells FedScoop that the Department of Homeland Security did not reach out to Yahoo to inform them of a potential state-sponsored cyberattack.
Hayden, former Congressman Mike Rogers and former NSA Deputy Director Chris Inglis cautioned Tuesday that decisive legal action against victimized companies may send mixed signals; potentially discouraging other private firms from sharing threat intelligence and data breach information with law enforcement and the intelligence community.
“You get ripped off by a nation state, you report through the SEC, it becomes public, and you now have lost company value as a result. Who wins? I’ll tell you who wins. It’s the nation state who went and did it in the first place,” said Rogers, who previously served as Chairman of the Permanent Select Committee on Intelligence.
Currently a CNN national security commentator and radio show host, Rogers said he opposes when companies keep breach incidents secret from customers — especially when confidential information may have been stolen.
Though a fine line may exist, today, between purposeful nondisclosure and negligence, Inglis believes a divide must certainly exist when reviewing data breach cases of the future.
“I don’t think there’s one lane here,” Inglis said, who is now a venture capitalist with D.C.-based Paladin Capital Group. “In the airline transportation industry, there’s this practice of if there’s an accident whether it’s attributable to human error or material failure … You don’t try to find attributable fault, you just try to find out what happened and get it out to everyone so we can make the airplane safer or to stop that dangerous procedure. But that doesn’t stop the criminal or liability proceedings occurring in a different lane,” said Inglis, “I think we can bring something like that into this domain as well.”
Inglis’ offhand suggestion may hold water as breach disclosure regulation remains a nascent and convoluted process, with conflicting equities always at play, Chris Roberts, a former security consultant and now chief security architect at Acalvio, described to TheStreet.
Secretary of Commerce Penny Pritzker said at the same conference Tuesday that the government and private sector have a ways to go in the exchange of cyber threat information.
“Even as companies and agencies begin to speak the same language of cyber risk management, we are not yet having a truly candid, actionable conversation because we lack the legal support structure necessary to do so,” Pritzker said Tuesday. “The problem is that the relationships between regulators and the businesses they regulate are inherently adversarial.”
Standard breach disclosure policy varies greatly dependent on individual state laws that oversee a victim company’s activities, said Ari Schwartz, a former senior director for cybersecurity on the National Security Council. Typically, an affected business will need to ask law enforcement if they can notify customers.
“Except in extreme circumstances, law enforcement usually says that it is okay to do so,” explained Schwartz, currently a managing director of cybersecurity services for D.C.-based law firm Venable LLP.
It can also be challenging for lawyers to accurately define what exactly constitutes lost or compromised data in such cases due to an absence of common policy and lack of technical understanding between the court and defendant.
There are cases, for example, in which private sector companies decline to report ransomware attacks because they believe that paying a hacker’s ransom puts them on par with deflecting a breach in its entirety, experts say.
“We have discovered that the majority of our private partners do not turn to law enforcement when they face an intrusion,” FBI Director James Comey said during a speech last month. “We know your primary concern is getting back to normal when you run any type of enterprise, especially a for-profit business. But we need to figure out who is behind that attack.”
Why the Commerce Department wants you to experiment with Amazon’s Alexa
An upcoming Commerce Department event at Amazon’s headquarters in Seattle will challenge people to connect government data to Amazon’s Alexa.
Alexa is a voice-enabled virtual assistant software programmed to respond to certain natural language commands and perform skills, like playing music, setting alarms or telling the news. The Commerce Department thinks the Amazon technology can give some people more digestible access to its troves of data, Justin Antonipillai, counselor to Commerce Secretary Penny Pritzker, wrote in a Medium blog post.
“Our goal is to democratize access to information for citizens by enabling comfortable, familiar mediums for open, easy access,” Antonipillai said in a statement. “While power users and large consulting firms are comfortable getting data via an API, a larger audience simply wants to access our data seamlessly via their consumer devices in both text and voice searches.”
During the two-day event on Oct. 7 and 8, software developers and civic hackers will develop new Alexa skills with technical help from the department’s Commerce Data Service and some supporting bureaus, like the U.S. Census Bureau.
Antonipillai provided possible examples of scenarios that could come out of the challenge’s work, including planning a crop harvest using weather forecasts or a fishing trip using tide and marine conditions, and learning which cities generate the most patents.
To participate, developers must sign up for the event, create free Amazon developer and Amazon Web Services accounts, and download the sample templates.