Sean M. Griffin

Published

July 10, 2019

Share

As increasingly large data volumes become available to emergency managers, many are looking to leverage massive, heterogenous data sets across lifeline sectors to enable better decision-making in all four phases of emergency management: mitigation, preparedness, response, and recovery.

After departing the National Security Council, I worked alongside leadership from the Federal Emergency Management Agency (FEMA), the U.S. Army Corps of Engineers, and energy industry to support Puerto Rico and the Virgin Islands in restoring their power grids following Hurricanes Irma and Maria.

The team was continually hampered by an inability to establish meaningful ground truth, which compounded the problem of acquiring and delivering resources where needed. The data existed, although the methods and platforms to access, leverage, and analyse the data meaningfully in 2017 did not exist.

The FEMA 2017 Hurricane Season After Action Report (AAR) identified lessons learned and recommendations to build a culture of preparedness, reduce complexity, and improve the nation's readiness for catastrophic disasters, in alignment with the FEMA 2018-2022 strategic plan. Key reflections from the AAR include:

“The public and private sector are inextricably linked and must have shared situational awareness and the ability to synchronize their respective efforts to be successful” and,“Public and private sector response and recovery efforts were too ‘stove-piped’ to share timely information, too slow to consult, and as a result, often too late to synchronize stabilization efforts.”

These reflections are painfully illustrative, as government and industry continue to work on independent, incompatible systems, without any meaningful bilateral data exchange.

Despite all the advances in technology, information is transferred primarily through verbal or email communications, where lack of systemic interchange prevents a common operational or situational view of disaster response, recovery, or mitigation activities. In addition, where systems provide for cross-sector collaboration, the ability to produce real-time analysis, leveraging big data in a collaborative environment, was totally non-existent.

After the 2017 hurricane season, the National Emergency Managers Association (NEMA) submitted a paper titled “Recommendations for Advancing Public-Private Preparedness Integration,” which was approved by the NEMA Board and submitted to FEMA leadership as well as state emergency managers.

The paper stated:

“[Information sharing] efforts were heralded as successes although there were no standards or agreed upon methods of practice or proofs of concept, and in many cases, these efforts were purely mechanisms to share essential information rather than make operational or strategic decisions toward achieving specific goals and objectives. At times, information shared lacked fidelity or was restricted by official owners, for confidentiality or business process reasons” and,

“[there were not] any pre-defined process or set of community standards for sourcing, sorting, analyzing, providing, or mapping the information for the purposes of joint public private decision-making processes, including the media and the public."

The NEMA recommendation paper, like the 2017 FEMA Hurricane Season AAR, recognized the gap in information sharing and the need for real-time situational awareness, anchored by data standards and governance models that allow for the facilitation of cross-sector decision making. Solutions require infrastructure asset owners, operators and government agencies, across all lifelines, to have “widely-distributed communications and decision support systems (including hardware and software) to establish situational awareness in severely disrupted environments and help guide and prioritize response operations.”

In Search of a Solution

In search of solutions to an increasing gap in situational awareness, I sought innovators in this space, and in October 2018 joined Disaster Intelligence as a founding partner.

Disaster Intelligence CTO Jonathan Flack had been working in partnership with leaders in the semiconductor and big data analytics sectors to resolve the core problem of analysis across many tens of billions of records, much of which is high velocity data with a high frequency update cadence.

Breakthroughs in memory frameworks, as well as GPU accelerated analytics were just beginning to occur in early 2017, and innovative companies including semiconductor giant NVIDIA, and analytics startup OmniSci were clear leaders in this early period of innovation. By late 2018, Disaster Intelligence joined NVIDIA's prestigious accelerator, Inception, designed to nurture startups revolutionizing industries with advancements in AI and data sciences.

The problem of aggregating many hundreds of high velocity datasets proved enormously difficult. With data maintained in a multitude of formats, often proprietary geospatial formats ill suited to real time analytics, and delivered via sparsely documented API's, Disaster Intelligence founders turned to their Chief Scientist, Jack Poulson, for a solution.

A former Chief Research Scientist at Google and assistant professor at Stanford and Georgia Tech, Jack took the lead in tackling what has, until now, been an insurmountable data aggregation problem. Together with CTO Jonathan Flack, they devised a strategy that converts data to a common memory framework which allows for zero copy across processes. This solution allows datasets in the millions of records to be injected into the analytics database in just a few milliseconds.

This month, OmniSci and Disaster Intelligence jointly announced a key strategic partnership, wherein Disaster Intelligence is licensing OmniSci's analytics source code to dramatically accelerate traditional and geospatial analytics, allowing the emergency management community access to a toolset capable of delivering millisecond level performance when analysing disaster data at massive scale.

How Disaster Intelligence is Leveraging GPU-accelerated Analytics Technology on NVIDIA GPU's utilizing an analytics stack built on OmniSci

Analytics presents unique technical challenges, and memory bandwidth is critical for big data analytics.

However, recent developments in GPU based platforms bring, for the first time, the possibility of systems to comprehensively handle multiple massive data sets concurrently allowing for real time situational awareness, and analytics at the speed of thought.

Along with significant memory performance gains, dramatic improvements have been made in GPU based server architectures. NVIDIA's first generation DGX-1 Server™, released in April of 2016, initially featured 8 GPUs based on the Pascal or Volta GPUs. Then in March 2018, NVIDIA announced the next generation for this platform, the DGX-2 Server. An entirely new server architecture incorporating NVSwitch™, a non-blocking switch fabric supporting 16 Tesla V100 GPU's, each with 32GB of HBM2 memory.

The performance of these systems is simply beyond the comprehension for most users, delivering 2 petaFLOPS of compute power in a single 8 rack unit server, weighing 350 pounds.

Transforming Disaster Analytics and Situational Awareness

Today, a rare opportunity to address long standing systemic deficiencies in emergency management has emerged. Key breakthroughs in open source software development, the GPU, and supporting architectures have converged, providing the capability to deliver highly collaborative analytics environments with real time performance, at massive scale.

The tools developed by Disaster Intelligence create a shared environment where public and private sector stakeholders can securely collaborate across sectors, share context, and dramatically improve situational awareness. The company continues to evolve these tools in concert with a diverse set of public and private partners including NVIDIA, OmniSci, University of Delaware Disaster Research Center, CoreLogic, DigitalGlobe, and the U.S. Department of Homeland Security.

By harnessing these technologies and public-private partnerships, together with sound policy and standards development, it is now possible to deliver revolutionary situational awareness and predictive analytic tools to visualize data, understand disaster risks, and inform time-critical mitigation, planning, and response decisions.

[Editor's Note: Sean M. Griffin, President and Chief Strategy Officer, Disaster Intelligence, Inc. will be speaking at the 8th Annual Building Resilience Through Private- Public Partnerships Conference. Learn more and register here.]

About the authors

Sean M. Griffin