Data a Tool for Resilience Against Disasters

January 29, 2015

New England is digging out from one of the worst storms to ever hit the region. While New York City dodged the worst of the storm, Long Island, upstate New York, Massachusetts, and Connecticut saw several feet of snowfall. Fortunately, there were not widespread power outages (save Nantucket) or injuries, due in part to the advance planning by state and city leaders.

When it comes to natural disasters, time is against the responders. The faster a community can make order from chaos the better able they are to save lives, limit economic consequences, and rebuild what was lost. This is called resilience, and preparing for and recovering from a natural disaster demands data.

Over the last 60 years, the Federal Emergency Management Agency (FEMA) has declared more than 3,000 incidents as official disasters, and the agency has been tracking the declarations at the county level, which Washington Post reporter Christopher Ingraham graphically lays out in a recent article.

There are several takeaways from Ingraham’s analysis.

  • By the numbers, southern California and Oklahoma are the most disaster prone areas in the United States. Yet the impact of natural disasters is relative. As Ingraham notes, a severe freeze in Florida may be disastrous for agriculture in that state, but the same weather in northern Michigan would not even merit mention on the evening weather forecast. A disaster’s impact is relative to the environment and industry of a particular area, making resilience-planning specific to the needs of that community.
  • Natural disasters can have impacts that stretch across county and state lines. Floods, hurricanes and other severe storms can affect an entire region, making inter-state collaboration an important element in building resilience. Historical data can help decision makers evaluate and coordinate resilience efforts.
  • Data does not inherently tell a true and complete story. It must be paired with other information. As Ingraham notes in several places, the FEMA data reflects not just historical disaster information but also the politics of disaster recovery. For example, the Midwest sees many storms each year, yet Illinois appears to see fewer storms than its neighbors. This is not because tornados are avoiding Illinois.Rather, as Ingraham writes, “this may be due to Illinois governors' relative reluctance to request disaster aid for storms.” The lesson here is that data cannot be assessed in isolation. It must be placed within a real-world context to reveal deeper insights and perspectives. Moreover, this is evidence of why data analytics cannot replace human thinking. Data does nothing on its own; it is how we analyze and apply it that makes data a powerful force for good.

Data’s role in resilience was championed late last year when Data.gov launched an open-data portal presenting natural disaster datasets and applications that arm first responders, businesses, volunteers, and even survivors with information following an event, fostering the capacity for a holistic, whole-of-community response. There are currently 114 Federal datasets, and this critical resource will continue

to grow over time. Among these datasets are watches, warnings, and advisories on severe weather and events; there is a National Structures Dataset offering information on facilities that can answer disaster planning and emergency response needs; and there is the Next Generation Radar Locations Map of weather radar sites.

Like other parts of Data.gov, the disaster portal uses the philosophy of Open Data, which is the notion that sharing data freely and with little or no restrictions can be a catalyst for innovation. For companies, organizations, and individuals, collecting and storing data brings real costs. Removing this barrier allows more entrepreneurs, innovators, businesses, and other thinkers and doers to access this resource freely, which boosts the capacity for data-driven innovation and resilience. The folks behind Data.gov understand this, and on the new portal they even offer a challenge to innovators about one of the most deadly natural events: floods. Data.gov’s challenge to innovators is, “How might we leverage real-time sensors, open data, social media, and other tools to help reduce the number of fatalities from flooding?”

To be sure, the new portal isn’t perfect (nothing new ever is), but it is a start. The people behind the site show a certain humility that is uncommon in the Federal government. There are numerous notes throughout the disaster portal and other portions of Data.gov asking for feedback, soliciting requests for other datasets, and overall presenting an inclusive, participatory theme, which is the essence of Open Data. That is also the beating heart of effective disaster response, a willingness to look for additional help and insights because no one person has all the answers.

The massive storm in New England brought feet of snow to several states, but it brought something else too—more data. Natural disasters are a fact of life, but more and better data arms us with the information we need to build resilience and limit the short- and long-term consequences of Mother Nature’s wrath. 

See also: