Tornadoes and Terabytes: Big Data as Weatherman
Maybe you’ve heard these quips before:
- Who is it that everyone listens to but no one believes? The weatherman.
- What does the word “meteorologist mean?” It means liar.
- What did the honest weatherman forecast for today? Bright and sunny with an 80% chance of being wrong.
There are hundreds of these kinds of jokes. Meteorologists are some of the most picked-on professionals (except perhaps lawyers). In Monday-morning-quarterbacking the weather, it is easy to point out where the weatherman was wrong, but sharpshooting the local meteorologist belies the incredible feat they are trying to achieve—predicting the future.
This is something human beings have been trying to do since we were planting seeds in the Stone Age. Human understanding of the weather and seasons guides when we plant and when we harvest; when we hunt and when store food for the winter; when we travel and when we stick close to home. It determines what we wear, the activities we choose, and how we live. On a larger scale, it impacts the economy, societal planning and productivity. In short, the weather is some of the most important and long-pursued data in human history.
Despite the sometimes-incorrect meteorologists, our modern capacity to understand and anticipate the weather would look prophetic to our ancestors. Using satellites, sensors spread throughout the world, and advanced computers and algorithms, we can more accurately anticipate the weather today than ever before, and we are poised to get better, thanks in part to the tidal wave of Big Data that is in our grasp.
Data is a boon for meteorologists. Bryson Koehler, the Chief Information Officer for The Weather Company (parent company to The Weather Channel), says that:
“Figuring out the weather once doesn’t matter. What does matter is that once you understand how people have reacted to weather in the past, you can extrapolate to predict how people will act in the future. This is fascinating to us, and maintaining our position as the world’s best weather forecasting organization is really an exercise of big data.”
Gobs of Weather Data
To capitalize on the wealth of data, The Weather Company has embarked on its Storage Utility Network (SUN) project, which consolidates the company’s data storage and records 2.25 billion weather data points 15 times an hour. This adds up to 10 to 20 terabytes a day—an enormous, constant flow of data that allows the company to continually update, refine and improve its forecasts. To store all this information, The Weather Company is relying on could providers like Amazon and Google, evidence that the massive potential in data is not limited to the insights gleaned from it. It drives businesses, relationships and the economy writ large. Weather data is big business.
While the everyday citizen may only be concerned with whether they need an umbrella, small, medium, and large businesses rely on weather forecasts to make important financial and strategic decisions. Take the energy industry as an example. Forecasts for extreme weather are essential data for planning energy infrastructure. A location bombarded by hurricanes requires a different kind of engineering than an area that sees frigid temperatures nine months of the year. Weather forecasts also allow energy companies to anticipate demand, ensuring enough electricity pumps into hundreds of thousands of consumer homes to power their air conditioning through a sweltering summer day.
Given the importance to citizens, businesses, and government alike, weather forecasting today is ever-more focused on predictive analytics; that is, looking through data in an attempt to anticipate the future. IBM’s Deep Thunder, for example, feeds historical and near real-time data through supercomputers capable of forecasting extreme weather days in advance for targeted locations (less than a square mile). Testing Deep Thunder in New York City metro area, IBM ran calculations to accurately predict a 2013 blizzard—not just how much snow would fall but when the storm would begin and end.
Integrating more robust predictive analytics into weather analysis is also driving valuable public-private partnerships that can yield better forecasts and business. The National Oceanic and Atmospheric Administration (NOAA) is a federal agency with a broad mandate to monitor the ocean and atmospheric conditions. It operates the National Weather Service and collects troves of data from radar, buoys, sea-faring vessels, satellites, and indeed any sensor that records information about water and air conditions. It is a case of the data cup running over; there is so much data, NOAA has turned to the private sector for innovative ideas on how to store and use all this information.
"Quite simply, NOAA is the quintessential big-data agency," said NOAA CIO Joe Klimavicz. "Imagine the economic potential if more of these data could be released. Unleashing the power of NOAA's data will take creative and unconventional thinking, and it's a challenge we can't tackle alone."
Here then is yet another example of how data affords insights that can improve our way of life while also contributing to economic growth. It is, as has been said elsewhere on this blog, a case of “data for good” — good for us individually, for the nation’s businesses and for the country overall.
Yet, while there have been dramatic advancements in predictive analytics, meteorologists and other forecasting professionals and organizations can still get it wrong. NOAA has taken some flak for not measuring up against its European equivalent, the European Centre for Medium Range Weather Forecasting. Time and again, the Europeans have nailed a forecast while NOAA has missed it. Why is this? It boils down to the amount of accurate data the computing power to analyze it.
“They have ten times more computer power than we [Americans] do, which allows them to run their weather prediction model at a higher resolution than ours, roughly twice the resolution,” said University of Washington professor of atmospheric science Cliff Mass. “Imagine a three-dimensional grid covering the atmosphere. They are able to define things much better—structures in the atmosphere and physical processes—because they're using a finer grid.”
Predicting Powerful Storms
Forecasts for the 2013 hurricane season were abysmally inaccurate, and no one is quite sure why. The Florida State University Center for Ocean-Atmospheric Prediction Studies (COAPS), which has a good reputation for accurately anticipating severe weather, said in 2013 that there was a 70% probability that the season would see “12 to 17 named storms with five to 10 of the storms developing into hurricanes.”
COAPS wasn’t alone. The Tropical Meteorology Project at Colorado State University (CSU) also predicted a bad hurricane season, as did other forecasting organizations. They were all wrong. Last year, just two storms reached the lowest hurricane strength; neither was considered a major hurricane. Now, researchers and scientists around the world are trying to figure out why their forecasts were off the mark. So far, Phil Klotzbach and William Gray of the CSU Atmospheric Sciences department have found that some datasets did not agree with one another. Yet, as we head into Hurricane Season 2014, (which goes from June 1st through November 30th), data scientists and meteorologists are still not sure why data-based forecasting could have been so wrong. This, however, has not stopped CSU forecasters from announcing what they expect will be a quiet hurricane season, with nine tropical storms, three of which will become hurricanes.
Will 2014 be a calm year for hurricanes or will the forecasters miss it again? Right or wrong, these are the growing pains of a world working its way through the potential (and sometimes peril) of data-driven insights and predictive analytics. Fortunately, the only way to go is up. As more data is collected and better computing power brought to bear, our ability to forecast the weather will continue to improve. It may even make all the weatherman jokes obsolete, though we will likely keep telling them anyway.