As smoky, apocalyptic-looking skies spread across western North America, researchers are scrambling to improve their predictions of wildfire risk. A changing climate means not just higher temperatures but also new patterns of rain and snow, which interact in complex ways to contribute to the risk of fire.
A paper in PNAS this week argues that the role of temperature has been exaggerated, while the importance of rainfall has been overlooked. Zachary Holden and a team of collaborators report finding that, out of a range of different factors, longer dry spells were the best predictor of wildfire risks. Other researchers, however, argue that the paper has overlooked previous research and that its results aren’t definitive. Everyone agrees that wildfire risk is set to increase as a result of climate change—but the gritty details are harder to nail down.
Hot, dry, and fiery
This research sits in a strange place between the blindingly obvious and the intensely murky. The contributions of rising temperatures to summer wildfire risk are intuitive: less snow, earlier snowmelt, faster evaporation of water from the environment, hotter and drier vegetation. The role of drought in making things hotter and drier is also clear.
If every place was just getting steadily hotter and drier at exactly the same rate, things would be simple. The complication is that the projections of changes in rain, snow, and temperature are different for different regions and even for different elevations in those regions, so predicting risk means working out the relative contributions of all those factors. “Meaningful use of climate projections can only occur if we can accurately link the different climate elements to fire,” write Holden and colleagues.
To estimate those contributions to overall risk, the researchers looked at historical fire data from 1979 to 2016 in the Western US, comparing it to data on snow, temperature, and rainfall. They found evidence of less rain falling over time and also of a reduction in the number of days with a decent amount of rainfall (anything more than 2.5mm). “We’ve been kinda quietly breaking records here for really long dry spells,” says Holden.
When they compared the different factors, they found that the biggest contribution came from the dry spells: the more days without decent rainfall, the greater the chances of a wider area being covered by wildfire in the region. The researchers, many of whom are based at the US Forest Service, plan to use the data to improve predictions. “The primary focus of this work was to improve how we characterize wildfire danger so that fire managers can make better decisions,” Holden explains.
Temperature’s not out of the running yet
However, other researchers are not convinced by the results. LeRoy Westerling, whose work on wildfire risk has highlighted temperature as a primary factor, points to research of his own that identified the decline in rainfall, but he didn’t find that it played such a large role in wildfires.
The two studies made different decisions in how they chose and analyzed their data. For instance, Holden and his colleagues explored three different ways to measure snowfall: the peak amount of snow in a given year, the number of snow-free days between March and September, and the amount of snow lingering on April 1. They picked peak snowfall to use in their analysis, because it had the strongest correlation with wildfire. Westerling and his colleagues, on the other hand, used the last date of permanent snowpack. These different metrics are all related but aren’t identical.
Westerling also points to the inclusion of 2016, a record-breaking drought year, in the data set. While obviously the drought itself forms part of the history of the changing climate, the drought was “followed by one of the wettest years,” he says. “We don’t really know whether we’ll have more drought or more wet years, too; the trend we’ve seen is towards greater variability.”
It’s useful to think of the different findings as boundaries staking out the limits of what’s likely to be true, says Park Williams, whose work has looked at the role of aridity in wildfire. His interpretation, he explains, is that his work “should be interpreted as an upper bound on the importance of atmospheric heat/aridity,” while the new research should be interpreted as a lower bound.
“Reality is somewhere in between,” he explains. “This paper will motivate future work to more thoroughly disentangle the intimately intertwined effects.”