RENO, Nev. – Modeling and mapping fire-vulnerable forest vegetation across millions of acres in      California, scientists at the University of Nevada, Reno are using a variety of new technologies with massive amounts of data and computational power. This research will help optimize fuel management to reduce fire risk, support carbon sequestration and improve water quality.

Ladder fuel loss from the 96,000-acre Ferguson Fire in the Sierra Nevada in 2018 is captured with LiDAR. Red indicates biomass that was consumed in the fire, blue indicates biomass that survived the fire.

The research team, led by Jonathan Greenberg and Erin Hanan in the University’s College of Agriculture, Biotechnology & Natural Resources, is working on a set of interrelated initiatives that are collectively called the “GigaFire Project.” Their overarching goal is to understand, using remote sensing technology and process-based models, how vegetation and fuels are changing over large landscapes.

Greenberg and Hanan are researchers with the College’s Experiment Station and Department of Natural Resources & Environmental Science. Their research will produce statewide and localized fuel maps that will help identify where fire risk is the greatest. They will also inform modeling scenarios designed to predict how management can mitigate fire risk while also promoting carbon retention and water security.

With $570,000 from the California Air Resources Board and nearly $1.8 million from CAL FIRE, the researchers are mapping surface and canopy fuels across the state using:

  • multi-sensor remote sensing data with Landsat and Airborne LiDAR (LiDAR stands for Light Detecting And Ranging, and is a remote sensing method used to examine the three dimensional structure of vegetation); 
  • field-based sampling with terrestrial laser scanning and ground based photogrammetry (the use of photography in surveying and mapping to measure distances between objects) to calibrate and validate changes over time;
  • machine learning; and
  • cloud and high-performance computing to map surface fuel model types, canopy base height, and canopy bulk density across the state.

Lessening the severity of wildfires through enhanced ground and resource management is important. That’s where the GigaFire team is making a difference with their recently funded research and collaboration with CAL FIRE and the California Air Resources Board. Part of the work is focusing on quantifying the first 2 meters of the forest’s understory, as that is the most crucial for predicting fire behavior. Looking toward the future, the team is working to project carbon gains and losses under varying forest treatment scenarios.

These data will be used by the California Air Resources Board to develop new standardized inputs for their program. The GigaFire team aims to prototype an open, transparent and automated scientific modeling framework that can be updated as new data and algorithms become available for improved fuels mapping throughout California.

“We’re using remote sensing and modeling to find all the fuels, especially ladder fuels,” Associate Professor Greenberg said. “It will be a system that is updated regularly and automatically. It will be for the entire state of California, and a few parts of Nevada.

Other attempts at this modeling have been made. Greenberg and Hanan are improving upon that using big data and cloud computing with present and hindcast data since the 1980s for fuels management.

“Analyzing the amount and location of fuel accumulation allows us to understand the situations where you go from low-intensity ground fires, to high-intensity crown fires,” he said. “Crown fires are the real danger – those are the wildfires where things blow up. Our department contributes to the science behind fuels management. When a fire does break out, and they will break out, you want to have already managed the fuels to minimize the risk of catastrophic wildfires.”

Through their research, Greenberg and Hanan also work with land and resource managers who can target specific areas that need treatment, such as forest thinning, collection of material for pulp and controlled burns. Fuel treatments are often used to mitigate fire risk in forests where decades of suppression have increased fuel loading. However, forest density reductions can sometimes have unintended consequences for water quantity and quality, and such effects can be difficult to predict. Modeling work is aimed at understanding how fuels influence fire behavior and the effects of fire behavior on vegetation, soil and hydrological processes. 

“We are using simulation models to determine when, where and under what circumstances fuel treatments can mitigate the risk of severe crown fire, maintain stable forest carbon, and promote water security for millions of residents across the West,” said Assistant Professor Hanan, who leads the Fire & Dryland Ecosystems Lab and also leads the modeling portion of the GigaFire project.

“Models enable us to make predictions about complex responses to future climate and management scenarios that would not otherwise be possible with measurements alone,” she said. “However, to be valid and to advance our scientific understanding, models need to be continually confronted with field data. This is where Greenberg’s big data research is crucial.”

Greenberg runs the University’s Global Environmental Analysis and Remote Sensing Lab, known as GEARS, that is helping to transform the understanding of forest ground coverage with their research using LiDAR technology. LiDAR to examine the three dimensional structure of vegetation.

Before implementing LiDAR technology to map forests before and after fires, the only way to figure out how much ground cover was in a certain area was to deploy teams into the field – an expensive and time-consuming endeavor. However, with LiDAR the researchers can figure out down to the branch what burned and what didn’t during a fire, helping them to better understand the ways in which fires move, and the best ways in which to reduce the chances of extremely severe forest fires.

All of this research requires gathering, moving and storing massive amounts of data. Some of the technology that helps to enable this research is done with Pronghorn, the University’s high-performance computing system housed at Switch, the data storage center in northern Nevada. While the hardware is necessary for the success of the research, the critical technological piece that makes the difference is the human capital, the research-computing professionals who help the researchers scale their science by leveraging these technologies.

“Dr. Greenberg’s wildfire project is a great example of how the University’s research efforts are evolving with modern technologies in a very data-centric way,” Scotty Strachen, director of cyberinfrastructure in the University’s Office of Information Technology, said. “Being able to capture key data at scale, rapidly process and analyze it, and then distribute science-based information to decision-makers and the public requires a new way of thinking about networking, computing and data at the University.

“Our emerging research cyberinfrastructure team is facing this challenge head-on, and working with our scientists and campus leadership to evolve Nevada’s capabilities to bring real solutions to real problems in real time.”