CIAR project: deep learning to detect forest fires from satellites

France experienced very significant heat waves this year, including a very early one. With climate change, this phenomenon is likely to recur every year, increasing the risk of forest fires, which we unfortunately saw this summer. IRT Saint Exupéry is studying technological solutions based on AI and artificial neural networks to detect forest fires as early as possible from satellites in orbit. He presented the results of this work in the article presented at CNIA 2022: “Model and dataset for the multi-spectral detection of forest fires on board satellites” and decided to make the model accessible to the scientific community to strengthen research on this subject.

This work was carried out as part of the “Chaîne Image Autonome et Réactive” (CIAR) project, which studies technologies for deploying AI for image processing on embedded systems (satellites, delivery drones, etc.). It is conducted at IRT Saint Exupéry in partnership with Thales Alenia Space, Activeeon, Avisto, ELSYS Design, MyDataModels, Geo4i, INRIA and LEAT/CNRS.

Following a call for applications, the project was selected to carry out in-orbit demonstrations as part of the OPS-SAT mission. This nano-satellite, launched on December 18, 2019, is only 30 cm high, is equipped with an Intel Cyclone-V (“System On Chip” or SoC) computer, as well as a small step-by-step camera. 50 meter sampling. The CIAR team contributed to the development of AI solutions embedded in the satellite.

On March 22, 2021, she achieved two space firsts:

  • Remote updating, from a ground station, of an artificial neural network on board a satellite.
  • The use of an FPGA (“Field-Programmable Gate Array”) to deploy and use this neural network in orbit.

Model and dataset for the multi-spectral detection of forest fires on board satellites

It is estimated that the number of forest fires could increase by +50% between now and 2100. Houssem Farhat, Lionel Daniel, Michaël Benguigui and Adrien Girard were interested in remote sensing of these fires on board a satellite in order to identify early warnings directly from space.

They trained a UNet-MobileNetV3 neural network to segment a set of 90 multispectral images from Sentinel-2, a series of European Space Agency Earth observation satellites developed under the Copernicus program funded by the European Space Agency. EU. They annotated these images semi-automatically and then checked them manually.

Sentinel-2 images were downloaded from Sentinel-Hub’s OGC WCS API, and their GSD (Ground Sampling Distance, in meters per pixel) ranges from 40 to 80m. The images were then cut into 256x256px patches and distributed within three different datasets in order to train, validate and test an AI reaching an IoU performance of 94%.

Two examples of patches (256×256 pixels). On the left, in the RGB column, the images appear in natural colors. The “False Color” column represents the same scene as the previous column but the Red-Green-Blue channels of the screen display the B12, B11 and B04 spectral bands of the Sentinel-2 satellites. Bands B12 and B11 being sensitive to near-infrared, the foci appear in orange. The ground truth masks are presented in the third column, indicating where exactly the active fires are located, based on a partially manual analysis. AI algorithms can find these same places completely automatically, as can be seen in the right column which shows the outputs of a neural network trained by the researchers.

The network was then deployed on an onboard GPU in a low orbit satellite.

Version 2 (constant GSD equal to 20m):

A second dataset version was produced after the publication of the paper published during CNIA 2022. Sentinel-2 images were downloaded from the Copernicus service, at the maximum resolution, and at the L1C processing level. The team recommends the use of this second version for any future research, since all the patches of the dataset here have a constant GSD equal to 20m.

The IRT has also decided to publish all the datasets:

  • Download both versions of the dataset here
  • CNIA 2022 results (based on S2WDS Version 1) are reproducible via available code here
  • Similar code for S2WSD Version 2 (post CNIA 2022 paper) is available here

Sources of the article: Model and dataset for multi-spectral fire detection
of forest on board satellites, National Intelligence Conference
Artificial 2022 (CNIA 2022), Jun 2022, Saint-Etienne, France. ffhal-03866412.


  • H. Farhat, Saint-Exupéry Technological Research Institute, AViSTO
  • L. Daniel, Saint-Exupéry Technological Research Institute
  • M. Benguigui, Saint-Exupéry Technological Research Institute, Activeeon
  • A. Girard, Saint-Exupéry Technological Research Institute

We wish to thank the author of this short article for this amazing content

CIAR project: deep learning to detect forest fires from satellites

You can view our social media profiles here , as well as other pages on related topics here.