Photo d'un xylophages Agrilus © Bouget
BIOC@PT - Results

BIOC@PT - Automatic Forest Biodiversity Sensors

For several decades, due to the rapid and alarming decline in biodiversity, monitoring environmental changes has become a crucial issue. Traditional methods of biodiversity monitoring are no longer adequate, and it is necessary to consider automating the collection of samples using images, videos, and sounds.

Photo d'un xylophages Agrilus
© ©Bouget

Thanks to advances made over the last decade in the fields of genomics, computer science, and artificial intelligence, sensors can now be combined with environmental DNA barcoding, visual recognition, and autonomous sound recognition technologies to identify or count species.

 

 

 

Approaches

The Bioc@pt project explored new ways of acquiring biodiversity data that are faster and less costly than traditional morphological identification in the laboratory, by automating field surveys (using sensors) and species identification (using artificial intelligence techniques applied to photographic recognition), with a threefold focus on biomonitoring, biodiversity monitoring and the study of spatiotemporal patterns of biodiversity. This exploratory approach involved two important and relatively poorly understood forest taxonomic groups (insects and bryophytes) and could be adapted to other groups.

Bryological and mycological studies

A feasibility test of autonomous visual recognition using deep learning was conducted on photos of bryophyte spores, sampled in the forest by actively aspirating airborne particles with an automatic Cyclone Sampler sensor. We developed a CNN-type algorithm capable of discriminating between ‘spore’-type objects and built a photo bank of diaspores from 54 species of bryophytes, including species recorded by naturalist inventory at the aspiration sampling sites.

Prototype photographic entomological sensor

A prototype automatic camera trap insect sensor was developed and tested in the forest on a group of wood-boring beetle species (buprestes Agrilus sp.). In collaboration with Cap2020 for the mechatronic engineering of the sensor (housing, backlit optical sensor, communication, backup, energy autonomy), we transformed a green Lindgren trap, selective for buprestes, into a prototype selective, non-destructive, automatic and connected photographic sensor.

Autonomous visual recognition of photos of Agrilus beetles

The medium-term goal is to develop an automatic camera trap with an embedded photo processing algorithm. At BIOC@PT, we worked on an automatic laboratory sorting tool capable of automatically recognising (classification by deep learning) beetles photographed in the laboratory, in order to automate the processing of samples from conventional traps.

Results

Bryological and mycological studies

Based on the training data, the overall recognition rate is insufficient, ranging between 63% and 80%, and the average sensitivity and accuracy are low for all species, with frequent confusion between certain species.

Prototype photographic entomological sensor

Remote updating of the script controlling the regular frequency of shots and remote transmission of images are operational. The geometry of a new camera chamber, with an anti-return collar, optimised after observing the behaviour of trapped insects, has been integrated into version 2 of the Cap2020 sensor housing and successfully tested in the forest in the summer of 2024 (photo).

 

piège non destructif pour capturer des insectes en forêt équipé d'une prise de vue automatique

Autonomous visual recognition of Agrilus beetle photos


A custom neural network was created and trained to distinguish between classes (species) in a training photo library consisting of several thousand images of the seven main regional species of the Agrilus genus and one invasive exotic Agrilus species. Eight species of bark beetles were then added.  The neural network performs well in distinguishing between taxa. With the expanded training photo library, the overall recognition rate reaches 90%.

The algorithm trained with the laboratory photo library proved capable of distinguishing between Agrilus species in photos taken with the field sensor, even though the latter is of lower optical quality than the digital laboratory microscope.

Participants

INRAE units involved

  • UR EFNO - Unité de recherche Écosystèmes forestiers
  • UMR SADAPT - Sciences pour l'action et le développement : activités, produits, territoires 
  • UR P2E - Laboratoire de Physiologie, Ecologie et Environnement - Université d'Orléans
  • USC Ecodiv-Rouen - Étude et compréhension de la biodiversité 
  • UMR Herbivore - VetAgro Sup

Partners

  • IRBI - Institut de Recherche sur la Biologie de l'Insecte - Université de Tours / CNRS

Contacts - coordination

See also

To find out more: see the scientific assessment and find the main publications on the HAL Biosefair