Artificial intelligence against deforestation
Artificial intelligence against deforestation
Identifying the link between specific commodities and areas at risk of deforestation, can be extremely complex. At Barry Callebaut, we require our suppliers, such as palm and soy suppliers, to identify the forest areas that need protecting, and those that can be developed for agriculture. However, conducting this type of assessment is difficult, time consuming and, often, a costly exercise.
We needed a solution to support our suppliers that was more efficient, less expensive, and had the ability to scale. Moreover, we knew that artificial intelligence was already being used for climate change strategies, such as for predicting droughts, cloud cover, and greenhouse gas emissions.
The question we, therefore, asked ourselves was, how can we leverage existing deforestation methodology with artificial intelligence?
To answer that question, we teamed up with EcoVision Lab, part of the Photogrammetry and Remote Sensing group at ETH Zurich, who have the capability to develop highly automated artificial intelligence solutions. The group has long-standing experience in combining machine learning (deep learning) with remote sensing to address ecological challenges. The team at ETH Zurich is utilizing data from a NASA laser scanner attached to the International Space Station and imagery from the European Space Agency (ESA), which allows large areas to be mapped by applying artificial intelligence, limiting on-the-ground measurements to only very critical locations.
This collaboration led to the development of a publicly available, industry-first, indicative High Carbon Stock (HCS) map that identifies forests with high conservation value and areas where deforestation would cause the highest carbon emissions.
Building on best in class approaches
The development of the HCS map greatly supports the current approach of data taken from the field, which until now is the widely used approach to measure the link between commodity cultivation and deforestation, the so-called High Carbon Stock Approach (HCSA).
Building on best in class approaches
HCSA is a widely recognized methodology that is increasingly being used by certification standards, such as the Roundtable on Sustainable Palm Oil (RSPO), and by companies that are committed to breaking the link between deforestation and land development in either their operations or supply chain. The reliance on ground and aerial imagery for HCSA is challenging, because manually measuring landscapes and evaluating vegetation classes is labor-intensive and difficult to roll out at scale, whilst using planes equipped with specialized laser scanners is an expensive option.
Combining HCSA with the predictive power of artificial intelligence
Implementing artificial intelligence solutions, all starts with data quality. Deep learning is a research field with a very fast pace. New algorithms are improving quickly and are demonstrating the potential to revolutionize forest monitoring and carbon stock estimation based on satellite images. However, when relying on supervised learning, that is, learning from large reference datasets, the amount and quality of data is the key to success. Over the past four years, the ETH team has focussed on utilizing the new satellite imagery and calibrating regional carbon biomass data. As a result, we have developed a tool that is highly automated, objective and can be used to up-scale indicative HCS mapping to entire world regions.
Southeast Asia and beyond
Today’s launch of this large-scale HCS map marks a pivotal moment in our exciting journey with ETH Zurich, that could ripple well beyond our own chocolate and cocoa supply chain.
We are passionate about addressing the biggest challenges in our industry and we can only do so by working together, and by consistently pushing the needle to find innovative solutions.
Interested in learning more about our large-scale HCS maps?
For more information, please contact :
Oliver von Hagen
Director Global Ingredients Sustainability, Barry Callebaut
[email protected]