This same year we received funding from the Australian Marine Mammal Centre (AMMC) for a one-year project in which we planned to work out how to replace observer surveys with drone surveys. Turns out this timeframe was a bit ambitious (as this is the topic that many are still working on)!
But it was the start of the journey towards replacing expensive, laborious, and sometimes genuinely life-threatening, observer surveys, with aerial imagery surveys that would be safer, provide better data, provide a permanent record of every animal sighting and the surrounding habitat, and be (eventually) cheaper.
It was at this early stage when we realised that conducting imagery surveys meant processing 10s or 100s of thousands of images to detect animals, and that without an automated system for this, imagery surveys were never going to be realistic. And so began the collaboration with the Queensland University of Technology (QUT) to achieve our aim of automating the detection of animals in the images. You can see our first publication on this topic where we used non-AI algorithms here.
In 2010, Amanda continued the development of this ‘drone survey’ idea under a Bill Dawbin Postdoctoral Research Award from AMMC. With some amazing in-kind support from Insitu Pacific Ltd, we conducted three sets of trial surveys of dugongs and humpback whales using the high-end military drone, the ScanEagle. Follow the links to our publications describing our dugong case study, assessment of detection probability, and direct comparison of observer and imagery surveys .
During this time, Frederic Maire (QUT) started using our labelled aerial images of dugongs to train a deep convolutional neural network (a form of AI known as machine learning) to automate dugong detection. Dubbed the Dugong Detector, this was the beginning of WISDAMapp. You can see our original publication of this work here and here.
Besides spotting animals in the survey images, we also needed to plot them on a map. Eric Kniest, from the University of Newcastle, kindly adapted his VADAR (marine mammal location and tracking) software so that it would georeference the location of each animal labelled within our aerial imagery, as well as the plot of the image footprints so that we knew the exact area that had been surveyed.
Chris Cleguer joined the collaboration in 2017, leading the development of methods to use small, more accessible drones to conduct local-scale dugong surveys. In searching for someone to write a standalone custom mapping software for our aerial imagery (VADAR had reached its limits), Chris found Martin Wieser , who developed OceanMapper.
In 2018, our Dugong Detector to Monitor Seagrass Health project was selected as a finalist in the in the Google.org Impact Challenge Australia 2018, securing us funding to further develop AI models and to develop a user interface that would integrate the detection (both manual and automated) and mapping of the dugongs.
That user interface is now WISDAM, which is now designed to be applied to the aerial imagery survey of any fauna and which integrates a marine animal detector that goes beyond dugongs, having also been trained to detect whales, dolphins, turtles, sharks and rays. We have many more ideas for the further development of WISDAM as a tool for wildlife imagery surveys, and we welcome your ideas and input.