A Duke University team is demonstrating that AI can be for the birds, using deep learning to train a computer to identify up to 200 species of birds from a photo.
The team trained their deep neural network by feeding it 11,788 photos of 200 bird species to learn from, ranging from swimming ducks to hovering hummingbirds.
Given a photo of a mystery bird, the network is able to pick out important patterns in the image and hazard a guess as to which bird it is by comparing those patterns to typical species traits it has seen before.
The team is making the system explainable, able to give reasons for its conclusions, for instance that it identified a hooded warbler on the basis of its masked head and yellow belly.
Duke computer science Ph.D. student Chaofan Chen and undergraduate Oscar Li led the research, along with other team members of the Prediction Analysis Lab directed by Duke professor Cynthia Rudin. They found their neural network is able to identify the correct species up to 84% of the time, on par with alternative bird recognition systems that are not able to explain as well why they reached their conclusion.
The project is more about visualizing what deep neural networks are really seeing when they look at an image than it is about naming birds, according to Rudin.
For their next project, the team is using their algorithm to classify medical images including mammograms. Their system would look for lumps, calcifications and other symptoms that could be signs of breast cancer, and it will show doctors which part of the mammogram it is focusing on that show evidence of the cancerous lesions from patients who have been diagnosed.
The system is designed to mimic the way doctors make a diagnosis. “It’s case-based reasoning,” Rudin stated. “We’re hoping we can better explain to physicians or patients why their image was classified by the network as either malignant or benign.”
European researchers are conducting similar experiments in the use of AI to help identify individual birds. Published in the British Ecological Society journal Methods in Ecology and Evolution, the research demonstrated that AI could be trained to recognize individual birds from images.
“We show that computers can consistently recognize dozens of individual birds, even though we cannot ourselves tell these individuals apart. In doing so, our study provides the means of overcoming one of the greatest limitations in the study of wild birds – reliably recognizing individuals,” stated Dr. André Ferreira at the Center for Functional and Evolutionary Ecology (CEFE), France, and lead author of the study.
In the study, researchers from institutes in France, Germany, Portugal and South Africa describe the process for using AI to individually identify birds. This involves collecting thousands of labelled images of birds and then using this data to train and test AI models.
The researchers trained the AI models to recognize images of individual birds in wild populations of great tits and sociable weavers and a captive population of zebra finches, some of the most commonly studied birds in behavioral ecology. After training, the AI models were tested with images of the individual bird they had not seen before, and had an accuracy of over 90% for the wild species and 87% for the captive zebra finches.
In animal behavior studies, individually identifying animals is one of the most expensive and time-consuming factors, limiting the scope of behaviors and the size of the populations that researchers can study. Current identification methods like attaching color bands to birds’ legs can also be stressful to the animals.
These issues could be solved with AI models. Dr. Ferreira stated. “The development of methods for automatic, non-invasive identification of animals completely unmarked and unmanipulated by researchers represents a major breakthrough in this research field. Ultimately, there is plenty of room to find new applications for this system and answer questions that seemed unreachable in the past.”
The researchers used an innovative system to capture bird photos needed to train its algorithms. They built bird feeders with camera traps and sensors. Many birds in the study populations carried a passive integrated transponder (PIT) tag, similar to microchips implanted in pet cats and dogs. Antennae on the bird feeders were able to identify the bird from these tags and trigger the cameras.
The European bird researchers used a type of deep learning AI method known as convolutional neural networks, optimal for solving image classification problems. The authors cautioned that their AI model can only re-identify birds for which it has reference images. “The model is able to identify birds from new pictures as long as the birds in those pictures are previously known to the models. This means that if new birds join the study population the computer will not be able to identify them,” stated Dr. Ferreira.
AI is also being used to listen to birds, according to a recent account in nature. While many researchers collect audio recordings of bird calls, conservation biologist Marc Travers is interested in the noise produced when a bird collides with a power line. It sounds “very much like the laser sound from Star Wars,” he stated.
Travers wanted to know how many of these collisions were occurring on the Hawaiian island of Kauai. His team at the University of Hawaii’s Kauai Endangered Seabird Recovery Project in Hanapepe was concerned specifically about two species: Newell’s shearwaters (Puffinus newelli) and Hawaiian petrels (Pterodroma sandwichensis).
To investigate, the team sent the 600 hours of bird audio it had collected to Conservation Matrix, a firm in Santa Cruz, Calif., that uses AI to assist wildlife monitoring. The company’s software was able to detect the collisions automatically. Since beginning the work in 2011, the team added to its bird audio data to get to about 75,000 hours.
Results suggested that bird deaths as a result of the animals striking power lines numbered in the high hundreds or low thousands, much higher than expected. “We know that immediate and large-scale action is required,” Travers stated. His team is working with the utility company to test whether shining lasers between power poles reduces collisions; it seems to be effective. The researchers are also pushing the company to lower wires in high-risk locations and attach blinking LED devices to lines.
The software may not be as accurate or as sensitive as humans at many conservation research tasks, and the amount of data needed to train an AI algorithm to recognize images and sounds can present hurdles. But early adopters in conservation science are enthusiastic. For Travers, AI enabled a massive boost in monitoring. “It’s a huge increase over any other method available,” he stated.
Comments