Subscribe now

Technology

Apps that identify plants can be as little as 4 per cent accurate

There are many smartphone apps that aim to identify plants from photographs, but tests have found that most are not very accurate

By Matthew Sparkes

5 April 2023

Boy using smartphone to photograph a plant

Apps can help identify plants – but only up to a point

Marko Geber/Digital Vision/Getty Images

Smartphone apps that identify plants from photographs can be as little as 4 per cent accurate, which could put people foraging for food at risk and also lead to endangered plants being mislabelled as weeds and eradicated.

Julie Peacock at the University of Leeds, UK, and her colleagues evaluated six of the most popular apps: Google Lens, iNaturalist, Leaf Snap, Pl@ntNet, Plant Snap and Seek. They attempted to identify 38 species of plant in their natural habitat, at four locations in Ireland, with each app. The team found that some apps scored extremely poorly, while even the best fell short of 90 per cent accuracy.

“There are lots of reasons why it’s important that either the apps are accurate, or people are aware that these apps are a guide but definitely not perfect,” says Peacock. For example, people could misidentify important native species as invasive, and remove them from their gardens, or consume potentially dangerous wild plants, thinking they are a harmless variety.

But Peacock doesn’t think people shouldn’t use these apps, as long as they understand the limitations. “They have huge potential for people to start to engage more with plants,” she says.

The apps use artificial intelligence algorithms trained on vast numbers of captioned photographs of plants. During training, the AI is taught to recognise not only the training photos, but also to spot similarities between them and new photographs, which allows them to identify plants.

Generally, the apps were all better at identifying flowers than leaves, which the researchers say is due to their greater variety of shape and colour providing the AI with more clues. But this wasn’t always the case. The iNaturalist app was able to correctly identify just 3.6 per cent of flowers and 6.8 per cent of leaves. Plant Snap identified 35.7 per cent of flowers correctly and 17.1 per cent of leaves. The highest accuracy was achieved by Pl@ntNet at 88.2 per cent.

Sign up to our Wild Wild Life newsletter

A monthly celebration of the biodiversity of our planet’s animals, plants and other organisms.

Alexis Joly at Inria in Montpellier, France, who is one of the researchers behind the non-profit project Pl@ntNet, said that the app’s success was down to its data sets, which are sourced and categorised by botanists, scientists and informed amateurs, along with algorithms that attempt to balance out bias towards common species and instead rank several likely candidates for each search.

“This is sometimes a thankless task because people prefer to see a single result with 100 per cent confidence, even if it’s not the right one, rather than three possible species at 33 per cent each, but which represents the reality with regard to the photo taken,” he says. “But it seems our strategy is paying off.”

Stephen Harris at the University of Oxford says that Peacock’s concerns are valid, and that he has also experienced problems with such apps and relies on a good reference book instead. The problem is relying on images uploaded to the internet that are often incorrectly labelled, he says.

“People tend to take images of similar things. So you will get certain plants that are really obvious and everybody wants to take a picture of, whereas if you get some sort of really interesting plant but it happens to be a scrappy little thing that doesn’t have very attractive flowers or anything, you won’t get very many images of it,” says Harris. “It’s very unlikely that you’re going to have people scrambling around in ponds, hoiking out pond weeds and taking pictures of it.”

Google declined a request for interview, while the other app creators didn’t respond.

Journal reference

PLoS One DOI: 10.1371/journal.pone.0283386

Topics: