Picture this: You are out walking in a forest when you stumble upon a beautiful collection of plants that you haven’t seen before. You wonder what they are, but you have no idea, and walk away. Once you are home, the best you can do is go online and search: “small purple flowers”. You spend time aimlessly scrolling around and, unsurprisingly, nothing similar to what you saw shows up, it seems that you will never know what plant it was.
Nowadays, there are solutions to avoid this kind of problem! Instead, when you find these plants, you can simply take a picture using an app that uses visual search, and the app will process the photo and compare it to the photos in its database in order to recognise the plant, and show you all the details about it. Now you know everything about the plant you found!
But how does this work?
In order for visual search to work, machine learning algorithms have to go through a trial and a testing phase to learn the differences between different species of plants (in this case). In order to do that, the machine has to go through hundreds of thousands of images of certain plants which have information attached to them, called annotations. In the first trial phase, pixels are analysed and the algorithm learns to differentiate characteristics of the photographs in the trial set.
Once the trial has completed, a model is created to train the data based on the found characteristics. Using new data, this model can be tested using a new testing set of images. These can be uploaded by users throughout the testing, so that with each added photo, the search engine becomes more and more powerful at recognising plants. This works because once the algorithm has been through the testing phase and has some understanding of the images and its characteristics, during the collection phase, the data will be annotated further. This helps the algorithm to confirm certain characteristics of the images and data so that it can properly group them into the right category. In this case, it will be able to recognise when an image shows different plants and plant species.
Eventually, the algorithm is able to detect similarities and differences between the various species of plants to determine which plant it is, very quickly and accurately.
Plant Search: PictureThis vs. PlantSnap
With the specialised apps that have been created, such as PictureThis , an online plant encyclopedia and plant identifier, the process of searching for the plant you are looking for is simple. Simply take a photo of the plant, and the results will appear, showing you the name of the plant, a description of the plant, and tips for looking after it, among other interesting things!
We decided to put this app to the test:
First, we took and uploaded a photo of a Prayer Plant. You can see that on the first try, the app already detected that it was, indeed, a prayer plant, and gave all of the information about the plant that was available.
We then tested a few more plants to see how accurately the AI would recognise them. In the first case with the Chinese money plant, the AI recognised the plant immediately. Similarly, for the polka dot begonia and the Swiss cheese plant, the AI recognised the plants from the first guesses.
For testing purposes, we wanted to test the same four photos on another app, to see how the algorithms compare. We chose the app, “PlantSnap” to do so:
For the prayer plant and the Chinese money plant, the AI recognised the plants immediately. However, for the other two plants, the app was not able to recognise the plant at all, in the case of the polka dot begonia, instead mistaking it for various cactus species, but it was able to recognise the monstera, not in its first guess, but as a second guess.
Based on these two apps, it seems that visual search in the case of plants is quite promising and mostly accurate which makes it great for gardeners, or just those wanting to learn more about plants.
Recognising Diseases: PictureThis vs. Plantix
Another feature of visual search apps for plants includes disease recognition .
What makes this feature so important is the fact that in emerging countries, there are a lot of young farmers, who don't yet have the experience with recognising plant illnesses. The problem this brings is that when the disease is not recognised fast or accurately enough, it can spread and destroy the entire harvest of plants.
A majority of this problem can be avoided through the use of apps such as PictureThis and Plantix , by allowing these farmers to recognise a disease early on and get the right cultivation tips. However, these apps are not only useful for young, inexperienced farmers, other end users, such as hobby farmers, or people with plants at home can use it too.
However, the difference between these two apps is that PictureThis is used more commonly for regular (ornamental) house plants, whereby Plantix is used for crops.
We compared the same image of a tomato plant on both apps to see which one recognised the illness better. The images below show the results from PictureThis:
Whilst the right plant was recognised, the disease was not quite accurate. However, gray mold is easy to confuse with late blight (the correct illness) due to the irregularities in their shapes.
However, what is nice is that PictureThis does a whole symptom analysis that allows users to further understand the illness allowing them to confirm the diagnosis, and bringing awareness to the similarities and differences of what the app predicts, compared to what disease it actually is.
Then we tested the same image on Plantix:
Plantix was able to recognise the correct disease on the tomato plant. They immediately show the disease as it is on other tomato plants to allow for comparisons, and show a list of symptoms of the disease so that users can check for the accuracy with a short check-list.
Plantix then also provides a more detailed description about the disease so that users can make a better informed decision about what disease their plant or crop actually has.
Clearly, Plantix is more suited to crop plants and their diseases and would be better suited to situations, as mentioned earlier, in communities with lower experience levels with crops and crop diseases.