Tea Bud Recognition Using the YOLOX Classification Model

In an article recently published in the journal Scientific Reports, researchers proposed the YOLOX classification model for accurate identification and classification of tea buds with similar characteristics.

YOLOX classification model identifies tea sprouts of different varieties. Study: https://www.nature.com/articles/s41598-024-53498-y
YOLOX classification model identifies tea sprouts of different varieties. Study: https://www.nature.com/articles/s41598-024-53498-y

Background

Tea bud plucking is a critical step in the tea production process that determines the dry tea quality. The number of tea-drinking groups with tea quality requirements is continuously increasing, which has increased the importance of tea bud plucking for China's tea industry as it is the key step to adapting to market demand changes.

Currently, mechanical tea plucking and artificial tea plucking are the major tea bud plucking methods. Although artificial plucking ensures a high bud integrity rate, low speed and high cost are the key disadvantages of this method. Similarly, a low bud integrity rate is the major limitation of mechanical tea picking.

Intelligent harvesting of tea shoots can be realized through accurate classification and identification. Computer vision technology can assist in accurately identifying tea shoots to improve mechanical tea picking/facilitate the development of mechanically intelligent tea picking.

Limitations of computer vision technology

In shoot recognition studies, computer vision technology can be classified into deep learning recognition and traditional image recognition stages. The traditional image recognition stage is primarily based on image segmentation, traditional image processing algorithms, and edge detection algorithms.

The tea shoot recognition model developed using deep learning (DL)-based target detection algorithm typically shows a higher accuracy and better robustness than the model developed using traditional algorithms. However, several challenges still exist in classifying and identifying tea shoots.

For instance, the accuracy of the DL-based target detection algorithm in tea leaf shoot identification is not high. Additionally, fewer studies have been performed on tea shoot classification despite its huge importance for practical tea production and the development of the tea industry.

The study

In this study, researchers initially constructed tea shoot/bud recognition models based on seven mainstream algorithms and compared them to determine the best-performing model. Subsequently, the best-performing model was combined with the tea bud dataset to create a tea shoot classification model.

A total of 3728 tea shoots representing four shoot types, including Anji White Tea, Huangshan Seed, Longjing 43, and NongKang Early, with similar characteristics, were photographed to create the tea bud dataset (TBD). The pictures of the shoots were taken on both sunny and cloudy days, with both bad and good lighting conditions.

Seven mainstream target detection algorithms, including YOLOX, YOLOv4, YOLOv5, YOLOv7, EfficientDet, CenterNet, and Faster R-CNN, were used to perform the shoot recognition comparison experiments to identify the best algorithm. Eventually, the best-performing algorithm identified in the shoot recognition comparison experiments was combined with the dataset TBD to construct the tea leaf shoot classification model for four tea shoot types.

Six evaluation metrics, including FPS, training time, mAP, F1 score, recall, and precision, were used to evaluate the shoot recognition models. An indicator weight method was used to comprehensively analyze the good and bad of each tea shoot recognition model during evaluation.

Significance of the study

Results obtained after the shoot recognition comparison experiments demonstrated that the YOLOX algorithm had the best performance among all algorithms. The YOLOX algorithm achieved 89.34%, 93.56%, 0.91, and 95.47% precision, recall, F1 score, and mAP, respectively. Thus, this algorithm was selected and combined with the dataset TBD to build a tea shoot classification model covering four kinds of tea shoots.

The YOLOX classification model accurately identified the tea shoots and their varieties. The model also identified and classified multiple shoot images/multi-target images with multiple shoots. However, the target detection algorithm identified NongKang Early as Longjing shoots, which indicated misclassifications by the YOLOX classification model due to the similarity in the color characteristics of these two tea shoot varieties. The classification model demonstrated the highest accuracy while identifying the Huangshan variety owing to the significant difference between the background color of the old leaves and shoots.

Additionally, the model displayed the second-highest accuracy while identifying the Longjing 43, followed by the NongKang Early variety and the Anji White Tea variety. The accuracy of the YOLOX model in identifying the NongKang Early, Longjing 43, and Anji White Tea varieties was closer to the overall model accuracy.

The YOLOX classification model identified the Anji white tea shoots with 76.19% precision, the yellow mountain species/Huangshan seed with 90.54% precision, Longjing 43 with 80% precision, and NongKang early to the morning with 77.78% precision.

To summarize, the findings of this study demonstrated that the YOLOX classification model can accurately identify and classify four tea shoot types, which provides a robust theoretical foundation for mechanically intelligent tea picking in practical applications.

Journal reference:
Samudrapom Dam

Written by

Samudrapom Dam

Samudrapom Dam is a freelance scientific and business writer based in Kolkata, India. He has been writing articles related to business and scientific topics for more than one and a half years. He has extensive experience in writing about advanced technologies, information technology, machinery, metals and metal products, clean technologies, finance and banking, automotive, household products, and the aerospace industry. He is passionate about the latest developments in advanced technologies, the ways these developments can be implemented in a real-world situation, and how these developments can positively impact common people.

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Dam, Samudrapom. (2024, February 09). Tea Bud Recognition Using the YOLOX Classification Model. AZoAi. Retrieved on November 22, 2024 from https://www.azoai.com/news/20240209/Tea-Bud-Recognition-Using-the-YOLOX-Classification-Model.aspx.

  • MLA

    Dam, Samudrapom. "Tea Bud Recognition Using the YOLOX Classification Model". AZoAi. 22 November 2024. <https://www.azoai.com/news/20240209/Tea-Bud-Recognition-Using-the-YOLOX-Classification-Model.aspx>.

  • Chicago

    Dam, Samudrapom. "Tea Bud Recognition Using the YOLOX Classification Model". AZoAi. https://www.azoai.com/news/20240209/Tea-Bud-Recognition-Using-the-YOLOX-Classification-Model.aspx. (accessed November 22, 2024).

  • Harvard

    Dam, Samudrapom. 2024. Tea Bud Recognition Using the YOLOX Classification Model. AZoAi, viewed 22 November 2024, https://www.azoai.com/news/20240209/Tea-Bud-Recognition-Using-the-YOLOX-Classification-Model.aspx.

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of AZoAi.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
CALDERA Enables Leaner Language Models for Phones and Laptops