Weed Classification in Precision Farming

In an article published in the journal Applied Sciences, researchers explored traditional feature-based computer vision methods for classifying weeds in precision farming. They compared these methods to convolutional neural network (CNN)-based deep learning, emphasizing their effectiveness with smaller datasets.

Study: Weed Classification in Precision Farming. Image Credit: Laky Art/Shutterstock
Study: Weed Classification in Precision Farming. Image Credit: Laky Art/Shutterstock

By testing various features and classifiers, the authors determined the minimum number of training images needed for reliable weed classification, achieving a 94.56% recall rate with 160 images per weed type in a four-class system.

Background

In contemporary agriculture, the declining interest in agricultural work, coupled with a growing global population exceeding 8.1 billion, necessitates innovative methods to boost productivity and ensure sufficient food supply. Precision farming increasingly relies on computer vision technologies, particularly in weeding and harvesting, to address this challenge.

Two primary methodologies dominate this field: traditional feature-based approaches and deep learning methods. Traditional methods extract meaningful features like shapes, textures, and colors from images for classification. In contrast, deep learning, especially with CNNs like you only look once (YOLO), leverages large labeled datasets to train complex models for automatic feature extraction and classification.

Previous studies have employed both approaches with varying success. For example, the YOLO deep learning solution achieved high accuracy and recall rates in recognizing artificial plants but lower rates for real plants. Traditional methods, utilizing techniques like support vector machines (SVM) and random forests, have also demonstrated effectiveness but often require extensive datasets, which are scarce for agricultural applications.

This paper aimed to bridge these gaps by focusing on traditional feature-based methods to classify weeds using significantly smaller datasets. It evaluated various feature extraction techniques, such as shape, color, and texture features, and tested their effectiveness individually and in combination. By determining the minimum number of training images needed, the study provided a more resource-efficient alternative to deep learning models, achieving high recall rates with fewer images and diverse features.

Comprehensive Feature Extraction and Classification for Weed Detection

The researchers used a publicly available weed dataset, featuring real-world images captured at 30 centimeters (cm) above ground level. Six plant types were selected, totaling 3,000 images. The images were pre-processed to remove background clutter and noise using a  5 x 5 Gaussian filter and converted to the hue saturation value (HSV) color space to isolate green regions. The largest green area was identified and extracted, resized to 256 × 256 pixels, and further processed for contour and shape analysis.

Shape features such as area, hull area, and solidity were derived from the contours, while Hu moments provided invariant shape descriptors. Distance transformation was applied to compute the distance of each pixel from the nearest contour, identifying local maxima for object ranking.

Color features were extracted from red green blue (RGB) and HSV histograms, creating a total of 192 parameters. Texture features were analyzed using histogram of oriented gradients (HOG), gray-level co-occurrence matrix (GLCM), and local binary patterns (LBP), yielding comprehensive texture descriptors.

Feature combinations were constructed from these shape, color, and texture features, and six classifiers were tested to determine the best prediction outcomes. This comprehensive feature extraction and classification approach aimed to improve the accuracy and robustness of weed detection in real-world agricultural settings.

Evaluation of Feature Sets and Classifiers for Weed Classification

The authors aimed to evaluate the recognition rate of a weed classification system using various feature sets and classifiers, with training images ranging from 5 to 160 per plant.

  • Area-Based Features: Using weed area, hull area, and solidity, the highest recognition rate was 53.85% with 160 training images.
  • Hu Features: Hu moments, better suited for simpler shapes, resulted in recognition rates just above 50% with 160 training images.
  • Distance Transformation Features: Optimal results were achieved using the three largest distance maximum values, with a maximum accuracy of 66% using 160 images.
  • Shape and Distance Transformation Features: Combining these features improved recognition rates to 79.48% with 160 images.
  • Color Histogram: Achieved 92% accuracy with 160 images, indicating strong performance.
  • HOG Features: Performed poorly, reaching only 64% accuracy with 160 images.
  • GLCM and LBP Features: Both showed improvement with more images, achieving 82% and 78.2% accuracy, respectively, with 160 images.
  • Texture Features: Combined HOG, GLCM, and LBP features achieved 80.5% accuracy with 160 images.
  • All Features Combined: Achieved up to 93.49% accuracy with 160 images, but excluding HOG features slightly improved results to 94.56%.

Random forest and gradient boosting machines (GBM) were the most effective. GBM achieved slightly higher accuracy with larger datasets, while random forest performed better with smaller datasets.

Discussion and Future Directions

The researchers extracted diverse features from plant images, including shape, distance transformation, color histograms, and textures, to improve weed classification. Using a dataset of 500 images per plant species, six classifiers were trained with varying numbers of images to assess performance. With just 10 training images, the detection rate was 83%, improving to 86-91% with 20-40 images.

Using 160 images per category and all features, the detection rate reached 94.56%.
The method required significantly fewer images (50-200 per category) compared to CNN-based approaches (1000-2000 images), making it more practical for resource-constrained environments. Future work aims to enhance the feature set by incorporating additional shape and texture descriptors. This approach offered a faster and more efficient solution for weed detection, particularly crucial after rainfall when weeds spread rapidly.

Conclusion

In conclusion, for effective weed classification in precision farming, the choice between traditional feature-based methods and CNN-based deep learning hinged on dataset size. CNNs excelled with thousands of labeled images per category, while feature-based methods, requiring fewer images, offer practicality in resource-constrained settings.

By leveraging diverse feature sets, the researchers achieved up to 94.56% recall with 160 images per weed type. Future research should focus on enhancing feature descriptors, ensuring efficient weed detection and management strategies in agriculture.

Journal reference:
Soham Nandi

Written by

Soham Nandi

Soham Nandi is a technical writer based in Memari, India. His academic background is in Computer Science Engineering, specializing in Artificial Intelligence and Machine learning. He has extensive experience in Data Analytics, Machine Learning, and Python. He has worked on group projects that required the implementation of Computer Vision, Image Classification, and App Development.

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Nandi, Soham. (2024, June 12). Weed Classification in Precision Farming. AZoAi. Retrieved on November 14, 2024 from https://www.azoai.com/news/20240612/Weed-Classification-in-Precision-Farming.aspx.

  • MLA

    Nandi, Soham. "Weed Classification in Precision Farming". AZoAi. 14 November 2024. <https://www.azoai.com/news/20240612/Weed-Classification-in-Precision-Farming.aspx>.

  • Chicago

    Nandi, Soham. "Weed Classification in Precision Farming". AZoAi. https://www.azoai.com/news/20240612/Weed-Classification-in-Precision-Farming.aspx. (accessed November 14, 2024).

  • Harvard

    Nandi, Soham. 2024. Weed Classification in Precision Farming. AZoAi, viewed 14 November 2024, https://www.azoai.com/news/20240612/Weed-Classification-in-Precision-Farming.aspx.

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of AZoAi.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
Self-Supervised Learning Boosts Sewer Anomaly Detection With Better Accuracy