In a paper published in the journal Applied Sciences, researchers proposed a scaleless method for automatically measuring plant height using monocular computer vision. They converted binary images into horizontal one-dimensional data by processing color images of peppers planted side by side with super-green grayscale and the Otsu method.
They then segmented the images into single plants, extracted pixel heights, and calculated realistic plant heights. The technique demonstrated high precision within a 2–3-meter range under specific lighting conditions, achieving an absolute error (AE) of no more than ±10 mm and a proportional error of no more than ±4%.
Background
Past work in plant phenotype research has shown the significance of plant height as a key morphological parameter. Traditional plant height measurement methods often suffer from inefficiency and high error rates, leading to the need for more advanced high-throughput phenotyping (HTP) techniques. HTP leverages modern data-sampling technologies and machine learning to automate and enhance the accuracy of phenotypic data collection.
Automated Plant Height Measurement
In this study, a monocular camera was employed to capture plant images multiple times without using calibration objects, and a matrix laboratory (MATLAB) was utilized to process these images for plant segmentation and height measurement. The monocular vision measurement method captures images from different heights, establishing a proportional relationship between the camera and pixel displacement.
This relationship converts pixel height into actual plant height without needing a ruler. The image preprocessing involved converting color images to grayscale using the ExG method and then binarizing them using the Otsu method to create a 0–1 matrix for further processing. The analyst achieved plant segmentation by analyzing the longitudinal pixels of the binarized images, identifying symmetrical regions, and using mean filtering to smooth data for accurate peak detection.
The peaks corresponded to individual plants, and the valleys between peaks were used to determine plant boundaries. This segmentation method effectively isolated individual plants, allowing for accurate height measurements. The team extracted pixel heights and displacement values from two images taken at different heights to calculate the plant height using a derived scaling factor. The final plant height measurement involved constructing a minimum rectangle around each segmented plant to determine its pixel height and calculating the pixel displacement between the two images.
The plant height was obtained by substituting these values into the proportional relationship equation. This method demonstrated good accuracy, with minimal adhesion between adjacent plants and precise segmentation, offering a viable solution for non-contact, high-precision, and automated plant height measurement.
Measurement Validation Results
Researchers validated the accuracy and feasibility of the proposed method by comparing manual measurement values with algorithmic results, calculating the AE, absolute percentage error (APE), and sum of squared errors. Images were collected under varying conditions, including different height displacement distances (Δh) of the camera, the distance between the camera and the plant (d), and ambient lighting conditions.
Manual measurements, taken as the benchmark, were conducted with an accuracy of 1 mm, averaging three measurements per plant. The experiment was conducted in two phases. The first phase covered a range of conditions to gather data on various lighting environments and measurement distances. The second phase evaluated the impact of changing the reference distance between camera shots.
The analysts compared algorithmic measurement results to manual measurements under normal lighting conditions (approximately 279 lx). They also recorded measurement results in darker lighting conditions (approximately 42 lx). The study included measurements under filling light conditions (approximately 324 lx). These varying light conditions helped assess the algorithm's robustness and accuracy in different environments.
Additionally, the study explored the impact of changing the reference distance (Δh) between camera shots on measurement accuracy. The camera was positioned 250 cm away from the plant under normal lighting conditions (279 lx). This investigation demonstrated the method's potential for accurate, non-contact plant height measurement under various conditions, showcasing its applicability in different agricultural settings.
Conclusion
To sum up, the monocular multi-plant height measurement method demonstrated a strong performance, showing practical significance for handling multi-plant height measurements in current agricultural applications. The technique achieved accurate plant height measurements, with the absolute error (AE) not exceeding ±10 mm and the APE not exceeding ±4%, excluding anomalies in dark/light conditions.
Experiments within a measurement distance range of 200 cm to 300 cm showed no significant impact of distance on the results. However, the method's effectiveness in dark or light conditions and its grayscaling and binarization schemes needed improvement. The method's accuracy in varying light conditions and reference distances needs further exploration.
Journal reference:
- Tian, H., et al. (2024). A Multi-Plant Height Detection Method Based on Ruler-Free Monocular Computer Vision. Applied Sciences, 14:15, 6469. DOI: 10.3390/app14156469, https://www.mdpi.com/2076-3417/14/15/6469