Image processing with UAVs in precision agriculture: current landscape, technological trends, and a systematic review

Authors

DOI:

https://doi.org/10.36825/RITI.13.32.002

Keywords:

Unmanned Aerial Vehicles, Precision Agriculture, Image Processing, Crop Segmentation, Deep Neural Networks, Disease Detection y Germination Analysis

Abstract

This systematic review aims to analyze the main technological and methodological trends in the use of unmanned aerial vehicles (UAVs) and image processing techniques applied to precision agriculture between 2018 and 2024. The research was based on a structured query of indexed databases such as Scopus, IEEE Xplore, and ScienceDirect, from which 111 peer-reviewed articles were selected following rigorous inclusion criteria. Duplicates, non-agricultural studies, and publications lacking methodological clarity were excluded. The findings indicate sustained growth in scientific production related to UAV applications in agriculture, with a predominance of RGB sensors due to their affordability and accessibility. There is also an increasing adoption of multispectral and thermal sensors for more advanced use cases. In terms of image processing techniques, significant advances were found in the application of deep learning models, particularly CNN, U-Net, and YOLO, used for tasks such as crop segmentation, disease detection, germination monitoring, and species classification. Frequent technological combinations were identified between sensors, processing methods, and agricultural applications, establishing increasingly standardized methodological frameworks. However, several challenges persist, including limited validation under field conditions, lack of metric standardization, and underrepresentation of studies in emerging agricultural regions. This review provides a strong foundation for future research and technological implementation, highlighting the potential of UAVs as strategic tools in achieving more efficient, precise, and sustainable agriculture.

References

Pino-Vargas, E. (2019). Los Drones Una Herramienta Para Una Agricultura Eficiente: Un Futuro De Alta Tecnología. Idesia, 37 (1), 75-84. http://dx.doi.org/10.4067/S0718-34292019005000402

Tsouros, D. C., Bibi, S., Sarigiannidis, P. (2019). A review on UAV-based applications for precision agriculture. Information, 10 (11), 1-26. http://dx.doi.org/10.3390/info10110349

Zhu, W., Sun, Z., Huang, Y., Lai, J., Li, J., Zhang, J., Yang, B., Li, B., Li, S., Zhu, K., Li, Y., Liao, X. (2019). Improving field-scale wheat LAI retrieval based on UAV remote-sensing observations and optimized VI-LUTs. Remote Sensing, 11 (20), 1-22. http://dx.doi.org/10.3390/rs11202456

Fu, Z., Jiang, J., Gao, Y., Krienke, B., Wang, M., Zhong, K., Cao, Q., Tian, Y., Zhu, Y., Cao, W., Liu, X. (2020). Wheat growth monitoring and yield estimation based on multi-rotor unmanned aerial vehicle. Remote Sensing 12 (3), 1-19. http://dx.doi.org/10.3390/rs12030508

Gerardo, R., de Lima, I. P. (2023). Applying RGB-based vegetation indices obtained from UAS imagery for monitoring the rice crop at the field scale: A case study in Portugal. Agriculture, 13 (10), 1-18. http://dx.doi.org/10.3390/agriculture13101916

Ma, Y., Ma, L., Zhang, Q., Huang, C., Yi, X., Chen, X., Hou, T., Lv, X., Zhang, Z. (2022). Cotton yield estimation based on vegetation indices and texture features derived from RGB image. Frontiers in Plant Science, 13, 1-17. http://dx.doi.org/10.3389/fpls.2022.925986

Yamaguchi, T., Tanaka, Y., Imachi, Y., Yamashita, M., Katsura, K. (2020). Feasibility of combining deep learning and RGB images obtained by unmanned aerial vehicle for leaf area index estimation in rice. Remote Sensing, 13 (1), 1-19. http://dx.doi.org/10.3390/rs13010084

Fang, Z., Fan, Q., Jiang, H., Wang, C., Fu, X., Li, X., Li, M., Zhang, S., Zhang, Y., Li, Y. (2024). Evaluation of cucumber seed germination vigor under salt stress environment based on improved YOLOv8. Frontiers in Plant Science, 15, 1-16. http://dx.doi.org/10.3389/fpls.2024.1447346

Wu, Y., Li, Z., Jiang, H., Li, Q., Qiao, J., Pan, F., Fu, X., Guo, B. (2024). YOLOv8-segANDcal: Segmentation, extraction, and calculation of soybean radicle features. Frontiers in Plant Science, 15, 1-20. http://dx.doi.org/10.3389/fpls.2024.1425100

Xu, X., Zou, J., Cai, J., Zou, D. (2024). Multi-scale contextual Swin transformer for crop image segmentation. Journal of Physics: Conference Series, 2759, 1-6. http://dx.doi.org/10.1088/1742-6596/2759/1/012012

Zhu, Z., Jiang, M., Dong, J., Wu, S., Ma, F. (2023). PD-SegNet: Semantic segmentation of small agricultural targets in complex environments. IEEE Access, 11, 90214-90226. http://dx.doi.org/10.1109/access.2023.3284036

Bellis, E. S., Hashem, A. A., Causey, J. L., Runkle, B. R. K., Moreno-García, B., Burns, B. W., Green, V. S., Burcham, T. N., Reba, M. L., Huang, X. (2022). Detecting intra-field variation in rice yield with unmanned aerial vehicle imagery and deep learning. Frontiers in Plant Science, 13, 1-13. http://dx.doi.org/10.3389/fpls.2022.716506

Bouguettaya, A., Zarzour, H., Kechida, A., Taberkit, A. M. (2022). A survey on deep learning-based identification of plant and crop diseases from UAV-based aerial images. Cluster Computing, 26 (2), 1297-1317. http://dx.doi.org/10.1007/s10586-022-03627-x

Ramachandran, A., Kumar, K. S. S. (2024). An efficient deep learning model for paddy growth stage classification using neural network pruning on UAV images. Engineering Research Express, 6 (4), 1-15. http://dx.doi.org/10.1088/2631-8695/ad9afe

McEnroe, P., Wang, S., Liyanage, M. (2022). A survey on the convergence of edge computing and AI for UAVs: Opportunities and challenges. IEEE Internet of Things Journal, 9 (12), 15435-15459. http://dx.doi.org/10.1109/jiot.2022.3176400

Arafat, M. Y., Alam, M. M., Moh, S. (2023). Vision-based navigation techniques for unmanned aerial vehicles: Review and challenges. Drones, 7 (2), 1-41. http://dx.doi.org/10.3390/drones7020089

Zimmer-Dauphinee, J., Wernke, S. A. (2024). Semantic segmentation and classification of active and abandoned agricultural fields through deep learning in the southern Peruvian Andes. Remote Sensing, 16 (19), 1-16. http://dx.doi.org/10.3390/rs16193546

Wang, W., Kang, Y. (2025). A review of computer vision technologies in precision agriculture: From crop disease detection to farm management. Theoretical and Natural Science, 5 (2), 35-40. http://dx.doi.org/10.54254/2753-8818/2025.ch22224

Hasan, U., Sawut, M., Chen, S. (2019). Estimating the leaf area index of winter wheat based on unmanned aerial vehicle RGB-image parameters. Sustainability, 11 (23), 1-11. http://dx.doi.org/10.3390/su11236829

Vigabriel Navarro, L. M., Osorio Leyton, J. M., Quezada Lambertín, C. E., Benavides Lopez, J. P. (2024). Estimación de la biomasa del cultivo de cebada (Hordeum vulgare L.) mediante teledetección de imágenes multiespectrales. Revista de Investigación e Innovación Agropecuaria y de Recursos Naturales, 11 (2), 18–29. http://dx.doi.org/10.53287/iguo9951ru99j

Lu, W., Okayama, T., Komatsuzaki, M. (2021). Rice height monitoring between different estimation models using UAV photogrammetry and multispectral technology. Remote Sensing, 14 (1), 1-24. http://dx.doi.org/10.3390/rs14010078

Luo, B., Wang, X., Zhang, Z. (2021). Application of computer vision technology in UAV. Journal of Physics: Conference Series, 1881, 1-4. http://dx.doi.org/10.1088/1742-6596/1881/4/042052

Fraccaro, P., Butt, J., Edwards, B., Freckleton, R. P., Childs, D. Z., Reusch, K. (2022). A deep learning application to map weed spatial extent from unmanned aerial vehicles imagery. Remote Sensing, 14 (17), 1-15. http://dx.doi.org/10.3390/rs14174197

Mustafa, S., Waseem Iqbal, M., Rana, T. A., Jaffar, A., Shiraz, M., Arif, M., Chelloug, S. A. (2022). Entropy and Gaussian filter-based adaptive active contour for segmentation of skin lesions. Computational Intelligence and Neuroscience, 2022, 1-10. http://dx.doi.org/10.1155/2022/4348235

Cheng, S., Li, B., Sun, L., Chen, Y. (2023). HRRNet: Hierarchical refinement residual network for semantic segmentation of remote sensing images. Remote Sensing, 15 (5), 1-19. http://dx.doi.org/10.3390/rs15051244

Sonawane, S., Patil, N. N. (2024). Crop-weed segmentation and classification using YOLOv8 approach for smart farming. Journal of Studies in Science and Engineering, 4 (2), 136-158. http://dx.doi.org/10.53898/josse2024428

Kerner, H., Chaudhari, S., Ghosh, A., Robinson, C., Ahmad, A., Choi, E., Jacobs, N., Holmes, C., Mohr, M., Dodhia, R., Lavista Ferres, J. M., Marcus, J. (2025). Fields of The World: A Machine Learning Benchmark Dataset for Global Agricultural Field Boundary Segmentation. Proceedings of the AAAI Conference on Artificial Intelligence, 39 (27), 28151-28159. https://doi.org/10.1609/aaai.v39i27.35034

Kitchenham, B., Brereton, O. P. (2013). A systematic review of systematic review process research in software engineering. Information and Software Technology, 55 (12), 2049–2075. http://dx.doi.org/10.1016/j.infsof.2013.07.010

Published

2025-11-05

How to Cite

Mora Castro, G., Ayala Cruz, K. V., Valenzuela Hernández, J. de J., & Romero Fitch, J. de H. (2025). Image processing with UAVs in precision agriculture: current landscape, technological trends, and a systematic review. Revista De Investigación En Tecnologías De La Información, 13(32 Especial), 4–16. https://doi.org/10.36825/RITI.13.32.002