• Watching agriculture form the sky — Real-time identification of crop losses using UAV imagery

  • 2019 -10 -24
This technology integrates thousands times of UAV aerial imaging experiences with labeled images of rice lodging images for training. A rice lodging AI image recognition model through a deep learning architecture, SegNet, is established with an accuracy of 90%. The pretrained UAV model for rice lodging recognition can be deployed in a microcomputer on a UAV to implement edge computing. When a UAV is taking aerial shots, the inference can be simultaneously result with the lodging area and crop damage level so to reveal the agricultural damage spatially and temporally.

The identification technology of crop losses using UAVs is at the development stage. This technology integrates image segmentation technology and edge computing to building an agricultural disaster image database and to implements the real-time inference on a UAV. This technology enables the surveying personnel to instantly realize the damage distribution. This technology greatly simplifies the time-consuming and labor-intensive surveying operations, and increases the efficiency of agriculture loss subsidy. The outcome possesses both research value and industrial applicability.

This technology can accurately quantify agricultural losses and save greatly manpower and time-consumption for losses investigation. This technology also improves the efficiency of disaster area detection and subsidies. This technology can be applied to agricultural practitioners, such as the UAV hardware and software developers, agricultural insurance companies, pesticide fertilizer companies, and so on. In the future, the further research can be extended to large-scale rice field management and agriculture disaster detection in Southeast Asia as a scientific and low-cost tool in precision agriculture.

本技術獲選為2019未來科技展「未來科技突破獎」,了解更多:Watching agriculture form the sky — Real-time identification of crop losses using UAV imagery