Presentations
Siblings:
ProceedingsProgrammeSpeakersPostersContent PartnersPowering the FutureMarkets TheatreResearch & Innovation in actionStudent programmePresenters dashboardAI-based optical-thermal video data fusion for near real-time non-contact blade damage detection without stopping wind turbine operation
Xiao Chen, Associate Professor, DTU
Abstract
Blade damage inspection without stopping the normal operation of wind turbines has significant economic value. This study proposes an AI-based non-contact method that can accurately, robustly, and in real-time detect blade surface and subsurface structure damages from optical and thermal videos taken by drones in the field without stopping the normal operation of turbines. The proposed method fuses optical and thermal videos taken by drones from normal operating wind turbines and achieves near real-time blade damage segmentation by utilizing multimodal and temporal complementarity. Subsequently, the method cuts segmented blade images into patches, models the global and local correlations of each patch with vision transformer, and utilizes the difference between global and local correlations to detect damages. We collected a large-scale dataset, i.e., 100 video pairs and over 55,000 images, of optical-thermal videos of blades in operational wind turbines to train and test the method. Experimental results show that: i) The proposed method achieves 0.996 and 0.981 MIoU on optical and thermal blade video segmentation, respectively, outperforming state-of-the-art methods, particularly in videos with complex backgrounds; ii) A video demonstration shows that the proposed method enables accurate, robust, and near real-time detection of blade damage in the field. This study provides an essential step towards automated blade surface and subsurface structure damage detection using computer vision without stopping the normal operation of wind turbines.