Posters
Siblings:
ProceedingsProgrammeSpeakersPostersContent PartnersPowering the FutureMarkets TheatreResearch & Innovation in actionStudent programmePresenters dashboardCome meet the poster presenters to ask them questions and discuss their work
We would like to invite you to come and see the posters at our upcoming conference. The posters will showcase a diverse range of research topics, and will give delegates an opportunity to engage with the authors and learn more about their work. Whether you are a seasoned researcher or simply curious about the latest developments in your field, we believe that the posters will offer something of interest to everyone. So please join us at the conference and take advantage of this opportunity to learn and engage with your peers in the academic community. We look forward to seeing you there!
PO305: AI-based optical-thermal video data fusion for near real-time non-contact blade damage detection without stopping wind turbine operation
Xiao Chen, Associate Professor, DTU
Abstract
Blade damage inspection without stopping the normal operation of wind turbines has significant economic value. This study proposes an AI-based non-contact method that can accurately, robustly, and in real-time detect blade surface and subsurface structure damages from optical and thermal videos taken by drones in the field without stopping the normal operation of turbines. The proposed method fuses optical and thermal videos taken by drones from normal operating wind turbines and achieves near real-time blade damage segmentation by utilizing multimodal and temporal complementarity. Subsequently, the method cuts segmented blade images into patches, models the global and local correlations of each patch with vision transformer, and utilizes the difference between global and local correlations to detect damages. We collected a large-scale dataset, i.e., 100 video pairs and over 55,000 images, of optical-thermal videos of blades in operational wind turbines to train and test the method. Experimental results show that: i) The proposed method achieves 0.996 and 0.981 MIoU on optical and thermal blade video segmentation, respectively, outperforming state-of-the-art methods, particularly in videos with complex backgrounds; ii) A video demonstration shows that the proposed method enables accurate, robust, and near real-time detection of blade damage in the field. This study provides an essential step towards automated blade surface and subsurface structure damage detection using computer vision without stopping the normal operation of wind turbines.