4.7 Article

Image-driven structural steel damage condition assessment method using deep learning algorithm

Journal

MEASUREMENT
Volume 133, Issue -, Pages 168-181

Publisher

ELSEVIER SCI LTD
DOI: 10.1016/j.measurement.2018.09.081

Keywords

Computer vision; Fracture; Low cycle fatigue; Structural condition assessment; Machine learning; Steel structures; Structural health monitoring

Funding

  1. National Transportation Research Center at the University of Maryland [020469-01]

Ask authors/readers for more resources

This paper presents a deep learning based structural steel damage condition assessment method that uses images for post-hazard inspection of ultra-low cycle fatigue induced damage in structural steel fuse members. The deep learning model - Convolutional Neural Network (CNN) model, can be trained to represent the high dimensional features in huge amount of raw data which traditional mathematical models are unable to describe. A saliency-based visualization method is employed to visualize the feature-related pattern recognized by the deep learning model. To quantify the damage condition of the inspected structure fuse members, a micromechanical fracture index is defined as the damage index and used for labeling the images in the training data set. To provide large training dataset, cumulative plastic strain contour plot images generated through finite element (FE) simulation are adopted for the training data. Parametric studies were performed to validate and optimize the deep learning based damage condition assessment method. The method and findings are further examined using real experimental images collected from cyclic testing of a steel notched plate specimen. The results suggest that the proposed method provides a promising automation tool for rapid inspection of structural steel fuse member damage condition. (C) 2018 Elsevier Ltd. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available