Publication

Damage prediction for regular reinforced concrete buildings using the decision tree algorithm

Abstract

To overcome the problem of outlier data in the regression analysis for numerical-based damage spectra, the C4.5 decision tree learning algorithm is used to predict damage in reinforced concrete buildings in future earthquake scenarios. Reinforced concrete buildings are modelled as single-degree-of-freedom systems and various time-history nonlinear analyses are performed to create a dataset of damage indices. Subsequently, two decision trees are trained using the qualitative interpretations of those indices. The first decision tree determines whether damage occurs in an RC building. Consequently, the second decision tree predicts the severity of damage as repairable, beyond repair, or collapse.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related concepts (20)
Decision tree
A decision tree is a decision support hierarchical model that uses a tree-like model of decisions and their possible consequences, including chance event outcomes, resource costs, and utility. It is one way to display an algorithm that only contains conditional control statements. Decision trees are commonly used in operations research, specifically in decision analysis, to help identify a strategy most likely to reach a goal, but are also a popular tool in machine learning.
Decision tree learning
Decision tree learning is a supervised learning approach used in statistics, data mining and machine learning. In this formalism, a classification or regression decision tree is used as a predictive model to draw conclusions about a set of observations. Tree models where the target variable can take a discrete set of values are called classification trees; in these tree structures, leaves represent class labels and branches represent conjunctions of features that lead to those class labels.
Decision tree pruning
Pruning is a data compression technique in machine learning and search algorithms that reduces the size of decision trees by removing sections of the tree that are non-critical and redundant to classify instances. Pruning reduces the complexity of the final classifier, and hence improves predictive accuracy by the reduction of overfitting. One of the questions that arises in a decision tree algorithm is the optimal size of the final tree. A tree that is too large risks overfitting the training data and poorly generalizing to new samples.
Show more
Related publications (32)

Distributional Regression and Autoregression via Optimal Transport

Laya Ghodrati

We present a framework for performing regression when both covariate and response are probability distributions on a compact and convex subset of Rd\R^d. Our regression model is based on the theory of optimal transport and links the conditional Fr'echet m ...
EPFL2023

Reinforcement bond performance in 3D concrete printing: Explainable ensemble learning augmented by deep generative adversarial networks

Xianlin Wang

Integrating various reinforcements into 3D concrete printing (3DCP) is an efficient method to satisfy critical requirements for structural applications. This paper explores an explainable ensemble machine learning (EML) method to predict the bond failure m ...
Amsterdam2023

Bayes-optimal Learning of Deep Random Networks of Extensive-width

Florent Gérard Krzakala, Lenka Zdeborová, Hugo Chao Cui

We consider the problem of learning a target function corresponding to a deep, extensive-width, non-linear neural network with random Gaussian weights. We consider the asymptotic limit where the number of samples, the input dimension and the network width ...
2023
Show more