Enhanced probabilistic neural network with data imputation capabilities for machine-fault slassification

Document Type

Article

Publication Date

1-1-2009

Abstract

This paper presents the expectation-maximization (EM) variant of probabilistic neural network (PNN) as a step toward creating an autonomous and deterministic PNN. In the real world, faulty reading sensors can happen and will create input vectors with missing features yet they should not be discarded. To overcome this, regularized EM is put in place as a preprocessing step to impute the missing values. The problem faced by users when using random initialization is that they have to define the number of clusters through trial and error, which makes it stochastic in nature. Global k-means is used to autonomously find the number of clusters using a selection criterion and deterministically provide the number of clusters needed to train the model. In addition, fast Global k-means will be tested as an alternative to Global k-means to help reduce computational time. Tests are conducted on both homoscedastic and heteroscedastic PNNs. Benchmark medical datasets and also vibration data collected from a US Navy CH-46E helicopter aft gearbox known as Westland were used. The tests' results fully support the usage of fast Global k-means and regularized EM as preprocessing steps to aid the EM-trained PNN.

Keywords

Neural computation, Application, Helicopter, Vibration, Benchmarks, Heteroscedasticity, Selection criterion, Data analysis, Error estimation, Initialization, Measurement sensor, Maximization, Cluster analysis (statistics), Fault, Neural network

Divisions

ai

Publication Title

Neural Computing and Applications

Volume

18

Issue

7

Publisher

Springer Verlag (Germany)

Additional Information

Chang, Roy Kwang Yang Loo, Chu Kiong Rao, M. V. C.

This document is currently not available here.

Share

COinS