This bookk addresses the fault tolerance of RBF networks where all hidden nodes have the same fault rate and their fault probabilities are independent. Assuming that there is a Gaussian distributed noise in the output data, we have derived an objective function for robustly training an RBF network based on the Kullback-Leibler divergence. We also find that for a fault-tolerance regularizer some eigenvalues of the regularization matrix should be negative. For the Tipping's regularizer and the OLS regularizer, the regularization matrices are positive or semipositive definite. Hence, they cannot efficiently handle the multinode open fault.
Bitte wählen Sie Ihr Anliegen aus.
Rechnungen
Retourenschein anfordern
Bestellstatus
Storno