Search Results

You are looking at 1 - 4 of 4 items for :

  • "backpropagation" x
Clear All
Free access

S. Mancuso and F.P. Nicese

Backpropagation neural networks (BPNNs) were used to distinguish among 10 olive (Olea europaea L.) cultivars, originating throughout the Mediterranean basin. Identification was performed on the basis of 17 phyllometric parameters resulting from image analysis. Different BPNN architectures were attempted and best performance was achieved using a 17 × 20 × 10 BPNN. Networks were tested with sets of phyllometric parameters not involved in the training phase. Results enabled identification with certainty all cultivars tested.

Free access

Francisca López-Granados, M. Teresa Gómez-Casero, José M. Peña-Barragán, Montserrat Jurado-Expósito and Luis García-Torres

, 2008 ). The MLP neural model is a fully connected multilayer feed-forward supervised learning network with symmetric sigmoid activation functions, trained by the back-propagation algorithm to minimize a quadratic error criterion. Fully connected

Free access

Arthur Villordon, Christopher Clark, Tara Smith, Don Ferrin and Don LaBonte

= Greedy method, value of Ridge parameter = 1.0E-8), multilayer perceptron (backpropagation neural network classifier), Gaussian radial basis function network, support vector machine, various decision tree procedures (decision stump, M5P, and REPtree), and

Open access

Jinshi Cui, Myongkyoon Yang, Daesik Son, Seongmin Park and Seong-In Cho

by using fruit-quality factors obtained from tomato samples, were developed. The models consisted fundamentally of the following parts: input, hidden, and output layers. Also, both a supervised learning method and Levenberg-Marquardt back-propagation