View Single Post
Old 2005-10-13, 09:34 AM   #4
sperry
The Doink
 
sperry's Avatar
 
Real Name: Scott
Join Date: Nov 2002
Location: Portland, OR
Posts: 20,335
 
Car: '09 OBXT, '02 WRX, '96 Miata
Class: PDX/TT-6
 
The way out is through
Default

Sounds like that Neural Net can't learn anything. Even if the weights are being modified during the learning procedure, each perceptron in the layer will be outputting the same value to the next layer, thereby removing any learned behavior. Essentially, the network will act as a simple adding machine where the result is scaled by the learned weights.

...I think, I never really learned much about neural networks.

And JC, stop getting other people to do your homework.
__________________
Is you is, or is you ain't, my con-stit-u-ints?
sperry is offline   Reply With Quote