Quantcast
Viewing all articles
Browse latest Browse all 3

Answer by greeness for Creating a basic feed forward perceptron neural network for multi-class classification

One thing that should help is to use cross-entropy error instead of classification error or mean-squared error (MSE) for such multi-class problem (especially for evaluation). This is a nice article that explains the idea. I will quote its example here:

Suppose we are predicting a person’s political party affiliation (democrat, republican, other) from independent data such as age, sex, annual income, and so on. ...

Now suppose you have just three training data items. Your neural network uses softmax activation for the output neurons so that there are three output values that can be interpreted as probabilities. For example suppose the neural network’s computed outputs, and the target (aka desired) values are as follows:

computed       | targets              | correct?
-----------------------------------------------
0.3  0.3  0.4  | 0  0  1 (democrat)   | yes
0.3  0.4  0.3  | 0  1  0 (republican) | yes
0.1  0.2  0.7  | 1  0  0 (other)      | no

This neural network has classification error of 1/3 = 0.33. Notice that the NN just barely gets the first two training items correct and is way off on the third training item. Now see another output as below:

computed       | targets              | correct?
-----------------------------------------------
0.1  0.2  0.7  | 0  0  1 (democrat)   | yes
0.1  0.7  0.2  | 0  1  0 (republican) | yes
0.3  0.4  0.3  | 1  0  0 (other)      | no

This NN also has a classification error of 1/3 = 0.33. But this second NN is much better than the first because it nails the first two training items and just barely misses the third training item. To summarize, classification error is a very crude measure of error. See below for a comparison of the classification error and Average cross-entropy error in the two cases:

Neural Network | classification error   | Average cross-entropy error
--------------------------------------------------------------------
NN1            |        0.33            |     1.38
NN2            |        0.33            |     0.64

To use cross-entropy error in the training, you need to use a different cost function. See details here.

Image may be NSFW.
Clik here to view.
enter image description here
where m is the number of training examples and k is the number of classes; y is the label; x is the feature vector; \theta is the weight parameter.


Viewing all articles
Browse latest Browse all 3

Trending Articles