问题
I'm having trouble seeing what the threshold actually does in a single-layer perceptron. The data is usually separated no matter what the value of the threshold is. It seems a lower threshold divides the data more equally; is this what it is used for?
回答1:
Actually, you'll just set threshold when you aren't using bias. Otherwise, the threshold is 0.
Remember that, a single neuron divides your input space with a hyperplane. Ok?
Now imagine a neuron with 2 inputs X=[x1, x2]
, 2 weights W=[w1, w2]
and threshold TH
. The equation shows how this neuron works:
x1.w1 + x2.w2 = TH
this is equals to:
x1.w1 + x2.w2 - 1.TH = 0
I.e., this is your hyperplane equation that will divides the input space.
Notice that, this neuron just work if you set manually the threshold. The solution is change TH to another weight, so:
x1.w1 + x2.w2 - 1.w0 = 0
Where the term 1.w0
is your BIAS. Now you still can draw a plane in your input space without set manually a threshold (i.e, threshold is always 0). But, in case you set the threshold to another value, the weights will just adapt themselves to adjust equation, i.e., weights (INCLUDING BIAS) absorves the threshold effects.
回答2:
The sum of the products of the weights and the inputs is calculated in each node, and if the value is above some threshold (typically 0) the neuron fires and takes the activated value (typically 1); otherwise it takes the deactivated value (typically -1). Neurons with this kind of activation function are also called Artificial neurons or linear threshold units.
回答3:
I think I understand now, with help from Daok. I just wanted to add information for other people to find.
The equation for the separator for a single-layer perceptron is
Σwjxj+bias=threshold
This means that if the input is higher than the threshold, or
Σwjxj+bias > threshold, it gets classified into one category, and if
Σwjxj+bias < threshold, it get classified into the other.
The bias and the threshold really serve the same purpose, to translate the line (see Role of Bias in Neural Networks). Being on opposite sides of the equation, though, they are "negatively proportional".
For example, if the bias was 0 and the threshold 0.5, this would be equivalent to a bias of -0.5 and a threshold of 0.
来源:https://stackoverflow.com/questions/6554792/whats-the-point-of-the-threshold-in-a-perceptron