在神经网络中偏置输入的作用

转载:https://stackoverflow.com/questions/2480650/role-of-bias-in-neural-networks

I think that biases are almost always helpful. In effect,a bias value allows you to shift the activation function to the left or right, which may be critical for successful learning.

It might help to look at a simple example. Consider this 1-input, 1-output network that has no bias:

The output of the network is computed by multiplying the input (x) by the weight (w0) and passing the result through some kind of activation function (e.g. a sigmoid function.)

Here is the function that this network computes, for various values of w0:

Changing the weight w0essentially changes the "steepness" of the sigmoid. That's useful, but what if you wanted the network to output 0 when x is 2? Just changing the steepness of the sigmoid won't really work --you want to be able to shift the entire curve to the right.

That's exactly what the bias allows you to do. If we add a bias to that network, like so:

...then the output of the network becomes sig(w0*x + w1*1.0). Here is what the output of the network looks like for various values of w1:

Having a weight of -5 for w1shifts the curve to the right, which allows us to have a network that outputs 0 when x is 2.

你可能感兴趣的:(在神经网络中偏置输入的作用)