官术网_书友最值得收藏!

Learning neural network weights

To understand this section, let us assume that the person in question will eventually and indefinitely be affected by a heart disease, which directly implies that the output of our sigmoid function is 0.

We begin by assigning some random non-zero values to the weights in the equation, as shown in the following diagram:

We do this because we do not really know what the initial value of the weights should be.

We now do what we have learned in the previous section: we move in the forward direction of our network, which is from the input layer to the output layer. We multiply the features with the weights and sum them up before applying them to the sigmoid function. Here is what we obtain as the final output:

The output obtained is 4109, which, when applied to the activation function, gives us the final output of 1, which is the complete opposite of the actual answer that we were looking for.

What do we do to improve the situation? The answer to this question is a backward pass, which means we move through our model from the output layer to the input layer so that during the next forward pass, we can obtain much better results. 

To counter this, the neural network will try to vary the values of the weights, as depicted in the following diagram:

It lowers the weight of the age parameter just to make the age add negatively to the equation. Also, it slightly increases the lifestyle because this contributes positively, and for the genes and weights, it applies negative weights.

We do another forward pass, and this time we have a smaller value of 275, but we're still going to achieve an output one from the sigmoid function:

We do a backward pass again and this time we may have to vary the weights even further:

The next time we do a forward pass, the equation produces a negative value, and if we apply this to a sigmoid function, we have a final output of zero:

Comparing 0 to the required value, we realize it's time to stop because the network now knows how to predict.

A forward pass and a backward pass together is called one iteration. In reality, we have 1,000, 100,000, or even millions of these examples, and before we change the weight, we take into account the contribution of each of these examples. Basically, we sum up the contribution of each of these examples, and then change the weights.

主站蜘蛛池模板: 攀枝花市| 泰顺县| 襄垣县| 兴业县| 伊宁市| 闵行区| 河北省| 衡阳县| 固镇县| 金阳县| 娄烦县| 兰考县| 麻江县| 家居| 乐至县| 海城市| 景泰县| 望江县| 大悟县| 黑水县| 上饶市| 许昌市| 郓城县| 宝丰县| 永丰县| 万源市| 城固县| 和政县| 疏勒县| 隆德县| 黔西县| 石楼县| 海林市| 海口市| 竹溪县| 定安县| 辽源市| 府谷县| 濮阳市| 石家庄市| 林周县|