- Neural Networks with Keras Cookbook
- V Kishore Ayyadevara
- 680字
- 2021-07-02 12:46:23
There's more...
In the previous section, we built a regression formula (Y = a*x + b) where we wrote a function to identify the optimal values of a and b. In this section, we will build a simple neural network with a hidden layer that connects the input to the output on the same toy dataset that we worked on in the previous section.
We define the model as follows (the code file is available as Neural_networks_multiple_layers.ipynb in GitHub):
- The input is connected to a hidden layer that has three units
- The hidden layer is connected to the output, which has one unit in output layer
Let us go ahead and code up the strategy discussed above, as follows:
- Define the dataset and import the relevant packages:
from copy import deepcopy
import numpy as np
x = [[1],[2],[3],[4]]
y = [[2],[4],[6],[8]]
We use deepcopy so that the value of the original variable does not change when the variable to which the original variable's values are copied has its values changed.
- Initialize the weight and bias values randomly. The hidden layer has three units in it. Hence, there are a total of three weight values and three bias values – one corresponding to each of the hidden units.
Additionally, the final layer has one unit that is connected to the three units of the hidden layer. Hence, a total of three weights and one bias dictate the value of the output layer.
The randomly-initialized weights are as follows:
w = [[[-0.82203424, -0.9185806 , 0.03494298]], [0., 0., 0.], [[ 1.0692896 ],[ 0.62761235],[-0.5426246 ]], [0]]
- Implement the feed-forward network where the hidden layer has a ReLU activation in it:
def feed_forward(inputs, outputs, weights):
pre_hidden = np.dot(inputs,weights[0])+ weights[1]
hidden = np.where(pre_hidden<0, 0, pre_hidden)
out = np.dot(hidden, weights[2]) + weights[3]
squared_error = (np.square(out - outputs))
return squared_error
- Define the back-propagation function similarly to what we did in the previous section. The only difference is that we now have to update the weights in more layers.
In the following code, we are calculating the original loss at the start of an epoch:
def update_weights(inputs, outputs, weights, epochs):
for epoch in range(epochs):
org_loss = feed_forward(inputs, outputs, weights)
In the following code, we are copying weights into two sets of weight variables so that they can be reused in a later code:
wts_new = deepcopy(weights)
wts_new2 = deepcopy(weights)
In the following code, we are updating each weight value by a small amount and then calculating the loss value corresponding to the updated weight value (while every other weight is kept unchanged). Additionally, we are ensuring that the weight update happens across all weights and also across all layers in a network.
The change in the squared loss (del_loss) is attributed to the change in the weight value. We repeat the preceding step for all the weights that exist in the network:
for i, layer in enumerate(reversed(weights)):
for index, weight in np.ndenumerate(layer):
wts_tmp[-(i+1)][index] += 0.0001
loss = feed_forward(inputs, outputs, wts_tmp)
del_loss = np.sum(org_loss - loss)/(0.0001*len(inputs))
The weight value is updated by weighing down by the learning rate parameter – a greater decrease in loss will update weights by a lot, while a lower decrease in loss will update the weight by a small amount:
wts_tmp2[-(i+1)][index] += del_loss*0.01
wts_tmp = deepcopy(weights)
Finally, we return the updated weights:
weights = deepcopy(wts_tmp2)
return wts_tmp2
- Run the function an epoch number of times to update the weights an epoch number of times:
update_weights(x,y,w,1)
The output (updated weights) of preceding code is as follows:

In the preceding steps, we learned how to build a neural network from scratch in Python. In the next section, we will learn about building a neural network in Keras.
- 基于粒計算模型的圖像處理
- TypeScript Blueprints
- Learning PostgreSQL
- 深入實踐Spring Boot
- NLTK基礎教程:用NLTK和Python庫構建機器學習應用
- C#程序設計(慕課版)
- Apache Hive Essentials
- Cocos2d-x學習筆記:完全掌握Lua API與游戲項目開發 (未來書庫)
- 表哥的Access入門:以Excel視角快速學習數據庫開發(第2版)
- 單片機C語言程序設計實訓100例
- IBM Cognos Business Intelligence 10.1 Dashboarding cookbook
- 用案例學Java Web整合開發
- 案例式C語言程序設計實驗指導
- 工業機器人離線編程
- C# 7.1 and .NET Core 2.0:Modern Cross-Platform Development(Third Edition)