官术网_书友最值得收藏!

PyTorch non-linear activations

PyTorch has most of the common non-linear activation functions implemented for us already and it can be used like any other layer. Let's see a quick example of how to use the ReLU function in PyTorch:

sample_data = Variable(torch.Tensor([[1,2,-1,-1]]))
myRelu = ReLU()
myRelu(sample_data)

Output:

Variable containing:
1 2 0 0
[torch.FloatTensor of size 1x4]

In the preceding example, we take a tensor with two positive values and two negative values and apply a ReLU on it, which thresholds the negative numbers to 0 and retains the positive numbers as they are.

Now we have covered most of the details required for building a network architecture, let's build a deep learning architecture that can be used to solve real-world problems. In the previous chapter, we used a simple approach so that we could focus only on how a deep learning algorithm works. We will not be using that style to build our architecture anymore; rather, we will be building the architecture in the way it is supposed to be built in PyTorch.

主站蜘蛛池模板: 拉孜县| 自治县| 西乌珠穆沁旗| 海南省| 高密市| 正镶白旗| 白朗县| 桦川县| 军事| 洛隆县| 抚远县| 霍山县| 桐梓县| 沂南县| 商洛市| 黄骅市| 新乡县| 兴海县| 凤台县| 大方县| 武乡县| 大丰市| 福清市| 安平县| 苗栗市| 施秉县| 云霄县| 搜索| 汉阴县| 三穗县| 安庆市| 喀喇| 泾川县| 瓮安县| 桐城市| 房产| 盐源县| 宿州市| 枞阳县| 深水埗区| 中方县|