- Mobile Artificial Intelligence Projects
- Karthikeyan NG Arun Padmanabhan Matt R. Cole
- 395字
- 2021-06-24 15:51:39
Activation functions
We now know that an ANN is created by stacking individual computing units called perceptrons. We have also seen how a perceptron works and have summarized it as Output 1, IF .
That is, it either outputs a 1 or a 0 depending on the values of the weight, w, and bias, b.
Let's look at the following diagram to understand why there is a problem with just outputting either a 1 or a 0. The following is a diagram of a simple perceptron with just a single input, x:

For simplicity, let's call , where the following applies:
- w is the weight of the input, x, and b is the bias
- a is the output, which is either 1 or 0
Here, as the value of z changes, at some point, the output, a, changes from 0 to 1. As you can see, the change in output a is sudden and drastic:

What this means is that for some small change, , we get a dramatic change in the output, a. This is not particularly helpful if the perceptron is part of a network, because if each perceptron has such drastic change, it makes the network unstable and hence the network fails to learn.
Therefore, to make the network more efficient and stable, we need to slow down the way each perceptron learns. In other words, we need to eliminate this sudden change in output from 0 to 1 to a more gradual change:

This is made possible by activation functions. Activation functions are functions that are applied to a perceptron so that instead of outputting a 0 or a 1, it outputs any value between 0 and 1.
This means that each neuron can learn slower and at a greater level of detail by using smaller changes, . Activation functions can be looked at as transformation functions that are used to transform binary values in to a sequence of smaller values between a given minimum and maximum.
There are a number of ways to transform the binary outcomes to a sequence of values, namely the sigmoid function, the tanh function, and the ReLU function. We will have a quick look at each of these activation functions now.
- 社會網(wǎng)絡(luò)分析方法在圖書情報領(lǐng)域的應(yīng)用研究
- 圖書館學(xué)是什么
- 地方檔案與文獻(xiàn)研究(第1輯)
- 檔案保護(hù)技術(shù)
- 檔案檢索: 理論與方法
- 信息系統(tǒng)工程(第2版)
- 邂逅法學(xué)圖書館:浙江大學(xué)光華法學(xué)院師生原創(chuàng)文集
- 檔案智慧化管理與開發(fā)利用的探索研究
- 大學(xué)圖書館信息服務(wù)與信息素養(yǎng)教育理論與實(shí)踐研究
- 檔案修復(fù)與歷史資料的數(shù)字化:第六屆東亞史料研究編纂機(jī)構(gòu)聯(lián)席會議論文集
- 圖書館學(xué)基礎(chǔ)簡明教程
- 中國圖書館轉(zhuǎn)型風(fēng)險研究
- 圖書館學(xué)研究法:學(xué)術(shù)論文寫作摭要
- 知人者智:人物志讀本
- 文化與詩學(xué)(2009年第1輯)(總第8輯)