- Neural Networks with R
- Giuseppe Ciaburro Balaji Venkateswaran
- 85字
- 2021-08-20 10:25:18
Rectified Linear Unit
Rectified Linear Unit (ReLU) is the most used activation function since 2015. It is a simple condition and has advantages over the other functions. The function is defined by the following formula:

In the following figure is shown a ReLU activation function:

The range of output is between 0 and infinity. ReLU finds applications in computer vision and speech recognition using deep neural nets. There are various other activation functions as well, but we have covered the most important ones here.
推薦閱讀
- JavaScript從入門到精通(微視頻精編版)
- Node.js+Webpack開發實戰
- Mobile Web Performance Optimization
- Arduino by Example
- Rust編程:入門、實戰與進階
- OpenCV for Secret Agents
- HTML5 Mobile Development Cookbook
- Windows Presentation Foundation Development Cookbook
- ASP.NET 3.5程序設計與項目實踐
- C程序設計實踐教程
- 常用工具軟件立體化教程(微課版)
- OpenStack Networking Essentials
- 零基礎C#學習筆記
- ROS機器人編程實戰
- VMware vSphere Design Essentials