- Deep Learning Quick Reference
- Mike Bernico
- 79字
- 2021-06-24 18:40:05
The Adam optimizer
Adam is one of the best performing known optimizer and it's my first choice. It works well across a wide variety of problems. It combines the best parts of both momentum and RMSProp into a single update rule:




Where is some very small number to prevent division by 0.
Adam is often a great choice, and it's a great place to start when you're prototyping, so save yourself some time by starting with Adam.
推薦閱讀
- 極簡AI入門:一本書讀懂人工智能思維與應用
- WOW!Illustrator CS6完全自學寶典
- INSTANT Autodesk Revit 2013 Customization with .NET How-to
- Azure PowerShell Quick Start Guide
- INSTANT Puppet 3 Starter
- 空間機器人
- 計算機應用基礎學習指導與練習(Windows XP+Office 2003)
- 渲染王3ds Max三維特效動畫技術
- 軟件測試設計
- Java求職寶典
- CPLD/FPGA技術應用
- 單片機硬件接口電路及實例解析
- 運動控制系統應用及實例解析
- R:Predictive Analysis
- Hadoop大數據技術與應用