- Mastering Machine Learning on AWS
- Dr. Saket S.R. Mengle Maximo Gurmendez
- 191字
- 2021-06-24 14:23:17
Gradient descent
The gradient descent algorithm is also popular for estimating parameters for linear regression. The gradient descent algorithm is used to minimize a function. Based on what we are predicting, we start with a set of initial values for the parameters and iteratively move toward the parameters to minimize the error in the function. The function to iteratively minimize error is called the gradient. The idea is to descend the gradient toward the lowest point in the gradient plane. Different types of gradient descent algorithms include batch gradient descent, which looks at all observed examples in each example, and stochastic gradient descent, where we iterate with only one observation at a time. For this reason, batch gradient descent is more accurate than stochastic gradient descent, but is much slower and hence not suitable for large datasets.
There is a vast amount of research being done on regression algorithms as they are very well suited for predicting continuous variables. We encourage you to learn more about linear regression libraries and try different variants that are provided in the library to calculate the efficiency and effectiveness of the test datasets.
- Creating Dynamic UI with Android Fragments
- 計算機應用與維護基礎教程
- 電腦組裝、維護、維修全能一本通(全彩版)
- 電腦常見故障現場處理
- AMD FPGA設計優化寶典:面向Vivado/SystemVerilog
- 微軟互聯網信息服務(IIS)最佳實踐 (微軟技術開發者叢書)
- STM32嵌入式技術應用開發全案例實踐
- 固態存儲:原理、架構與數據安全
- 計算機組裝維修與外設配置(高等職業院校教改示范教材·計算機系列)
- Intel Edison智能硬件開發指南:基于Yocto Project
- Blender Quick Start Guide
- RISC-V處理器與片上系統設計:基于FPGA與云平臺的實驗教程
- Istio服務網格技術解析與實踐
- 嵌入式系統原理及應用:基于ARM Cortex-M4體系結構
- Mastering Quantum Computing with IBM QX