- Hands-On Neural Networks
- Leonardo De Marchi Laura Mitchell
- 61字
- 2021-06-24 14:00:15
Softmax
The softmax function is a generalization of the sigmoid function. While the sigmoid gives us the probability for a binary output, softmax allows us to transform an un-normalized vector into a probability distribution. That means that the softmax will output a vector that will sum up to 1, and all of its values will be between 0 and 1.
推薦閱讀
- Mobile DevOps
- JMAG電機(jī)電磁仿真分析與實(shí)例解析
- 自動(dòng)檢測(cè)與轉(zhuǎn)換技術(shù)
- 大數(shù)據(jù)安全與隱私保護(hù)
- 大數(shù)據(jù)技術(shù)與應(yīng)用
- Docker High Performance(Second Edition)
- 可編程序控制器應(yīng)用實(shí)訓(xùn)(三菱機(jī)型)
- 高維聚類(lèi)知識(shí)發(fā)現(xiàn)關(guān)鍵技術(shù)研究及應(yīng)用
- 運(yùn)動(dòng)控制系統(tǒng)應(yīng)用與實(shí)踐
- RedHat Linux用戶(hù)基礎(chǔ)
- Ansible 2 Cloud Automation Cookbook
- Learn Microsoft Azure
- Serverless Design Patterns and Best Practices
- C#編程兵書(shū)
- 納米集成電路制造工藝(第2版)