- Machine Learning for OpenCV
- Michael Beyeler
- 272字
- 2021-07-02 19:47:25
Understanding logistic regression
Despite its name, logistic regression can actually be used as a model for classification. It uses a logistic function (or sigmoid) to convert any real-valued input x into a predicted output value ? that takes values between 0 and 1, as shown in the following figure:

Rounding ? to the nearest integer effectively classifies the input as belonging either to class 0 or 1.
Of course, most often, our problems have more than one input or feature value, x. For example, the Iris dataset provides a total of four features. For the sake of simplicity, let's focus here on the first two features, sepal length--which we will call feature f1--and sepal width--which we will call f2. Using the tricks we learned when talking about linear regression, we know we can express the input x as a linear combination of the two features, f1 and f2:

However, in contrast to linear regression, we are not done yet. From the previous section we know that the sum of products would result in a real-valued, output--but we are interested in a categorical value, zero or one. This is where the logistic function comes in: it acts as a squashing function, σ, that compresses the range of possible output values to the range [0, 1]:

Now let's apply this knowledge to the Iris dataset!
- SQL Server 從入門到項目實踐(超值版)
- Advanced Machine Learning with Python
- Microsoft Exchange Server PowerShell Cookbook(Third Edition)
- 游戲程序設計教程
- Python機器學習基礎教程
- 自然語言處理Python進階
- WebRTC技術詳解:從0到1構建多人視頻會議系統
- 單片機C語言程序設計實訓100例
- Learning Concurrency in Kotlin
- PHP+Ajax+jQuery網站開發項目式教程
- Couchbase Essentials
- Visual Studio Code 權威指南
- FPGA嵌入式項目開發實戰
- Sails.js Essentials
- AI自動化測試:技術原理、平臺搭建與工程實踐