- Mastering Machine Learning with R
- Cory Lesmeister
- 247字
- 2021-07-02 13:46:23
Classification methods and linear regression
So, why can't we use the least square regression method that we learned in the previous chapter for a qualitative outcome? Well, as it turns out, you can, but at your own risk. Let's assume for a second that you have an outcome that you're trying to predict and it has three different classes: mild, moderate, and severe. You and your colleagues also assume that the difference between mild and moderate and moderate and severe is an equivalent measure and a linear relationship. You can create a dummy variable where 0 is equal to mild, 1 is equal to moderate, and 2 is equal to severe. If you have reason to believe this, then linear regression might be an acceptable solution. However, qualitative labels such as the previous ones might lend themselves to a high level of measurement error that can bias the OLS. In most business problems, there's no scientifically acceptable way to convert a qualitative response into one that's quantitative. What if you have a response with two outcomes, say fail and pass? Again, using the dummy variable approach, we could code the fail outcome as 0 and the pass outcome as 1. Using linear regression, we could build a model where the predicted value is the probability of an observation of pass or fail. However, the estimates of Y in the model will most likely exceed the probability constraints of [0,1] and hence be a bit difficult to interpret.
- Splunk 7 Essentials(Third Edition)
- Hands-On Artificial Intelligence on Amazon Web Services
- Practical Data Wrangling
- Visual FoxPro 6.0數據庫與程序設計
- 反饋系統:多學科視角(原書第2版)
- Security Automation with Ansible 2
- Windows程序設計與架構
- Hadoop Real-World Solutions Cookbook(Second Edition)
- 21天學通Java
- 步步圖解自動化綜合技能
- 悟透JavaScript
- Azure PowerShell Quick Start Guide
- 計算智能算法及其生產調度應用
- Generative Adversarial Networks Projects
- Eclipse全程指南