官术网_书友最值得收藏!

Logistic regression for classification

In the previous section, we learned how to predict. There's another common task in ML: the task of classification. Separating dogs from cats and spam from not spam, or even identifying the different objects in a room or scene—all of these are classification tasks. 

Logistic regression is an old classification technique. It provides the probability of an event taking place, given an input value. The events are represented as categorical dependent variables, and the probability of a particular dependent variable being 1 is given using the logit function:

Before going into the details of how we can use logistic regression for classification, let's examine the logit function (also called the sigmoid function because of its S-shaped curve). The following diagram shows the logit function and its derivative varies with respect to the input X, the Sigmoidal function (blue) and its derivative (orange):

A few important things to note from this diagram are the following:

  • The value of sigmoid (and hence Ypred) lies between (01)
  • The derivative of the sigmoid is highest when WTX + b = 0.0 and the highest value of the derivative is just 0.25 (the sigmoid at same place has a value 0.5)
  • The slope by which the sigmoid varies depends on the weights, and the position where we'll have the peak of derivative depends on the bias

I would suggest you play around with the Sigmoid_function.ipynb program available at this book's GitHub repository, to get a feel of how the sigmoid function changes as the weight and bias changes. 

主站蜘蛛池模板: 阳原县| 许昌县| 建湖县| 侯马市| 南召县| 武平县| 辛集市| 宜昌市| 和林格尔县| 临邑县| 义乌市| 青海省| 微山县| 平陆县| 开江县| 克什克腾旗| 西峡县| 电白县| 龙口市| 屏东县| 宜丰县| 荔浦县| 澄城县| 莫力| 庄河市| 沈丘县| 隆子县| 恭城| 邳州市| 屏南县| 仪征市| 吉安县| 高要市| 平定县| 公主岭市| 临夏县| 宾川县| 台北市| 乌鲁木齐县| 图片| 敖汉旗|