官术网_书友最值得收藏!

Summary

In this chapter, we covered quite a lot of ground, didn't we!

In short, we learned a lot about different supervised learning algorithms, how to apply them to real datasets, and how to implement everything in OpenCV. We introduced classification algorithms such as k-NN and logistic regression and discussed how they could be used to predict labels as two or more discrete categories. We introduced various variants of linear regression (such as Lasso regression and ridge regression) and discussed how they could be used to predict continuous variables. Last but not least, we got acquainted with the Iris and Boston datasets, two classics in the history of machine learning.

In the following chapters, we will go into much greater depth within these topics, and see some more interesting examples of where these concepts can be useful.

But first, we need to talk about another essential topic of machine learning, feature engineering. Often, data does not come in nicely formatted datasets, and it is our responsibility to represent the data in a meaningful way. Therefore, the next chapter will talk all about representing features and engineering data.

主站蜘蛛池模板: 定日县| 鄱阳县| 淮北市| 葵青区| 宜川县| 甘肃省| 封丘县| 香河县| 巧家县| 浏阳市| 怀安县| 咸丰县| 轮台县| 明光市| 云龙县| 武邑县| 大英县| 搜索| 原阳县| 灵寿县| 罗山县| 灵山县| 阿合奇县| 靖安县| 鹿邑县| 晋中市| 宝山区| 兰考县| 聊城市| 淅川县| 海城市| 余江县| 仙游县| 霍山县| 大埔县| 安阳县| 天津市| 清涧县| 佳木斯市| 张北县| 合江县|