官术网_书友最值得收藏!

Differential calculus

At the core of calculus lie derivatives, where the derivative is defined as the instantaneous rate of change of a given function with respect to one of its variables. The study of finding a derivative is known as differentiation. Geometrically, the derivative at a known point is given by the slope of a tangent line to the graph of the function, provided that the derivative exists, and is defined at that point.

Differentiation is the reverse of Integration. Differentiation has several applications; like in physics, the derivative of displacement is velocity, and the derivative of velocity is acceleration. Derivatives are mainly used to find maxima or minima of a function.

Within machine learning, we deal with the functions that operate on variables or features having hundreds or more dimensions. We calculate derivatives of the function in each dimension of the variable, and combine these partial derivatives into a vector, which gives us what is called a gradient. Similarly, taking the second-order derivative of a gradient gives us a matrix termed as Hessian.

The knowledge of gradients and hessians helps us define things like directions of descent and rate of descent, which tell us how we should travel in our function space in order to get to the bottom-most point, in order to minimize the function.

The following is an example of a simple objective function (linear regression with weights x, N data points, and D dimensions in a vectorized notation:

The method of Lagrange multipliers is a standard way in calculus to maximize or minimize functions when there are constraints involved.

主站蜘蛛池模板: 德昌县| 田林县| 新和县| 古蔺县| 扎囊县| 景宁| 黔东| 榆林市| 沙田区| 永宁县| 青海省| 若尔盖县| 无棣县| 辽中县| 焉耆| 太仆寺旗| 珠海市| 商河县| 阿克苏市| 公安县| 左云县| 英山县| 嘉善县| 罗平县| 泰来县| 武陟县| 嘉黎县| 凉山| 石狮市| 广水市| 承德市| 牙克石市| 哈尔滨市| 巴中市| 小金县| 汉中市| 宁夏| 剑阁县| 尼玛县| 布拖县| 黄浦区|