官术网_书友最值得收藏!

  • Machine Learning in Java
  • AshishSingh Bhatia Bostjan Kaluza
  • 246字
  • 2021-06-10 19:29:57

Data transformation

Data transformation techniques tame the dataset to a format that a machine learning algorithm expects as input and may even help the algorithm to learn faster and achieve better performance. It is also known as data munging or data wrangling. Standardization, for instance, assumes that data follows Gaussian distribution and transforms the values in such a way that the mean value is 0 and the deviation is 1, as follows:

Normalization, on the other hand, scales the values of attributes to a small, specified range, usually between 0 and 1:

Many machine learning toolboxes automatically normalize and standardize the data for you.

The last transformation technique is discretization, which divides the range of a continuous attribute into intervals. Why should we care? Some algorithms, such as decision trees and Naive Bayes prefer discrete attributes. The most common ways to select the intervals are as follows:

  • Equal width: The interval of continuous variables is divided into k equal width intervals
  • Equal frequency: Supposing there are N instances, each of the k intervals contains approximately N or k instances
  • Min entropy: This approach recursively splits the intervals until the entropy, which measures disorder, decreases more than the entropy increase, introduced by the interval split (Fayyad and Irani, 1993)

The first two methods require us to specify the number of intervals, while the last method sets the number of intervals automatically; however, it requires the class variable, which means it won't work for unsupervised machine learning tasks.

主站蜘蛛池模板: 天水市| 梁河县| 阿鲁科尔沁旗| 凤山县| 呼玛县| 澄江县| 岳普湖县| 当雄县| 上蔡县| 肃南| 淅川县| 且末县| 永城市| 武宣县| 马山县| 陕西省| 石泉县| 玛曲县| 富川| 轮台县| 息烽县| 固镇县| 新化县| 古丈县| 邵东县| 原阳县| 沧源| 喀什市| 精河县| 靖西县| 长治市| 饶平县| 舞钢市| 民丰县| 忻州市| 金阳县| 桑日县| 北票市| 三门峡市| 尼玛县| 财经|