Comparing scaled data using the Pearson correlation coefficient
Another way to measure how closely two items relate to each other is by examining their individual trends. For example, two items that both show an upward trend are more closely related. Likewise, two items that both show a downward trend are also closely related. To simplify the algorithm, we will only consider linear trends. This calculation of correlation is called the Pearson correlation coefficient. The closer the coefficient is to zero, the less correlated the two data sets will be.
The Pearson correlation coefficient for a sample is calculated using the following formula:
How to do it...
Create a new file, which we will call Main.hs, and perform the following steps:
Implement main to compute the correlation coefficient between two lists of numbers:
main :: IO ()
main = do
let d1 = [3,3,3,4,4,4,5,5,5]
let d2 = [1,1,2,2,3,4,4,5,5]
let r = pearson d1 d2
print r
Define the function to compute the Pearson coefficient:
pearson xs ys = (n * sumXY - sumX * sumY) /
sqrt ( (n * sumX2 - sumX*sumX) *
(n * sumY2 - sumY*sumY) )
where n = fromIntegral (length xs)
sumX = sum xs
sumY = sum ys
sumX2 = sum $ zipWith (*) xs xs
sumY2 = sum $ zipWith (*) ys ys
sumXY = sum $ zipWith (*) xs ys
Run the code to print the coefficient.
$ runhaskell Main.hs0.9128709291752768
How it works...
The Pearson correlation coefficient measures the degree of linear relationship between two variables. The magnitude of this coefficient describes how strongly the variables are related. If positive, the two variables change together. If negative, as one variable increases, the other decreases.