Free Essay

A Note on Positive Semi-Definiteness of Some Non-Pearsonian Correlation Matrices

In:

Submitted By mishrasknehu
Words 2648
Pages 11
A Note on Positive Semi-definiteness of Some Non-Pearsonian Correlation Matrices
SK Mishra Department of Economics North-Eastern Hill University Shillong, Meghalaya (India) mishrasknehu@yahoo.com

I. Introduction: A correlation matrix, ℜ , is a real and symmetric m × m matrix such that − 1 ≤ rij ∈ ℜ ≤ 1; i, j = 1,2, ... , m. Moreover, rii = 1. The Pearsonian (or the product moment) correlation coefficient, e.g. r12 (between two variates, say x1 and x 2 , each in n observations), is given by the formula:

r ( x1 , x2 ) = cov(x1 , x2 ) / var(x1 ) ⋅ var(x2 )



(1)

1 n 1 n 2 where, x a = ∑k =1 x ka ; cov( x1 , x 2 ) = ∑k =1 x k 1 x k 2 − x12 x 2 and var( xa ) = cov( xa , xa ); a = 1, 2. n n
A little of algebra also gives us the identity:

r ( x1 , x2 ) = (1 / 4) [var(x1 + x2 ) − var(x1 − x2 )]

var(x1 ) ⋅ var(x 2 )



(2).

The Pearsonian correlation matrix is necessarily a positive semi-definite matrix (meaning that all its eigenvalues are non-negative) since it is the quadratic form of a real matrix, X ( n, m ). It also implies that if ℜ is not a positive semi-definite matrix, then X ( n, m) is not a real matrix. II. Robust Measures of Correlation: The Pearsonian coefficient of correlation as a measure of association between two variates is highly prone to the deleterious effects of outlier observations (data). Statisticians have proposed a number of formulas, other than the one that obtains Pearson’s coefficient of correlation, that are considered to be less affected by errors of observation, perturbation or presence of outliers in the data. Some of them transform the variables, say x1 and x 2 , into

z1 = φ1 ( x1 ) and z 2 = φ 2 ( x 2 ), where ϕa ( xa ); a = 1, 2 is a linear (or nonlinear) monotonic (orderpreserving) rule of transformation or mapping of x a to z a . Then, r ( z1 , z 2 ) is obtained by the appropriate formula and it is considered as a robust measure of r ( x1 , x 2 ). Some others use different measures of central tendency, dispersion and co-variation, such as median for mean, mean deviation for standard deviation and so on. In what follows, we present a few formulas of obtaining different types of correlation efficient. II.1. Spearman’s Rank Correlation Coefficient: If x1 and x 2 are two variables, both in n observations, and z1 = » ( x1 ) and z2 = » ( x2 ) are their rank numerals with
»

(.) as the rank-ordering rule, then the

Pearson’s formula applied on ( z1 , z 2 ) obtains the Spearman’s correlation coefficient (Spearman, 1904). There is a simpler (but less general) formula that obtains rank correlation coefficient, given as:

ρ ( x1 , x2 ) = r ( z1 , z2 ) = 1 − 6∑ k =1 ( zk1 − zk 2 )2 / [n(n2 − 1)]

n



(3)

2
II.2. Signum Correlation Coefficient: Let c1 and c2 be the measures of central tendency or location (such as

x1 and x 2 respectively. We transform them to z ka = ( x ka − c a ) / | x ka − c a | if | xka − ca | > 0, else zka = 1; a = 1, 2. Then, r ( z1 , z 2 ) is the signum correlation coefficient (Blomqvist, 1950; Shevlyakov, 1997). Due to the special nature of arithmetic mean or median) of transformation, we have

r ( z1 , z 2 ) ≅ cov( z1 , z 2 ) = (1 n )∑i =1 z i1 z i 2

n



(4)

In this study we will use median as a measure of central tendency to obtain signum correlation coefficients. II.3. Kendall’s Tau: If x1 and x 2 are two variables, both in n observations, and z1 = » ( x1 ) and

z2 = »( x2 ) are their rank numerals with » (.) as the rank-ordering rule, we define ck = 1 iff ( zk1 > zl1 and zk 2 > zl 2 ) else ck = 0; k , l = 1, 2,..., n …
Then r ( z1 , z2 ) = −1 + 4∑ k =1 ck / ( n 2 − n ) is a measure of association called Kendall’s Tau. n (5)

II.4. Bradley’s Absolute Correlation Coefficient: Bradley (1985) showed that if (uk , vk ); k = 1, n are n pairs of values such that the variables u and v have the same median = 0 and the same mean deviation (from median) or (1/ n)

∑ n n k =1

uk = (1/ n)∑ k =1 vk = d ≠ 0 , both of which conditions may be met by
− u k − vk

n

any pair of variables when suitably transformed, then the absolute correlation may be defined as

ρ (u , v ) =



( u k + vk k =1

)



n k =1

( uk

+ vk .

)



(6)

II.5. Shevlyakov Correlation Coefficient: Hampel et al. (1986) defined the median of absolute deviations (from median) as a measure of scale, sH ( xa ) = median | xka − median( xka ) |; a = 1, 2 which is a very k k

robust measure of deviation, and using this measure, Shevlyakov (1997) defined median correlation,

rmed = med 2 | u | − med 2 | v | med 2 | u | + med 2 | v | where [

][

]



(7)

u and v are given as uk = ( xk1 − med ( x1 )) / sH ( x1 ) + ( xk 2 − med ( x2 )) / sH ( x2 ) and vk = ( xk1 − med ( x1 )) / sH ( x1 ) − ( xk 2 − med ( x2 )) / sH ( x2 ); k = 1, 2,..., n.
III. Are Robust Correlation Matrices Positive Semi-definite? In this study we investigate into the question whether the correlation matrix, ℜ, whose any element rij is a robust measure of correlation (obtained by the formulas such as Spearman’s, Blomqvist’s, Kendall’s, Bradley’s or Shevlyakov’s), is positive semi-definite. We use the dataset given in Table-1 as the base data for our experiments. We have carried out ten thousand experiments for each method (Spearman’s ρ , Blomqvist’s signum, Kendall’s tau, Bradley’s absolute r, and Shevlyakov’s rmed ) of computing robust correlation,

rij ∈ ℜ. In each experiment, the base data (Table-1) has been perturbed by a small (between -0.25 to 2.5) quantity generated randomly (and distributed uniformly) over n observations on all the eight variables. In each experiment, the correlation matrix, ℜ, has been computed and its eigenvalues are obtained. If any (at least one) eigenvalue of the correlation matrix has been found to be negative, the occurrence has been counted as a failure (of the matrix being positive semi-definite), else it is counted

3 as a success. Such three sets of experiments have been carried out with three different seeds for generating random numbers (for perturbation).
Table-1: Base Dataset for Computation of Robust Correlation Matrices by Different Methods
SL. No. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19

x1
18 17 6 5 21 14 3 4 18 4 17 3 7 12 1 11 9 16 10

x2
3 17 1 4 10 20 15 13 16 14 14 9 5 6 5 1 12 5 11

x3
1 6 6 2 8 7 12 1 4 8 8 4 5 6 3 7 6 3 7

x4
2 6 1 1 3 3 1 1 1 4 9 6 2 3 4 1 8 1 10

x5
3 13 10 3 7 14 2 1 1 3 15 4 12 2 12 3 12 6 8

x6
3 10 16 2 7 16 15 6 4 16 11 4 9 8 15 2 16 3 14

x7
2 16 5 6 12 19 10 8 5 12 20 4 13 9 12 3 20 3 13

x8
7 16 7 1 8 14 9 6 5 14 13 6 10 8 11 1 16 7 15

SL. No.

x1

x2

x3

x4

x5

x6

x7

x8

20 8 2 9 4 9 11 7 2 21 13 3 9 1 7 5 3 9 22 15 4 7 5 6 15 12 15 23 17 16 11 5 9 10 10 12 24 2 7 5 1 3 1 1 1 25 3 7 3 1 2 12 4 8 26 20 19 4 4 8 13 14 10 27 19 18 6 2 16 14 19 12 28 3 14 9 3 11 5 10 3 29 8 2 7 1 10 4 2 1 30 16 21 11 9 10 18 18 17 31 5 10 4 3 12 2 11 6 32 21 17 9 8 11 13 15 10 33 14 8 4 3 5 6 10 13 34 9 6 1 1 2 10 8 4 35 19 16 7 2 1 6 7 9 36 19 15 10 7 4 17 17 15 37 22 5 7 1 6 3 4 6 Note: This dataset has been perturbed in our experiments

.
IV. The results, Discussion and Conclusion: Our findings reveal that Spearman’s rho, Blomqvist’s signum, Kendall’s tau and Bradley’s absolute correlation formulas yield positive semi-definite correlation matrix (without any failure). Of these, positive semi-definiteness of the matrices based on the first three measures (Spearman’s rho, Blomqvist’s signum and Kendall’s tau) is expected since they are Pearsonian correlation matrices of transformed variables. However, the failure rate of Shevlyakov’s formula is very high (about 81 percent: more exactly 81.47%, 80.94% and 81.41% for the three random number seeds: 13317, 31921 and 17523, respectively). Two sample correlation matrices (and their eigenvalues) are presented in Table-2.1 and Table-2.2. It is found that the smallest eigenvalue so often turns out to be negative. Table-2.1. Shevlyakov’s Robust Correlation matrix and its Eigenvalues (Sample-1)
Variable

x1
1.00000 0.59053 0.27080 0.13168 0.20532 0.10974 0.41999 0.40193 4.62370

x2
0.59053 1.00000 0.34953 0.65555 0.56049 0.58320 0.85656 0.66425 1.17033

x3
0.27080 0.34953 1.00000 0.19778 0.21678 0.22393 0.25129 0.34782

x4
0.13168 0.65555 0.19778 1.00000 0.47509 0.66117 0.83071 0.60398

x5
0.20532 0.56049 0.21678 0.47509 1.00000 0.67361 0.67501 0.39314

x6
0.10974 0.58320 0.22393 0.66117 0.67361 1.00000 0.79762 0.70722

x7
0.41999 0.85656 0.25129 0.83071 0.67501 0.79762 1.00000 0.74020 0.14943

x8
0.40193 0.66425 0.34782 0.60398 0.39314 0.70722 0.74020 1.00000 0.04956

x1 x2 x3 x4 x5 x6 x7 x8

Eigenvalues (in descending order) 0.79635 0.61840 0.42434 0.16789

4
Table-2.2. Shevlyakov’s Robust Correlation matrix and its Eigenvalues (Sample-2)
Variable

x1
1.00000 0.58154 0.36318 0.28763 0.20059 -0.01382 0.43710 0.45526 4.67015

x2
0.58154 1.00000 0.48691 0.65733 0.46806 0.50631 0.86147 0.63320 1.20149

x3
0.36318 0.48691 1.00000 0.32810 0.17847 0.44367 0.23434 0.43348

x4
0.28763 0.65733 0.32810 1.00000 0.42870 0.60102 0.85698 0.62314

x5
0.20059 0.46806 0.17847 0.42870 1.00000 0.61081 0.67040 0.41083

x6
-0.01382 0.50631 0.44367 0.60102 0.61081 1.00000 0.86103 0.73863

x7
0.43710 0.86147 0.23434 0.85698 0.67040 0.86103 1.00000 0.69714 0.13572

x8
0.45526 0.63320 0.43348 0.62314 0.41083 0.73863 0.69714 1.00000 -0.11778

x1 x2 x3 x4 x5 x6 x7 x8

Eigenvalues (in descending order) 0.83475 0.58967 0.42884 0.25716

Table-2.3. Nearest PSD Shevlyakov’s Correlation matrix and its Eigenvalues (from Table-2.2)
Variable

x1
1.0000 0.5915 0.3543 0.2958 0.2016 0.0056 0.4087 0.4499
4.64125994

x2
0.5915 1.0000 0.4766 0.6668 0.4692 0.5287 0.8286 0.6270
1.18540099

x3
0.3543 0.4766 1.0000 0.3196 0.1775 0.4236 0.2638 0.4391
0.816828682

x4
0.2958 0.6668 0.3196 1.0000 0.4296 0.6195 0.8300 0.6180
0.585393293

x5
0.2016 0.4692 0.1775 0.4296 1.0000 0.6130 0.6672 0.4102
0.420774306

x6
0.0056 0.5287 0.4236 0.6195 0.6130 1.0000 0.7969 0.7265
0.243645189

x7
0.4087 0.8286 0.2638 0.8300 0.6672 0.7969 1.0000 0.7150
0.106686193

x8
0.4499 0.6270 0.4391 0.6180 0.4102 0.7265 0.7150 1.0000
1.14002254E-005

x1 x2 x3 x4 x5 x6 x7 x8

Eigenvalues (in descending order)

The observed failure rate of Shevlyakov’s correlation matrix raises a question whether it can be used directly for further analysis of correlation matrices. While the product moment correlation coefficient (of Karl Pearson) is so much sensitive to the outliers, the robust nature of Shevlyakov’s correlation coefficient is attractive. But, unfortunately, its robustness goes along with its being extremely prone to non-positive semi-definiteness and unsuitability to multivariate analysis. In view of these findings, it is safe to use Spearman’s ρ , Blomqvist’s signum, Kendall’s tau or Bradley’s absolute r rather than Shevlyakov’s correlation coefficient for constructing correlation matrices for any further analysis. This difficulty may, of course, be overcome in two steps: first obtaining the correlation matrix using the Shevlyakpv’s formula and then, if the matrix is found to be non-positive semi-definite, converting it to the nearest positive semi-definite matrix. There are a number of methods available for this purpose, among which Rebonato and Jäckel (1999), Higham (2002), Mishra (2004) and Mishra (2008-a and 2008-b) may be of some relevance. While the methods of Rebonato & Jäckel and Higham define nearness by L2 Minkowski-type norm, Mishra permits any of the L1, L2 or L∞ Minkowski-type

5 norm. As an exercise, we obtain the L2-nearest positive semi-definite (PSD) correlation matrix from the matrix presented in Table-2.2 and present it in Table-2.3. It may be noted that the correlation coefficients (elements of the matrix in Table 2.2 vis-à-vis those of the matrix in Table 2.3) are very close and statistically they are not different from each other across the matrices. Hence, this PSD matrix in Table 2.3 may be used for further analysis.

References
• • • Blomqvist, N. (1950) "On a Measure of Dependence between Two Random Variables", Annals of Mathematical Statistics, 21(4): 593-600. Bradley, C. (1985) “The Absolute Correlation”, The Mathematical Gazette, 69(447): 12-17. Hampel, F. R., Ronchetti, E.M., Rousseeuw, P.J. and W. A. Stahel, W.A. (1986) Robust Statistics: The Approach Based on Influence Functions, Wiley, New York. Higham, N.J. (2002). “Computing the Nearest Correlation Matrix – A Problem from Finance”, IMA Journal of Numerical Analysis, 22, pp. 329-343. Mishra, SK (2004) “Optimal Solution of the Nearest Correlation Matrix Problem by Minimization of the Maximum Norm". http://ssrn.com/abstract=573241 Mishra, S. K. (2008-a) “A Note on Solution of the Nearest Correlation Matrix Problem by von Neumann Matrix Divergence”, Available at SSRN: http://ssrn.com/abstract=1106882 Mishra, S. K. (2008-b) “The Nearest Correlation Matrix Problem: Solution by Differential Evolution Method of Global Optimization”, Journal of Quantitative Economics, New Series, 6(1&2): 240-262. Rebonato, R. and Jäckel, P. (1999) “The Most General Methodology to Create a Valid Correlation Matrix for Risk Management and Option Pricing Purposes”, Quantitative Research Centre, NatWest Group, http://www.rebonato.com/CorrelationMatrix.pdf Shevlyakov, G.L. (1997) “On Robust Estimation of a Correlation Coefficient”, Journal of Mathematical Sciences, 83(3): 434-438. Spearman, C. (1904) "The Proof and Measurement of Association between Two Things", American Journal of Psychology, 15: 88-93.


• • •



• •

Similar Documents