Asumsi Non-Autokorelasi dalam Regresi Linear

Halo sahabat . . . ^__^ . . . Terima kasih ya, sudah berkunjung ke web saya yang sangat sederhana ini he he he he . . .

Semoga sahabat tetap sehat dan sukses selalu ya . . . Amiinnn . . .

Nah, pada kesempatan kali ini, ada dua pertanyaan nih yang akan diajukan, yakni

 [1] Pertanyaan:

 Perlukah uji asumsi non-autokorelasi dalam penggunaan regresi linear??

 [2] Pertanyaan:

 Apa akibatnya jika asumsi non-autokorelasi tidak dipenuhi dalam penggunaan regresi linear?

Nah, dua pertanyaan tersebut, akan dijawab langsung nih, oleh para pakarnya, yakni:

  1. Andy Field dalam bukunya yang berjudul “Discovering Statistics Using SPSS, 3rd Edition
  2. James P. Stevens dalam bukunya yang berjudul “Applied Multivariate Statistics For the Social Sciences, 5th Edition”
  3. Hair, dkk., dalam bukunya yang berjudul “Multivariate Data Analysis, 7th Edition
  4. Gujarati Damodar, dalam bukunya yang berjudul “Basic Econometrics, 4rd Edition

Oke mari kita simak, pemaparan-pemaparan mereka . . . . ^__^

[1] Pemaparan menurut Andy Field

Field (2009:220) menyatakan sebagai berikut.

 Independent errors: For any two observations the residual terms should be uncorrelated (or independent). This eventuality is sometimes described as a lack of autocorrelation. This assumption can be tested with the Durbin-Watson test, which tests for serial correlations between errors. Specifically, it tests whether adjacent residuals are correlated. The test statistic can vary between 0 and 4 with a value of 2 meaning that the residuals are uncorrelated. A value greater than 2 indicates a negative correlation between adjacent residuals, whereas a value below 2 indicates a positive correlation. The size of the Durbin-Watson statistic depends upon the number of predictors in the model and the number of observations. For accuracy, you should look up the exact acceptable values in Durbin and Watson’s (1951) original paper. As a very conservative rule of thumb, values less than 1 or greater than 3 are definitely cause for concern; however, values closer to 2 may still be problematic depending on your sample model.

Nah, berdasarkan uraian di atas, dapat kita tarik sedikit informasi bahwa, jadi asumsi non-autokorelasi, perlu dipenuhi/diuji dalam penggunaan regresi linear. Nah, untuk mendeteksi apakah terjadi autokorelasi atau tidak, dapat digunakan uji statistik Durbin-Watson.

[2] Pemaparan menurut James P. Stevens

Stevens (2009:90) menyatakan sebagai berikut.

 Recall that in linear regression model it is assumed that the errors are independent (NON-AUTOCORRELATION) and follow a normal distribution with constant variance. The normality assumption can be checked through use of the histogram of the standardized or studentized residuals, as we did in Table 3.2 for the simple linear regression example. The independence assumption implies that the subjects are responding independently of one another. This is an important assumption.

 Nah, berdasarkan uraian di atas, dapat kita tarik sedikit informasi bahwa, jadi asumsi non-autokorelasi, merupakan asumsi yang penting dalam penggunaan regresi linear, dan perlu dipenuhi atau diuji.

[3] Pemaparan menurut Hair, dkk.

Hair, dkk. (2009:217) menyatakan sebagai berikut.

 Independence of the Error Terms. We assume in regression that each predicted value is independent, which means that the predicted value is not related to any other prediction; that is, they are not sequenced by any variable. We can best identify such an occurrence by plotting the residuals against any possible sequencing variable. If the residuals are independent, the pattern should appear random and similar to the null plot of residuals. Violations will be identified by a consistent pattern in the residuals.

 Nah, berdasarkan uraian di atas, dapat kita tarik sedikit informasi bahwa, jadi asumsi non-autokorelasi, merupakan asumsi yang penting dalam penggunaan regresi linear, dan perlu dipenuhi atau diuji.

[4] Pemaparan menurut Gujarati

Gujarati (2004:488) menyatakan sebagai berikut.

[1] If the assumption of the classical linear regression model  that the errors or disturbances  entering into the population regression function (PRF) are random or uncorrelated  is violated, the problem of serial or autocorrelation arises.

 [2] Autocorrelation can arise for several reasons, such as inertia or sluggishness of economic time series, specification bias resulting from excluding important variables from the model or using incorrect function form, the cobweb phenomenon, data massaging, and data transformation. As a result, it is useful to distinguish between pure autocorrelation and “induced” autocorrelation because of one or more factors just discussed.

Although in the presence of autocorrelation the OLS estimators remain unbiased, consistent, and asymptotically normally distributed, they are no longer efficient. As a consequence, the usual t, F, and X^2 tests cannot be legitimately applied. Hence, remedial results may be called for.  

Nah, berdasarkan uraian di atas, dapat kita tarik informasi:

 [1] Asumsi non-autokorelasi merupakan salah satu asumsi dari penggunaan regresi linear.

[2] Asumsi non-autokorelasi dapat juga berarti error-error (disturbances)  tidak saling berkorelasi (uncorrelated) atau acak/random.

[3] Jika asumsi non-autokorelasi dilanggar, maka disebut juga dengan autokorelasi.

[4] Ketika terjadi autokorelasi, estimator-estimator yang dihasilkan dengan metode ordinary least squares (OLS) tetap tak bias, konsisten, dan terdistribusi normal secara asimtotis, namun tidak lagi efisien. Sebagai akibatnya, uji t, F, dan X^2 yang biasa tidak sah/legitimate untuk digunakan.

 Nah, jadi asumsi non-autokorelasi PERLU DIUJI YAHHHH

Oke deh sampai di sini dulu ya, semoga pemaparan para pakar-pakar tersebut, bermanfaat yah bagi kitaaa . . . .AMINNNNN . . . . ^__^

Mohon koreksi jika ada kesalahan atau kekurangan yahhh ^__^