Now Accepting Apple Pay

Apple Pay is the easiest and most secure way to pay on StudyMoose in Safari.

Overcoming the Autocorrelation Problem in Data Analysis

Categories: DataData Analysis

Several approaches to data analysis can be used when autocorrelation is present. One uses additional independent variables and another transforms the independent variable.

  • Addition of Independent Variables Often the reason autocorrelation occurs in regression analyses is that one or more important predictor variables have been left out of the analysis. For example, suppose a researcher develops a regression forecasting model that attempts to predict sales of new homes by sales of used homes over some period of time. Such a model might contain significant autocorrelation.

    The exclusion of the variable “prime mortgage interest rate” might be a factor driving the autocorrelation between the other two variables. Adding this variable to the regression model might significantly reduce the autocorrelation.

  • Transforming Variables When the inclusion of additional variables is not helpful in reducing autocorrelation to an acceptable level, transforming the data in the variables may help to solve the problem. One such method is the first-differences approach. With the first-differences approach, each value of X is subtracted from each succeeding time period value of X; these “differences” become the new and transformed X variable.

    Get quality help now
    Dr. Karlyna PhD
    Verified writer

    Proficient in: Data

    4.7 (235)

    “ Amazing writer! I am really satisfied with her work. An excellent price as well. ”

    +84 relevant experts are online
    Hire writer

    The same process is used to transform the Y variable. The regression analysis is then computed on the transformed X and transformed Y variables to compute a new model that is hopefully free of significant autocorrelation effects. Another way is to generate new variables by using the percentage changes from period to period and regressing these new variables. A third way is to use autoregression models.

  • Autoregression A forecasting technique that takes advantage of the relationship of values (Yt) to previous period values (Y t-1, Y t-2, Y t-3 .

    Get to Know The Price Estimate For Your Paper
    Number of pages
    Email Invalid email

    By clicking “Check Writers’ Offers”, you agree to our terms of service and privacy policy. We’ll occasionally send you promo and account related email

    "You must agree to out terms of services and privacy policy"
    Check writers' offers

    You won’t be charged yet!

    . .) is called autoregression. Autoregression is a multiple regression technique in which the independent variables are time-lagged versions of the dependent variable, which means we try to predict a value of Y from values of Y from previous time periods. The independent variable can be lagged for one, two, three, or more time periods. An autoregressive model containing independent variables for three time periods looks like this:

Ŷ = b0 + b1 Y t-1 + b2Y t-2 + b3Y t-3

Autoregression can be a useful tool in locating seasonal or cyclical effects in time series data. For example, if the data are given in monthly increments, autoregression using variables lagged by as much as 12 months can search for the predictability of previous monthly time periods. If data are given in quarterly time periods, autoregression of up to four periods removed can be a useful tool in locating the predictability of data from previous quarters. When the time periods are in years, lagging the data by yearly periods and using autoregression can help in locating cyclical predictability.

Cite this page

Overcoming the Autocorrelation Problem in Data Analysis. (2016, Oct 24). Retrieved from

Overcoming the Autocorrelation Problem in Data Analysis

👋 Hi! I’m your smart assistant Amy!

Don’t know where to start? Type your requirements and I’ll connect you to an academic expert within 3 minutes.

get help with your assignment