Autocorrelation and homoscedasticity are both statistical concepts used in time series analysis. Autocorrelation refers to the degree of correlation between values of a variable at different points in time, while homoscedasticity refers to the assumption that the variance of errors is constant across all levels of an independent variable.
The relationship between these two lies in their implications for regression models. If autocorrelation exists, it violates the assumption of independence among error terms in ordinary least squares (OLS) regression, leading to inefficient parameter estimates. Similarly, if homoscedasticity is violated, it results in heteroscedasticity which also leads to inefficiency and biased standard errors in OLS regression.
In essence, both autocorrelation and non-homoscedasticity can lead to misleading inference from regression analysis if not properly addressed.