What is lag in a time series? - Mathematics Stack Exchange 23 I am curious about what a lagging time series is On investopedia, I saw an article that said that: "Autocorrelation is degree of similarity between time series and a lagged version of itself over successive intervals " Someone please explain to me what "lagged" means, and why autocorrelation matters in relation to time series analysis
signal processing - Lag of a delagged exponential moving average . . . From the paper, and the paper's title even "ZERO LAG (well, almost)", the adaptive filter described in the paper is not exactly $0$ lag The adaptive filter algorithm is designed to provide a compromise between reactivity (low delay) and smoothing It tries to be more reactive than a plain EMA while still smoothing the data
Delay embedding for an irregularly-sampled time series All the examples I have seen of time delay embedding involve regularly (evenly) sampled time series where lagged versions of the observed series can easily be created However, I am interested in time series that are irregularly sampled in time