Time Series Short Notes

## Table of Contents

## Overview of Time Series Analysis:

- Time series data consists of observations collected over time, often at regular intervals (e.g., daily, monthly, yearly).
- The main goals of time series analysis include understanding the underlying patterns and trends, identifying any cyclical or seasonal components, and making forecasts for future values.
- Time series analysis techniques are widely used in various fields, such as economics, finance, meteorology, and engineering, among others.

## Box-Jenkins Methodology:

- Developed by statisticians George Box and Gwilym Jenkins in the 1970s.
- It provides a systematic approach for identifying, estimating, and diagnosing ARIMA models.
- The methodology involves the following iterative steps: a. Model Identification: Examine the data patterns, stationarity, and autocorrelation/partial autocorrelation functions to determine the tentative ARIMA model orders (p, d, q). b. Parameter Estimation: Use techniques like maximum likelihood estimation to estimate the model parameters. c. Diagnostic Checking: Analyze the residuals to ensure the model assumptions are met (e.g., white noise, no autocorrelation). d. Forecasting: Use the fitted model to make forecasts for future time periods.

## ARIMA Model:

- ARIMA stands for Autoregressive Integrated Moving Average.
- It is a versatile model that combines three components: a. Autoregressive (AR) component: Models the current value as a linear combination of its past values. b. Integrated (I) component: Handles non-stationarity by taking differences of the time series. c. Moving Average (MA) component: Models the current value as a linear combination of past error terms.
- The orders of the ARIMA model are represented as ARIMA(p, d, q), where:
- p is the order of the AR component (number of past values used)
- d is the order of differencing (number of times the series is differenced)
- q is the order of the MA component (number of past error terms used)

## Autocorrelation Function (ACF):

- The ACF measures the correlation between a time series and its lagged values.
- It helps identify the presence and pattern of autocorrelation in the data.
- The ACF plot displays the autocorrelation coefficients at different lags.
- The pattern of the ACF can provide insights into the appropriate AR and MA orders for an ARIMA model.

## Autoregressive Models (AR):

- In an AR(p) model, the current value is modeled as a linear combination of its p previous values and a random error term.
- The AR(p) model is represented as: y_t = c + φ_1
*y_(t-1) + φ_2*y_(t-2) + … + φ_p*y_(t-p) + ε_t - AR models are useful when the time series exhibits autocorrelation in its values.

## Moving Average Models (MA):

- In an MA(q) model, the current value is modeled as a linear combination of the current and q previous error terms.
- The MA(q) model is represented as: y_t = μ + ε_t + θ_1
*ε_(t-1) + θ_2*ε_(t-2) + … + θ_q*ε_(t-q) - MA models are useful when the time series exhibits autocorrelation in its errors.

## ARMA and ARIMA Models:

- An ARMA(p, q) model combines both AR(p) and MA(q) components.
- The ARIMA(p, d, q) model extends the ARMA model by including an integrated component (I) of order d, which accounts for non-stationarity by taking differences of the time series.
- ARIMA models are flexible and can capture a wide range of time series patterns, making them widely applicable.

## Building and Evaluating an ARIMA Model:

- Stationarity checking: Check if the time series is stationary (constant mean and variance) by examining plots and performing statistical tests. If not, apply differencing or transformations to achieve stationarity.
- Model identification: Examine the ACF and Partial Autocorrelation Function (PACF) plots to determine the tentative orders of the AR and MA components.
- Parameter estimation: Use methods like maximum likelihood estimation to estimate the model parameters.
- Diagnostic checking: Analyze the residuals to ensure they are white noise (no autocorrelation) and meet the model assumptions.
- Model selection: Compare different ARIMA models using information criteria (e.g., AIC, BIC) and choose the best-fitting model.
- Forecasting: Use the selected ARIMA model to generate forecasts for future time periods, along with prediction intervals.

## Reasons to Choose and Cautions:

- ARIMA models are widely used due to their flexibility and ability to capture various time series patterns.
- They can handle non-stationarity, trends, and seasonal components.
- However, ARIMA models have some limitations and assumptions: a. Stationarity assumption: The time series should be stationary or made stationary through differencing or transformations. b. Linear model: ARIMA models assume linear relationships, which may not hold for highly nonlinear data. c. Structural breaks and outliers: ARIMA models can be sensitive to structural breaks, outliers, or changes in the underlying data-generating process. d. Large data requirements: Building accurate ARIMA models typically requires a sufficiently large historical dataset.
- Caution should be exercised when interpreting and applying ARIMA models, and diagnostic checks should be performed to ensure the model assumptions are met.