This is the second part of a three-part blog series. You can read the first article here.

Types of Time Series Data

Let's start with different types of time series data and modelling techniques with detailed descriptions.

Stationary Time Series

Stationary time series are characterized by a consistent and unchanging statistical property over time, such as a constant mean, variance, and auto-correlation structure, making them relatively predictable and easier to analyze. They are like steady streams, staying consistent over time. They have a constant mean, so the overall trend doesn't go up or down. The relationship between past and future values and variance also doesn't change much.

Stationary time series are desirable for many time series analysis techniques, such as autoregressive integrated moving average (ARIMA) modelling, as they simplify the modelling process and facilitate accurate forecasting.

Non-Stationary Time Series

Non-stationary time series are characterized by unstable statistical properties, with mean, variance, or other characteristics changing over time. They can arise due to trends, seasonality, or other systematic patterns present in the data. Nonstationary series require special attention in analysis and modelling to account for these changing properties. Techniques for handling non-stationarity include detrending, differencing, seasonal adjustment, or incorporating trend and seasonality components into the modelling process.

Common methods for verifying stationarity and non-stationary include visual inspection of plots, statistical tests such as the Augmented Dickey-Fuller (ADF) test, Kwiatkowski-Phillips-Schmidt-Shin (KPSS) Test, and examining autocorrelation plots.

Seasonal Time Series

Imagine having data over time, like monthly sales figures. If you notice that these numbers go up and down regularly, like getting higher around holidays, that's called seasonal data. It's like a pattern that repeats itself every year, showing how things change based on the seasons.

Trend-stationary Time Series

If you look at data over a longer period, like several years, and notice that it's generally going up or down, that's a trend. For example, think about how populations in cities grow over time. A trend like this affects the overall behaviour of the data, showing a long-term change in a certain direction. Understanding these trends helps us see how things change over time and predict the future.

After understanding the characteristics of time series data, it's important to know the concept of white noise, as it is a fundamental element in time series analysis.

Understanding White Noise

White noise is the process where the data points are generated by a sequence of random variables that are independent and identically distributed. It represents random fluctuations with constant mean and variance. There is no relationship between the successive observations. For instance, If you hear a faint sound in a quiet room, we can call it white noise. That means a random sound with no pattern. The sound stays the same level on average and doesn't get louder or fatter over time. Each sound you hear is completely independent of the others. So, if you hear a loud burst of noise, it doesn't mean the next sound will be quiet. They're not connected.

While white noise may seem entirely random, it serves as a baseline for detecting meaningful signals in the data. Analysts can determine whether the observed patterns or trends are statistically significant or merely random fluctuations by comparing the properties of a given time series to those of white noise. In signal detection theory, analysts can isolate and extract meaningful signals embedded within the data by filtering out the white noise component.

White noise is used in quality control processes to assess the randomness and variability of manufacturing processes or experimental data. Deviations from white noise behaviour may indicate underlying issues or anomalies. It is also utilized in statistical hypothesis testing to evaluate the validity of statistical models or assumptions.

Understanding the characteristics of different types of time series data and selecting the appropriate modelling technique are essential steps in building accurate forecasting models and extracting meaningful insights from the data.

Time Series Modelling Techniques

Different modelling techniques are used for the analysis and pattern identification of time series data.

Following are detailed descriptions of models used for time series analysis:

Autoregressive (AR) Time Series Model

Autoregressive (AR) time series models describe a relationship where the current value of a variable depends on its past values. In other words, the future behaviour of the variable is assumed to be a linear function of its past behaviour. By incorporating information from previous observations, AR models can effectively capture the inertia or momentum inherent in the data. This allows them to forecast future values based on historical patterns and trends.

AR models are employed in quality control and process monitoring to detect deviations from expected patterns in manufacturing processes or system behaviour. In climate and environmental studies, AR models are used to analyze and predict weather patterns, temperature fluctuations, and other environmental variables.

Moving Average (MA) Time Series Model

Moving averages (MA) are used to analyze and predict future values based on the weighted average of past observation errors. Unlike autoregressive (AR) models, which model the relationship between the variable's current value and its past values, MA models focus on the relationship between the current value and past forecast errors. MA models help filter out random fluctuations and highlight long-term patterns in the data.

Imagine you have a bunch of data points, like daily temperatures or monthly sales. With MA, instead of just looking at one data point, you look back at several of them. MA takes these past data points and averages them together. This smoothing process helps to reduce random jumps or fluctuations, making the data look smoother and easier to see the overall trend or pattern.

Types of Moving Averages:

Simple Moving Average (SMA):
Simple Moving Average calculates the average of a set number of data points over a specific period, giving equal weight to each point. It is often used in financial and technical analyses to smooth out price data over a specified period, helping traders and analysts identify trends and potential entry or exit points in the market. It is also commonly used in various fields, such as engineering, signal processing, and environmental science, to analyze trends in data over time.

Cumulative Moving Average (CMA): A Cumulative Moving Average calculates the average of all data points up to the current point in time, updating with each new observation and giving more weight to recent data. It is commonly used in quality control processes and manufacturing to monitor a process's average performance over time. It continuously updates the average performance, giving more weight to recent data points, which can help identify changes or trends in the process performance.

Exponential Moving Average (EMA): Exponential Moving Average assigns exponentially decreasing weights to past observations, giving more weight to recent data and responding quickly to changes, commonly used in finance and technical analysis. It is widely used in financial markets and technical analysis to track price movements and identify trends. Its responsiveness to recent data makes it particularly useful for short-term trading strategies and for smoothing out noisy data. Additionally, EMA is utilized in various fields for time series analysis, including signal processing, forecasting, and trend analysis, where recent data points are more important than older observations.


In the next part, we'll explore more advanced modelling techniques, such as ARIMA, SARIMA, and VAR models. These methods enable us to capture complex patterns, account for seasonality, and understand dynamic interactions between variables in time series data.

Thank you for reading this article. See you in the next one.