Team Licence
subjects
cpd types
licence
about

The essence of time series forecasting is to take a series of measures or observations from the past and project them into the future.

Time series forecasting looks at the past, then to the future.

It assumes, initially, that the same conditions which pertained in the past will continue into the future. Generally, there are three basic components to any form of time series:

  • Level – the basic trend which is what the future would be if there were no unexpected movements up or down, no noise, and no seasonality.
  • Trend – is the general pattern of results over the entire period.
  • Seasonality – the effect of seasonal patterns on the data influenced.

Noise

Noise is what statisticians call random variables which affect the overall pattern of prior year data, but which has no other significance. When gathering data for any form of time series analysis, it is important to try to eliminate noise and to concentrate solely on the trends and values of the underlying data.

For example, one month may show a spike in orders to particular suppliers. Why was this? The answer is that, at that time, there was a sudden demand for a rush order from a customer so extra materials had to be bought in to meet it. It wasn't repeated, so the value of the extra purchases should be eliminated as noise from the past data.

Naïve forecasting

Naïve forecasting involves adding growth rate to the previous period.

Naïve forecasting is best summed up by the principle of "last period + x%". It basically involves taking a previous period and adding a percentage growth rate to those numbers, or simply using the last period for this period. Anomalies and outliers can be removed prior to doing this, but it is a simple and quick approach. It can give a quick and easy start to forecasting and can be used to evaluate the outcomes of more sophisticated forecasting approaches.

Unfortunately, it can produce wildly error-prone forecasts. There is a simple reason for this which is that the market is not linear. Using this approach in businesses with any form of trend, seasonality, spikes or discontinuities will produce a distorted forecast simply because irregularities generally don't happen at the same level, in the same way, or at the same time. This throws out the calculations.

Moving averages

This is quite a simple process familiar to accountants and included here for completeness. The process forecasts the current period by taking the average of a number of previous periods. To calculate the average for, say, January assume the sales figures for a period were:

Month
November 3,600
December 4,000

For January, the average would be (3600 + 4000) ÷ 2 = 3,800.

This is a simple calculation which, when used on a rolling basis, tends to smooth out the trends and eliminate short-term fluctuations. It works reasonably well in a business with a more or less steady level of business, however it is of limited use in businesses where the trade is seasonal or irregular, or subject to substantial discontinuities.

What it does do is highlight a trend, but there is always a slight lag in identifying this because it is based on past numbers. The longer the period used to calculate the average, the greater the lag. A variation of this is the weighted moving average.

Weighted moving average

The same methodology is used in these calculations but, instead of taking the actual number for a previous period, a weighting is applied to it to reflect its significance. For example:

Month Sales (€) Weighting Net
March 5,500 0.2 1,100
April 6,200 0.3 1,860
May 7,400 0.5 3,700
June (forecast) 6,660

Again, it is useful in indicating trends, but the weighting can be highly subjective. So, as with naïve forecasts or simple moving averages, it has its uses in indicating trends but is of limited value where businesses are in a more fluid and uncertain environment.

Exponential smoothing

Exponential smoothing focuses on the most recent data.

Exponential smoothing is a form of weighted moving average that gives more weight to the most recent data. It looks at how accurate any previous forecast was against what actually happened and takes that into account when calculating a number for the next period.

The basic formula for simple exponential smoothing is:

Forecast for next period = (smoothing factor × actual value from previous period) + ((1 – smoothing factor) × previous forecast)

The calculation includes a smoothing factor or parameter known as the alpha, signified by the Greek letter α. The rationale for the alpha is buried deep in mathematics so most forecasters simply accept it for what it is. The accuracy or otherwise of the use of a given level of alpha can be determined by how close the forecasts are to the actual given the underlying assumptions and conditions hold true.

The alpha ranges from 0 to 1, which determines the weight assigned to the most recent observation. A smaller alpha discounts older observations more rapidly. An alpha of 1 would give a result the same as a naïve forecast, and an alpha of 0 would leave the result unchanged from the previous forecast. An alpha nearer to 1 gives more weight to most recent data, but many forecasters use an alpha of around 0.3 to 0.5. A higher alpha will react quickly to trends but will be more volatile, whereas a low alpha may lag behind trends. Choosing the right alpha is important as it will give higher or lower values in the calculation.

Explore more forecasting techniques in John Taylor’s 4-hour course, Real World Forecasting, here!

    You need to sign in or register before you can add a contribution.