Treasury Practice

Regression analysis

Published: Oct 2011

Regression analysis, sometimes referred to simply as ‘regression’, is a statistical tool used to analyse the relationship between two or more variables. It is employed in many areas of forecasting and financial analysis because it can help analysts understand how strongly a set of ‘independent’, that is to say changing, variables are related to a ‘dependent’ one.

Unlike correlation, which looks at the interdependence of variables, regression works on the assumption that the independent variable(s) has a one-way causal effect on the dependent variable. Take for example the write-downs banks faced during the financial crisis in relation to the number of sub-prime mortgages they had on their books: in general, one would assume that the more sub-prime loans they made, the greater the number of write-downs they faced.

Types of regression

There are two basic types of regression: linear and multiple. Regression analysis that deals with just one independent variable (usually denoted by \(X\)) and one dependent variable (usually denoted by \(Y\)) is termed linear regression. In its simplest form, a linear regression describes an unchanging relationship between the two variables. As such, values of the two variables (determined from an existing data set) are used to develop a model, the aim of which is to predict the value of \(Y\) for given values of \(X\).

Multiple regression works on a similar premise, but uses two or more independent variables. Each independent variable is denoted by using subscripted numbers, such as \(X1\) , \(X2\) and so on. Again, the aim is to build a model that can forecast the value of \(Y\) .

Methodology for simple linear regression

It is useful to examine the data graphically before the regression analysis really begins. The first step is to plot a scatter graph using the values of the variables. The analysis can start once a ‘line of best fit’ has been introduced. This provides a ‘summary’ of the points and draws out the relationship between the variables more explicitly.

The equation

Linear regression: Y= a + bX

Where:

  • \(Y\) is the dependent variable we are trying to predict.
  • \(b\) is the slope of the line of best fit.
  • \(X\) is the independent variable.
  • \(a\) is the variable that represents the point at which the line of best fit intercepts the y-axis.

Not all data points will lie directly on this line of best fit – other factors tend to influence the dependent variable. The distance between a data point and the line of best fit is termed the ‘regression residual’. This is sometimes factored into the regression equation as ‘c’, such that Y = a + bX + c.

Practical application

Regression calculations could be used in the treasury function to determine the extent to which specific variable factors, such as interest rates or the price of a commodity, influence movements in the price of an asset. In receivables management, for example, the impact of early-payment discounts offered to customers might also lend itself to regression analysis.

All our content is free, just register below

As we move to a new and improved digital platform all users need to create a new account. This is very simple and should only take a moment.

Already have an account? Sign In

Already a member? Sign In

This website uses cookies and asks for your personal data to enhance your browsing experience. We are committed to protecting your privacy and ensuring your data is handled in compliance with the General Data Protection Regulation (GDPR).