Financial Modelling in Microsoft Excel/Testing
The most important thing to remember is that you are the last line of defence. You aren’t just there to look for typos, and anything you miss could cost a lot of money or embarrassment, so be careful and don’t rush.
In the notes that follow, we’ll assume we are checking a model which is important enough to spend a day or two checking, and that we definitely want no errors. We’ll also consider what can be done to make a model easier to check.
We’ll use the words “check”, “test” & “review” a lot below. They all have the same objective – ensuring the model is correct, or close enough to correct
Before you start
We’ll start with some basic questions before getting into the checking itself.
1. Are you the right person to check the model?
- Do you understand the business issues behind the model? This is important because you need to ensure that nothing has been left out, e.g. tax considerations. If you don’t know much about the business, then you probably shouldn’t be checking the model.
- Do you understand enough about the technicalities of the model to check it? If it is written in VBA and you can’t read it, then it’s going to be tough.
- Have you built models yourself? Are you experienced in what can go wrong?
2. Do you have all the documentation for the model?
This should include all relevant correspondence or documents, and anything that will help you understand the business which is being modelled.
3. Do you have enough time and budget to do the job properly?
Make a copy of the spreadsheet, rename it, and add a sheet called ‘Review’, which will hold all of your comments. Don’t work off the original spreadsheet.
The Review sheet should have some columns like the following:
In practice, most checkers in business work with the original spreadsheet, making corrections or changes and then giving it back to the builder to check.
This is OK for most small spreadsheets, but it does confuse the role of builder and checker and you should avoid doing this for important or complex models. If you use a Review sheet, you should not make any changes to the original – the builder should do this after looking at your comments, and then give the model back to you for rechecking. Your job is to check, not to build.
Checking the functionality
The first thing to check, of course, is whether the model does what it is supposed to. You need to find out who wants it and what they asked for, and whether the model is adequate.
In our example of an annuity model, the specification is pretty clear about what is needed, so it should be quite easy to see if it appears to provide everything that was required.
Checking the logic
This is a crucial step, because research shows that logic errors and omissions are hard to pick up. This means you need to be extra careful.
Then you need to check the business logic in the model, before getting into the detail of sheets and cells. Hopefully, the model builder has documented the logic clearly to make it easier to check. This means not only setting out the logic itself, but providing a link to, or an extract of, original documentation, to show where it came from. If this is not the case, you should request that this be done.
The extract below is from an actual spreadsheet which calculates a profitability formula using a previous report, an extract of which is included on the right. This makes it easy for a checker to review the logic.
You can see that this is much better than if the model builder simply began using a mysterious formula like =1-1/(G6+(1-G6)/0.85), without explanation.
As far as possible, you should write out your own logic before looking at the model, because it is easy to be influenced by what you see, and think “That looks OK”, whereas if you’d done it yourself first, you might have spotted something missing.
Identifying the risk factors
Now you understand what the model is supposed to do, and you have worked out the business logic, you should think about what could go wrong, and how serious that could be.
- some of the inputs might be dangerous, because users might put in incorrect values by mistake or through ignorance
- there may be combinations of inputs which require different treatment
- some assumptions (e.g. tax rate) may have a significant effect on results
- which inputs or calculations have most effect on the results?
- formulae may not have been copied correctly across rows or columns
At this stage, you should also think about how you can check the model for reasonableness.
Checking the inputs
Your inputs will normally include some or all of
Data: e.g. employee records, or transaction records. You need to check where the data came from, and that it is correct, and that it has not been altered. Ideally, it should be kept completely separate from the rest of the model (e.g. in a separate worksheet or datafile), to avoid contamination.
As with everything else, the data needs to be clear. If, for example, there are confusing or mysterious headings, there should be notes to explain them.
Assumptions: these are the assumptions underlying the model, which are set by the builder and not by users. For example, the model may include a set of the current income tax rates, which obviously don’t need to be input by users because they are the same for everyone.
The assumptions should be set out clearly in one place, clearly labelled and with explanations, and if they are date-sensitive (e.g. tax limits which are changed every year), the effective date should be shown. The cells containing values should be colour-coded to pick them out.
The assumptions need to be documented and justified, because they can have such an effect on the results. If you don’t feel that you could explain the reasons for the assumptions to another actuary, you haven’t done enough work.
If it hasn’t been done already, figure out the limits for assumptions, for example, salary growth may be shown as 4%, but you may decide that it may vary between 0% and 8% in a year. This is important in checking that the model can deal with extreme inputs, and also in checking any sensitivity testing.
User inputs: this generally only applies where there are going to be users other than the model builder. User inputs can be very dangerous, because users can be very imaginative in what they enter. For example, if asked for an interest rate, they could put in any of 0.7, 7% or 7. The model has to deal with this.
For this reason, user inputs should be “controlled”, by constraining them to valid inputs. Excel’s data validation is the most powerful way to do this, because it will prevent users entering inputs that don’t fall in ranges you specify, or typing in text that isn’t on a list you specify, etc.
An alternative is to use the “controls” that Excel provides, e.g. dropdowns & checkboxes.
As a checker, you should look at the way the inputs have been modelled, and think about what users could do to break the model. How strict you need to be will differ for each model, of course.
You should also check that the user inputs are explained well enough for an average user to understand them.
Checking the calculations
The calculations are where the inputs meet the business logic. This is where it is most important to lay the calculations out clearly and in detail, to make it easier to check them.
The most obvious approach is to follow the layout of the calculations and check the way in which the data, assumptions, inputs and logic have been combined to produce the eventual results.
Most commonly, the data is in rows, and formulae are in columns.
Formulae should run from left to right, top to bottom, both because this is how we read, and because Excel calculates in this order. Inputs should be at the top left, or on their own sheet.
Things to look for:
- formulae unexpectedly changing in the middle of a table - this will be easiest to spot if you change the font to a monospace font, e.g. Courier, and then, with a click of the Show Formulas button in the toolbar, reveal all the formulae on the sheet
- hard coded numbers (e.g. tax rates) in formulae – these should be taken out and included with the assumptions, even if they are unlikely ever to change
- circularity – should be avoided if at all possible
The next step is to run some tests on the inputs. A simple technique is to set all the inputs to 0 or 1 (or whatever) so that the results are very simple indeed and any anomalies jump out immediately. For example, a financial projection with the investment rate and indexation set to nil should just produce a list of unadjusted cashflows, making it easier to see if they are correct. Another technique is to use “corner” inputs, i.e. put in the highest and lowest values for inputs and see if the model works correctly.
You should think about what will happen if you do all your tests and find an error, and then you have to redo all the tests again when the error is fixed. This could get a bit tedious! Depending on the effort involved, you could set up an automated set of tests.
If the model is for internal use, you should try to include your tests in the model itself, so that anyone using it can see exactly what was tested, and perhaps even retest it.
If you can write code, it may be a good idea to set up a set of inputs and use VBA to run through them automatically. This can greatly speed up testing.
Where the business logic is written in VBA (i.e. VBA is doing most of the hard work), it may be difficult to test. You may be able to dump partial results to a sheet for testing.
You should look for ways to test the reasonableness of the model. For example, there may be figures that were done a year ago which should be similar to the results of the model, or there may be other models which can be used to produce test results for at least part of the current model.
You can also ask the builder to include cross-checks (e.g. totals for columns and rows should match) which help in testing the model.
Sensitivity testing, i.e. changing one of the inputs and seeing what happens, is another way to test for reasonableness, especially if you have some idea of what to expect.
An extension of sensitivity testing is to find out how accurate the model needs to be, then estimate the possible errors in setting the assumptions and inputs (for example, an investment assumption of 7% might be out by 2% either way), and test the impact on the results. This gives a rough idea of how far out the results might be (because the user can’t possibly get all the assumptions absolutely correct), and whether the overall error is acceptable.
Test in chunks
If possible, test the model in chunks, so you can sign it off one part at a time. This makes checking more manageable and it should make finding errors much easier.
Spreadsheet testing tools
There are several spreadsheet testing tools which can help in checking complex spreadsheets. They look for things like inconsistent formulae, & orphan (unused) calculation cells.
Is it usable?
Put yourself in the shoes of the user, or, even better, find someone else to try using it. Is it clear, or is it confusing?
Can users accidentally change formulae, or put in ridiculous figures?
Remember that users will often not read instructions, and will use the model without thinking. The less expert the user, the more effort needs to go into protecting the model – and especially the inputs – from incorrect use.