1. Introduction
The Institute's mission, as set out by its founders, is to ‘carry out research into the economic and social forces that affect people's lives and to improve the understanding of those forces and the ways in which policy can bring about change’. In recognition of the key role that macroeconomic fluctuations and trends play in influencing living standards, one of the main ways in which the Institute has fulfilled this mission for much of its eighty years has been through the publication of its regular economic commentary and forecasts, underpinned by macroeconomic models.
The original impetus for the Institute's work in this area came in the mid-1950s when it was felt that “the lack of informed debate about economic policy gave the government an unduly great influence on public opinion”(Reference JonesJones, 1998). With Treasury support and a grant from the Ford Foundation, the Institute began a programme of work on macroeconomic studies and short-term national income forecasting. This led to the quarterly publication of economic forecasts under the direction of Christopher Dow, later chief economist at the OECD and Executive Director and Advisor to the Governor at the Bank, starting in the first issue of the Review in January 1959 and continuing almost without interruption to this day.Footnote 1
The Institute's purpose in publishing economic forecasts was, and still is, to give a “wholly independent opinion on the state of the economy” (Reference WorswickWorswick, 1971). This involves a description of where the economy appears to be heading, based on assumptions about the setting of policy. A good forecast is not necessarily very accurate, but should set out a clear narrative of the central case together with an assessment of the uncertainty around it, identify the main risks to the outlook, and provide some indication of what would be their consequence if they were to materialise. As such, the main function of forecasts themselves is “to provide a structure for an assessment of the current economic situation” (Reference WealeWeale, 1998). That assessment then points the way to possible changes in policy intended to improve economic performance and ultimately people's lives.
When the first economic forecasts were put together, “there was not much use of equations” (Reference JonesJones, 1998). But this changed relatively quickly and the Institute became one of the first organisations in the United Kingdom to build and develop a macroeconomic model. Box A reproduces an early version of the Institute model (taken from Reference SurreySurrey, 1971). This consists of ten behavioural equations and seven identities describing the UK economy. Today, it could be solved within seconds on any personal computer. But at the time it took considerable resources and creativity to develop and maintain a model of this type.
The original purpose of the model was mainly to be a tool for producing economic forecasts, and in the early years there seemed to be some optimism that better modelling would lead to more accurate forecasts. But it was very soon understood by practitioners that while macroeconomic models might help in forecasting, their main purpose is as guides to understanding the drivers of economic progress and the effects of different policies and other developments: “the aim of modelbuilding cannot be, and never has been, to devise a machine which will deliver a forecast free from error. The aim is to understand the systematic relationships which influence employment, output and inflation consistently over the years” (Reference BrittonBritton, 1983).
It was also recognised at an early stage that there needed to be a continuous process of modification and adaptation in the development of macroeconomic models: “Ideally one might envisage the process of model building as a journey through the economy in which successive structural relationships were firmly pinned down by research, so that a model increasing in size and sensitivity could be steadily constructed”(Reference WorswickWorswick, 1971).
The accompanying article by Stephen Hall and Brian Henry describes part of the Institute's model building journey, focusing mainly on work undertaken up to 1987 when they were colleagues at the Institute, after which they left to join the Bank of England. During this time, they made many important contributions, including the introduction of rational expectations into the model. This period also saw a major departure in focus when Andrew Britton and Simon Wren Lewis took on the former Treasury and Bank world model in 1986 to use it for forecasts and research. This decision reflected the need to take account of the increasing openness of the UK economy and the importance of understanding globalisation and its impacts. The first global forecast was published by the Institute in 1987 and this aspect of the Institute's work has continually increased in importance since that date.
This article describes the new journey into studying the UK in a global context that started in the late 1980s. Ray Barrell, Nigel Pain and Peter Westaway joined the Institute from the Treasury in 1988 and Garry Young and Andy Blake joined them in 1990 when Simon Wren Lewis moved to a university chair.Footnote 2 Coming from the Treasury, this group were naturally interested in economic policy and how it could affect the economy. At the Treasury they had learned, from Chris Melliss, Rod Whitaker and others, the importance of constructing models that not only fitted the data, but also had ‘sensible’ overall model properties. In practice this meant that, in response to shocks, the models should display smooth dynamic adjustment to long-run solutions that were consistent with the well-understood textbook economics that some of them had also taught at university. This background influenced how we and the others approached macroeconomic modelling during our time at the Institute, and encouraged a focus on the long-run properties of the model, including the specification of the supply-side of the model. The combination of the introduction of wealth effects and stocks of assets owned and owed abroad along with the impact of the capital stock on the supply side of the models meant that adjustment time to long-run equilibria was extended in the new generation of models, and this required much longer forecast horizons than previously, and also meant that there had to be significant changes to the solution software developed by Stephen Hall in the 1980s.
There have been other influences on how macroeconomic modelling developed at the Institute over this period. Of key importance has been the availability of funding to support research using macroeconomic models. A Macroeconomic Modelling Consortium, financed by the ESRC, HM Treasury and Bank of England, had been established in 1983 to coordinate support for a programme of research in macroeconomic modelling. The Consortium provided financial support to macroeconomic modelling at the Institute in four four-year phases until the Consortium was wound up and the research programme was discontinued in 1999. A further round of finance was provided by the ESRC in 1999, but there has been no public financial support specifically for macroeconomic modelling at the Institute since 2003.
It was clear even in the late 1980s that finance would become increasingly difficult to obtain from this source and in 1989 a global model user group, that had been planned by Simon Wren-Lewis and Andrew Britton, was started, with the Bank, the Treasury and a few city firms as inaugural members. The group has continually expanded, with many continental European central banks joining, the first being the Bank of Italy in 1989. The model has been used at various times by the OECD, the IMF and the World Bank, and the ECB and the system of European Central Banks have been a major support for some time. Private sector users also increased in number and they have always been around half of the total members. The user group provided an increasing proportion of macroeconomics funding from 1990 and since 2003 macroeconomic modelling has been financed by sales of the National Institute Global Econometric Model (NiGEM) and by specific research projects based around the model. User-friendly software has been maintained and developed by Ian Hurst, who joined the Institute in 1992. The model expanded along with the model user group as there was a commitment to modelling all EU and subsequently all EMU members, and both groups kept growing, with central banks, finance ministries and important stakeholders needing help with tools to analyse the prospects they might face. In addition, the world became increasingly integrated, so developments outside Europe had to be analysed and models of countries such as Mexico and Brazil added.
The need to secure funds to finance macroeconomic modelling imposes a number of disciplines, whether the finance comes from public sector research grants or model sales. In either case it is necessary to demonstrate that the modelling teams are competent, have a decent academic reputation, and that the model itself is empirically coherent, consistent with accepted economic theory, and can be used to provide a quantified answer to important macroeconomic questions. The main consequence of the ending of public financial support for macroeconomic modelling at the Institute has been that the domestic economic model was cut down in scale and the core was embedded in NiGEM in 2000. This has meant that the approach to modelling the UK economy is now much less detailed than was possible at times in the past, but coverage of the rest of the world has increased from ten demand side models in 1987 to 46 individual country models in 2018.
Another key influence on macroeconomic modelling in our time at the Institute was the oversight from the Macroeconomic Modelling Bureau based at the University of Warwick, under the direction of Professor Ken Wallis. The Bureau was also financed by the Macroeconomic Modelling Consortium. From its inception in 1983 to its close in 1999, the Bureau examined the properties of the various publicly funded macroeconomic models that were deposited with it. The regular publication of these model comparison exercises, initially in annual review volumes and subsequently in this Review, had a critical role in encouraging best practice in the UK modelling community. This role was further enhanced by the annual macroeconomic modelling conference held at Warwick that provided a supportive, as well as sometimes critical, environment for the exchange of ideas.
A further important influence on macroeconomic modelling at the Institute, and perhaps the reason it has outlasted other modelling groups, is that the model has always been used to produce an economic forecast, its original motivation, and for almost 30 years has been provided to active outside users immediately the forecast has been completed. The discipline of keeping up to date with and commenting on economic developments that goes with economic forecasting, along with the existence of an active user community has meant that the model has always needed to be relevant, with fully justifiable assumptions and up-to-date information every quarter.
This article is organised as follows. Section 2 briefly describes developments in some of the individual behavioural relationships in the model. Section 3 describes developments in policy analysis. Section 4 comments on forecasting performance and the Institute's involvement in the analysis of the current conjuncture, as it is referred to by our European partners. Section 5 concludes.
2. Developments in individual behavioural relationships
As can be seen from the structure of the 1971 version of the Institute model shown in Box A, the original macroeconomic models contained few structural relationships that appeared consistent with the optimising behaviour of economic agents or with the operation of market mechanisms that describe the economy. One of the key themes of macroeconomic modelling over the past forty years or so has been to improve the theoretical underpinnings of the models. One manifestation of this has been the preference of much of the profession for Dynamic Stochastic General Equilibrium (DSGE) models.Footnote 3 This approach has much to recommend it, and is popular in many policymaking institutions. But DSGE models have clear disadvantages. In particular, they often elevate the current version of theoretical purity above empirical coherence and end up with implausible overall properties.
We agree with Reference BlanchardBlanchard (2018) that there is a need for different classes of macroeconomic models with different degrees of theoretical purity: “at one end, the maximum theoretical purity is indeed the niche of DSGEs. For those models, fitting the data closely is less important than clarity of structure. Next come models used for policy purposes, for example, models by central banks or international organisations. Those must fit the data more closely, and this is likely to require in particular more flexible, less micro-founded lag structure.” We would put the models developed at the Institute over the past forty years or so in the latter category. They could be described as incorporating micro-founded long-run relationships with flexible lag structures that are fitted to the data.
Box A. NIESR UK Model of 1971
This box sets out the behavioural equations and identities for the 1971 vintage of the Institute model, as described in Surrey (1971, pp. 98–9).
Behavioural equations
1. Income from rent and self-employment (IRSE)
2. Income from dividends and net interest receipts (DNI)
3. Wage and salary bill (WS)
4. Tax on wages and salaries (TAXWS)
5. Other personal tax (TAXO)
6. Non–credit consumption (CEND)
7. Consumer price index, excluding indirect taxes (CPIFC)
8. Factor cost adjustment (FCA)
9. Stocks, level (S)
10. Imports of goods and services (IMP)
Identities
11. Consumer price index (CPI)
12. Real disposable wage income (RDW)
13. Other real disposable income (RDO)
14. Consumers’ expenditure (CE)
15. Total final sales (TFS)
16. Stockbuilding (INV)
17. Gross domestic product (GDP)
Where variables in bold are treated as exogenous, this includes the ‘add factors’, designated RESi, that enable forecasters to implement their forecast judgements. The exogenous variables are CG, current grants to persons, CR, consumer credit, EEC, employees’ National Insurance contributions, ERC, employers’ contributions to private superannuation schemes, EX, exports of goods and services, FP, forces’ pay, GFI, gross fixed investment, MP, import prices, NTA, net transfers abroad, PAC, public authorities’ consumption, T, indirect tax component of consumer prices, TIM a time trend.
The current NiGEM model is the latest manifestation of this general methodological approach. It is described in detail in Reference Hantzsche, Lopresto and YoungHantzsche, Lopresto and Young (2018). In what follows we describe some of the developments that led towards the current version of the NiGEM model.
Consumption, wealth and house prices
The largest single component of GDP on the expenditure side is consumption, and a great deal of the modelling activity of the NiGEM team was centred on analysing and forecasting its behaviour. It has always been clear that both personal income and personal wealth affected behaviour, but the relative impacts and the speed of reaction are a largely empirical matter. A number of studies undertaken by Institute staff looked for wealth effects on aggregate consumption, and the importance of differences in country impacts is discussed in a panel context and then applied to the model in Reference Barrell, Byrne and DuryBarrell, Byrne and Dury (2003). Although some theorists have suggested that housing wealth cannot be aggregate wealth, it clearly affects behaviour,Footnote 4 and from around 2000 onwards housing markets were built into NiGEM as data became available. It is clear that housing wealth has a much more rapid impact on consumption than does financial or stock market wealth, with an initial impact five times as large. This of course differs between countries, with financial wealth being more important in the US in the short term than in continental countries such as Germany and France. The background work on the relation between consumption and housing wealth is discussed in Reference Barrell and DavisBarrell and Davis (2007) and the impacts of house prices and wealth on reactions of consumers to financial crises is evaluated by Reference Barrell, Davis and PomerantzBarrell, Davis and Pomerantz (2006). Although the consumption equations were estimated on data, we can see current income as a proxy for expected income, and in policy analysis consumption on the model can be turned fully forward looking, affecting both fiscal multipliers and the possibility of events in the future affecting consumption now. This capacity has been used in the evaluation of policies for extending working lives through raising retirement ages in Reference Barrell, Kirby and OrazganiBarrell, Kirby and Orazgani (2011).
Factor demands and the labour market
On the supply side, Institute models have generally been based around a production function of varying degrees of complexity, depending on the application. For example, in their work for the UK model, Reference Riley and YoungRiley and Young (2007) looked at the effect of skill-biased technical change on equilibrium unemployment. They estimated the parameters of sectoral production functions allowing for substitution across five different skill groups. Factor demands in NiGEM have been based on a simpler approach using an underlying aggregate production function. This is assumed to be of the Constant Elasticity of Substitution (CES) form. Reference Barrell and PainBarrell and Pain (1997) use the implied labour demand relationship to estimate both the elasticity of substation (which averages around a half) and the (coefficient on) technical progress. The availability of capital stock data makes it possible to integrate this back in to a capital demand and investment relationship. This can also be used to infer capacity output given available factor inputs.
Equilibrium unemployment or the NAIRU is determined by a labour demand equation based on the CES production function in the model and on forward looking wage equations based on bargaining as in Reference Anderton and BarrellAnderton and Barrell (1995). In the model the NAIRU varies over time as it is correctly specified as an identity depending on the real exchange rate, consumer and producer prices and the tax wedge as well as the factors affecting the wage bargain in the wage equation and the parameters of the production function. The existence of an equilibrium in the labour market in each of the economies covered by the model made it possible to undertake analysis of the factors affecting migration flows in Europe (Reference Barrell, Gottschalk, Kirby and OrazganiBarrell, Gottschalk, Kirby and Orazgani, 2009) and their impact on the European economies and elsewhere (Reference Barrell, Davis, Karim and LiadzeBarrell, Riley and Fitzgerald 2010). The existence of a reasonable labour market with a clear, unique, but changing equilibrium was essential to much of the policy work undertaken with the Global model.
3. Developments in policy analysis
The macroeconomic models developed at the Institute have always been used as frameworks for the discussion of economic policy. But, as discussed by Stephen Hall and Brian Henry in this Review, the 1976 Lucas critique called into question the validity of using models as then specified for this purpose. As the models became increasingly coherent, partly in reaction to the Lucas critique, policy analysis also needed to become more sophisticated. In particular, policy could no longer be assumed to be completely exogenous. This is because once current behaviour was allowed to be influenced by future policy, as it would be with rational expectations, it also becomes necessary to explain how future policy would respond to changes in the economy that might be brought about by current changes in policy. This means that it is necessary to include feedback rules within the model that explain the reaction of governments and the monetary authorities to future developments.
Much of the work in introducing policy rules into the Institute models was carried out in the 1990s following the UK's exit from the European Monetary System and the adoption of an inflation target. A good summary of this general research theme is Reference WestawayWestaway (1995).
Monetary policy rules determined the short-term interest rate, whilst the long-term rate affected investment decisions, financial asset prices, the price of houses and interest payments on government debt. These two interest rates are linked through the term structure with current and expected short-term rates determining the long-term rate in both forecasts and policy analysis. The use of more than one interest rate on the models distinguished them from scaled down DSGE models and allowed a much richer role for expectations in policy analysis.
Prior to the early 1990s, a whole range of intermediate monetary targets had been adopted in the UK, and the Institute model had been used to analyse different scenarios with varying degrees of success. For example, keeping the exchange rate within a target band could be modelled by an appropriate nonlinear control rule but, whilst feasible, turned out to be cumbersome and certainly fragile. The adoption of inflation targeting made the specification of a satisfactory policy easier in the sense that the final target could be achieved by an appropriately designed rule.
Developing policy analysis
In the 1980s, two Institute modellers, Andy Blake and Peter Westaway, had been part of the ‘Stagflation Project’, a group centred on James Meade at Cambridge who had long considered economic policy as a problem that could be addressed with the help of experts in control. Much of that work involved formally designing policies using classical control methods – effectively what an economist would now call instrument rules – culminating in Reference Weale, Blake, Christodoulakis, Meade and VinesWeale et al. (1989). (Another member of that project, Martin Weale, subsequently became Director of the Institute in 1995.) A key part of such methods is using interactive design processes which meant that policy rules were evaluated with the help of model simulations under a number of different scenarios, adhering to a variety of design criteria, such as overall stability. This meant using small approximating models, obtained either as reduced versions of the larger empirical models or small open economy models that replicated the structure of larger models but were independently constructed.
This was something rather revolutionary in its time, and the marriage of small simulation models and empirical models proved fruitful. In particular, adopting an early form of the four-equation macroeconomic model based on the Reference Fuhrer and MooreFuhrer and Moore (1995) specification of the Phillips Curve produced the insight that small simulation models could effectively capture the salient features of larger empirical ones. There was an important caveat to this: small and large models behaved in qualitatively similar ways once model inadequacies had been corrected. Policy experiments on small simulation models – models strikingly similar in their responses to a contemporary DSGE model despite essentially being constructed as small versions of the Institute model – when replicated on the Institute model revealed inadequacies that the forecasting process under different policy regimes had not.
For example, when the small model with an appropriate interest rate feedback rule was subjected to an inflation target shock such that the desired inflation rate was lowered, the inflation rate converged quickly to the new target rate. When the same rule was applied to the large empirical model, the inflation rate failed to reach the target and often displayed small but significant offsets. The problem was that some parts of the empirical model implied that a doubling of all prices wouldn't leave the real equilibrium unchanged – static price homogeneity failed to hold. In other simulations this did hold but a doubling of all inflation rates affected the real equilibrium – dynamic price homogeneity failed to hold. Once both of these features were corrected, policy experiments and impulse responses on the large empirical model replicated those of the small model. Thus long-run monetary neutrality could be imposed on the model with only moderate and wholly data-acceptable tweaking of the parameters. Thus the model embodied long-run properties that the New Classical literature had established as economically justified, whilst at the same time incorporating empirically-driven frictions that motivated short-run stabilisation. This was something of a departure from previous generations of Keynesian models, and whilst reflecting mainly developments by modelling practitioners around the world, at the Institute this was specifically driven by the new monetary policy arrangements in the UK.
Against this background, the Institute was uniquely placed to contribute to the inflation targeting literature of the time, with exhaustive experiments detailed in Reference Blake and WestawayBlake and Westaway (1994). The Cambridge Stagflation project had imbued its members with the need for ‘integral control’, which involved keeping at least half an eye on the price level as much as the inflation rate. This argument was at least temporarily lost after about 2000 when the Taylor rule (Reference TaylorTaylor, 1993) became the ubiquitous interest rate rule of choice, and perhaps for good reason. Such a rule works surprisingly well in normal times, times that the Institute modellers knew would be transient. One possible characterisation of the difference between the Institute approach to policy developed at this time and, say, Taylor rule advocates was a concern for a rule design that worked in the widest possible circumstances rather than the policy with the highest empirical support. Whilst the model was a necessarily imperfect description of the empirical experience, policy rules were treated as an entirely normative exercise.
A major policy-related strand of work was the investigation of the aforementioned inflation targeting framework. Recall the ‘inflation target’ shock, where the policymaker announces a lower (or these days perhaps higher) target value. The astounding (and what would now be called neo-Fisherian) result was that the announcement of a lower inflation target with an effectively designed policy rule meant an instantaneously lower nominal interest rate. Reference Blake and WestawayBlake and Westaway (1996) argued, in keeping with Hall's pioneering work, that price-setters would need to learn that the policy maker was indeed committed to the new target, and would therefore need to show commitment and gain credibility. This revealed an equilibrium where a need to demonstrate commitment yielded at least a modest tightening in policy in a small simulation model, and it turned out that the large empirical model mimicked the result.
In a similar vein, Reference Pain, Weale and YoungPain, Weale and Young (1997) estimated feedback rules for UK government spending that have continued to be used as an important part of the Institute's analysis of fiscal policy.
A wide variety of policy rules have been implemented on NiGEM and numerous variants are available for policymakers and private-sector users who may wish to design their own interest rate feedbacks, fiscal policy rules and exchange rate arrangements in order to analyse the influence of different policy frameworks.Footnote 5 These frameworks had to take account of the zero lower bound for short-term interest rates as Japan, one of the core NiGEM countries, reached this bound in the 1990s. It affected both forecasts and policy analyses, and a significant amount of work was undertaken on unconventional monetary policy (later known as QE) in 1998 and 1999 whilst working with the Japanese National Institute, the Brooking Institute and various official modelling groups in the US and Europe.
The early versions of monetary policy rules included simple Taylor rules and nominal GDP targeting, as is described in Reference Barrell, Dury and HurstBarrell, Dury and Hurst (2003) and Reference Barrell, Becker, Byrne, Gottschalk, Hurst and van WelsumBarrell et al. (2004). The latter paper built on the work of Reference Barrell and SeftonBarrell and Sefton (1997) which discussed fiscal solvency and the debt stock. The model incorporated various solvency feedback rules for taxes from the early 1990s. In the 1980s many models were not stock-flow consistent and lacked a full description of how government deficits accumulated onto the debt stock, how the current account cumulated onto the foreign asset stock and also how saving and investment affected personal wealth and the capital stock. This stock-flow consistency with forward looking expectations is essential for the analysis of polices such as the evaluation of post crisis reductions in VAT undertaken by Reference Barrell and WealeBarrell and Weale (2009).
Brexit
In addition to monetary and fiscal policy, the Institute models have been used to quantify the macroeconomic effect of a range of other policies. As an example, in the late 1990s, the Institute used the UK model to estimate the macroeconomic impact of UK withdrawal from the European Union (EU) (published in Reference Pain and YoungPain and Young, 2004).Footnote 6 While there was an active debate at the time about the costs and benefits of EU membership, there was little public discussion of withdrawal from the EU. This meant that the analysis seemed to be purely hypothetical. But it became extremely relevant following the May 2015 general election when the new Conservative government immediately pledged to hold a referendum on the UK's membership of the EU by the end of 2017.
This prompted new model-based analysis of the risks (or rewards) and the balance of those risks posed by a UK exit from the EU. The paper by Ebell, Reference Ebell, Hurst and WarrenHurst and Warren, (2016) extended previous work with the model, and suggested that in the long run output would be between 3 and 6 per cent lower than it would otherwise have been, with the higher figure taking into account the reduction in competitive pressure that would come from leaving the EU. The May 2016 Review, published six weeks ahead of the referendum, contained a range of analysis of the effects of EU exit, including a commentary on the economic consequences of leaving the EU. This put the long-term adverse effect on GDP of EU exit at about “1.5 to 3.7 per cent”. As was recognised at the time, there is considerable uncertainty about the effects on the UK economy of leaving the EU, not least because it was not clear what trade deals would be negotiated (Reference Armstrong and PortesArmstrong and Portes, 2016). Arguably of more importance than the estimated percentage effect on GDP is that this type of analysis provides a framework for discussion of policies by elucidating the various channels by which they take effect. In this way it can highlight what are the key factors that determine the response to a policy, how adverse effects might be mitigated, and where further research is needed.
Uncertainty
Another important development was the treatment of risk and uncertainty in both the forecast and policy analysis. A special edition of this Review in 1996 discussed this in detail. In that issue Reference BlakeBlake (1996) argues that the use of stochastic simulation conditioned on a ‘sensible’ monetary policy rule is a consistent way of displaying forecast uncertainty. This model-based analysis was developed essentially simultaneously with the Bank of England's fan chart, but on a somewhat different statistical basis. This approach suggested a way of communicating uncertainty probabilistically grounded in the data, one which is now in widespread use.
While the effects of policies are uncertain, policy frameworks can be designed in such a way as to reduce uncertainty in the economy. Standard policy analysis, as discussed above, relies on notional innovations and investigates standard patterns of responses. Although this is useful, it does not help us look at uncertainty, as many shocks to many markets can happen, and all estimated equations on a model have a residual element. Evaluations of uncertainty using NiGEM have used whitened historical residuals and applied them repeatedly and randomly to the model projections in order to put forecast uncertainty bounds around those projections. The same bootstrapping technique has been used to evaluate policy regimes and policy rules.
The uncertainty bounds produced by stochastic evaluation of policy rules are sometimes asymmetric, for instance when there is a possibility of the economy hitting the zero lower bound for short-term interest rates. This asymmetry is demonstrated to have been important for low interest rate Japan around 2000 in Reference Barrell, Hendry and EricssonBarrell (2001), with much larger negative than positive uncertainty bounds as monetary policy would be unable to respond in a conventional way to further deflationary shocks once interest rates had fallen to zero. Reference Barrell and DuryBarrell and Dury (2000a) used stochastic model repetitions and evaluated the impacts of the UK joining EMU on the UK and European economies, and suggested that the reduction in uncertainty associated with exchange rate stability with major trading partners would raise UK GDP because risk premia for investment and for trade would be reduced. Reference Barrell and DuryBarrell and Dury (2000b) extended this analysis to monetary targeting regimes in general, showing that a price level target would be better than inflation targeting for other regimes as it would reduce long-run uncertainty and hence raise output. Reference Barrell, Hall and HurstBarrell, Hall and Hurst (2006) evaluated the optimality of inflation targeting as against a nominal GDP target, and suggested that stochastic analysis indicated that the latter would reduce uncertainty more than the former. It is also possible to use stochastic simulation analyses for evaluating fiscal targeting regimes. Reference Barrell and PinaBarrell and Pina (2004) looked at the feasibility of the Stability and Growth Pact in Europe, and suggested that the targets would often be breached without tighter fiscal deficit feedbacks on to taxes. Reference Barrell, Hurst, Mitchell, Larch and MartinsBarrell, Hurst and Mitchell (2007) used similar techniques to look at cyclically adjusted budget balance in Europe in a world with uncertainty similar to that observed in the past.
4. Forecasting performance
It goes without saying that there is considerable uncertainty associated with economic forecasts. As noted earlier, the Institute has provided a regular quarterly forecast of the UK economy since 1959. For many years this was presented as a point forecast for the key variables of interest. In practice, most users would have understood that any forecast has an error range associated with it, but this was not made explicit until 1996 when the Institute started to publish its estimates of the probabilities of different outcomes for GDP growth and inflation. Ironically, it was around this time that the Institute achieved external recognition for the accuracy of its point forecasts. In February 1996, it was awarded the ‘Golden Guru’ award of the Independent newspaper, and in December came top of the Sunday Times table of 45 forecasters for the second successive year – a unique achievement described by that newspaper as “close to a miracle”.
Despite the occasional accuracy of its point forecasts, the Institute has been careful over the years to emphasise that these are prone to error. It has published many articles setting out the errors in its point forecasts, the most recent being Reference Kirby, Paramaguru and WarrenKirby et al. (2015).
The recent performance of point forecasts for UK GDP growth and inflation is shown visually in figures 1 and 2. Each dashed line shows the three-year ahead forecast from the time the forecast was made. For the GDP growth forecasts, the top panel shows the forecast alongside the latest estimate of GDP growth available at the time each forecast was made, and the lower panel shows the forecast alongside the latest estimate of GDP growth available now. It can be seen from the lower panel that the starting point is often quite different from the latest estimate of GDP growth, reflecting the tendency for estimated growth to be revised. One of the themes of Institute work over the years has been the need for better measurement of the current state of the economy. The Institute has recently developed a GDP Tracker as a focus for its assessment of current trends, and is working with the Office for National Statistics to improve official estimates.
![](https://static.cambridge.org/binary/version/id/urn:cambridge.org:id:binary:20200302061741793-0603:S0027950100002209:S0027950100002209-fig1.png?pub-status=live)
Figure 1. GDP growth forecasts and different vintages of back data
![](https://static.cambridge.org/binary/version/id/urn:cambridge.org:id:binary:20200302061741793-0603:S0027950100002209:S0027950100002209-fig2.png?pub-status=live)
Figure 2. CPI inflation forecasts and back data
It is clear from recent forecast performance, shown in figures 1 and 2, that the forecast errors appear to be due to two types of causes: a large set of individually unimportant factors that affect individual economic relationships (the residuals on the estimated equations) and occasional uncertain large-scale events such as financial crises. Crises have large effects on the longer-term prospects of the economy, but the best a forecaster can do is give a data dependent probability forecast of one happening, whilst normally basing the modal forecast on the assumption it will not do so. These probabilities do change over time and depend on economic factors. Institute forecasts have taken these possibilities into account for some time. The best example, at least in terms of outcomes, is probably the October 1998 forecast around the time of the Asian Financial Crisis. The possible outcomes for the world economy were described as bimodal, with the core forecast being the upper mode where the crisis did not spread to US and European banking systems. The lower mode reflected the impacts of a banking crisis in these countries. This bimodality of possible outcomes is important and is repeated every few years. In 2008, for instance, the Institute had been warning for some time that a banking crisis was becoming more likely, but it assumed that the authorities would be able to step in before the situation had become too difficult. Unfortunately, they did not do so, and a sketch of the consequences was contained in the October 2008 Review which was devoted entirely to the causes and consequences of the crisis. The scale of the disaster was significantly underestimated, but the importance of studying rare events and their impacts was emphasised.
Financial crises are endemic to market economies, and became more frequent in the OECD by decade after the collapse of the Bretton Woods system in 1972. Crises are damaging, and generally leave a permanent scar on output, and hence if they can be avoided at a reasonable cost, it would be worth doing so. Institute staff had been working on the determinants and effects of financial crises for some years before 2008. This work was of value when the team used the model and other tools to investigate the causes and responses to the 2007–8 crisis in a report for the FSA (Reference Barrell, Davis, Fic, Holland, Kirby and LiadzeBarrell et al., 2009). The analysis linked the effects of crises to the spread between borrowing and lending rates which had been put in place on the model before the crisis broke in 2007. A crisis raises risk premia, initially by large amounts, and this reduces investment spending and consumption plans. The policy response of raising bank capital requirements also increases the spread between borrowing and lending rates, and hence a cost–benefit analysis can be undertaken as long as it is possible to evaluate the costs of crises and the impacts of increased capital requirements on the probability of them happening. In 2008 this link was not fully understood, but was filled by Institute work.
The work on the causes of crises, summarised in Reference Barrell, Davis, Karim and LiadzeBarrell, Davis, Karim and Liadze (2010) was the first empirical paper to show the importance of capital and liquidity as defences against crises, and emphasised the role of house price bubbles as a cause of crises. It did this by focussing on the causes of crises in financially liberalised OECD countries after 1980 rather than trying to find a common cause across many heterogeneous countries and time periods. The results, which were presented at the BIS in Basel, the ECB and the Bank of England, were influential in the policy debate that followed, and supported the case for strengthening capital buffers.
The Institute model proved useful in evaluating policy responses to the crises as in Reference Barrell, Fic and HollandBarrell, Fic and Holland (2009) and Barrell Fic and Liadze (2009), as it allowed researchers to assume the world in a crisis would display different, and more backward looking, behaviour than it would in normal times. The existence of an empirically justifiable, but flexible description of the world embedded into a framework such as NiGEM allowed the Institute, in Reference Barrell and HollandBarrell and Holland (2010), to contribute rapidly to the debate on completely new measures, such as quantitative easing. Policy analysis is at the centre of macro modelling, and models must be maintained to deal with new policy problems.
More recent work by Reference Carreras, Davis, Hurst, Liadze, Piggott and WarrenCarreras et al. (2018) has shown how the model may be used to analyse the effects of the introduction of macroprudential indicators.
In recent years forecast uncertainty has been depicted in the publication of often wide ‘fan charts’ for key variables of interest. The aim is to give an indication of the probability distribution of possible outcomes rather than over-emphasising the point forecasts. But how reliable is this as an indicator of the unreliability of forecasts? The last published analysis of the Institute density forecasts for inflation is Reference MitchellMitchell (2005). This found that for the first ten years of use, the Institute overestimated the degree of forecast uncertainty. This no doubt partly reflects the unusual stability of that period and the lack of large shocks.
5. Conclusion
This article has described some of the developments in macroeconomic modelling at the Institute, focusing mainly on the past thirty years. Looking back over this period there are broadly four main developments that mirror changes in the economics profession more generally.
First, an increase in emphasis on the theoretical underpinning of the behavioural relationships in the model, particularly as regards the supply-side and long-run behaviour of the economy.
Second, an increased focus on the world economy. This has been essential given the increasing openness of the UK economy and the importance of understanding globalisation and its impacts. This has also been crucial in helping to finance the Institute's macroeconomic modelling work.
Third, an increase in focus on policy frameworks rather than specific policy measures. Institute work has shown how changes in the policy framework with regard to monetary, fiscal and macroprudential policy can make a difference to the volatility of the overall economy.
Fourth, there is a greater emphasis on the uncertainty around forecasts and how this can be characterised. There is a danger that focusing on forecast errors and the uncertainty around forecasts can be misconstrued as suggesting that the Institute does not have a clue about the future. That is not true. Many outcomes are possible and to focus on only one, other than as a narrative around the central case, is misleading. By presenting forecasts as a probability distribution, the Institute is setting out the odds of possible futures in a similar way to a bookmaker setting out the odds of a race. The forecast then provides “a scenario or set of scenarios to evaluate, to think about and discuss”(Reference ChadhaChadha, 2017). That then provides a platform for thinking about possible futures and how they can be influenced by policy changes. In so doing the Institute remains true to the mission of its founders to “carry out research into the economic and social forces that affect people's lives and to improve the understanding of those forces and the ways in which policy can bring about change”.