Hostname: page-component-745bb68f8f-f46jp Total loading time: 0 Render date: 2025-02-11T22:26:01.321Z Has data issue: false hasContentIssue false

Britain's Fiscal Watchdog: A View from the Kennel

Published online by Cambridge University Press:  29 November 2013

Rights & Permissions [Opens in a new window]

Abstract

Type
Sessional meetings: papers and abstracts of discussions
Copyright
Copyright © Institute and Faculty of Actuaries 2013 

The full lecture is available at http://www.actuaries.org.uk/.

Mr D. Cribb (CEO of the Institute and Faculty of Actuaries):

I am delighted to be joined by Robert Chote, who will be talking to us on the subject “Britain's Fiscal Watchdog: a View from the Kennel”. Robert is the Chairman of the Office for Budget Responsibility (OBR), a role he took up in October 2010. Prior to this, he was Director of the Institute for Fiscal Studies. He has been an adviser to the management of the International Monetary Fund, and Economics Editor of the Financial Times. He is a member of the Finance Committee of the University of Cambridge and a Governor of the National Institute for Economic and Social Research.

As Chairman of the OBR, Robert is tasked with providing independent economic forecasts as background to the preparation of the UK Budget. This is of great interest to members of the Institute and Faculty of Actuaries, who often use his outputs to help set the assumptions used to put a value on products such as pensions and life policies.

There are also many parallels between the work of the OBR and actuaries. At the very least, it is building forecasts from complex models and living up to the challenge of explaining the outputs from that to lay persons. We also both understand the importance of impartiality to our credibility.

We look forward to hearing Robert's insights into such a role at the heart of government.

Mr R. Chote (Chairman of the OBR)

It is a great pleasure to be here today and to have the opportunity to address such a forbiddingly numerate audience. Thank you very much for the invitation.Footnote 1

As an independent fiscal watchdog, the OBR shares an important underlying goal with the actuarial profession, namely the use of rigorous quantitative analysis to help promote better decision-making under conditions of uncertainty. In trying to fulfil this role, we also confront some of the same challenges: among them the need to develop tractable models of a complex and changing world; reliance on statistical data of varying quality; the need to emphasise uncertainty; and a requirement to communicate our findings to users with different needs and technical expertise.

Actuaries have been tackling these challenges for 250 years or so. We have been tackling them for three. What I want to do today is to reflect on our experience over those three years, and to explain why we were created; how we are setting about our tasks; and what difference we might have made to fiscal policymaking in the UK.

2. The spread of independent fiscal watchdogs

Let me start with some international and historical context.

Independent fiscal watchdogs, or ‘fiscal councils’ as they are often referred to, exist to provide authoritative and non-partisan analysis of public sector finances. Some countries have had them for many years: the Netherlands since 1945, Denmark since 1962, the US since 1974 and Belgium since 1989.

The idea gained fresh momentum among academic economists during the mid-1990s. This reflected a desire to emulate in some dimensions of fiscal policy the apparent success of independent central banks in taking the politics out of interest-rate setting and thereby increasing the credibility of governments’ promises to deliver low inflation. The explosion of government deficits and debts that followed the recent financial crisis provided further encouragement, partly because explicit government targets (‘fiscal rules’) had proved insufficient on their own to ensure prudent management of the public finances before the crisis, and partly because governments wanted to boost the credibility of their promises to act virtuously after the crisis. We have seen a wave of fiscal councils created as a result.

Different definitions of what constitutes a fiscal council encompass different institutions. The International Monetary Fund has defined a fiscal council as

“A permanent agency with a statutory or executive mandate to perform independently from partisan influence at least one of the following functions:–

  • to assess the Government's fiscal policies, plans and performance against macroeconomic objectives related to the long-term sustainability of public finances, short-to-medium term macroeconomic stability, and other official objectives;

  • to contribute to the use of unbiased macroeconomic and budgetary forecasts in budget preparation (through forecasting and scoring functions, or by proposing prudent levels of key parameters);

  • to identify sensible fiscal policy options and, possibly, formulate recommendations, and;

  • to facilitate the implementation of fiscal policy rules.”Footnote 2

Based on this definition, the IMF estimates that fiscal councils now operate in 27 countries, with the majority created since the crisis. Most of the recent additions are in the European Union, where the 2012 ‘Fiscal Compact’ treaty recommended that independent bodies monitor compliance with national fiscal rules, while the subsequent ‘two-pack’ agreement mandated Eurozone countries to establish an “independent fiscal body” to produce “independent macroeconomic forecasts” for budget preparation. Beyond Europe, fiscal councils have been created as parliamentary budget offices in the likes of Canada, Kenya and Australia, with a new PBO currently being set up in South Africa. Some countries have created completely new institutions, like the OBR, while others have adapted or augmented the role of existing institutions. In France, for example, the newly created High Council of Public Finance is based in the country's court of auditors (the ‘Cour des comptes’).

So what is the attraction of setting up a fiscal council?

The core analytical argument is that, left to their own devices, democratic governments are prone to what economists call ‘deficit bias’ and ‘pro-cyclicality’ in their management of the public finances. In English, this means that on average they borrow more than they should, and that in particular they spend too much or tax too little when the economy and the public finances both appear to be performing well.

Lars Calmfors and Simon Wren-LewisFootnote 3 have identified six potential sources of deficit bias that might recommend the creation of a fiscal watchdog:–

  • First, over-optimism and differential access to information. In their published forecasts, governments may tend to be over-optimistic about the outlook for the public finances. They may be over-optimistic about the outlook for the economy, about the performance of revenues and spending in any given state of the economy, or both. On occasion this may be a cynical attempt to disguise the need for painful fiscal action until after an upcoming election. Ministers may be seduced by their own rhetoric and genuinely mistake a temporary cyclical improvement in the public finances for a permanent improvement arising from their own far-sighted policies. If outsiders express doubts about the forecasts, the Government can argue with justice that it has access to real-time information on spending and revenues that the outsiders lack. A watchdog has less temptation to be over-optimistic and can be given the same information that the government sees.

  • Second, impatience. Ministers may be less forward-looking than voters when they set their tax and spending policies, because they are focused on winning the next election. The voters may be unable to discipline this short-sighted behaviour effectively, because elections are not fought solely over the state of the public finances. A watchdog can focus discipline in this particular area. It can also encourage the electorate to resist irresponsible tax cuts or spending increases by publicising their likely long-term consequences.

  • Third, electoral competition. A government that knows it may be voted out of office at the next election may borrow more than is optimal because it knows there is some probability that the costs of doing so will have to be borne by its opponents. In extremis, a government that actively expects to lose an election may borrow more in order to reduce its opponents’ room for manoeuvre when they are in office. A watchdog can warn against both.

  • Fourth, ‘common pool’ problems. Governments may borrow too much because different powerful groups in society lobby for higher spending and tax cuts without considering the consequences for the public finances as a whole. This is likely to be a particular problem in countries where spending ministries are powerful relative to the finance ministry. A watchdog can strengthen the hand of a virtuous finance minister in resisting such pressure.

  • Fifth, use of fiscal policy for macroeconomic management. If a government uses fiscal policy to help manage the overall amount of spending in the economy, then fiscal watchdogs may help reduce deficit bias in the same way that independent central banks help prevent inflation bias – by helping them to keep their promises to behave responsibly. This does not mean that fiscal watchdogs will always stand in the way of increasing government borrowing. Indeed, if their remit permits they can be uniquely well placed to reassure voters and investors that the use of temporary fiscal stimulus measures to help stabilise the economy need not compromise the long-term sustainability of the public finances. We have seen this happen in Sweden, where the Fiscal Policy Council was more enthusiastic about the use of such measures during the recent crisis than the Government was.

  • Sixth, exploiting future generations. Today's taxpayers may be tempted to finance their consumption of public services by borrowing rather than taxing, because that leaves tomorrow's taxpayers to pick up the bill. A watchdog more sensitive to the verdict of history can act in effect as a representative of future generations (although it is not obvious why a government more concerned about the opinion of the current generation would defer to it).

The role and structure of fiscal councils varies quite a lot from country to country, probably more so than the design of central banks. The differences reflect a number of country-specific factors, among them the relative importance of the different sources of deficit bias; the system of government; the existence of other domestic institutions with a watchdog role; and the importance of supranational scrutiny.

Taking these factors in turn, the distinctive features of the OBR in part reflect:–

  • first, the perception that deficit bias has been exacerbated in the UK by episodes in which ministers have made systematically over-optimistic fiscal forecasts, as well as ‘moving the goalposts’ of their fiscal targets;

  • second, the fact that by international standards the Executive in the UK is powerful relative to Parliament in fiscal policymaking, while the Treasury (the finance ministry) is powerful relative to individual spending ministries;

  • third, the fact that there are already independent bodies – notably the Institute for Fiscal Studies – that regularly scrutinise the UK public finances, but without access to all the information available to ministers, and;

  • fourth, the fact that while the UK is formally subject to supranational fiscal scrutiny by the EU, this process has almost no domestic political impact.

3. The remit of the OBR

So let me now describe the remit of the OBR and how we compare internationally.

The OBR was set up in interim form immediately after the May 2010 general election. (The incoming Coalition Government wanted an independent assessment of the public finances before its ‘emergency Budget’ in June 2010, which increased the size and pace of the fiscal consolidation plan that it inherited from the outgoing Labour government). A three-person interim ‘Budget Responsibility Committee’ (BRC) was appointed, chaired by the former Treasury chief economic adviser Sir Alan Budd and aided by a small secretariat of Treasury officials. They produced a forecast for the public finances shortly before the June Budget, followed by a second forecast on Budget day itself that incorporated the impact of the newly announced measures.

With the June Budget forecasting exercise complete, the permanent structure of the OBR was then put in place. A new Budget Responsibility Committee was appointed in October 2010, consisting of myself, Steve Nickell, a distinguished economist and former member of the Bank of England's Monetary Policy Committee and Graham Parker, an experienced former Treasury and Inland Revenue fiscal forecaster. An Oversight Board was also created, comprising the three BRC members plus two external members, who in effect function as our Non-Executive Directors. The OBR also agreed an annual budget with the Treasury of £1.75 million each year to 2014–15 (subsequently raised slightly to £1.775 m to cover the VAT on our office rental payments). This pays for a dedicated staff of 18 civil servants, plus the committee. The OBR has complete freedom over its staffing decisions.

The OBR's permanent role and structure was set out in legislation in the Budget Responsibility and National Audit Act, under which we became a statutory body on 4 April 2011. The Act states that the overarching role of the OBR is: “to examine and report on the sustainability of the public finances”.

In practice we have four main tasks set out below.

  • First, we are now responsible for producing the official five-year forecasts for the economy and the public finances that the Chancellor of the Exchequer was required to produce twice each year by the 1975 Industry Act. One accompanies the Chancellor's Budget (usually in March), while the other accompanies his Autumn Statement (usually in late November).

  • Second, we use these five-year forecasts to assess whether the Government has a better than 50 per cent chance of achieving the fiscal targets that it has set itself on its existing policies. There are currently two such targets:–

    • The ‘fiscal mandate’ requires the government to set policy so as to achieve balance or surplus in the structural current budget five years ahead (in other words, to ensure that it will not borrow other than to finance capital investment, after adjusting for the temporary impact of any spare capacity in the economy on revenues and spending).

    • The ‘supplementary target’ requires the ratio of debt to GDP to be falling in 2015–16. This target date does not roll forward.

  • Third, we scrutinise the fiscal cost or saving resulting from each tax and welfare spending measure announced in the Budget or Autumn Statement. The Treasury comes to us with a policy costing (usually prepared by HM Revenue & Customs or the Department for Work and Pensions) and we discuss it with the officials from those departments and suggest amendments. The Treasury publishes its own final estimate of the costing and we say publicly: “yes, we agree”, “no, we don't” or “we were not given sufficient time or information to reach a judgement”. If we agree, that costing is incorporated in our forecast. If we disagree, we incorporate our own costing. In the absence of sufficient time or information to make a judgement, we include the Treasury's costing or one of our own as a provisional estimate and then revisit it in the subsequent forecast.

  • Fourth, we analyse the health of the public sector balance sheet and the long-term sustainability of the public finances. Our analysis of long-term sustainability is based on 50-year projections of revenues, spending and financial transactions, which allow us to capture the influence of demographic and other long-term fiscal influences.

Almost all this work is focused on the public finances at a UK-wide level.

The Government has also asked us to forecast the Scottish receipts for four taxes that it plans to devolve to the Scottish Government from April 2015 onwards – namely income tax, stamp duty land tax, landfill tax and the aggregates levy. We have been publishing these forecasts since March 2012, alongside our UK revenue forecasts.

So how does this remit compare with that of our counterparts in other countries?

The first point to make is that like all other existing independent fiscal watchdogs, and in marked contrast to independent central banks, we are not ourselves policymakers. The Government is not compelled to do what we think would be necessary to give itself a better than 50 per cent chance of meeting its fiscal targets, and we certainly do not have any tax or spending instruments under our control.

Beyond that common feature, fiscal council remits differ widely from country to country. By international standards, the OBR has a relatively narrow remit, focusing very much on fiscal rather than broader policy analysis. In contrast, the Danish, Dutch, German, South Korean and Swedish watchdogs (for example) all comment on employment, growth and other structural policies too. The US Congressional Budget Office (CBO) looks at labour markets, employment policy and climate change. The CBO and the Canadian Parliamentary Budget Office also examine particular public services spending streams and projects in areas such as defence and healthcare.

Our mandate is also narrower than that of some of our counterparts in that we confine ourselves to positive analysis rather than making normative policy recommendations. Indeed, Calmfors and Wren-Lewis argue that the OBR is “the most extreme case of positive analysis”, because Parliament has instructed us not even to look at the impact of different policy options, let alone to recommend between them. Rather we are to confine ourselves to analysis of the current policies of the current government. At the other end of the spectrum, the CPB Bureau of Economic Policy Analysis in the Netherlands formally assesses the economic and fiscal consequences of political parties’ manifestos ahead of general elections. It will be interesting to see if there is any demand for the OBR to look at alternative policies once we have established ourselves more firmly in our current role.

In common with roughly half our counterparts, the OBR produces and publishes macroeconomic forecasts – but in our case purely as an input into our forecasts for the public finances, rather than to draw wider macroeconomic policy conclusions. The Chancellor of the Exchequer gets his macroeconomic policy advice from the Treasury, based on its own work and informed by that of other economists.

Most watchdogs that publish macroeconomic and public finance forecasts do so in order to compare them to their governments’ published forecasts. (The National Assembly Budget Office in South Korea is a good example). Indeed, it can be hard to critique a government's fiscal forecast authoritatively without a reasonably fully articulated alternative forecast of your own.

The UK is unusual in that the Government no longer publishes economic and fiscal forecasts itself. It has in effect ‘contracted out’ the task of producing the official forecasts to the OBR and ministers then have to decide how to respond to them. This requires the OBR to interact more closely with ministers and officials as they make their fiscal policy decisions than most other fiscal councils have to do.

4. Independence and accountability

The whole point of creating an independent fiscal watchdog is for it to provide (and to be seen to provide) analysis that is based on professional judgement rather than politically motivated wishful thinking. Achieving and demonstrating independence from political influence is therefore of paramount importance. (This is especially true for the OBR, because of the behind-the-scenes working relationship with ministers and officials that is necessitated by our remit). Independence has to be accompanied by accountability. Fiscal watchdogs may not have formal policymaking powers, but they are publicly funded bodies that can exert significant influence over important policy decisions. So they need to be accountable for their actions.

Independence and accountability can be underpinned by ensuring that watchdogs have clear formal rights and responsibilities, set down in legislation and in other published agreements with their governments and parliaments. But, as I shall discuss in a moment, just as important (and probably much more important) is how the watchdog behaves in practice and, in particular, how transparently it operates.

The OBR's formal rights and responsibilities are set out in three documents:–

  • the Budget Responsibility and National Audit Act, our framing legislation;

  • the Charter for Budget Responsibility, a document required by the Act, in which the Government sets out its approach to fiscal policy, and;

  • a Memorandum of Understanding (MOU) between the OBR and those government departments with which it interacts most, namely HM Treasury, HM Revenue and Customs and the Department for Work and Pensions.

These documents, all on our websiteFootnote 4 , set out some things we have to do, some things we cannot do, and an overarching duty that we are free to fulfil as we see fit, subject to these ‘do's and don'ts’ and to some broad principles of good behaviour.

The Act sets out the overarching duty of the OBR to “examine and report on the sustainability of the public finances”, but also identifies specific duties to:–

  • prepare two sets of fiscal and economic forecasts each financial year;

  • assess the extent to which the fiscal mandate is likely to be achieved;

  • assess the accuracy of our previous forecasts; and

  • analyse the sustainability of the public finances.

The Act gives us “complete discretion in the performance of [our] duty”, as long as we perform it “objectively, transparently and impartially” and as long as we take into account current government policies but not alternative policies. (It should be said that defining ‘current government policies’ is not always as straightforward as it sounds, particularly when making projections over five decades rather than five years). The Act also requires us to operate “efficiently and cost-effectively”.

The Charter says that our independence includes complete discretion to decide:–

  • the methodology underpinning our forecasts, assessments and analyses;

  • the judgments we make in producing these outputs;

  • the content of our publications; and

  • our work programme of research and additional analysis.

The Charter specifies some material that we have to include in our forecasts and gives the Chancellor the right to determine the length of the forecast horizon (subject to a minimum of five years). It states that the Government remains responsible for policy decisions and costings and says that we “should not provide normative commentary on the particular merits of government policies”, tempting though that sometimes is.

Importantly, the Act gives the OBR “right of access (at any reasonable time) to all Government information which it may reasonably require for the performance of its duty”. We are “entitled to require from any person holding or accountable for any government information any assistance or explanation which the Office reasonably thinks necessary for that purpose”. Any disputes over access to information or assistance would be discussed and hopefully resolved by the Forecast Liaison Group, which is chaired by the OBR and which includes representatives of the four MOU signatories. The MOU states that: “Where it is not possible to reach agreement, issues may be escalated to the Chair of the OBR and Permanent Secretaries as appropriate”. If we were still being denied information or assistance that I believed we needed to do our job properly then I would simply make those concerns public.

A fiscal watchdog's effectiveness can be undermined by restricting the resources it has available to it, as well as the information it has access to. The Canadian Parliamentary Budget Office (PBO) had its budget reduced in 2009–10 after it released controversial reports on the costs of Canada's engagement in Afghanistan and the economic and fiscal outlook. After two years in existence the Hungarian Fiscal Council was stripped of its secretariat and transformed into a more toothless body after criticising its government's Budget for over-optimistic assumptions and lack of transparency. So no fiscal watchdog can afford to be complacent. The OBR's budget is formally part of the Treasury's (although separately identified within it) and so we could be subjected to a similar squeeze. Our main protection is to have our budget fixed several years in advance, and published, which would make it hard to exert financial pressure on a year-by-year basis.

Appointments are another potential pressure point. In Belgium, the High Council of Finance was left in limbo for several years, unable to make policy recommendations, after the Finance Minister refused to renew the mandate of existing council members or appoint new ones. In Canada the future of the PBO is currently in some doubt after the government failed to appoint a permanent successor to the outgoing Parliamentary Budget Officer before the end of his term of office.

The Canadian and Hungarian episodes demonstrate that making a fiscal watchdog formally responsible to Parliament, rather than to the Executive, does not necessarily protect it from political pressure. (On the IMF definition, fiscal councils now split roughly evenly between parliamentary budget offices, bodies attached to the Executive, and stand-alone institutions). The OBR is a ‘non-departmental public body’ under the aegis of the Treasury, but we are also formally accountable to Parliament via the Treasury Select Committee (TSC) of the House of Commons.

This accountability to Parliament has a number of elements.

  • First, the four flagship publications we are required to publish each year all have to be laid formally before Parliament, which means in practice that we can only publish them when Parliament is sitting and able to discuss them.

  • Second, the TSC can call us to give evidence on our work at any time, and it routinely does so shortly before it quizzes the Chancellor and his officials after each Budget and Autumn Statement. (We also now give regular evidence to the Finance Committee of the Scottish Parliament).

  • Third, the TSC has a role in determining the membership of the Budget Responsibility Committee. When a new member has to be appointed, the Chancellor of the Exchequer names his preferred candidate following a formal application and interview process run by the civil service. The TSC then holds confirmation hearings with the candidate and can veto the Chancellor's choice. (This is in contrast to appointments to the Bank of England's Monetary Policy Committee. For these the TSC holds hearings with new members, and expresses views on their suitability, but cannot formally veto them). The TSC also has a veto over any attempt by the Chancellor to sack a BRC member before the end of his or her term of office.

It should be noted that the Chairman of the TSC and the majority of its members come from the government parties. This has led some observers to question how reliable a guardian of our independence it can be. That said, the TSC is widely recognised by political observers to behave independently from the Executive.

At the end of the day the OBR needs to be accountable and responsive to the Chancellor and to parliamentarians, but to remain independent of both. Our ultimate responsibility is to the general public and we owe them our best judgement, whether politicians like what we say or not. It is nonetheless essential for the OBR's legitimacy that we enjoy broad support across the political spectrum. It is therefore welcome that there was cross party support for the Budget Responsibility and National Audit Act and that the Opposition supports our continued existence.

Striking a balance between accountability and independence will not always be easy. The external members of our Oversight Board can play a valuable role here. Their main task is to help ensure that we fulfil our duty in line with the principles of good behaviour set out in the Act: they formally report on our success or failure in doing so each year in our Annual Report. They also offer a second opinion and a channel of communication if we in the OBR are ever particularly concerned about the behaviour of the politicians, or if the politicians are concerned about ours.

Formal safeguards for independence have come to be seen as increasingly important as fiscal watchdogs have increased in number. Around half of all fiscal councils now have legal provisions for their independence, but this proportion rises to three-quarters for those established since 2005. The oldest fiscal council, in the Netherlands, had its limited formal independence bolstered by a decree in 2012.

Existing and putative fiscal councils will also be able to draw strength from the Principles for Independent Fiscal Institutions Footnote 5 being drawn up under the auspices of the Organisation for Economic Cooperation and Development (OECD). The current draft contains 23 principles covering local ownership, independence and non-partisanship, mandate, resources, relationship with the legislature, access to information, transparency, communication, and external evaluation. The principles should help fiscal councils resist attacks on their independence by highlighting any deviation from international best practice. At the same time the principles should also serve to encourage high standards in councils’ own behaviour.

Direct comparisons are hard to make, but I suspect that the OBR's formal governance and accountability arrangements offer pretty strong underpinning for our independence when judged by international standards. Formal safeguards, where they exist, are only helpful up to a point. Ultimately fiscal watchdogs have to rely on the informal independence and legitimacy that they develop by building a reputation for transparency and for impartial and rigorously underpinned analysis.

5. Have we made a difference?

I have described the role and structure of the OBR, and the formal foundations for our independence and accountability. Now let me turn to the way in which the OBR operates in practice and whether we have made any difference to fiscal policy making in the UK. I will focus on three issues: these are the fiscal policy process, the transparency of official fiscal analysis and the performance of official fiscal forecasts.

The fiscal policy process

The creation of the OBR has made a significant difference to the way in which Budgets and Autumn Statements to Parliament are prepared, presented and pronounced upon. By introducing us into the process, the Chancellor has certainly sacrificed some of his predecessors’ flexibility, as well as exposing himself to greater uncertainty: the underlying fiscal forecast is no longer under his control and every tax and spending measure he intends to announce has to be subjected to our scrutiny. At the same time I hope that we have injected more rigour into the process and increased people's confidence in the integrity of the published figures and analysis, whether or not they agree with the precise forecasts and costings.

Let me explain how the process now works.

  • The Charter for Budget Responsibility requires the Chancellor, under normal circumstances, to give the OBR at least 10 weeks’ notice of a Budget or an Autumn Statement. Once the date is set, the OBR and the Treasury agree a timetable according to which we will exchange the information necessary for the OBR to produce the economic and fiscal forecasts and for the Treasury to produce its ‘scorecard’ of costings for the policy measures.

  • The OBR begins by preparing a first-round economic forecast, drawing on the economic data released since the previous forecast and some preliminary judgements on the outlook for the economy. Using determinants derived from this forecast (such as the outlook for wages, profits, consumer spending, unemployment and inflation), the OBR then commissions forecasts for each individual tax and spending stream from HM Revenue and Customs, the Department for Work and Pensions and other departments. These are then collated and used to generate forecasts for overall spending and receipts and for the various measures of public sector borrowing and debt.

  • The results of the first-round economic and fiscal forecasts are then sent to the Chancellor, together with the OBR's initial assessment of the margin by which it believes the Government is likely to hit or miss its fiscal targets in the absence of new policy measures. For the March 2013 Budget the Chancellor received this ‘pre measures’ forecast about six weeks before his statement.

  • The economic and fiscal forecast then undergoes two further iterations, each incorporating new data plus further judgements on the economy and on the detail of the tax and revenue forecasts. The OBR then sends the Chancellor a final ‘pre-measures’ economic and fiscal forecast and an accompanying assessment of performance against his fiscal targets. This provides a stable base from which the Chancellor can take his final policy decisions, knowing what he needs to do to meet or miss his formal fiscal targets or other objectives with whatever margin he deems appropriate. In March 2013 the Chancellor received this forecast about two weeks before his statement.

  • At the same time as working on the forecasts, the OBR begins scrutinising the tax and welfare spending measures that the Chancellor is thinking of including in the statement. First the Treasury shows us a draft ‘scorecard’ which is an initial list of possible measures. We then discuss the scrutiny we think each measure would require with the Treasury and the responsible department (usually HMRC or DWP), based on its complexity and similarity to previous measures. The department will then send a ‘costing note’ to the OBR, setting out the details of the policy and estimating the amount of money it will raise or cost in each year of the forecast. The OBR discusses the analysis with the department and the Treasury, suggesting changes and iterating until we are happy to endorse the estimates as ‘reasonable and central’ or until the Treasury and we agree to disagree (which has not happened yet). In the case of tax measures, these discussions focus on identifying the relevant tax base and judging the potential behavioural impact of the measure from the experience of similar measures or from estimates of relevant elasticities.

  • At the outset of the forecast process, the OBR and the Treasury agree deadlines by which the OBR must be told of a proposed policy measure if it is to guarantee: first, to include its impact in the final post-measures economic forecast, and; second, to reach a judgement on the scorecard costing. In March 2013, the deadline to guarantee inclusion in the economic forecast was nine days before the statement.

  • The OBR does not scrutinise the spending of individual Whitehall departments on public services and administration, such as schools and hospitals. Instead it takes a judgement on the extent to which departments in aggregate will over or under spend the ‘Departmental Expenditure Limits’ (DELs) set for them by the Treasury. For those years of the forecast for which DELs have not yet been set, the Government currently announces a target for overall public spending growth. Combined with the OBR's bottom-up forecasts for welfare, debt interest, locally-financed and other so-called ‘Annually Managed Expenditure’, this target yields an implied envelope for DELs in those years beyond the Spending Review, which we publish.

  • During the week before the statement, the OBR prepares its final economic and fiscal forecasts for publication, by amending the final pre-measures forecast to reflect the economic and fiscal impact of the scorecard package. The final scorecard can look very different from the first draft; some measures drop off as the statement draws closer, while others are added on.

  • Even for measures that remain on the scorecard throughout, such as the precise details and the exact changes in tax allowances and rates, may be refined during and after the scrutiny process. Minor changes can be incorporated into the forecast after the deadlines referred to above, with the OBR typically closing the final post-measures forecasts on the Friday prior to the statement. This allows the Treasury to fine tune the measures to achieve the ‘bottom line’ of net giveaways and takeaways that it wants in each year.

  • On the day of the statement itself, the OBR publishes the final post-measures forecasts in the Economic and Fiscal Outlook (EFO), along with an explanation of the impact that the policy measures have had on the forecasts and on the Government's performance against its fiscal targets. The Treasury publishes its final scorecard costings, alongside other documentation on the economy and its policy decisions. The Chancellor gives his own summary of our forecasts, and the impact that they have had on his policy decisions, when he makes his statement to Parliament. We then hold a press conference to take people through the forecasts in detail and to answer any questions.

One consequence of having a formal timetable and external scrutiny by the OBR is that it is now much harder for Chancellors to add or drop measures from the package just as the documents are about to be printed, as sometimes happened in the past. Coalition government also makes last minute changes more difficult, as the key decisions have to be agreed by both governing parties. The TSC has in the past expressed concern that requiring the Chancellor to notify us of policy decisions by particular deadlines might stand in the way of ‘late political decisions’. In extremis, the Chancellor could always announce a measure without our scrutiny, leaving us to assess its impact in the subsequent forecast. In any event, most observers would not regard ‘late political decisions’ as one of the historic strengths of British Budget making.

To underline our independence, we endeavour to be as transparent as we can about our interactions with the Government during the forecast and costing process. All substantive meetings with the Chancellor, his special advisers and private office are logged on our website. We set out the dates on which we submit the draft forecasts to the Chancellor and when he and his officials receive the final publication, so that they can prepare his statement. We publish the deadlines by which we have to be informed of policy measures and we list any measures that were revealed to us after these deadlines and say how we have dealt with them.

Notwithstanding this transparency, some observers question whether we can ever be truly independent, given our behind-the-scenes interactions with officials and ministers. As I have argued, it is these interactions that allow us to add value to the work already being done by the IFS and others in scrutinising the public finances on the basis of publicly available information. It is also important to realise that our engagement with Government is not an adversarial bilateral stand-off with ‘us’ on one side of the table and ‘them’ on the other. For example, we deal with multiple departments and with both officials and their political masters, and their objectives and incentives are not necessarily the same. In the numerous trilateral challenge and scrutiny meetings at which we have discussed policy measures or parts of the fiscal forecast, the Treasury and HMRC or DWP have never arrived with a predetermined “line to take” that they then proceed to defend against us. There has always been a genuine desire by everyone to probe what the data and analysis has to tell us.

Needless to say, all these departments have much more resources than we do and some observers wonder whether we can avoid having the wool pulled over our eyes as a result. I do not think that this is too much of a risk if we maintain the right mix of skills; a combination of inside experience and outside perspective. This is one reason why taking some staff and BRC members from the ranks of former Treasury, HMRC and DWP officials is a strength rather than a weakness. They are well equipped to challenge their former departments and typically relish the opportunity too.

So what about outcomes? To date we have prepared EFO forecasts for six ‘non-emergency’ Budgets and Autumn Statements, and on three of those occasions our ‘pre-measures’ forecast has suggested that the Chancellor is more likely than not to breach one of his fiscal rules in the absence of additional policy action.

  • In our November 2011 forecast, we showed him on course to breach the fiscal mandate, primarily because we had revised down our forecasts for potential GDP and had thereby increased our estimate of the ‘structural’ budget deficit i.e. that element which would not disappear automatically as the economy returned to full strength. On that occasion the Chancellor chose to extend the expected period of public spending restraint by an additional year so as to bring himself back on course. (This was facilitated by the fact that the target date for the mandate rolls forward one year each year).

  • In December 2012 and March 2013 our ‘pre-measures’ forecasts showed the Chancellor on course to breach his supplementary target to reduce Public Sector Net Debt as a share of GDP in 2015–16, primarily because we had lowered our medium-term projections for nominal GDP growth, which reduces cash receipts and pushes up public spending as a share of GDP. On both occasions the Chancellor accepted the forecasts, but concluded that it would be better to expect to breach the rule than to announce an additional near-term fiscal tightening that would weaken economic growth.

What would have happened if a Chancellor had still been responsible for these forecasts, as well as for the policy decisions? Would the outcome have been the same, or might a more convenient set of fiscal projections have downplayed the threat of a breach? It is for others to speculate. Either way, these episodes underline three important features of the new arrangements. First, that the OBR is not afraid to tell the Chancellor that it expects him to breach his formal targets, if that is its judgement. Second, that the decision what to do about it rightly remains in the hands of elected politicians; and third, that that decision can go either way.

Tax and spending decisions may of course be influenced by objectives other than the formal targets. In the March 2013 Budget the Chancellor chose to squeeze spending by Whitehall departments at the end of the fiscal year, and to push some into future years. This ensured that our published forecast showed the most widely-watched measure of the budget deficit falling by £0.1 billion between 2011–12 and 2012–13, rather than rising as it would otherwise have done.

Needless to say, a forecast change in the deficit of £0.1 billion is fiscally and statistically insignificant, dwarfed by the likely revisions to the outturn data and by the average error in forecasting the budget deficit even at this short time horizon. In the old days the deficit forecast could probably have been kept on a downward path simply by tweaking the pre-measures forecast: it is good for the integrity of the process that this is no longer possible. In the absence of that option, and given the uncertainty surrounding the forecast and the outturn data, is it sensible to use policy measures to fine-tune changes in the budget deficit forecast to such a degree? That, in the final analysis, is a decision for the Chancellor to take.

It is possible that Chancellors may someday justify their actions on the grounds that they simply disagree with the OBR's forecast. They would be entirely within their rights to do so and that need not undermine the new arrangements. It would be recognition of the huge uncertainty that lies around all such forecasts.

The transparency of official fiscal analysis

Having previously spent almost 20 years diligently scrutinising Budget and Autumn Statement documents (first as a journalist and then at the Institute for Fiscal Studies), I was convinced when I joined the OBR that the biggest contribution we could make to the quality of fiscal policy and debate was to increase significantly the transparency of official fiscal forecasts and analysis. I hope we have done so.

Being transparent in the analysis we publish is important for several reasons.

  • First, and most obviously, we have been created “to examine and report on the sustainability of the public finances”. Transparency is essential to do that effectively. Fiscal forecasting (unlike most macroeconomic forecasting) is a highly disaggregated exercise. It involves quantifying numerous flows of spending, revenues and financial transactions, each of which can be affected by a wide range of economic and non-economic determinants, and for which the accounting treatment can sometimes be complex and counterintuitive. This requires a lot of information and analytical tools that are unavailable or obscure to private sector or academic economists, or where the benefits for them of getting on top of that information or developing those tools would not outweigh the costs. Even dedicated institutions like the Institute for Fiscal Studies only have the resources, information and analytical tools to look in detail at a subset of the public finances. Uniquely, we have to undertake a comprehensive bottom-up analysis, and Parliament has given us the resources and the access to information and analytical expertise that is necessary to do so. It is therefore incumbent on us to paint as comprehensive and comprehensible a picture as possible, levelling the playing field as best we can for everyone who wants to understand the public finances.

  • Second, transparency is necessary to build and maintain trust. We were created because of the long-standing perception that official fiscal forecasts and projections were too often shaped by political considerations rather than by professional judgement. As members of an independent and non-partisan body, that I hope will survive to work alongside future governments of every political colour, my BRC colleagues and I do not face the same temptations that ministers do when coming up with our forecasts and analysis. We cannot expect people to take what we say on trust simply because of who we are. We need to “show our working” as best we can, so that people can satisfy themselves that our analysis is indeed based on professional judgement, even if they disagree with the conclusions that we reach.

  • Third, and more parochially, transparency is a valuable source of self-discipline. By setting out our central forecasts, and the reasons for the changes in those forecasts, in considerable quantitative detail, we avoid the temptation to make analytically dubious but presentationally convenient judgements safe in the knowledge that no one will ever spot them.

One potential downside of presenting forecasts and projections in detail is that this can create a spurious sense of precision and thus de-emphasise the enormous uncertainty that lies around any point forecast for the public finances. (This is why the Bank of England has historically published less quantitative detail of its macroeconomic forecasts than we do). Given the particular need we have to “show our working”, this quantitative detail is essential in our outputs. We have made sure that we accompany it with a much more extensive discussion of uncertainty than was provided in the pre-OBR era. In the case of our medium term fiscal forecasts, we illustrate and quantify uncertainty in three ways.

  • First, we show the probability distribution around our central forecasts for the headline and target measures of the budget deficit that would be implied by the size and distribution of past official forecast errors. This allows us to give the percentage probability that the Government will meet the fiscal mandate if our forecasts are expected to be as accurate as past official ones.

  • Second, we show how sensitive the target fiscal variables are to different values for some of the key parameters in our economic forecast: notably the pace of economic growth; the amount of spare capacity in the economy; and the interest rate that the Government has to pay on its borrowing. This should help people with a different central forecast for the economy see how significant we think that difference would be in determining the Government's chances of hitting its targets.

  • Third, we use scenario analysis to highlight the impact of other important economic judgements in the forecast. These have included different scenarios for oil prices, the exchange rate, the behaviour of real wages and the outlook for the Eurozone. In choosing our scenarios, we try to address issues that are the subject of debate in the wider forecasting community, but which are not captured sufficiently in our sensitivity analysis.

Consistent with our desire for transparency, and our recognition of the importance of uncertainty, once a year we look back and systematically compare our economic and fiscal forecasts with the subsequent outturns and try to explain the inevitable differences as best we can. In October 2012, for example, we focused on trying to explain why the budget deficit had shrunk very much as expected over the previous two fiscal years, even though the real economy had performed much less strongly than we forecast in June 2010. Evaluating forecast performance regularly and publicly is important for a number of reasons. These are as an exercise in accountability, a means to improve future forecast performance, and a way to help people understand the forecasting process and the limitations of forecasting better.

These examples aside, there are numerous ways we have tried to increase the transparency of official fiscal analysis since the OBR was created. In our medium term forecast publications, for example,

  • we publish a much more comprehensive range of forecast outputs and assumptions, including in response to requests for hitherto unpublished data;

  • we provide diagnostics for changes in key revenue forecasts, separating the impact of methodological changes, economic determinants and other factors;

  • we present more detailed quantitative explanations for the changes in forecasts for tax credits, social security and public sector pension spending;

  • we provide more detailed information on the within-year spending agreements reached between the Treasury and Whitehall departments;

  • we provide more detail on financial transactions, which affect Public Sector Net Debt without affecting Public Sector Net Borrowing;

  • we highlight the impact of policy changes and statistical classification decisions that have a temporary impact on key public finance measures;

  • we present forecasts for receipts on an accruals rather than a cash basis;

  • we have reduced the size and number of obscure ‘National Accounts adjustments’ that appear in the forecasts;

  • we compare our forecasts and key assumptions with those of external forecasters more systematically; and

  • we have published a series of briefing papers explaining key parts of our forecast methodology and also hope later this year to publish full documentation for the core macroeconomic model we use.

We also publish a monthly commentary on the public finance outturn statistics, to help people interpret the latest data in light of our most recent forecasts. For example, we highlight special factors that affect the monthly profile of revenue and spending flows. These special factors often mean that a linear extrapolation of trends over the year to date offers a misleading guide to the likely full-year outcome.

We have also tried to increase the scope and transparency of the long-term fiscal analysis we produce, in our Fiscal sustainability reports. These build on the Long-term public finance reports published by the previous government.

One important advance is that we now integrate balance sheet analysis with flow analysis, helped by the publication since 2011 of Whole of Government Accounts (WGA) for the UK. The WGA are compiled on commercial accounting principles and thus cover a wider range of assets and liabilities than the conventional National Accounts measures. The WGA also provide information on provisions and contingent liabilities, which help shed light on the risks to our central fiscal forecasts and projections. (We pay particular attention to the contingent liabilities reported by HMRC, which highlight risks to future tax receipts). Balance sheet analysis provides a useful snapshot of the fiscal impact of past government decisions, but it has its limitations as a guide to fiscal sustainability. One obvious limitation is that the WGA balance sheet, like the National Accounts balance sheet, does not incorporate the public sector's most valuable financial asset, which is the present value of future tax receipts. As a consequence these measures routinely show large net liability or negative net asset positions, which overstate the fragility of the public finances. Another limitation is that the size of some of the highest-profile liabilities on the WGA balance sheet, most notably the present value of future public service pension payments, depends crucially on the discount rate used to convert the projected flows into a one-off up-front sum, and this discount rate can move quite sharply from year to year.

For these reasons, and because of their more forward-looking nature, we focus more on flow projections when assessing fiscal sustainability. In doing so, we make 50-year projections for receipts, spending and significant financial transactions (notably the granting and repayment of student loans), based on what we regard as the most sensible definition of ‘unchanged’ government policy. This highlights in particular the impact of the projected ageing of the population. We have also tried to look at a wider range of non-demographic influences, such as health service productivity on the spending side and the impact of technological change, resource depletion and labour market developments on the receipts side. Armed with these central projections, we can then make projections of public sector indebtedness and estimate what if anything needs to be done to return the public finances to a sustainable position, for example, stabilising the debt-to-GDP ratio at particular levels. As with our medium term projections, we test the sensitivity of these central projections to different values for key parameters, such as the speed and extent of population ageing, morbidity, net migration and productivity growth.

Returning briefly to our medium term forecasts, the Treasury has also helped to improve transparency by publishing ‘policy costings documents’ at the time of the Budgets and Autumn Statements. These set out the analytical and empirical basis for the costing of each tax and spending measure, including a description of the tax base and an assessment of any assumptions about behavioural effects. They reflect the conclusions of the challenge and scrutiny discussions that we engage in with the relevant departments and they include an OBR annex in which we highlight particular uncertainties and risks to the estimates.

Needless to say, providing additional information is not the be all and end all of promoting transparency. This can even be counterproductive if you simply bury people in a blizzard of detail. So we also try to provide user-friendly summaries of our work, distilling the main themes and conclusions for the less specialist consumer.

The performance of official fiscal forecasts

The most distinctive feature of the OBR's remit is that we now produce the official economic and fiscal forecasts that Chancellors of the Exchequer were previously required by law to publish. Historically Chancellors have been ambivalent about this obligation: on the one hand most did not like publishing forecasts, because they were bound to be criticised when the forecasts inevitably turned out to be wrong; but on the other hand, they took great interest in what the forecasts said, because they wanted them to demonstrate that their policies were (or would be) a success.

As an independent body, thankfully we do not face the same pressure to publish forecasts that imply the success or otherwise of government policy. Like Chancellors in the past, we do find ourselves judged in part on the accuracy of our predictions. However, freed from the political imperative to engage in ‘conviction forecasting’, it is easier for us to be honest about the uncertainties that lie around any forecast and to change them when the facts or our judgements change. I have described already how we try to be transparent in both these respects.

In thinking about the OBR's forecasting role it is important to remember that we are here to assess the outlook for the public finances, where we have access to information and analysis that is not available easily or at all to outsiders. When it comes to macroeconomic forecasting, we have no significant informational advantage over the 38 private sector, academic and international forecasters whose predictions the Treasury collates every month. Given the reputation-sapping nature of short and medium-term macroeconomic forecasting, one obvious question is why we do it at all? Why do we not take the average of the outside forecasts, or those of the Bank of England, and use them to drive our forecast for the public finances?

Unfortunately, you need a different kind of macroeconomic forecast to produce a highly disaggregated medium-term fiscal forecast than that which is produced by most outside bodies. For example, we need a forecast that extends five years (to cover the Government's target horizon), that contains a detailed breakdown of the components of nominal income and expenditure, and that contains estimates of potential GDP and the ‘output gap’ (so that we can assess progress against fiscal targets that are adjusted for the state of the economic cycle). Only six of the 38 forecasters polled by the Treasury each month submit a five-year output gap forecast. The Bank of England forecasts over a three-year horizon and only publishes its forecasts for real GDP and the target measure of CPI inflation. Generating our own economic forecast, informed of course by the views of outside forecasters, also makes it easier for us to undertake sensitivity and scenario analysis and to incorporate the expected impact of newly decided policy measures. It also helps us to ensure that our economic and fiscal forecasts are mutually consistent.

Most public discussion of macroeconomic forecast accuracy focuses on forecasts for real GDP; the volume of goods and services produced in the economy. Our near-term forecasts for GDP growth tend to be near the average of a wide outside range; sometimes above and sometimes below. The path of real GDP is much less important in explaining the behaviour of the public finances than other variables, which are: nominal GDP (the total value of cash spending in the economy), the components of nominal incomes and spending, average earnings growth,; retail and consumer price inflation, labour market developments and interest rates. This was evident from the fact that in 2010–11 and 2011–12 the budget deficit shrank as a share of GDP much as the OBR had forecast in June 2010, even though real GDP growth over this period was far weaker than we and most other forecasters expected. This was largely because nominal GDP and the labour market held up more strongly than we would have expected had we known how weak real GDP growth would turn out to be.

You also have to be wary of focusing too much on real GDP, because the estimates are prone to revision. Nominal GDP estimates are relatively stable after a year or two, but the statisticians can change their minds about the relative contribution of real GDP growth and whole economy inflation to changes in nominal GDP years after the event. This helps explain why the recession of the early 1990s now looks significantly shorter, shallower and smoother than it appeared in the mid-1990s.

Fiscal forecasting is certainly a difficult business, but what can we say about the likely impact, and the impact to date, of transferring the task from Chancellors to the OBR. Increasing the transparency and humility of official forecasts would be reason enough to do this, but can we expect it to make the forecasts more accurate?

If you believe that ministers routinely allow political considerations to infect their forecasts, while fiscal councils always rely purely on dispassionate professional judgement, then on average you would expect the answer to be “yes”. To demonstrate this conclusively, even after the event, would be impossible. To do so, you would need to compare the accuracy of OBR and ministerial forecasts, each made at the same time and on the basis of the same information. You would need to compare a very long run of these forecasts, so that you could distinguish reliably between the consequences of luck and judgement. We will not be able do this, as Chancellors no longer publish official forecasts of their own.

We can compare the forecasts that the OBR has made to date with what has happened subsequently, and then see if the average errors are bigger or smaller than those in previous official forecasts. This has obvious limitations as a guide to relative forecast accuracy. Most fundamentally, we are not comparing like with like. For example, we may be looking at periods in which the underlying behaviour of the public finances was inherently more or less predictable, in which the size and distribution of unforeseeable shocks was different, or in which policymakers responded differently when the public finances diverged from expectations. As the OBR has only produced seven forecasts so far, the sample is still very small.

The OBR's absolute errors in predicting Public Sector Net Borrowing as a share of GDP have generally been smaller than the average absolute errors in official forecasts over the previous 20 years. The same is true for our forecasts for receipts and public spending separately (see Annex A). Of course past performance is no guarantee of future performance, and it is worth remembering that in our forecasts to date we have progressively revised down our estimates of nominal GDP. This has relatively little impact on forecasts for receipts as a share of GDP, as they are in effect forecasts of the average tax rate. They will make earlier forecasts for public spending as a share of GDP look less accurate over time, as most departmental spending is planned in cash terms, while welfare and debt interest costs are linked more to inflation and interest rates than nominal GDP. So when nominal GDP is revised down, the error in predicting these categories of spending as a share of GDP will rise even if the errors in cash terms remain small.

You could compare the OBR's forecasts to those of outside forecasters, but this is more problematic as a guide to relative forecast accuracy for fiscal forecasts than for macroeconomic forecasts. As noted the OBR has no significant informational advantage over outside forecasters when making macroeconomic forecasts and so there is no reason to expect them to change their views significantly after seeing ours. It follows that there is no reason to believe that that the accuracy of outside macroeconomic forecasts would be much affected if the OBR shut up shop. I doubt that the same is true for fiscal forecasts, given the range of non-economic determinants and statistical classification decisions that affect them. Anecdotally, quite a few outside forecasters seem to start with the official fiscal forecast and then pitch their own above or below it, reflecting their views on the economy. I strongly suspect that without an initial official forecast to use as a baseline, the accuracy of most outside fiscal forecasts would be significantly worse.

Before joining the OBR, I spent many years commenting on other people's economic and fiscal forecasts, rather than producing them. From that position it was always tempting to mock the afflicted and even to reject the whole exercise as fundamentally pointless. Difficult it may be; pointless it is not. Tax and spending decisions have effects that persist long into the future and they have to be based on some assessment of how the world is likely to evolve in the absence of those decisions and as a consequence of them. It is good for accountability that those assessments are made explicit and transparent. Forecasts are also a way of summarising in a consistent fashion your imperfect understanding of what is going on in the world and they provide a baseline against which to judge the significance of new information. Forecast errors may be pounced upon, but they are not necessarily mistakes and they are the way we learn about the true state of the world. So our task is not only to produce the best forecasts we can, but also to explain the uncertainties around them and what they can and cannot be expected to achieve.

5. Conclusion

As any of you that have children know, three can be a difficult age and we are still enjoying our toddler years. I hope that what I have had to say has gone some way to persuade you that independent fiscal watchdogs can play a valuable role in promoting better policy decisions and more informed public debate.

Different institutions are confronting this challenge in different ways in different countries and we are all learning from each other. We have been very gratified by the response to our work so far from those who take an informed interest in the public finances in this country. We are always keen to know how we could fulfil our role more effectively. This audience is as well informed on many of the challenges that confront us as any we could wish for, so please don't hold back.

Mr Cribb: Thank you, Robert, for a very thorough run-down of the challenges you have in the day-to-day job of treading that line of independence when there are so many interested stakeholders.

Could I have the first question please?

Mr A. G. McLean, F.I.A.: You mentioned the United States has had a body like your own since 1975. That did not seem to stop the United States having the problems that it has had over the last several years.

You could argue that part of that problem was a mistake they made about giving mortgages to neighbourhoods that should not have had them. If you had existed in 1975 do you think we would have escaped the problems we have had for the past decade or two?

Mr Chote: That is a very good question. I think the short answer is “No”. The fiscal councils in effect have been designed to fight the war before last.

What you can reasonably say is if you had had independent bodies like the OBR in place in the run up to the financial crisis, you might have had the public finances in stronger shape when that hit. You would have had public sector net debt lower, possibly; you would have had the budget deficit lower.

Arguably, that would have put you in a stronger position to make an aggressive Keynesian response as when the crisis hit. Nobody would claim that the existence of a fiscal council would have prevented the financial crisis and the scale of the adjustments that we are confronting now.

Partly, that is because, fiscal councils have been created because of a need to try to open up the black box that gets you from a view of the economy to a view of how spending and receipts are going to behave. It is not an answer to getting a fundamentally better view of the economy.

We do not have any information on what is going on with growth, inflation, etc., that is not available to everybody else, and it would be dishonest to claim that we would have spotted that coming any more than anybody else did.

In the US, the institution we are talking about is the Congressional Budget Office. In terms of the running of fiscal policy in the US, one important difference in terms of their structure and ours, and the way in which fiscal policy operates more generally, is that in the UK the executive, the government gets the fiscal policy that it wants through Parliament.

In the United States that is not the case. Clearly, Congress has a much greater influence and there is a much greater to-ing and fro-ing and a political process for actually working out what fiscal policy is going to be, whether or not it is right. You can see that in some of the debates about wrestling with the medium fiscal term challenges there now.

Mr Cribb: If I could just ask a follow-on question: what is the prevalence of the fiscal councils in the fast-growth economies, perhaps some of the tiger economies? If they are not there, do you think they should be implementing them now during periods of growth rather than perhaps the historical trait of waiting until things hit a downturn?

Mr Chote: It is certainly true that the majority of these are in Europe. The largest and most well-known of the East Asian ones is the National Assembly Budget Office in Korea which, as I say, along with the Congressional Budget Office and the Dutch office, is one of the largest and does produce a very impressive set of outputs on this.

I was seeing people from the Japanese Foreign Ministry this morning, thinking about the possibility of setting one up. In some of the more emerging economies, where you have scrutiny from institutions like the IMF that play a much more powerful role in terms of providing supra-national scrutiny, particularly if they are borrowing countries, then that can create a different set of issues.

The IMF generally tends to go around encouraging people to have these sorts of bodies. But certainly there has been more of it in Europe in particular than in the Far East so far. It is interesting to see the model spreading.

Miss B. J. Illingworth, F.I.A.: You talked a lot about how you try and make your information accessible to the public. I was just wondering what more you think could be done by outside bodies to promote that further. Plainly, you are doing a lot, but you are probably dependent on other people doing their bit as well.

Mr Chote: That is true. It is in everybody's interest that the material the Office for National Statistics, which is a source for a lot of the background material, is producing across the whole range of economic analysis, is as accessible as possible and explained to people as clearly as possible. They are also responsible, together with the Treasury, for the public spending and revenue numbers that are published.

If you have ever tried accessing the website of the Office for National Statistics, a lot of work has been done but there is some still to be done.

I mentioned the Whole of Government Accounts. This exercise has been very many years in gestation. It is now providing a new set of information on what is going on in the public sector finances in a way that is much more digestible and comparable for people coming at it from the private sector.

We have had only two or three iterations of that so far. It will be more useful as we have more of it. It remains to be seen whether that work is going to prove to be merely interesting or useful as well.

It is throwing light on areas like the scale of the potential fiscal consequences of clinical negligence claims, for example. Until I had seen the WGA, I had no idea of the weight of that particular issue. Also, nuclear decommissioning costs.

There are some areas where there is interesting work out there that more people will get to see and know and understand.

In terms of what informed consumers can do, we have spent a lot of our time explaining the uncertainties that lie around point forecasts; explaining to people that because the net present value of public service pension liabilities has leapt, that is actually reflecting what happened to the discount rate because of what has happened to high quality corporate bond yields, not because of any fundamental underlying view of the likely flows of pension payments.

There are particular areas where bodies that have particular expertise can help inform the public debate. One has to be realistic on this. There is always a temptation with the Whole of Government Accounts in particular. I have been a journalist; you go through and you find the largest negative nominal stock number you can and that is probably your starting point.

There is a lot of work and education to be done for everybody, and we need to play our part in that as well.

Mr D. B. Martin, F.F.A.: I understand there is a proposal for the GDP in the future to include research and development. I wonder if you would like to say something about that and, particularly, the scepticism that there might be, because it is suggested that that would give a better figure for the GDP. The scepticism from the public is that this is interfering unnaturally with the statistics.

Mr Chote: Yes. We know that that is going to happen. The expectation is that is going to be included in the national accounts not this year but in 2014. The expectation is that it will increase the level of GDP.

What is less clear is what impact it will have on the rate of change between any particular points. Is that likely, for example, to make the rise in output ahead of the recession bigger and the fall bigger, etc?

So will it change people's perceptions of the shape of the economic cycle or will it basically just take the existing numbers you have and move them all up by a roughly equal amount across time, in which case that is less likely to cast a light on the success or otherwise of policy.

It is certainly not unique to the UK. The US is looking at this as well. There is a question about whether you should be treating intangible investment in this area as well. That will make the economy look bigger. Whether it changes people's view about how well or badly we have done through the economic cycle remains to be seen.

These sorts of methodological changes can make quite a difference to the perceived path of the economy. In the 1990s the recession now looks about 20% to 25% shallower a year shorter, and there is no longer a double dip if you compare it to the way those figures looked in 1994.

In considering the recession in the early 1990s, history was not rewritten until the national accounts of 1998 and onwards. Anybody who produces a forecast and looks smug because the outturn looks very much in line with it, needs to wait and then disappointment inevitably waits round the corner.

Mr J. M. Ellacot, F.I.A.: A personal observation, Mr Chote, is since the time when you were heading up the Institute for Fiscal Studies your media presence, at least from my perspective, seems to have dropped somewhat. Clearly, as you have described, you are much more on the inside now.

I would be quite interested in personal observations as to whether that insider view, and the contact you have described with senior policymakers, means that you are more effective than perhaps you were externally having a media presence and being able to pressure the Government that way.

Mr Chote: It is a very different job. It was a key part of the job that I used to do at the Institute for Fiscal Studies. For one thing, we were ranging over a much wider policy area so there were more opportunities to express views in the media on different topics.

The other role which we had then, which we do not have now, is comparing and contrasting what the different parties are saying. We did a lot of work on the different parties’ approaches to the deficit and what they were intending to do in terms of tax and spending plans; the differences in how we thought the budget deficit was likely to evolve under the three different parties’ proposals.

That is something that Parliament has explicitly told us they do not want us to do in contrast to my esteemed Dutch colleagues who do do that. Although they do it, they do not pop up on the television every week doing it. They have a formal process. This is done when there are manifestos to take a serious judgement on.

There was some debate when we were set up about whether we should have this role of looking at alternative policy options. I said when I was going through my confirmation hearing that if nobody told me one way or the other, then I would respond in that sort way because as an independent body that would have been the appropriate thing to do.

That said, I can quite understand, and I think it is quite sensible, that we were instructed not to play that role initially.

We were created at a time in which there was a huge political debate over the wisdom of how quickly you ought to do the fiscal consolidation and how much tax versus how much spending.

To have thrown an organisation that had not had a chance to establish a reputation for impartiality straight into that hugely charged political territory would have been asking for trouble.

In time, commenting on those sorts of things in a limited way is good for perceptions of your independence. The IFS did it for many, many years. It was seen as a strength rather than a weakness. Our Dutch counterparts would have the same situation, too.

It will be interesting to see if people want us to play more of that role. Similarly, as I said, Parliament instructs us not to talk about the wisdom of particular policy measures. If the Government announces a decision to allow people to give up their employment rights in favour of tax advantages, it is no longer my job to say whether that is a good idea or not.

Mr M. G. White, F.I.A.: The Banking Standards Commission is going to report soon. Can you make any comments on the impact of accounting standards, particularly the values placed on assets, on the finances of banks and, ultimately, more than just banks, as they developed in the lead up to the crisis?

Mr Chote: That is not an area in which we have looked specifically. In terms of the importance of looking at what has been going on with the banking sector prospectively, and looking backwards as well, there is a lot that is considerably important to a number of the dimensions of the things that we have to worry about, both on the fiscal side and on the economic side.

For example, if you look on the fiscal side, we need to be concerned about the profitability of the banking sector and what revenues the Government might be getting from that source.

Go back to the pre-crisis period, the banking sector was roughly 8% of GDP but was contributing about 25% of corporation tax receipts, which is one of the reasons why you see the average tax rate for the economy falling over this period because we have hit a particularly revenue rich part of it.

Looking forward, forecasting what is going to happen, that is not simply a matter of saying what gross profits are looking like because you have a lot of banks that are sitting on losses that they have yet to be set off against tax. Even if you knew what the path of the gross profits was going to be, knowing when that translates into renewed payments of receipts, is not straightforward at all.

It is one of those areas where our access to the information analysis within HMRC is obviously helpful but only up to a point because they cannot share with us taxpayer confidential information.

In those sectors, where you have a relatively small number of firms providing a relatively large proportion of receipts, that can be quite awkward when we were sitting around a month or so ago trying to think about the overall path of financial sector bonuses and the consequences that would have for income tax payments.

You have to have those conversations without mentioning any names of any well-known high street and investment banks, which can occasionally be slightly surreal. So there are those sorts of issues.

On the economy side, there are number of questions, one of which is very fundamental to the outlook for the underlying potential of the economy and therefore the underlying potential of tax receipts to rebound of their own accord without needing policy to do it. It is the question of what impact the plight of the financial system, the gumming up of credit conditions, is having on the productive potential of the economy.

We have the situation where most people think that the productive potential of the economy has fallen well below the trend that you would have seen if you had simply extrapolated a pre-crisis line.

It is linked to the productivity puzzle. Why is output per worker so much weaker than you anticipated?

The front running contender, not to explain the whole story but to explain a reasonable chunk of it, is that the difficulties in the financial sector are preventing capital being reallocated to potentially rapidly growing innovative young firms, and at the same time a combination of low interest rates, low wage increases, a reluctance of banks to crystallise losses, is keeping so-called zombie firms going long after you would have expected them to be written off by a recession of this depth.

On that particular issue, it is not something that we have looked at, but in terms of trying to understand both in the financial sector more broadly, banks in particular, it matters a lot to us, both for the fiscal outlook and for understanding what is going on in the economy more broadly.

Mr C. A. Morley, F.I.A.: You mentioned that one of the benefits of setting up a fiscal council can be achieving a better balance in the strength between finance and spending government departments.

Where do you think the UK was when OBR was setup, and has it changed that balance? Was there any need to do so? Where does the UK fit in the international perspective there? Do we have a relatively good balance or a relatively strong balance one way or the other compared to other countries?

Mr Chote: In some countries you have a separate economics and finance ministry. In the UK the Treasury is relatively powerful compared to individual spending departments. That is partly an institutional interpretation, i.e., the civil service operations in the Treasury and other departments. It is also a political observation that particularly, for example, in some countries that have had multi-party coalition governments over longer periods of time, then it is easier for spending ministries to build a power base that puts them on more equal footing with the Treasury.

One way to judge this is that it is quite striking that public spending went up quite a lot obviously in the period in the run-up to the crisis, but it was not because departments were spending more than they had been given or were budgeting for. It was because the Government thought that there was a reasonable amount of money around at the time and set out relatively generous plans, and occasionally the money came in more than they expected and they topped those up somewhat.

The striking feature, and we have been caught out by this in the last couple of years, is the extent to which government departments underspend the budgets that the Treasury gives them even in a period in which there is a squeeze overall. There is a combination there of spending departments being afraid of overspending on the grounds that the Treasury will come round with a baseball bat and beat them with it the following year, and the Treasury itself does not want to see that either.

We had this dramatic demonstration at the end of the fiscal year that has just finished where the Treasury was going to departments and saying “Please, do not spend money at the end of the year. Push money into the following year” for cash management reasons, planning reasons, but also the consequences that would have for the budget deficit.

We saw, last year, departments under-spending by £11 billion in total against the plans that had been set out only last summer. That is really very large by historical standards and shows the relative strength of top down spending control in the UK that I would suspect is probably greater than you would see in most other countries, though I would not have the data to back that up fully.

It is striking that in the years in which some people complain we were spending too much, it was not because departments were overstepping the limit, it was because the Government thought that there was the money there for them to spend.

Mr Cribb: Thank you very much for that. It is probably time that we adjourned. Before we end, may I ask you once again to express your thanks to Mr Chote for this evening's talk. [Applause]

References

1 This lecture builds on and updates my Ken Dixon Lecture on 13 June 2011, in which I reflected on the OBR's first year in existence (http://budgetresponsibility.independent.gov.uk/wordpress/docs/Ken-Dixon-lecture.pdf). My thanks to Sir Alan Budd, Xavier Debrun, Tidiane Kinda and my OBR colleagues for useful comments on an earlier draft. Remaining errors are my own.

2 IMF, 2013, The Functions and Impact of Fiscal Councils (forthcoming; Washington: International Monetary Fund)

3 “What should fiscal councils do?”, by Lars Calmfors and Simon Wren-Lewis, http://www.finanspolitiskaradet.se/download/18.3b8af0c112ec0f3879380005563/whatshouldfiscalcouncilsdo.pdf