21 Things You Need to Know About Prescott
Oft economical models are developed to describe a stable surroundings. Nosotros may want to relate the model to observed not-stationary data without taking a stand up on where the non-stationarity comes from. Hodrick and Prescott (1981, 1997) proposed a method for doing this which is widely used in economical enquiry and policy analysis. Their method has some serious drawbacks. Fortunately, there's a improve approach that avoids those problems.
At that place are many ways to characterise the HP filter.1 One of the easiest to understand is an approximation to the HP-detrended component noted past Cogley and Nason (1995). First calculate fourth-differences of the original information – the change in the change in the change in the growth charge per unit – and so take a long, smoothen, weighted average of past and future values of those differences. The weights for that averaging stride are graphed in Effigy 1.
Effigy one Weight given to value of ∆4yt+2-j (plotted as a part of j) in calculating cyclical component for date t implied past HP
Is this a reasonable process to apply to economic data? Theory suggests that variables like stock prices (Fama 1965), futures prices (Samuelson 1965), long-term interest rates (Sargent 1976, Pesando 1979), oil prices (Hamilton 2009), consumption spending (Hall 1978), inflation, tax rates, and money supply growth rates (Mankiw 1987) might all be well approximated by random walks. Certainly a random walk is often very hard to beat in out-of-sample forecasts (see, for example, Meese and Rogoff 1983 and Cheung et al. 2005 on exchange rates; Inundation and Rose 2010 on stock prices; or Atkeson and Ohanian 2001 on inflation). If HP is non a adept process to apply to a random walk, then we have no business concern using it generically on economic data.
If a variable follows a random walk, starting time-differencing removes everything that is predictable near the series. HP nevertheless goes on to have three more differences and then utilise the smoothing weights in Figure 1. This puts all kinds of patterns into the HP-filtered serial that take cipher to do with the original data-generating process and are solely an artefact of having practical the filter.
The top-left panel of Figure 2 shows the autocorrelation between the growth rate of the Southward&P500 stock price alphabetize and its growth charge per unit j quarters earlier. The meridian-correct console shows the same for existent consumption spending. There is little evidence that these variables tin be predicted from their own lagged values or from lagged values of the other variable (bottom panels), exactly as we await with a random walk.
Effigy 2 Autocorrelations and cross-correlations for first-departure of log of stock prices and existent consumption spending
Source: Hamilton (forthcoming).
Figure three shows the autocorrelations and cantankerous-correlations for HP-detrended stock prices and consumption. The rich dynamic behaviour has nothing to do with the truthful properties of the variables. The patterns in Figure three are summaries of the filter, not the data.
Figure 3 Autocorrelations and cantankerous-correlations for HP-detrended log of stock prices and real consumption spending
Source: Hamilton (forthcoming).
The top console of Figure 4 shows the raw stock-toll data in red and the HP-inferred tendency in black. The black line tells the comforting story that the stock market rollercoaster only represents transient, cyclical dynamics. After all, we concluded 2009 with stock prices near where they started the decade! The lesser panel shows the result of applying the HP filter with ane caveat: for each appointment, we act as though that was the terminal appointment in the sample and look at what HP would take implied for the trend at that date. The picture is very different – the moves all look permanent now. A existent-time observer would never know that stock prices were about to plunge in 2007 or have off in 2009. The patterns in the top panel could never take been recognised in existent time considering they have nothing to exercise with the truthful data-generating process. Information technology'southward but a pretty picture that our imagination wants to impose on the information after the fact.
Figure 4 HP trend in stock prices every bit identified using the total sample (summit panel) and using data actually bachelor at each indicated date (bottom panel)
Source: Hamilton (forthcoming).
HP is not a sensible arroyo for a random walk. Is there any time-series procedure for which it would be a expert idea? Hodrick and Prescott noted one very special case: if nosotros presume that second differences of the tendency are impossible to forecast, and that the deviation of the variable from the trend is impossible to forecast, so HP would give an optimal inference about the trend. Even so, if we accepted those assumptions, information technology would exist straightforward to apply the observed information to estimate the appropriate value of the HP smoothing parameter, commonly denoted λ. The standard practise for quarterly information is to assume that λ = 1,600. For every economical and fiscal fourth dimension serial I have looked at, I come up with an estimate from the data closer to λ = 1 – three orders of magnitude off from what practitioners assume.
I conclude that not only is HP almost certainly the wrong approach for most economic variables we will encounter; there does not exist whatever variable to which one can point for which the commonly followed practice would be the optimal thing to practise.
Fortunately there is a ameliorate alternative. I beginning propose a practical definition of the tendency that nosotros intend to remove. Sometimes economists define a tendency in terms of an infinite-horizon forecast. Simply the problem with that concept is that nosotros can never find out about an infinite horizon from a finite sample. I propose that we instead focus more practically on a ii-year horizon, something about which we do have useful information in typical sample sizes. I would further argue that the primary reason we'll miss with a two-year forecast is cyclical developments – a recession arrives that we did not anticipate, or the expansion is stronger than expected.
One might remember that fifty-fifty to make a two-year-ahead forecast, nosotros need to know something about the underlying process and trend. However, this is not the instance. We can e'er form a usable forecast based on a linear function of the four most contempo values of the series. The optimal forecast within this grade exists and can be estimated from the information, and the error associated with the forecast is stationary for a wide class of time-series processes.
Suppose for example that the growth rate of a variable is stationary. We can always write the level of the variable two years from now as the level today plus the sum of the changes over the next ii years. If the growth rate is stationary, nosotros take just written the level ii years from at present equally a linear function of the level today plus something stationary.
Alternatively, suppose that information technology is the change in the growth charge per unit that is stationary. If nosotros know the level today, the level the previous flow, and the change in the growth charge per unit each quarter over the next 2 years, we would know the level two years from now. This allows us to write the level two years from now every bit a linear function of the two most recent levels plus something stationary. If instead fourth differences are needed to reach stationarity, we tin write the level two years from now as a linear function of the four almost recent levels plus something stationary.
What happens in whatever of the to a higher place cases if we do an ordinarly least squares regression of the level two years from at present on a constant and the four most contempo values for the level? If the regression picks the coefficients that make the residuals stationary, the average squared residue will tend to some stable number in a large enough sample. If information technology picks coefficients that make the residuals nonstationary, the average squared residual goes off to infinity. Because ordinary least squares tries to minimise the residual sum of squares, in a large sample it should give an approximate of the population object of interest, namely the linear forecast of the variable two years ahead. Simple regression thus offers a reasonable approach to remove the tendency as defined here for a wide course of possible processes.
For the baseline case of a random walk, the all-time two-yr-ahead forecast is just the level today. The cyclical component from the above definition would in that instance simply be the two-year difference of the serial. I have institute for most economic variables that the 2-yr deviation is pretty similar to the regression residuals.
Nosotros can too identify exactly the same concept – the error associated with a ii-year-ahead linear forecast – in whatsoever theoretical economic model. If the theoretical model has the property that shocks die out after 2 years, the forecast fault would exactly equal the deviation from the steady land, which is indeed the concept modellers often have in heed when they think of the cyclical component. If shocks do not fully die out after ii years, and so part of what I am labelling the trend corresponds to longer-lived shocks. In any example, one tin can approximate a population object in nonstationary data that is exactly analogous to the corresponding object in the stationary model, assuasive the kind of apples-to-apples comparison that users of HP were hoping to obtain.
Figure 5 applies this procedure to the components of the United states national income accounts. Consumption is less volatile and investment is more than volatile than Gross domestic product, but the cyclical components of all three move together. The cyclical component of exports is often moving separately, while the transient component of government spending is dominated by armed services shocks such as the Korean and Vietnam wars and the Reagan armed forces build-upwardly.
Figure five Regression residuals (blackness) and two-yr changes (red) 100 times the log of components of United states national income accounts
Source: Hamilton (forthcoming).
References
Atkeson, A and L Due east Ohanian (2001), "Are Phillips curves useful for forecasting inflation?", Quarterly Review, Federal Reserve Banking company of Minneapolis, 25(1): 2-11.
Cheung, Y-W, K D Chinn and A G Pascual (2005), "Empirical commutation rate models of the nineties: Are any fit to survive?", Journal of International Money and Finance, 24(7): 1150-1175.
Cogley, T and J M Nason (1995), "Effects of the Hodrick-Prescott filter on trend and deviation stationary fourth dimension series: Implications for business concern cycle enquiry", Periodical of Economic Dynamics and Control, 19(i-ii): 253-278.
Fama, Eastward F (1965), "The behavior of stock-market prices", The Journal of Business concern, 38(1): 34-105.
Flood, R P and A K Rose (2010), "Forecasting international financial prices with fundamentals: How do stocks and exchange rates compare?", Globalization and Economic Integration, Affiliate six, Edward Elgar Publishing.
Hall, R E (1978), "Stochastic implications of the life wheel-permanent income hypothesis: Theory and evidence", Journal of Political Economy, 86(6): 971-987.
Hamilton, J D (2009), "Understanding crude oil prices", Energy Journal, 30(2): 179-206.
Hamilton, J D (Forthcoming), "Why you lot should never use the Hodrick-Prescott filter", Review of Economics and Statistics.
Hodrick, R J and E C Prescott (1981), "Postwar U.s. business organisation cycles: An empirical investigation", working paper, Northwestern University.
Hodrick, R J and Due east C Prescott (1997), "Postwar US concern cycles: An empirical investigation", Journal of Money, Credit and Banking, 29(i): 1-16.
Mankiw, N One thousand (1987), "The optimal collection of seigniorage: Theory and evidence", Periodical of Monetary Economics, 20(ii): 327-341.
Meese, R A and K Rogoff (1983), "Empirical commutation charge per unit models of the seventies: Practise they fit out of sample?", Periodical of International Economics, 14(one-2): 3-24.
Pesando, J East (1979), "On the random walk characteristics of brusque- and long-term interest rates in an efficient market", Periodical of Money, Credit and Cyberbanking, xi(4): 457-466.
Samuelson, P (1965), "Proof that properly anticipated prices fluctuate randomly", Industrial Management Review, vi(2): 41-49.
Sargent, T J (1976), "A classical macroeconometric model for the United States", Journal of Political Economy, 84(2): 207-237.
[1] See Hamilton (forthcoming) for references and discussion.
Source: https://voxeu.org/article/why-you-should-never-use-hodrick-prescott-filter
0 Response to "21 Things You Need to Know About Prescott"
Post a Comment