Perfection in forecasting does not exist. It can't. What we need is a mix of intuition, common sense and history and this can never yield anything precise. Early empirical work took the uncertainty in the world into account using some very nifty assumptions. Think about it: if you reduce the whole of humanity into a set of identical robots programmed to maximise utility, then yes, you can predict the future. For any conditional statement, the first part is really important. For us economists though, it usually gets swept under the rug. So what do we do when people buy the wrong bundle? or when rational expectations gives us completely wrong predictions? Then, we look at imperfect knowledge economics.
Thanks to Roman Frydman and Michael Goldberg, we have a new way of looking at things. The first step is to let go of the idea that you will ever be able to precisely predict anything economics related. Once that weight has been lifted off your shoulders, you can look for the next best solution.
They don't mean to completely discard all economic theory. Rather, they focus on things that are observable and whose movements have at least a fair bit of certainty. Here's a concrete example relating to currency fluctuations.
Rational expectations predicts that should not deviate form parity in the long term. If you believe that, then the falling dollar should lead you to take no policy action.On the other hand, Behavioral economists claim that currencies can depart from their parity values for long periods due to irrational trading and not because traders think that the macroeconomic fundamentals are changing. Here too we come to the conclusion that policy will not only be unnecessary but ineffective as well.
So then, why has the dollar been falling? The imperfect knowledge perspective is that traders don't just look at deviations from parity. The euro holders may be looking at America's current deficit, rising euro interest rates etc. So the belief that macroeconomic fundamentals are changing will cause the euro holders to bid the dollar value furthur fom parity, this increases the riskiness associated with being a euro holder in the long run (since currencies do go back to parity values in the long run). It is this precarious balance between belief and riskiness that determines how long the currency deviates from parity.
By assuming that poeple have imperfect knowledge and that there is some degree of learning, we get much more accurate empirical results.
5 comments:
But this imperfect knowledge can be quantified and optimised by some predetermined rule which can be context specific (or any other set of complex rules for choosing methods)? And what about forecasts v. predictions?
And what about those statistical procedures borrowed from engineering which do not assume any underlying economic theory at all?
Kalman filtering methods, wavelet based methods, nonparametrics etc.? These have been augmented by the very simple economic idea (which to my mind is an extension of the Granger Representation Theorem) that any economic time series can be broken down into 4 components - trend, cyclical, seasonal and irregular - the so-called Structural Time Series models. Indeed ARIMA-based forecasting methods are considered black-box approaches in this line of thought. These methods have some critics and some very influential support. Especially see the special issue of The Economics Journal (Jan 1997) "Controversy: On Modelling the Long-Run in Applied Economics" with classic essays by Clive Granger, Hashem Pesaran and Andrew Harvey.
I don't have any idea about Kalman filtering or even ARIMA modelling.. but I think the main idea was about removing inconsistencies associated with emprical findings v/s theoretical predictions.. why don't you do a post about the paper you cited.. I'll try and get my hands on it in the meanwhile.
P.s. do you lose out on accuracy when you follow these methods you've mentioned?
I meant Wold decomposition, and not Granger representation.
State-space models are borrowed from engineering, where they are the main way of approaching time series, and the most general.
I should write a post on the 'actual' methodology of research, because a lot of economists simply pay lip-service to the models and go about designing models at it suits them.
Some very interesting work is being carried out by David Hendry on model selection and the General-to-Specific modelling methodology. Will try to put up some posts, but the thing is it is very hard to know these things completely, and it would be pointless to post about methodology if it were half-baked.
That would be a really good post.. It's something I've always wanted to read about.. You've got your hands dirty with quite a bit of research haven't you? Do post about the methodology of research.. I'm looking forward to it!
Post a Comment