Regular readers know that your humble blogger has consistently observed that the only barrier to publication in the premier economic journals is the ability to write a sentence (plus, I assume that the author has a PhD in economics or finance).
It is clear that there is zero effort put in by any reviewers of the articles to a) ask if the idea presented makes any sense or b) check the accuracy of any data analysis.
To prove these points, I offered up the works of Gary Gorton and Reinhart & Rogoff.
The former developed the idea of "informationally insensitive" debt and gave as an example of this type of debt bank demand deposits.
I don't know about you, but when I helped my children open up a bank account when they were 6-year olds I had to answer the question: how do I know I can get my money back? My answer, which is the answer virtually every parent I have talked to gave too, was "the government guarantees you will get your money back".
Last time I checked, the existence of a government deposit guarantee was information.
The idea of informationally insensitive debt confuses how hard it is to independently assess an investment with the notion of insensitivity. All investments are informationally sensitive. Some, like demand deposits with government insurance, are just easier to assess than others, like structured finance securities.
Professor Gorton also developed the idea of a "safe" asset. Excuse me, but there is no such thing as a safe asset. All assets have risk. Some assets have less risk than others, but they are all risky. Don't believe me, just look at the Capital Asset Pricing Model that underlies modern finance.
Please note that the ideas for informationally insensitive debt and a safe asset were borne from the necessity to explain the "run on the banks" that occurred at the beginning of the current financial crisis.
A run that the Financial Crisis Inquiry Commission explained was the result of the simple fact that no bank could figure out if a borrowing bank was solvent or not. The "run on the banks" was nothing more than banks reducing their lending because they could no longer determine if the borrowing bank could repay.
As your humble blogger pointed out leading up to the financial crisis and after the crisis began, the reason that it was impossible to determine if a bank was solvent or not was because of opacity. A point the Bank of England's Andrew Haldane drove home when he called banks "black boxes".
With a modest amount of attention, papers that embodied ideas like these should never have been published.
What lesson can economists draw from the ruckus over a flaw found in an influential study by two Harvard University scholars?The fact that an economics professor works for an Ivy league school or an MIT, a Berkeley or a Stanford doesn't mean that a) their idea isn't stupid and b) they haven' made a mistake in their analysis.
Our suggestion: Do a better job of checking one another’s work.
Empirical research has enjoyed an unaccustomed level of public attention since April 16, when a group of researchers published a critique of work by the economists Carmen Reinhart and Kenneth Rogoff. The critique pointed out a spreadsheet error -- now famous thanks to the blogosphere and “The Colbert Report” -- in a 2010 study on the relationship between government debt and economic growth. ...
There’s only one reliable way to verify empirical findings: Try to replicate them. ...
Replication rarely leads to career success.
“Ideas” people -- those exciting scholars generating new insights into how society functions -- are the stars of the profession.By this definition, with the FDR Framework, I should be a star (if only I had a PhD in economics; but then, neither did Adam Smith who invented the invisible hand of the market or Walter Bagehot who invented the modern central bank).
After all, I only explain the basis for how the global financial system operates and why it failed in the run-up to our current financial crisis.
And yes, the FDR Framework has shown that it works for predicting a financial crisis and what policies will and won't work to end the financial crisis. I used it to predict our current financial crisis and the lack of success at ending the crisis from our existing policy responses.
Those who do the grindingly difficult work of checking whether the stars’ insights are actually true rarely get recognized. Who can name an economist who achieved fame through replication?Milton Friedman who replicated Anna Schwartz's work or was it the other way around in their collaboration?
Editors of academic journals prefer to give what scarce space they have to exciting new ideas, rather than rehashing old debates....The preference for giving space to new ideas doesn't mean that the ideas shouldn't first be vetted to see if they make any sense.
The editors seems to have forgotten this admittedly low barrier to publication.
For a scholar, replication offers an unappealing bet. ... Tails, you find a serious flaw, but your results still probably won’t be published and you’ve earned enemies who may try to land some reputational punches against you....In other areas of academics, the author does not know who reviews their article. This way, when serious flaws are uncovered, it is done BEFORE the article is published and enemies aren't created.
No comments:
Post a Comment