I hate beating up on Gillian Tett, because even a writer is clever as she is is ultimately no better than her sources, and she seems to be spending too much time with the wrong sort of technocrats.
Her latest piece correctly decries the fact that no one has the foggiest idea of what might have happened if Greece defaulted (note that we are likely to revisit this issue in the not-too-distant future).
But she makes the mistake of assuming the problem could have been solved (in the mathematical sense, that the outcome could have been predicted with some precision) by having better data. That is a considerable and unwarranted logical leap...
This point needs to be repeated: since market participants are responsible for their losses, they have a significant incentive to not have a bigger exposure to any source of loss than they can afford to lose.
Since important information about the last crisis has been given short shrift, it’s a given that more data won’t necessarily yield a commensurate increase in understanding. We’ve lamented how, for instance, a critically important BIS paper debunking the role of the saving glut in the crisis and the use of the “natural” rate of interest in economic models, has been largely been ignored. Similarly, from what we can tell, there is perilous little understanding of how heavily synthetic and synthetic CDOs turned a US housing bubble that would have died a natural death in 2005 into a global financial crisis.Gillian is focused on disclosure of granular level data and not the failure of the economics profession.
And Tett’s focus on “data”, no doubt reflecting the preoccupation of the officialdom, is a big tell. Economists routinely exhibit “drunk under the streetlight” syndrome: they prize analyzing big datasets, aren’t good at developing them (this was a huge beef of Nobel Prize winner Wassily Leontief), and pretty bad at doing qualitative research (they’d rather to do thought experiments and even when they undertake survey research, the resulting studies have strong hallmarks of a failure to do proper development and validation of the survey instrument).All of which are fantastic reasons why we should not make our financial system dependent on economists. It is after all not the fault of the granular level data that economists cannot analyze it.
However, none of these problems with economists is applicable to how market participants use granular level data under the FDR Framework to assess risk and limit their exposure to what they can afford to lose.
Now, to the prospects for performing diagnostics and preventing the next crisis. One the one hand, it is a disgrace that the authorities didn’t have a good grip on who was on the wrong side of Greek credit default swaps.
The US banks were thought to be reasonably exposed; that’s one reason Treasury Secretary Geithner was unduly interested in this situation (recall when Geithner intervened what was seen as decisively against an Irish effort to haircut €30 billion of unguaranteed bonds?).
This is inexcusable, particularly in the wake of the financial crisis.
We’ve harped on the fact the likely reason that Bear was bailed out was due to its credit default swap exposures. At the time of the Bear failure, Lehman, UBS, and Merrill were seen as next. The authorities went into Mission Accomplished mode rather than putting on a full bore, international effort to get to the bottom of CDS exposures. And the Greek affair suggests they’ve continued to sit on their hands.
This matters because, as Lisa Pollack illustrated in a neat little post, supposedly hedged positions across counterparties can quickly become unhedged if one counterparty fails. So a basic data gathering exercise would at least help in identifying who is particularly active and has high exposures to specific counterparties and products.Without this basic data, market participants cannot assess the risk of each of their exposures, monitor their exposures for changes in risk and adjust the amount and price of their exposures according to their independent assessment of the current level of risk.
As a result, it is difficult for market participants to take the steps necessary to insure that they do not have a bigger exposure to losses from a specific source than they intend.
But this is of less help with big financial firms than you might think. While Lehman was correctly seen as being undercapitalized well in advance, pretty much no one foresaw Bear’s failure. It went down in a mere ten days. Confidence is a fragile thing.Had Bear been required to provide ultra transparency and disclose on an on-going basis its current asset, liability and off-balance sheet exposure details, everyone would have been able to foresee Bear's failure.
The source of confidence in the financial system comes from the ability of market participants to independently assess all the useful, relevant information (in Bear's case, its disclosures under ultra transparency).
When they do not have this information and cannot assess what is going on in a 'black box' like Bear, market participants have a natural incentive to get their money back and find another investment.
Similarly, while some positions are not very liquid or all that easy to hedge (think of our favorite bete noire, second liens on US homes), in general big financial firms have dynamic balance sheets.Which is why they need to disclose on an on-going basis what is on them!
With more extensive reporting, could regulators have seen and intervened in MF Global’s betting the farm on short-dated Italian government debt? Even if they had perceived the risk, Corzine would have argued that the trade would have worked out (and it did even though the firm failed by levering it too much)....Why have the regulators go in when market discipline would have intervened instead?
Had MF Global had to disclose under ultra transparency the details of its bet from day one, every market participant dealing with MF Global would have been in a position to assess the risk of this bet and adjust their exposure accordingly.
While it is hard to object to having better data, and we desperately need better information in some key policy areas (the lack of good information in the housing/mortgage arena and in student debt is appalling), more data is unlikely to get us as far in the financial markets sphere as Tett hopes.
The problem is, as we and others have discussed before, is that the financial system is tightly coupled....This confuses a symptom of the problem, tight coupling, with the problem that gives rise to tight coupling, opacity.
The reason that the financial system is tightly coupled is the lack of data. Without the granular level data provided by ultra transparency, market participants cannot assess the risk of loss and as a result have tended to underestimate risk and taken on more exposure than they can afford to lose.
There are many reasons why tightly coupled systems are really difficult to model....Which is why your humble blogger has been focusing on ultra transparency as it is a solution that can be implemented by all market participants rather than relying on a model.
If we want to reduce the frequency and severity of financial crises, it isn’t a data problem. It’s a systems design problem....Actually, it is a data problem as the global financial system that was designed in the 1930s is based on transparency.
So the answer does not lie in better data. It lies in the willingness of the authorities to stare down the financial services industry. And the next financial crisis is likely to be a necessary, but perhaps even then not sufficient, condition for that change in attitude.The answer lies in better data as one of the benefits of better data is that it is the market, which is bigger than the financial services industry, which stares down the industry.