Pages

Thursday, March 8, 2012

FT's Gillian Tett shines light on need for ultra transparency

In her Financial Times column, Gillian Tett looks at the need to collect all the useful, relevant data and make it available to the market so that it can be used.

Her column covers many of the topics this blog has discussed including Wall Street's Opacity Protection Team, the Office of Financial Research (where transparency goes to die) and who is capable of analyzing the data and has an incentive to do so.
If Greece were to descend into disorderly default, how big would the financial hit be? That question has been stirring up intense debate in the markets; and, of course, among regulators too. With negotiations about Greek debt having dominated the headlines, it has been important to understand the “what if” scenarios. 
But amid the number-crunching, there is a bigger question too: just how much information is really available about the likely impact of any Greek shock?
Very little.
After all, when the financial system went into shock in 2008, following the Lehman Brothers collapse, it became clear that modern finance was plagued by a data fog; although investors could find timely information about equities, say, there was very limited transparency about other areas, such as the repo market or credit derivatives (partly because banks had a commercial interest in maintaining that fog).
Please re-read the highlighted text as it neatly summarizes why there is a need for ultra transparency throughout the financial system and the creation of the Mother of all financial databases.

It is not only regulators who want to answer the questions like what is the impact of the Greece debt restructuring going to be, but also market participants like investors.
Since then, global regulators have pledged reforms. And in some senses, considerable progress has been made. 
Today banks and other financial institutions are filing far more detailed reports on those repo and credit derivatives trades, and regulators are exchanging that information between themselves. 
Meanwhile, in Washington a new body – the Office of Financial Research (OFR) – has been established to monitor those data flows and in July US regulators will take another important step forward when they start receiving detailed, timely trading data from hedge funds, for the first time.
This is just more of the same in terms of regulators trying to maintain their information monopoly.
But there is a catch: although these reports are now flooding in, what is still critically unclear is whether the regulators – or anybody else – has the resources and incentives to use that data properly.
If this data were made available to all market participants, there is no doubt that the market participants have the resources and incentive to use that data properly.  There is the potential to make a significant amount of money from doing so!

Said another way, not only do market participants have a monetary incentive to use the data, but they also have the expertise to analyze the data too!
The bitter irony is that this information tsunami is hitting just as institutions such as the Securities and Exchange Commission are seeing their resources squeezed; getting the all-important brain power – or the computers – to crunch those numbers is getting harder by the day. 
That means that important data – on Greece, or anything else – could end up languishing in dark corners of cyberspace. That is a profound pity in every sense. 
After all, if the data could be properly deployed, it might do wonders to show how the modern global financial system really works (or not, in the eurozone.) 
Where properly deployed means all market participants have access to it.
Conversely, if data ends up partly unused, that not just creates a pointless cost for banks and asset managers – but could also expose government agencies to future political and legal risk, if it ever emerges in a future crisis that data had been ignored.  
That US repo market is a case in point. Before 2007, this was a sector that epitomised the sense of data fog: timely information about activity was patchy, partly because investor and public interest was extremely low. 
The lack of timely information was not the fault of the investors, it was the fault of the regulators.

Regulators are responsible for ensuring that market participants have access to all the useful, relevant information in an appropriate, timely manner.  This applies to all the currently opaque corners of finance like banks, structured finance securities, Libor rate setting...
But when the crisis erupted at Bear Stearns and Lehman Brothers in 2008, the importance of the repo market became clear. So regulators started to demand far more detailed data from banks, money market funds and other institutions; most notably, they now collect so-called N-MFP submissions, which contain the first granular data on individual securities.... 
A team from ratings agency Fitch recently spent several weeks combing through those obscure N-MFP forms, for example, and then collated them to provide a fascinating – and once-unimaginable portrait – of this shadowy sector.
Showing that market participants do know how to use the granular level data that ultra transparency would provide.
This reveals that the use of structured finance collateral has recently risen, but haircuts on repo trades have fallen; the top five institutions now represent just 60 per cent of the market, down from 80 per cent in 2008.* 
But this snapshot only emerged because Fitch stumbled on these forms and then took the initiative to laboriously comb through thousands of individual files. Without that adhoc effort, there is no system in place to let a casual observer collate those trades, or track the market as a whole. 
Showing that the market does have an incentive to assess the data that granular level ultra transparency would make available.
Maybe this will change. Recent innovations in IT in theory make it easier than ever to track complex global data flows.... 
Yesterday for example, IBM's Watson signed up its first financial client, Citigroup.  Watson is designed to be able to track and turn the complex global data flows into real information.
But real change requires political will – and resources. Sadly, that remains patchy, at best. And don’t expect the banks to take the lead: although bank trade groups say they tentatively support the introduction of LEIs, they have also just written to regulators complaining about the costs of the OFR, in an effort to clip its wings. 
For the moment, in other words, there are still plenty of financial players who seem disinclined to blow away the data fog; be that in Greece, or anywhere else.
Naturally, Wall Street's Opacity Protection Team does not want the data fog to be blown away.  Wall Street has a financial interest.

As Yves Smith observed on Naked Capitalism, no one on Wall Street received large bonuses for designing low margin, transparent products.

Opacity provides Wall Street with the opportunity to make money based on the mis-assessment of risk/value by other market participants.

No comments:

Post a Comment