Thursday, July 19, 2012

Big Data and The LIBOR Opportunity


Does anyone else think LIBOR is ridiculous because it is prone to being taken advantage of due to human intervention? In the world where Google tracks every click everyone makes while logged in, we can’t track every loan that a major bank makes? Granted there are many more variables involved in a loan such as, credit rating of the party taking out the loan, age of the person(s), length of repayment, fixed or variable rate interest, percent size of downpayment if applicable, and more depending on the loan type. 
The point is, in one day of clicking around on the Internet I generate more data than all of the information needed to detail a new consumer loan. It seems simple to me that software could be produced that could aggregate all of the credit and loan data used by say the 100 largest banks in the world, or at least UN countries, to track both in real time and unambiguously what the rates are out there. Plus, with all of that data better information about regional interest rates would be available. This system, which of course would be implemented by several for-profit companies for lack of an ambitious government, would fall under heavy regulation. 

For example, In the last two months I have a friend that bought a 30 year variable interest rate mortgage at 2.87% for the first five years and another friend that bought a new car with a five year 0% interest loan. It seems simple to me to track that kind of data and make the summary widely available for free. Plus, there is a ton of money in it! Well, just think if you had a critical mass of large banks reporting all their loan transactions, but only you were able to divide it down to the micro scale? The national and international rates could be published and updated in real time free on the Internet, but to know what is happening in Dubuque, Iowa (or Chicago or Manhattan or Dubai or Bombay) would cost you a fee.  In a way it makes interest rates more like the price of oil which is pretty clear.

As for the regulation, unless a government wants to take the lead and run the software, I imagine it would be easy to pass a law post-LIBOR requiring the XX largest banks to comply and give a small number of details of every new loan. That is where the compliance aspect comes in, if there was ever a question about the published rate on say July 19th, 2012, it would be possible to review each and every new loan or line of credit established on July 19th, 2012 that was reported and match that with actual documents. Estimating that every American takes out three loans or lines of credit each year (I hope that is high) that amounts to about 2.5 million new loans per day. Assuming the developed world is about ten times the size of the US and new loans per year per person is the same, that is only 25 million new loans per day. I realize that sounds like a lot, but if you can boil it down to credit rating of the person, location, and terms of the loan (introductory interest rate, years of the loan, etc.) you have only a handful of cells in a row of a spread sheet. Everything is documented. The way I see it, it's a win win for everyone except the traders who made money off of fixing LIBOR and the LIBOR reporters who must have profited off of fixing it.


One last note, not directly related, but regarding accurate documentation. I am a huge fan of documenting everything. I will be the first to admit I do not want to relive the horror of my sins, but I certainly want to settle ambiguities with truth and honesty. I feel that documentation is related to education. The more documentation there is about an incident the more we can be educated about what actually happened. With more education we can come to the best solution. This is true in many things. 

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.