January 20, 2015
I’ve been heads down working on SpreadServe recently, so haven’t paid so much attention to the etrading topics that I used to blog about so much. Thanks to an update from mdavey, I’ve been catching up on the excellent, thought provoking content that jgreco has been posting on his plans for a new US Treasury trading venue, organised as a limit order book, with buy and sell side trading on an equal footing. I enjoyed the post on internalization and adverse selection. His points about single dealer platforms are well founded too, though my own experience in rates trading is that it’s difficult to get client flow on to SDPs as by their very nature they can’t offer multi dealer RFQs, which are critical for real money clients that must prove best execution for regulatory reasons. Of course, if the inter dealer prices from BrokerTec, eSpeed and EuroMTS were public in the same way as equity prices from major exchanges are public, then more solutions to the best execution problem would be possible. As jgreco rightly points out, transparency is key.
Now I want to raise a few questions prompted by jgreco’s posts, both pure tech, and market microstructure…
- Java? Really? I wonder if it’s inspired by LMAX’s Java exchange implementation, their custom collections and Disruptor. I would have expected C++, but then I’m an old school C++er.
- Is that really the workflow ? That must be a tier 2 or 3 bank. All my experience has been at tier 1 orgs where all pricing and RFQ handling is automated. If a trader quotes a price by voice, it’s a price made by the bank’s own pricing engines. Those engines will be coded in C++, driven by Eurex futures or UST on the runs, and showing ticking prices on the trader desktop. Mind you, obfuscation techniques were used to frustrate step 2: copy and paste quote. After you’ve spent a fortune building a rates etrading infrastructure, you don’t want everyone backing out your curve from your Bloomberg pages.
- Will DirectMatch have decimal pricing, or are you going to perpetuate that antiquated 1/32nd stuff?
- How will you handle settlement/credit risk? Will each trade result in two, with counterparties facing off with a clearing house?
- How do you shift liquidity? When liquidity is concentrated at a single venue, it’s difficult to move. The only case I know of is German Govt Futures moving from Liffe to Eurex. I guess UST liquidity is fragmented across D2D and D2C venues, so it’s not concentrated all in one place, improving DirectMatch’s chances of capturing some of the flow.
September 29, 2014
Zac Townsend‘s post on how Standard Treasury aims to turn banks into platforms is intriguing. There’s certainly no lack of ambition in his goal. But I do wonder if he’s setting himself to tilt against the very nature of both banks and platforms. One of the key phrases in Zac’s post is: “allowing developers to think of banks as platforms”. I’ll just unpack that a little. First, platforms, as explicated in Evans & Hagiu’s excellent Invisible Engines. Platforms are multi-sided markets. One side pays for access to the revenue generating customers that an effective platform aggregates by offering free or cheap access. For example, in gaming, game devs pay licenses to platform (console) owners so they can sell to gamers. The console manufactures sell consoles at or even below cost. In financial trading clients pay Bloomberg for access to information & liquidity, and dealers get access to the platform without paying fees to Bloomberg. Famously, Google and Facebook offer services free to consumers to enable them to sell to advertisers. So if banks are going to spend a load of cash adopting Standard Treasury tech so they can become more like real software platforms, who is going to pay?
Let’s bear in mind that banks are already liquidity platforms. They charge fees for access to the liquidity they provide by aggregating client capital. They disguise fees by making some things “free”, and charging for others when they cross sell. If you attempt to commoditise or aggregate by means of a software platform, they lose the cross sell, and so the margins. They will certainly resist that prospect. So, any software platform that integrates banks with with software services needs to offer the prospect of more margin in existing deal flow, or new deal flow to justify the cost of adoption. Judging by Zac’s post, it looks as if he thinks the new deal flow would come from the underbanked via mobile apps. Will that deal flow justify the cost of implementing Standard Treasury tech? I’m sceptical…
Standard Treasury should also be considering the cost of decommissioning those expensive legacy systems. In banking old and new systems tend to run in parallel until all stakeholders are confident that the new systems supports all legacy system functionality. So new tech that promises cost savings tends to cause a cost spike until the old system really has been put to bed. And, believe me, that can be a lengthy and painful process! I have first hand experience of systems that have resisted retirement for decades…
March 1, 2012
Thanks to reddit I’ve just discovered Bret Victor. I watched the Invention video, and enjoyed the whole theme on tightening the feedback loop between changing code and seeing results. The later part on moral crusading was interesting if not entirely convincing. So I checked out the web site, and am reading Magic Ink. Wow ! This is a full blown vision of doing software differently. Back in the 90s I got really excited by, in turn, Brad Cox’s vision, Patterns, and Open Source. About 10 years ago I discovered dynamically typed languages with Python and Smalltalk. And that’s the last time I had a real rush of excitement about some new approach in software. Sure, I’ve dabbled in functional languages like F#, and played with various OSS projects. But for the most part my attention has been on the trading topics that fascinate me, like electronic limit order books.
So what’s Magic Ink about ? Victor divides software into three categories: information, manipulation and communication software. He focuses on information software, which is most apps really. And that includes most financial and trading apps. And then he proceeds to argue that there’s too much interactivity, and that interaction is bad. The way forward is context sensitivity combined with history and graphic design. Counterintuitive, and utterly convincing. A joy to read !
I can’t help wondering what the UX crew over at Caplin think of this ? I haven’t seen them blogging on it. Victor’s views have radical implications for how etrading apps should work. I’d expect Sean Park to be pushing this angle with his portfolio companies too…
August 29, 2011
Fascinating post from Quantivity – I’m hoping for more on the same topic from him. Many of the advantages listed would be enjoyed by any small non real money fund: hedge, prop, family office etc. Of course there are some serious obstacles that small (relatively) unregulated funds face, and Lars Kroijer describes them in detail in Money Mavericks. And a lack of legacy technology is indeed an advantage in building trading systems quickly. A relatively recent pre existing framework, either from vendor or in house built can be a big advantage though. A classic example is gateways for exchange/ECN connectivity.
February 2, 2011
Fascinating blog on HFT implementation from WK. He commnts “a variation on this structure is to store the Limits in a sparse array instead of a tree. ” More detail on the implications for L1 and L2 cache behaviour of trees versus arrays for limits would be welcome. I’m assuming C++ implementation here of course, though WK points out you can make Java go fast if you avoid GC, which chimes with the experience of the LMAX guys. I ask because I interviewed with a big HFT firm last year: they gave me a programming exercise based on L1/2 cache behaviour.
February 1, 2011
I’m reading Zuckerman’s Greatest Trade Ever, an accout of how John Paulson’s hedge fund profited from the credit crunch. There’s a lot of anecdotage and general background. But among all that there’s some good detail on implementation. How to implement a view of the markets as a trade is a key question for any trader. Drobny’s House of Money is excellent on this. Zuckerman’s book as some good stuff on why shorting the bonds or equity of home loan origination companies didn’t work, why CDSs on sub prime MBSs didn’t become tradeable til 2005, and why they were the right vehicle for shorting. Also on why using CDSs means negative carry, and why that’s generally a difficult thing for any portfolio. Taleb has some good comments on why his out the money options strategy suffered from the same problem.
January 28, 2011
So I’ve signed myself for the LMAX UCL algo trading conference – see you there ! I’m looking forward to Martin Thompson’s talk, and hoping I can bend his ear on a few server engineering questions over drinks at the end of the afternoon. I’m also keen to know more about the LMAX market design. In the infoq presentation I linked earlier there are some intriguing comments about ensuring low and stable latency for market makers. This makes me wonder what the terms are for market maker participation on the LMAX order books. Do market makers have a different API than market takers ? Do makers get processing priority for placing, pulling and amending orders ? Can maker orders cross with other maker orders, or only market taker orders ?
September 25, 2010
If an FX dealer quotes a currency rate – say USDCHF – as 0.9877 that means USD is the base currency and CHF the term currency. The rate quoted is CHF/USD, the number of Swiss Francs to the US Dollar. If a client trades at the quoted rate, then the dealt currency is the one that specifies the unit of the trade. So if USD is the dealt currency, and the trade size is 10,000,000 then we’re trading 10,000,000USD. Which means a trade size of 987,700CHF. In this case CHF is the contra currency.
September 14, 2010
Thanks to Matt for the heads up on this article. In pt III Tusar asserts that Trade-At “essentially creates the CLOB [consolidated limit order book]–the thing that everyone agrees would stifle innovation.” Now my knowledge of algo equity execution is certainly not state of the art. But it’s news to me that there’s a body of opinion that thinks CLOBs stifle innovation. Certainly the advantage of CLOBs is that they aggregate liquidity for a security, and provide a public, visible live price during trading, and a record of open and close prices too. They also tend to come with clearing arrangements. This is exactly the arrangement that the industry has been inching towards for the OTC stuff that was so problematic in 2008, with the introduction of swaps clearing, which should improve transparency.
I suspect that Tusar is talking his own book. Plus ca change ! Of cause major broker dealers like Goldman want “innovation” in the form of multiple dark pools and execution venues, including their own. That leads to fragmented liquidity, which causes opacity and hinders price discovery. Which is good for Goldman, and bad for clients who will need expensive Goldman execution services.
Update: there’s an enlightening discussion of Trade-At here.
April 28, 2010
The price action on Greek government bonds over the last couple of days has been staggering. I’ve been watching the short end 2yr GGBs. Yesterday the yield went out from 14% in the morning, through 15% to 16% in the afternoon. This morning the yield hit 20% on the 2012 GGBs. At the time of writing it’s calmed down to 17%. This compares to yields of 0.8 to 2% for benchmark EGBs, depending on coupon.
For those not familiar with fixed income pricing, yields are inversely proportional to prices. Rising yields means falling prices. Prices are quotes as percentage of par. For benchmark German, French, Belgian or Dutch EGBs short end prices are ~99 to 105% of par. Bear in mind the coupons ! Those high GGB yields mean investors and speculators are only prepared to pay ~67% or 77% of par. Which reflects the market’s view of the default risk.
The flows are intriguing too, but I won’t comment here ;)