Algorithms and Tokenization
On October 1st SEC Chairman Jay Clayton spoke with Brian Brooks, Acting Comptroller of the Currency at a virtual event organized by the Digital Chamber of Commerce. He spoke about how he believed all securities would eventually become tokenized using blockchain technology:
“If you talk about trading today, all trading is electronic. Our exchanges have gone electronic. Every trade you do... it gets routed through an electronic algorithm executed electronically. That was not the case 20 years ago. It may very well be the case that just as you had stock certificates and now you have entries, digital entries for representing stock. It may very well be the case that those all become tokenized.”
20 years ago, I was working at the forefront of algorithmic trading. Today, I am working at the forefront of tokenized securities. So, let me tell you a story about what it was like in electronic/algorithmic trading 20 years ago:
In 2000, my first job on Wall Street was at a company called ITG. My job was to sell algorithmic trading solutions to hedge funds and institutional asset managers. ITG had a big first mover advantage as they had developed a developed a trading platform called QuantEx – which stood for Quantitative Execution. QuantEx was built on top of an operating language that only ran on Unix. So, to use QuantEx, you had to have a Unix workstation.
The first challenge was selling the product. The only way to demo it was on a Unix computer and no customers had Unix. Also, there was no Webex, or Zoom, or screen-sharing of any kind. There were a few Unix laptops on the market when I got into the business, but my colleague before me used to schlep around NYC dragging a Sun SparcStation, monitor and keyboard in a trolley behind him so he could do demos. His nickname was, and still is, “The Mule”.
When a client signed up, we had to order the physical hardware for them, and have it shipped to their location. We’d then travel to their office to set up and install QuantEx. The public internet was too slow and unreliable so we would also have to order a dedicated “T1 line”. This was a physical cable that the telco connected from the nearest access point to the clients offices. One client in rural Indiana was too remote to get a T1 line installed so he had satellite connectivity. Latency under the best of circumstances would have been over a second, but likely a lot more. Today, latency is measured in millionths of a second.
The algorithmic trading strategies themselves, although cutting edge at the time, were also pretty basic. For example, many algos would follow a generic schedule based off historical data. Modern algos today are optimized in real time on a stock by stock basis incorporating sophisticated models of volume, volatility and liquidity.
Sometimes an electronic order wasn’t even fully electronic – it could get picked up by a human and traded manually at the exchange or market making desk.
Stocks were traded in fractions not decimals. So, the minimum price increment for a stock was 1/16 of a dollar. This meant that the spread between best bid and offer was at least 6.25 cents and often 12.5 cents or more. This resulted in much wider spreads and higher costs to investors.
While I was lucky to be working at one of the most advanced firms on the street that embraced the technology, the same could not be said for most of the industry. My peers at investment banks would face stiff resistance from the established desks and were prevented from selling into the most lucrative accounts. The New York Stock Exchange – the epitome of trading on Wall Street – was one of the most ardent defenders of the ‘old way’ of doing things, favoring the ‘specialist’ system where each security would have a human trader on the floor assigned to make markets exclusively. The NYSE is still characterized by the glorious trading floor, which makes a great backdrop for journalists reporting on the market, but most trading actually takes place in data centers 30 miles away in Mahwah, New Jersey.
Despite all this, electronic trading was still a big improvement over what came before it. Adoption increased significantly from 2000 onwards driven by competition from firms such as ITG, Archipelago, Instinet, Island and Flextrade. Technologies such as FIX protocol, Order Management Systems, and broadband connectivity would make it easier and easier for firms to integrate algorithmic trading into their workflow. And changes in market structure such as decimalization and colocation would help bring down costs and improve performance.
Trading is now many times cheaper and more efficient than 20 years ago, and as Chairman Clayton noted, electronic trading is the norm.
The same will come to pass for digital securities. Tokenization using blockchain is a better technology that can deliver improved transparency, streamlined workflows, faster settlement and lower costs. We face many challenges just as electronic trading did 20 years ago: blockchain is not well understood or accepted today; established players and even regulators are resistant to change; certain market participants such as digital custodians have not yet emerged; and workflows and user interfaces can be clunky and unintuitive for many people.
There are lots of smart people and great companies like Texture Capital innovating in the tokenized securities space such as Arca Labs, Archax, Securitize, Securrency, TokenSoft, tZero, Vertalo and many others, while advocates such as the Digital Chamber of Commerce and the Wall Street Blockchain Alliance are driving the industry forward. Right now we are focused on the foundational layers of a tokenized security market – token issuance, compliance, trading, integration with legacy systems and each other. Soon, the industry will begin harnessing the creative ideas occurring in DeFi right now, followed by further innovations we haven’t even thought of yet.
And in 20 years, a future Chairman of the SEC will be reminiscing about a time when securities weren’t tokenized.
Great article, Richard Johnson!
Nice piece Sir Richard! An honor to be in that elite company. Looking forward to Texture Cap helping to continuing the the march towards efficiency.
Great article Richard! I always find I can learn a lot when I am reminded of the past. Thank you!
Great article!
I was thinking of all my old ITG midwest colleagues when I wrote this. Big shout out to Jay Fraser, Terrence J (Terry) Duffy, Jim Muller, Anna Ziotis Kurzrok, Harold Lee, Gianni Miccoli, Cem Tanbay, Mr Qelocal himself Christopher Marino ... and not forgetting the Mule, Neil Pardo And thanks for Jamie Selway for finding me that chart of trading costs.