Talk of automation, algorithms and data in equity trading has reached an all time high.  The sell-side – brokers and technologists – have a tremendous vested interest in complex offerings.  Is it overkill? Trading desks at many institutions and hedge funds, as well as service providers, may be squandering valuable resources focusing on issues that don’t really matter.

At base we underestimate the real value of the experienced trader, over-rely on automation, and focus too much on distracting issues that just don’t matter.  Maybe we need to go back to basics.

Best Execution and Trading Fundamentals 

The premise of Best Execution never changed.  The fundamental concepts – minimizing slippage or realizing a benchmark while controlling market impact – remain the same, in every market. In the post Regulation NMS era, sourcing liquidity in a fragmented market has introduced new complications.  However, this has not compromised the basic tenants.

The trading environment is certainly complex.  But greater reliance on technology may not be the answer to mastering the markets.  Better focus and smarter use of basic automation are the avenues to superior results.

Enter the Noise

Service providers look past the importance of the human element in trading workflow.  Ask them outright and they’ll deny this.  Yet examine their offerings.  Across the spectrum of providers you’ll see a wider array of algorithms, infrastructure solutions to manage “Big Data,” and analytics and automation designed to guard against the perils of high frequency traders.

Technology and analytics evolve,  as do markets and infrastructure.  But is “more” always “better?”
It may be sacrilege to some, but Big Data, advanced algorithms, and (oh yes) the continued threat of high frequency traders may be distractions that are not worthy of the attention received.

Big Data

Big Data is kind of a big joke.  Though regarded as a unique concept, it is really a discussion of scalable infrastructure.  Scalable data infrastructures are not new, nor is data analysis.

As it pertains to trading, Big Data involves the collection, management and use of the tremendous amounts of tick and execution data and derivative market analytics.

There is a clear need to manage larger data sets.  But to what end?  How much more refined can smart order routers, advanced algorithms, and analytics be?  And where is the pay-off in all that work?
A point exists where returns obtained through research and development do not justify the cost. This is where views on the importance of Big Data change.

There is a great deal of discussion about how Big Data can help improve execution quality, especially in regard to smart-order-routers.  Certainly, tremendous amounts of data are required to run a smart order router.  It is arguable whether the availability of data can be used to make execution quality appreciably better.  The same holds true for analytics.  Does the existence of more data influence the benefit and advancement of analytics?  If the answer is “no,” as we believe it is – then why the continued sizable investment in their development and enhancement?

A range of tangents have evolved with the increasing attention garnered by Big Data.  The emergence of certification programs, designed to give credence to individuals’ data processing and management capabilities, are one example. These certifications are often granted by organizations that are not widely known or universally respected, but help promote the individuals comprising the organization.  It is always interesting when an issue takes on a life of its own.

One area may well benefit from a wider, broader, and more available data set:  transaction cost analysis (TCA) for the buy-side.

The ability for institutional traders to assess execution performance independently – without the help of a broker – has obvious merit.  There are inherent conflicts of interest between brokers and clients in regard to order routing, driven by exchange pricing models underscored by maker-taker pricing.  It makes sense for buy-side firms to assess broker performance, provided it can be done in a cost-effective manner.

Specialty Algorithms

Basic algorithms predicated on time and volume-weighted strategies are of considerable value.  They easily and accurately automate actions a trader would manually take if he or she were monitoring orders and manually initiating trades.

Basic algorithms are commodities and do little to differentiate brokers.  Brokers that develop their own algorithmic suites have pushed the envelope to develop arrays of algorithms to handle special situations.  These algorithms, in theory, automate the process of hard-to-trade securities by becoming more or less aggressive, depending on market conditions.

Specialized algorithms designed to handle “complex” trading scenarios (e.g., automate less-liquid securities or perform more complex execution strategies) are of limited value.  Under the back to basics premise, the discerning eye of an experienced trader can assess price action far better than an algorithm.  That assessment of price action need not be at the milli- or micro-second level but over a period of seconds or minutes.  This is akin to why most traders would never place an illiquid, hard-to-trade security in an algorithm.  Experienced traders can do a better job than algorithms in less extreme instances.  After all, algorithms were developed to support – not replace – savvy traders.

Still, many brokers continue to develop and distribute specialty algorithms in an effort to differentiate themselves. Allocation of resources to expansion of these offerings is an inefficient use of sell-side resources.

Specialty algorithms that seek liquidity across the full spectrum of displayed and non-displayed venues have value.  In such cases traders are principally concerned with capturing liquidity, potentially at the expense of obtaining truly optimal execution quality.  The tradeoff is reasonable given the difficulty of filling certain orders.

High Frequency Trading

The debate about the merits and detriments of HFT continues unabated.  There is considerable noise around the subject as most parties have a vested interest in their viewpoint. One problem is the lack of universally accepted and objective research on what is at heart a secretive practice.

There is tremendous banter over whether high frequency traders are “cheating,” manipulating markets and gaming, or whether institutional traders are in fact enhancing overall market liquidity.  There are many types of high frequency strategies and the research that is available, including ours, suggests only a small proportion are predatory.  It is interesting to note that most predatory strategies operate by ferreting out “less intelligent” algorithms, not by “attacking” human traders.

To the extent that high frequency traders may have an adverse impact on the execution quality of less sophisticated traders (human and electronic), the impact is likely minimal at exchange venues.  This is because child orders, which are generally small in size, are only nominally impacted.  When these child orders are taken in aggregate, the value taken by HFT is likely very small.

The real concern is not activity within exchange venues but the impact high frequency traders can have by gaming prices within dark pools.  Given that non-displayed venues often draw reference prices from the NBBO, the ability of high frequency traders to impact prices within dark pools is a definite concern.  This is especially of concern if an institutional trader is attempting to execute a block sized order within a non-displayed venue.  Dark pool operators are cognizant of this issue and those who provide access to high frequency traders diligently monitor activity.  We are confident this issue is largely under control.

Back to Basics

Cutting through industry noise is imperative.  A large amount of “unbiased” thought leadership is, in fact, highly biased; it sometimes subtly, often unabashedly promotes the views of service providers. Chatter surrounding “topics du jour” should be overlooked in favor of smart use of resources and effective trading.

For the buy-side, getting back to basics means being mindful of a balance of human trading in conjunction with effective algorithm use.  To work complicated orders through “specialty” algorithms is often sub-optimal and can result in less-than-desired execution quality.  Humans thus should oversee complicated activity and only the most straightforward orders should be worked through algorithms.  The exception is liquidity-seeking algorithms that span both displayed and non-displayed venues.

Experienced traders have capabilities that algorithms and smart order routers lack.  Humans can view, understand and interpret price action with intelligence and discretion that automated trading tools predicated upon voluminous data analysis simply do not – cannot – have.  This affords human traders a tremendous leg up in terms of intraday execution strategy and decision making.

For the sell-side, getting back to basics means recognizing the appropriate focus on what constitutes optimal institutional trading and dedicating resources to higher margin initiatives.  In many cases, this means changing the focus on current business initiatives, especially the dedication of resources to specialty algorithms.  On the surface, these algorithms may appear to provide service differentiation. In the often cold reality of today’s markets, they are not effective for most traders and a misuse of valuable resources.


Matt Samelson
Principal, Director of Equities
Phone:  203-274-8970 ext 201
Email: msamelson@woodbineassociates.com  

Leave Comment

Your email address will not be published. Required fields are marked *