High Frequency Trading – Every Second counts – trading off speed against security

Posted on : 31-03-2014 | By : richard.gale | In : Cyber Security

Tags: , , , , , , , , ,

0

History – same principles different technology

High frequency trading (HFT) is the latest name for an old trading practice; Arbitrage – identifying & exploiting the difference between two markets or prices to make money. This relies on two basic factors – differences in price of the same commodity and communication mechanisms to transmit information between the markets with that price differential.

Where there is more than one opportunity to trade the same thing and there is a slight difference in price then there is money to made. In the ‘old’ days this could be between two markets selling, say, grain in different market towns. For various reasons (difference in the number of buyers/sellers that day or even the weather) then a bushel of corn could be $10 in one town and $12 in the next. Before fast communication methods this was just the way it was. With the advent of the telegraph, clever traders could buy on one market and sell on the other so making a clear profit on the difference.

Arbitrage developed into a significant trading tool and as communications improved it became faster and more sophisticated. The time differential in price shortened and the prices themselves grew closer together. When the telegraph (ticker tape) was introduced then the time gap could be minutes or hours. With high speed networks, seconds and then milliseconds became the difference between profit and loss.

Trends – Speed is all

HFT now is a major business. It is estimated that more than 30% of trades in some markets are executed by HFT systems. Some of the profits per pair of trades are very small but the number of transactions can make a significant amount of money.

Transaction speeds are way beyond the capabilities of human traders, black box systems pump thousands of trades through a second and are constantly charging and improving their trading models.  Firms spend serious amounts of money on the fastest possible hardware and networks. Stock and commodity exchanges have now created new profitable markets in selling space in their data centres to firms wanting to reduce latency to a minimum.

Some companies are going further, to shave 3 milliseconds off the data transmission times between New York & Chicago trading firms invested over $800m to build a direct fibre optic link. It’s an amazing story so take a look at this link to read more. There was some talk of boring through the earth to remove the extra distance required for the curvature of the earth – either way, unfortunately for them, another firm then built a series of line of site microwave links which effectively took them out of the market.  We actually wrote an article a few years back about a project we heard of which is taking this to the next level of speed….

With the transaction and reaction speeds measured in milliseconds then opportunities for errors magnify – there have been a number of noticeable spikes in volatility. The infamous Flash crash of 2010 where, in an already jittery market, the Dow Jones fell 600 points in 5 minutes only to recover half of that in the next 20 minutes was perhaps due to a ‘fat fingered’ trade (where someone added too many zeros) but the movement was magnified by the algorithmic computations taking advantage or defending losses on the rapid change in price.

The market is affected immediately by news feeds. This is nothing new but as new sources of information arrive the accuracy of them often isn’t checked until after the event.

It used to be that news was researched, verified and then printed and available in newspapers the next day. Radio & then television made this process much faster but information was generally verified, rolling news channels and their constant demand for events may have increased speed to live but potentially reduced accuracy.

There are ‘instant’ market data services such as Bloomberg and Reuters where trained journalists report and publish but now public messaging such as Facebook & Twitter allow anyone to publish virtually anything instantly without any verification of its accuracy.

Most trading firms treat the noise of Twitter as an additional influencer but not sole source of the truth. There is suspicion that some traders are feeding their HFT systems with raw feed data. This is what caused another mini crash when the Syrian free army hacked the Associated Press Twitter account and falsely reported an explosion at the White House.

Security – What could happen and is it actually possible?

So what are the security implications for HFT?  There appears to be a direct conflict of security versus speed. Any attempt to verify, encrypt or otherwise secure the HFT trades will inevitably slow down the messages so disadvantaging the firm that does this.

To counter this there is a level of security in the network. These are generally partitioned from the rest of the firm and often have their own physical wires and high speed routers to minimise ‘hops’ and the impact of other traffic. Beyond this are there other ways to ensure security and protect against possible attacks or breaches that could occur?

Slowing down rivals

The competitive advantage HFT have is getting the trades in faster than the opposition. An obvious way to do this would be to slow down the rival’s speed somehow. So if network speed could be reduced by, say putting additional traffic on the network or even carrying out a Denial of Service (DOS) attack to disrupt or disable the competition, it could present a significant (if illegal) trading window. In the same way, if the network could be reprogrammed to route elsewhere (maybe falsely alerting of a network failure in the main route) then the same thing could happen and the trades would arrive (too) late.

Modification

What if these attacks could be taken a step further? If a rival managed to change the code of a competitor they could introduce additional steps to slow the processing down. An extreme would be they could modify the amounts, security or even turn a buy into a sell. Fanciful maybe but not impossible.

What is perhaps more likely is that human error will cause these issues themselves. This has happened before, most significantly with Knight Capital when the wrong software was activated in  the live environment causing a $460m trading error and the eventual demise of the firm.

 

Can this dichotomy of the need for speed versus security ever be resolved? Anything which improves one will negatively impact the other.

Bring back the Ticker

The essential ingredient of HFT is a constantly moving price in more than one location that are out of synchronisation. With demand from traders, exchanges have improved their update speeds to the pointwhere the prices are constantly changing, instantly reacting to the changes in supply and demand. Some thought has been given to publishing the price on a regular basis in the way ‘ticker tape’ used to flow from telegraphs. If this was set at say once a second it would eliminate most of HFT’s reasons to exist. A nice idea but difficult to implement now the door has been opened and the massive investment in technology has been made.

Gentlemen’s agreements

Assuming that it is not possible to introduce additional steps into the in-line HFT trading process, is there another way to sort out the inevitable errors after the event? There are always trading errors and most organisations do enough business with each other to have informal agreements in place to resolve issues. Some exchanges also provide this service. However this did not help Knight Capital and may not help others in the future. If multiple trading parties have issues at the same time then the issue may have a systemic impact on the financial services and then the ‘real world’ soon after.

Legislation

There have been many calls for the cessation of HFT as it is considered destabilising, dangerous and out of control. Traders will always look to the next way to improve profits so if a way is found of closing down HFT then you can bet a similar successor system will arise. Ensuring enough reserves, liquidity and controls to manage the bumps and issues on the way is probably the best that can be hoped for.

 

We’ve been looking at HFT for a while now and it’s a fascinating area – pulling in the best brains from trading and technology. Michael Lewis – famous for his exposes of the financial world in Liar’s Poker published a book on the subject of HFT a couple of years back – it’s a great read!

 

BROADScale – Cloud Assessment

Posted on : 30-04-2013 | By : jo.rose | In : Cloud

Tags: , , , , , , , , , , , , ,

0

We are well into a step-change in the way that underlying technology services are delivered.  Cloud Computing in its various guises is gaining industry acceptance.  Terms such as Software as a Service (SaaS), Platform as a Service (PaaS), Private Cloud, Hybrid Cloud, Infrastructure as a Service (IaaS) and so on have made their way into the vocabulary of the CIO organisation.

Cloud Computing isn’t new.  Indeed many organisations have been sourcing applications or infrastructure in a utility model for years, although it is only recently that vendors have rebranded these offerings ( “Cloud Washing” ).

With all the hype it is vital that organisations consider carefully their approach to Cloud as part of their overall business strategy and enterprise architecture.

Most importantly, it is not a technology issue and should be considered first and fore mostly from the standpoint of Business, Applications and Operating Model.

Organisations are facing a number of common challenges:

  • Technology budgets are under increasing pressure, with CIO’s looking to extract more value from existing assets with less resource
  • Data Centre investment continues to grow with IT departments constantly battling the issue of power consumption and physical space constraints
  • Time to market and business innovation sit uncomfortably alongside the speed with which IT departments can transform and refresh technology
  • Increases in service level management standards and customer intimacy continue to be at the forefront

Cloud Computing can assist in addressing some of these issues, but only as part of a well thought out strategy as it also brings with it a number of additional complexities and challenges of its own.

Considering the bigger picture, a “Strategic Cloud Framework”

Before entering into a Cloud deployment, organisations should look at all of the dimensions which drive their technology requirements, not the technology itself.  These will shape the Cloud Framework and include:

  • Governance – business alignment, policies and procedures, approval processes and workflow
  • Organisation – changes to operating models, organisation, interdependencies, end-to-end processes, roles and responsibilities
  • Enterprise Architecture – application profiling to determine which applications are suitable, such as irregular / spiky utilisation, loosely coupled, low latency dependency, commodity, development and test
  • Sourcing – internal versus external, Cloud providers positioning, service management, selection approach and leverage
  • Investment Model – business case, impact to technology refresh cycle, cost allocation, recharge model and finance
  • Data Security – user access, data integrity and availability, identity management, confidentiality, IP, reputational risk, legislature, compliance, storage and retrieval processes

The BROADScale service

At Broadgate Consultants we have developed an approach to address the business aspects of the Cloud strategy.  Our consultants have experience in the underpinning technology but also understand that it is led from the Business domain and can help organisations determine the “best execution venue” for their business applications.

Our recommended initial engagement depends on the size, scale and scope of services in terms of the Cloud assessment.

  1. Initial – High Level analysis of capability, maturity and focus areas
  2. Targeted – Specific review around a business function or platform
  3. Deep – Complete analysis and application profiling

At the end of the assessment period we will provide a report and discuss the findings with you.  It will cover the areas outlined in the “Strategic Cloud Framework” and provide you with a roadmap and plan of approach.

During the engagement, our consultant will organise workshops with key stakeholders and align with the IT Strategy and Architecture.

For more details and to schedule an appointment contact us on 0203 326 8000 or email BROADScale@broadgateconsultants.com

Broadgate Predicts 2013 – Survey Results

Posted on : 27-03-2013 | By : jo.rose | In : Data, Finance, General News, Innovation, IoT

Tags: , , , , , , , ,

0

In January we surveyed our clients, colleagues and partners against our predictions for 2013. We are pleased that we have now the results, the highlights of which are included below.

Key Messages

Infrastructure as a Service, Cloud and a shift to Data Centre & Hosted Services scored the highest, outlining the move from on-premise to a more utility based compute model.

Strategies to rationalise apps, infrastructure and organisations remains high on the priority list. However, removing the technology burden built over many years is proving difficult.

Many commented on the current financial constraints within organisations and the impact to the predictions in terms of technology advancement.

Response Breakdown

 

 

 

 

 

 

 

 

 

 

Of the total responses received, the vast majority concurred with the predictions for 2013. A total of 78% either “Agreed” or “Strongly Agreed” (broadly in line with the 2012 survey).

Ranking

 

 

 

 

 

 

 

 

 

 

The diagram above shows the results in order from highest scoring to lowest. The continued growth in Infrastructure as a Service had the top overall ranking with 91% and the least was Crowd-funding with 53% agreement.

Respondents

 

 

 

 

 

 

 

 

 

 

We sent our predictions out to over 700 of our clients and associates. Unlike our previous years’ survey, we wanted to get feedback from all levels and functions, so alongside CIOs, COOs and technology leaders we also surveyed SMEs on both the buy and sell side of service delivery organisations.

We would like to thank all respondents for their input and particularly for the many that provided additional insight and commentary.

If you would like a copy of the full report, please email jo.rose@broadgateconsultants.com.

Broadgate Predicts 2013 – Preview

Posted on : 29-01-2013 | By : john.vincent | In : Innovation

Tags: , , , , , , , , , , , , , ,

0

Last month we published our 2013 Technology Predictions and asked our readers to give us their view through a short survey. We have had a great response…so much so that we are keeping in open for 2 more weeks.

However, we thought we would share a few of the findings so far, prior to us producing the final report.

Current Ranking

As we stand, the predictions that generated the most agreement are;

  1. Infrastructure Services Continue to Commoditise
  2. Samsung/Android gain more ground over Apple
  3. Data Centre/Hosting providers continue to grow

Some interesting commentary against these;

Many companies have come to terms with the security/regulatory issues concerning commoditisation and cloud services, although still chose to build in-house for now. It will take some significant time to see IaaS address the legacy infrastructure burden.

On the Apple debate, respondents agreed enough to place in 2nd place but differed a lot in terms of how this will develop…there is a feeling that Apple are struggling to continue to innovate ahead of the market and consumers are wiser now, together with a cost pressure that, if it is relieved, will cause users to stay with them.

Regarding Data Centres, the importance of cloud and managed services continues to drive expansion. Within heavily regulated industries such as Financial Services there continues to be a desire to Build vs Buy, but respondents questioned for how long. Having your own DC is not a competitive advantage.

At the other end of the scale, the prediction that respondents disagreed most with was;

  • Instant Returns on Investment required (followed closely by)
  • More Rationalisation of IT Organisations

Again, a pick of some of the additional comments;

Whilst there still exists demand for long term and large corporate technology initiatives, the stance is starting to change somewhat towards more agile, focused investments. Unfortunately, legacy issues and organisational culture continue to block progress.

Whilst the market conditions and technology evolution is facilitating a reduction in workforce, respondents cited other equal forces in areas such as risk and control, plus offshore operations delivering less value than expected, working to counteract this.

Please continue to send us your thoughts before we close!

Interestingly the largest number of No Comments (40%) came against the prediction that “Crowd-funding services continue to gain market share”…maybe an article for February.

Is “Cloud Banking” set to explode ?

Posted on : 24-11-2011 | By : john.vincent | In : Cloud, Finance

Tags: , , , , , , ,

2

There are conflicting views on the maturity, positioning and suitability of cloud computing at an enterprise level. Couple that with an increasingly dynamic and evolving marketplace and it is easy to see why it is difficult for organisations to define a roadmap appropriate to their business. What isn’t in doubt is that cloud computing, in whichever form, is changing the landscape of business technology.

However, what is the situation within banking and financial services? I was recently at an event to discuss datacentre innovation with a number of infrastructure managers, architects and consultants in the financial industry. We discussed the fact that demand for IT services outstrips supply and the difficulties that causes internal technology organisations to deal with capacity planning, infrastructure utilisation and optimisation, space and energy requirements. For many, the option of simply building a new datacentre facility to deal with the ebbs and flows of demand is not an option.

We explored techniques and experiences around improving virtualisation and utilisation and also in terms of energy efficiency, with cap-ex and op-ex savings of between 25%-35%. Naturally the conversation moved on to how cloud computing may help in terms of moving power from internally hosted systems to a “best execution” venue.

What was clear was that the starting point for the majority of financial services infrastructure managers was fairly negative in terms of building the higher value cloud models i.e. above Infrastructure as a Service (IaaS) into their technology strategy. Some of the reasons tabled included, “We’re not there yet in terms of maturity”, “Why would we tie ourselves into an external provider?”, “Our data privacy requirements mean that cloud is out for now”, “The regulatory authorities just won’t let us” and so on.

These are, of course, all valid concerns. However, another comment that resonated with me was the anecdote of a business user who, for whatever reason, had decided to turn to Amazon web services for provision of compute power and then simply included it as a line item on their expenses submission.  Not good, but it demonstrates perfectly the tipping point we are at. The question is, how will infrastructure leaders within Financial Services react to a fast-moving market, some of which is driven by user perception and some by the ability to change a service provider?

It is a difficult conundrum. The impact of cloud on the organisation and culture is something we are exploring, but for now let’s look at a few of our predictions for the next two years and why we think that the adoption of cloud with banking and financial services will accelerate.

1) The security issue stops being a blocker:

A key area and one which we believe enterprise security teams will work closely with IT and business users to determine an approach. FS organisations have for years used external providers to manage applications and related data, including Software as a Service, and the same rigour should be applied to allow the appropriate application portfolios to run with external cloud providers (in addition to private).

2) Platform as a Service (PaaS) will see significant growth:

We will see more than just PaaS providers adding multiple distinct environments. In our conversations with technology service providers we believe that many ISV’s will transition their applications to PaaS to provide a more rich set of business services, particularly towards retail banking and corporate systems.

3) The commoditisation of Infrastructure as a Service gathers pace:

The technology discussion will move on from “How do we build and operate the infrastructure?” and start to consider what can be achieved with cloud at a business services level. Some FS technology organisations we speak to are already starting this debate, as the “nuts and bolts” of how to use IaaS are moving into a more commoditised space.

4) Private cloud will continue to expand and provide a “Spring Board” for externalisation:

Having dipped their toe, or perhaps watched others, banks will become more in tune with using private cloud for their IT environments. As budgets continue to be constrained and FS organisations tackle unused or underutilised environments, they will be forced to rethink their IT strategies and shift to adopting scalable cloud infrastructures. In turn, these infrastructures and applications will be considered for transfer to external providers.

5) “Captive” datacentre growth slows and shifts to cloud providers:

This takes us back to the opening discussion on datacentre efficiency. We believe that FS organisations who currently provision their IT environments within ever-expanding datacentres will shift to a “best execution” venue to take advantage of the scalability, on-demand and defined costs of cloud computing. Many companies have already transitioned large portions of their infrastructure to private clouds by introducing virtualised solutions. In parallel with this reduction in internal datacentre footprint, they will need to take advantage of the benefits and economies of scale of public clouds.