The ultimate way to move beyond trading latency?

Posted on : 29-03-2019 | By : richard.gale | In : Finance, Uncategorized

Tags: , , , , , , ,

0

A number of power surges and outages have been experienced in the East Grinstead area of the UK in recent months. Utility companies involved have traced the cause to one of three  high capacity feeds to a Global Investment bank’s data centre facility.

The profits created by the same bank’s London based Propriety Trading group has increased tenfold in the same time.

This bank employs 1% of the world’s best post-doctoral theoretical Physics graduates  to help build its black box trading systems

Could there be a connection? Wild & unconfirmed rumours have been circulating within  the firm that a major breakthrough in removing the problem of latency – the physical limitation the time it takes a signal to transfer down a wire – ultimately governed by of the speed of light.

For years traders have been trying to reduce execution latency to provide competitive advantage in a highly competitive fast moving environment. The focus has moved from seconds to milli and now microsecond savings.

Many Financial Services & technology organisations have attempted to solve this problem through reducing  data hopping, routing, and going as far as placing their hardware physically close to the source of data (such as in an Exchange’s data centre) to minimise latency but no one has solved the issue – yet.

It sounds like this bank may have gone one step further. It is known that at the boundary of the speed of light – physics as we know it -changes (Quantum mechanics is an example where the time/space continuum becomes ‘fuzzy’). Conventional physics states that travelling faster than the speed of light and see into the future would require infinite energy and so is not possible.

Investigation with a number of insiders at the firm has resulted in an amazing and almost unbelievable insight. They have managed to build a device which ‘hovers’ over the present and immediate future – little detail is known about it but it is understood to be based on the previously unproven ‘Alcubierre drive’ principle. This allows the trading system to predict (in reality observe) the next direction in the market providing invaluable trading advantage.

The product is still in test mode as the effects of trading ahead of the data they have already traded against is producing outages in the system as it then tries to correct the error in the future data which again changes the data ad finitum… The prediction model only allows a small glimpse into the immediate future which also limits the window of opportunity for trading.

The power requirements for the equipment are so large that they have had to been moved to the data centre environment where consumption can be more easily hidden (or not as the power outages showed).

If the bank does really crack this problem then they will have the ultimate trading advantage – the ability to see into the future and trade with ‘inside’ knowledge legally. Unless another bank is doing similar in the ‘trading arms race’ then the bank will quickly become dominant and the other banks may go out of business.

The US Congress have apparently discovered some details of this mechanism and are requesting the bank to disclose details of the project. The bank is understandably reluctant to do this as it has spent over $80m developing this and wants to make some return on its investment.

If this system goes into true production mode surely it cannot be long before Financial Regulators outlaw the tool as it will both distort and ultimately destroy the markets.

Of course the project has a codename…. Project Tachyons

No one from the company was available to comment on the accuracy of the claims.

Battle of the Algorithms Quantum v Security

Posted on : 28-03-2018 | By : kerry.housley | In : Cyber Security, data security, FinTech, Innovation, Predictions

Tags: , , , , ,

0

Like black holes, quantum computing was for many years nothing more than a theoretical possibility. It was something that physicists believed could exist, but it hadn’t yet been observed or invented.

Today, quantum computing is a proven technology, with the potential to accelerate advances in all aspects our lives, the scope is limitless. However, this very same computing power that can enhance our lives can also do a great deal of damage as it touches many of the everyday tasks that we take for granted. Whether you’re sending money via PayPal or ordering goods online, you’re relying on security systems based on cryptography. Cryptography is a way of keeping these transactions safe from cyber criminals hoping to catch some of the online action (i.e. your money!). Modern cryptography relies on mathematical calculations so complex—using such large numbers—that attackers can’t crack them. Quantum could change this!

Cybersecurity systems rely on uncrackable encryption to protect information, but such encryption could be seriously at risk as quantum develops. The threat is serious enough that it’s caught the interest of the US agency National Institute of Standards and Technology (NIST). Whilst acknowledging that quantum computers could be 15 to 20 years away, NIST believes that we “must begin now to prepare our information security systems to be able to resist quantum computing.”

Many believe that quantum computers could rock the current security protocols that protect global financial markets and the inner workings of government. Quantum computers are so big and expensive that—outside of global technology companies and well-funded research universities—most will be owned and maintained by nation-states. Imagine the scenario where a nation-state intercepts the encrypted financial data that flows across the world and are is able to read it as easily as you are reading this article. Rogue states may be able to leverage the power of quantum to attack the banking and financial systems at the heart of the western business centres.

The evolution of the quantum era could have significant consequences for cyber security where we will see a new phase in the race between defenders and attackers of our information. Cryptography will be the battlefield in which this war of the future will be fought, the contenders of which are already preparing for a confrontation that could take place in the coming years. The evolution of quantum computing will crack some cryptography codes but how serious is the threat?

In theory, a quantum computer would be able to break most of the current algorithms, especially those based on public keys. A quantum computer can factor at a much higher speed than a conventional one. A brute-force attack (testing all possible passwords at high speed until you get the right one) would be a piece of cake with a machine that boasts these characteristics.

However, on the other hand, with this paradigm shift in computing will also come the great hope for privacy. Quantum cryptography will make things very difficult for cybercriminals. While current encryption systems are secure because intruders who attempt to access information can only do so by solving complex problems, with quantum cryptography they would have to violate the laws of quantum mechanics, which, as of today, is impossible.

Despite these developments we don’t believe there is any cause for panic. As it currently stands the reality is that quantum computers are not going to break all encryption. Although they are exponentially more powerful than standard computers, they are awkward to use as algorithms must be written precisely or the answers they return cannot be read, so they are not easy to build and implement.

It is unlikely that hacktivists and cybercriminals could afford quantum computers in the foreseeable future. What we need to remember is that most of attacks in today’s threat landscape target the user where social engineering plays as large, if not larger a part than technical expertise. If a human can be persuaded to part with a secret in inappropriate circumstances, all the cryptography in the world will not help, quantum or not!

It is important that organisations understand the implications that quantum computing will have on their legacy systems, and take steps to be ready. At a minimum, that means retrofitting their networks, computers, and applications with encryption that can withstand a quantum attack.

Quantum computing presents both an unprecedented opportunity and a serious threat. We find ourselves in a pre-quantum era, we know it’s coming but we don’t know when…

Are you ready for Y2Q (Years to Quantum)?

5 Minutes With Mark Prior

Posted on : 18-12-2015 | By : Maria Motyka | In : 5 Minutes With

Tags: , , , , , , , , , , , , , , ,

0

Which recent tech innovations are you the most excited about?

I get most excited about how my business can benefit from technology (whether it’s new or not). It’s my team’s job to understand our business; its processes, strategy and competitor landscape and bring technology to bear to address those challenges.
Smith and Williamson is a very client centric business – there is a great opportunity to leverage even well-established technology like IPT, Workflow and Document management to improve the service we provide to clients. Additionally Cloud based collaboration tools offer new ways to engage with our clients 1-1 and perhaps open up new markets for services.

Like all industries if we can both improve the service to the client through technology and at the same time lower the cost of servicing a client we will be successful.

From a pure technology perspective I’m looking forward to improvements in price and functionality of end user devices – particularly low cost 2in1 windows devices displacing the desktop or traditional clam laptop as the default end user device. I hope the combination of these devices, windows 10, office 365, Wi-Fi and IPT will provide a better mobile platform that’s easier to manage and support and offers a seamless user experience regardless of location and connection type.

Looking ahead I’m also interested in how graphene will impact IT – whether it’s in battery technology or the size and speed of microprocessors, it appears to have the potential to be revolutionary (and it was invented in the UK!!).

 

How do you see business applications in wealth management adopting As-a-Service operating models?

Firms buy solutions that best meet their needs – how those solutions are delivered is often secondary, however vendors that deliver their solution (only) as a service are I feel better placed to rapidly adapt and evolve their offering as it’s a single code set, single port etc. This should keep their costs down and by passing those savings to customers they will drive adoption and create a virtuous circle. It should also mean they can focus development resource on new features rather than maintaining multiple code sets and branches.

 

In your opinion, what are the biggest data security risks that financial organisations are currently facing and how can they be overcome?

I think everyone understands the need for perimeter security, good patch management, access controls etc. But I think an area this is sometimes overlooked are “end users” either inadvertently or deliberately exposing data. We need to ensure we classify our data based on risk, educate our employees and have appropriate audit trails and controls based on data classification (all easier said than done). Service like MS Office 365 and OneDrive mean this has to be driven as much by policy and education as by IT.

 

Why did you choose Broadgate to assist you? What value has working with Broadgate brought to your team?

I’ve known the team for many years and trust them to do a good job for their clients.

Broadgate’s engagement style is collaborative and consultative, unlike other firms where every conversation is viewed as a selling opportunity.

 

Which technology trends do you predict will be a key theme for 2016?

Every year we think it will be cloud – maybe this year it will happen (though personally I’m not sure it will) Financial service firms are still hesitant to put client data into the public cloud and many firms say the cost of cloud is more than the marginal cost of adding capacity to their own facilitates.
Hosting strategies are difficult to formulate as the options are many and varied with no clear leaders. I think Google will drive into MS market share (a few years ago I can’t recall anyone seriously considering alternatives to MS Office) which should ensure healthy competition and better options for their customers.

Highlights of 2014 and some Predictions for 2015 in Financial Technology

Posted on : 22-12-2014 | By : richard.gale | In : Innovation

Tags: , , , , , , , , , , ,

0

A number of emerging technology trends have impacted financial services in 2014. Some of these will continue to grow and enjoy wider adoption through 2015 whilst additional new concepts and products will also appear.

Financial Services embrace the Start-up community

What has been apparent, in London at least, is the increasing connection between tech and FS. We have been pursuing this for a number of years by introducing great start-up products and people to our clients and the growing influence of TechMeetups, Level39 etc within the financial sector follows this trend. We have also seen some interesting innovation with seemingly legacy technology  – Our old friend Lubo from L3C offers mainframe ‘on demand’ and cut-price, secure Oracle databases an IBM S3 in the cloud! Innovation and digital departments are the norm in most firms now staffed with clever, creative people encouraging often slow moving, cumbersome organisations to think and (sometimes) act differently to embrace different ways of thinking. Will FS fall out of love with Tech in 2015 – we don’t think so. There will be a few bumps along the way but the potential, upside and energy of start-ups will start to move deeper into large organisations.

Cloud Adoption

FS firms are finally facing up to the cloud. Over the last five years we have bored too many people within financial services talking about the advantages of the cloud. Our question ‘why have you just built a £200m datacentre when you are a bank not an IT company?’ was met with many answers but two themes were ‘Security’ and ‘We are an IT company’…. Finally, driven by user empowerment (see our previous article on ‘user frustration vs. empowerment) banks and over financial organisations are ’embracing’ the cloud mainly with SaaS products and IaaS using private and public clouds. The march to the cloud will accelerate over the coming years. Looking back from 2020 we see massively different IT organisations within banks. The vast majority of infrastructure will be elsewhere, development will take place by the business users and the ‘IT department’ will be a combination of rocket scientist data gurus and procurement experts managing and tuning contracts with vendors and partners.

Mobile Payments

Mobile payments have been one of the discussed subjects of the past year. Not only do mobile payments enable customers to pay without getting their wallets out but using a phone or wearable will be the norm in the future. With new entrants coming online every day, offering mobile payment solutions that are faster and cheaper than competitors is on every bank’s agenda. Labelled ‘disruptors’ due to the disruptive impact they are having on businesses within the financial service industry (in particular banks), many of these new entrants are either large non-financial brands with a big customer-base or start-up companies with fresh new solutions to existing issues.

One of the biggest non-financial companies to enter the payments sector in 2014 was Apple. Some experts believe that Apple Pay has the power to disrupt the entire sector. Although Apple Pay has 500 banks signed up and there is competition from card issuers to get their card as the default card option under Apple devices, some banks are still worried that Apple Pay and other similar service will make their branches less important. If Apple chose to go into retail banking seriously by offering current accounts then the banks would have plenty more to worry them.

Collaboration

The fusion of development, operations and business teams to provide agile, focussed solutions has been one of the growth areas in 2014. The ‘DevOps’ approach has transformed many otherwise slow, ponderous IT departments into talking to their business & operational consumers of their systems and providing better, faster and closer-fit applications and processes. This trend is only going to grow and 2015 maybe the year it really takes off. The repercussions for 2016 are that too many projects will become ‘DevOpped’ and start failing through focussing on short term solutions rather than long term strategy.

Security

Obviously the Sony Pictures hack is on everyone’s mind at the moment but protection against cyber attack from countries with virtually unlimited will, if not resources, is a threat that most firms cannot protect against. Most organisations have had a breach of some type this year (and the others probably don’t know it’s happened). Security has risen up to the boardroom and threat mitigation is now published on most firms annual reports. We see three themes emerging to combat this.

– More of the same, more budget and resource is focussed on organisational protection (both technology and people/process)
– Companies start to mitigate with the purchase of Cyber Insurance
– Governments start to move from defence/inform to attacking the main criminal or political motivated culprits

We hope you’ve enjoyed our posts over the last few years and we’re looking forward to more in 2015.

Twitter.com/broadgateview

 

 

Broadgate Big Data Dictionary

Posted on : 28-10-2014 | By : richard.gale | In : Data

Tags: , , , , , , , , , , ,

0

A couple of years back we were getting to grips with big data and thought it would be worthwhile putting a couple of articles together to help explain what the fuss was all about. Big Data is still here and the adoption of it is growing so we thought it would be worthwhile updating and re-publishing. Let us know what you think?

We have been interested in Big Data concepts and technology for a while. There is a great deal of interest and discussion with our clients and associates on the subject of obtaining additional knowledge & value from data.

As with most emerging ideas there are different interpretations and meanings for some of the terms and technologies (including the thinking that ‘big data’ isn’t new at all but just a new name for existing methods and techniques).

With this in mind we thought it would be useful to put together a few terms and definitions that people have asked us about recently to help frame Big Data.

We would really like to get feedback, useful articles & different views on these to help build a more definitive library of Big Data resources.

Analytics 

Big Data Analytics is the processing and searching through large volumes of unstructured and structured data to find hidden patterns and value. The results can be used to further scientific or commercial research, identify customer spending habits or find exceptions in financial, telemetric or risk data to indicate hidden issues or fraudulent activity.

Big Data Analytics is often carried out with software tools designed to sift and analyse large amounts of diverse information being produced at enormous velocity. Statistical tools used for predictive analysis and data mining are utilised to search and build algorithms.

Big Data

The term Big Data describes amounts of data that are too big for conventional data management systems to handle. The volume, velocity and variety of data overwhelm databases and storage. The result is that either data is discarded or unable to be analysed and mined for value.

Gartner has coined the term ‘Extreme Information Processing’ to describe Big Data – we think that’s a pretty good term to describe the limits of capability of existing infrastructure.

There has always been “big data” in the sense that data volumes have always exceeded the ability for systems to process it. The tool sets to store & analyse and make sense of the data generally lag behind the quantity and diversity of information sources.

The actual amounts and types of Big Data this relates to is constantly being redefined as database and hardware manufacturers are constantly moving those limits forward.

Several technologies have emerged to manage the Big Data challenge. Hadoop has become a favourite tool to store and manage the data, traditional database manufacturers have extended their products to deal with the volumes, variety and velocity and new database firms such as ParAccel, Sand & Vectorwise have emerged offering ultra-fast columnar data management systems. Some firms, such as Hadapt, have a hybrid solution utilising tools from both the relational and unstructured world with an intelligent query optimiser and loader which places data in the optimum storage engine.

Business Intelligence

The term Business Intelligence(BI) has been around for a long time and the growth of data and then Big Data has focused more attention in this space. The essence of BI is to obtain value from data to help build business benefits. Big Data itself could be seen as BI – it is a set of applications, techniques and technologies that are applied to an entities data to help produce insight and value from it’s data.

There are a multitude of products that help build Business Intelligence solutions – ranging from the humble Excel to sophisticated (aka expensive) solutions requiring complex and extensive infrastructure to support. In the last few years a number of user friendly tools such as Qlikview and Tableau have emerged allowing tech-savvy business people to exploit and re-cut their data without the need for input from the IT department.

Data Science

This is, perhaps, the most exciting area of Big Data. This is where the Big Value is extracted from the data. One of our data scientist friends described it as follows: ” Big Data is plumbing and that Data Science is the value driver…”

Data Science is a mixture of scientific research techniques, advance programming and statistical skills (or hacking), philosophical thinking (perhaps previously known as ‘thinking outside the box’) and business insight. Basically it’s being able to think about new/different questions to ask, be technically able to intepret them into a machine based format, process the result, interpret them and then ask new questions based from the results of the previous set…

A diagram by blogger Drew Conway  describes some of the skills needed – maybe explains the lack of skills in this space!

 

In addition Pete Warden (creator of the Data Science Toolkit) and others have raised caution on the term Data Science “Anything that needs science in the name is not a real science” but confirms the need to have a definition of what Data Scientists do.

Database

Databases can generally be divided into structured and unstructured.

Structured are the traditional relational database management systems such as Oracle, DB2 and SQL-Server which are fantastic at organising large volumes of transactional and other data with the ability to load and query the data at speed with an integrity in the transactional process to ensure data quality.

Unstructured are technologies that can deal with any form of data that is thrown at them and then distribute out to a highly scalable platform. Hadoop is a good example of this product and a number of firms now produce, package and support the open-source product.

Feedback Loops

Feedback loops are systems where the output from the system are fed back into it to adjust or improve the system processing. Feedback loops exist widely in nature and in engineering systems – think of an oven – heat is applied to warm to a specific temperature and is measured by a thermostat – once the correct temperature is reached the thermostat informs the heating element and it shuts down until feedback from the thermostat says it is getting too cold and it turns on again… and so on.

Feedback loops are an essential part of extracting value from Big Data. Building in feedback and then incorporating Machine Learning methods start to allow systems to become semi-autonomous, this allows the Data Scientists to focus on new and more complex questions whilst testing and tweaking the feedback from their previous systems.

Hadoop

Hadoop is one of the key technologies to support the storage and processing of Big Data. Hadoop emerged from Google and its distributed Google File System and Mapreduce processing tools. It is an open source product under the Apache banner but, like Linux, is distributed by a number of commercial vendors that add support, consultancy and advice on top of the products.

Hadoop is a framework for running applications on large clusters of commodity hardware. The Hadoop framework transparently provides applications both reliability and data motion. Hadoop implements a computational paradigm named map/reduce, where the application is divided into many small fragments of work, each of which may be executed or re-executed on any node in the cluster. In addition, it provides a distributed file system that stores data on the compute nodes, providing very high aggregate bandwidth across the cluster. Both map/reduce and the distributed file system are designed so that node failures are automatically handled by the framework.

So Hadoop could almost be seen as a (big) bucket where you can throw any form and quantity of data into it and it will organise and know where that data resides and can retrieve and process it. It also accepts that there may be holes in the bucket and can patch them up by using additional resources to patch itself up – all in all very clever bucket!!

Hadoop runs on a scheduling basis so when a question is asked it breaks up the query and shoots them out to different parts of the distributed network in parallel and then waits and collates the answers.

Hive

Hive provides a high level, simple, SQL type language to enable processing of and access to data stored in Hadoop files. Hive can provide analytical and business intelligence capability on top of Hadoop. The Hive queries are translated into a set of MapReduce jobs to run against the data. The technology is used by many large technology firms in their products including Facebook and Last.FM. The latency/batch related limitations of MapReduce are present in Hive too but the language allows non-Java programmers to access and manipulate large data sets in Hadoop.

Machine Learning

Machine learning is one of the most exciting concepts in the world of data. The idea is not new at all but the focus on utilising feedback loops of information and algorithms that take actions and change depending on the data without manual intervention could improve numerous business functions. The aim is to find new or previously unknown patterns & linkages between data items to obtain additional value and insight. An example of machine learning in action is Netflix which is constantly trying to improve its movie recommendation system based on a user’s previous viewing, their characteristics and also the features of their other customers with a similar set of attributes.

MapReduce

Mapreduce is a framework for processing large amounts of data across a large number of nodes or machines.

http://code.google.com/edu/parallel/img/mrfigure.png
Map Reduce diagram (courtesy of Google)

Mapreduce works by splitting out (or mapping) requests into multiple separate tasks to be performed on many nodes of the system and then collates and summarises the results back (or reduces) to the outputs.

Mapreduce based on the java language and is the basis of a number of the higher level tools (Hive, Pig) used to access and manipulate large data sets.

Google (amongst others) developed and use this technology to process large amounts of data (such as documents and web pages trawled by its web crawling robots). It allows the complexity of parallel processing, data location and distribution and also system failures to be hidden or abstracted from the requester running the query.

MPP

MPP stands for massively parallel processing and it is the concept which gives the ability to process the volumes (and velocity and variety) of data flowing through systems. Chip processing capabilities are always increasing but to cope with the faster increasing amounts of data processing needs to be split across multiple engines. Technology that can split out requests into equal(ish) chunks of work, manage the processing and then join the results has been difficult to develop.  MPP can be centralised with a cluster of chips or machines in a single or closely coupled cluster or distributed where the power of many distributed machines are used (think ‘idle’ desktop PCs overnight usage as an example). Hadoop utilises many distributed systems for data storage and processing and also has fault tolerance built in which enables processing to continue with the loss of some of those machines.

NoSQL

NoSQL really means ‘not only SQL’, it is the term used for database management systems that do not conform to the traditional RDBMS model (transactional oriented data management systems based on the ACID principle). These systems were developed by technology companies in response to challenges raised by the high volumes of data. Amazon, Google and Yahoo built NoSQL systems to cope with the tidal wave of data generated by their users.

Pig

Apache Pig is a platform for analysing huge data sets. It has a high-level language called Pig Latin which is combined with a data management infrastructure which allows high levels of parallel processing. Again, like Hive, the Pig Latin is compiled into MapReduce requests. Pig is also flexible so additional functions and processing can be added by users for their own specific needs.

Real Time

The challenges in processing the “V”‘s in big data (volume, velocity and variety) have meant that some requirements have been compromised. It the case of Hadoop and Mapreduce this has been the interactive or instant availability of the results. Mapreduce is batch orientated in the sense that requests are sent for processing where they are then scheduled to be run and then the output summarised. This works fine for the original purposes but now the ability to become more real-time or interactive are growing. With a ‘traditional’ database or application users expect the results to be available instantly or pretty close to instant. Google and others are developing more interactive interfaces to Hadoop. Google has Drill and Twitter has release Storm. We see this as one of the most interesting areas of development in the Big Data space at the moment.

 

Over the next few months we have some guest contributors penning their thoughts on the future for big data, analytics and data science.  Also don’t miss Tim Seears’s (TheBigDataPartnership) article on maximising value from your data “Feedback Loops” published here in June 2012.

For the technically minded Damian Spendel also published some worked examples using ‘R’ language on Data Analysis and Value at Risk calculations.

These are our thoughts on the products and technologies – we would welcome any challenges or corrections and will work them into the articles.

 

Broadgate 2013 Predictions – how did we do?

Posted on : 30-12-2013 | By : richard.gale | In : Innovation

Tags: , , , , , , , , , , , , , , , ,

0

In December 2012 we identified some themes we thought would be important for the coming year. Let’s see how we got on…

1. Infrastructure Services continue to commoditise – for many organisations, Infrastructure as a Service (IaaS) is now mainstream. Technology advancement will continue to move the underlying infrastructure more towards a utility model and reduce costs in terms of software, hardware and resource.

This has happened and is continuing to grow, most organisations have the infrastructure in place to support IaaS with private clouds and virtualised environments. However, the flexibility and agility benefits have not always been realised as large organisation IaaS have sometimes been weighed down with the legacy change and build processes of the previous model. To circumvent this, many businesses are looking at public cloud for more flexible capacity. This will be the big growth area of 2014 especially with financial services organisations that, previously, have been hesitant in adopting public cloud solutions.

2. Application/Platform rationalisation – for many large firms there is still a large amount of legacy cost in terms of both disparate platforms, often aligned by business unit, and their sheer size/complexity. The next year will see an increase in rationalisation of application platforms to drive operational efficiency.

In 2013 the understanding and scale of the problem became more apparent but, with limited change/transformation budgets (in financial services mainly due to the burden of regulatory compliance requirements) not much action. Now these complex webs of legacy applications are starting to fail and seriously constrained business growth. 2014 will be a ‘crunch’ year when these expensive problems have to be tackled head on either through wholesale re-architecting or giving someone else the problem of running them whilst new solutions are built.

3. Big Data/ Data Science grows and market starts to consolidate – 2012 was the year that Big Data technologies went mainstream…2013 will see an increased focus on Data Science resource and technology to maximise the analytical value. There will also be some consolidation at the infrastructure product level.

In financial services we saw a fair amount of discussion, some large proof of concept projects focusing on consolidation (many seem to be targeting the risk and finance areas), but not the levels of take up we expected. MasterCard have come in with a big data restaurant review concept. We may have been slightly premature with this one. We think the understanding of Data Science is starting to go mainstream and, as with Cloud, the demand will come more from the business rather than IT architects in 2014.

4. Data Centre/Hosting providers continue growth – fewer and fewer companies are talking about building their own data centres now, even the very large ones. With the focus on core business value, infrastructure will continue to be hosted externally driving up the need for provider compute power.

 Many organisations either use external more flexible hosting solutions or have an excess of capacity in their existing data centres. This will continue and grow in pace in 2014.

5. More rationalisation of IT organisations – 2012 saw large reductions in operational workforce, particularly in financial services. With revenues under more pressure this year (and in line with point 1) we will see further reductions in resource capacity and relocation to low cost locations, both nearshore and within the UK.

In the financial services sector this may be at an end. There will be growth in demand for IT skills in 2014 but there will be some reductions particularly in the infrastructure/BAU space due to the continued commoditisation of technology and move to XaaS services.

6. Crowd-funding services continue to gain market share – there have been many new entrants to this space over recent years with companies such as Funding Circle, Thin-Cats, Bank-to-the-Future and Kickstarter all doing well. We see this continuing to grow as access to funds from traditional lenders is still hard. The question is at what point will they step in.

This one was an easy prediction as a low starting point combined with the banks reluctance to lend, low interest rates and increasing interest in the tech sector inevitably led to high levels of growth. 2014 will continue this trend but with a higher degree of regulation after the first high profile failure of a lending exchange…

7. ‘Instant’ Returns on investment required – growth of SaaS & BYOD is changing the perception of technology. People as consumers are now accustomed to an instant solution to a problem (by downloading an app or purchasing a service with a credit card). This, combined with historic patchy project successes, means that long lead-time projects are becoming harder to justify; IT departments are having to find near instant solutions to business problems.

Business users are leading IT departments on the adoption of SaaS in particular. IT is playing catch-up and the race will continue. We are not sure what 2014 will bring on this. It could be that IT departments regain control or, alternatively, are bypassed on a more frequent basis by impatient, IT savvy business users.

8. Technology Talent Wars – with start-ups disrupting traditional players in areas such as data analytics, social media and mobile payment apps, barriers to entry eroding and salaries on the rise we see a shift from talent wanting to join industries such as financial services and choosing new technology companies.

Relatively low demand from financial services firms (except for a few specific skills such as security) has deferred this. This is more likely to impact 2014 change and innovation programmes now.

9. Samsung/Android gain more ground over Apple – we already have seen the Apple dominance, specifically in relation to the Appstore, being eroded and this will continue as the potential of a more open platform becomes apparent to both developers and users of technology.

This has happened and will continue unless Apple can come up with some new magic. Phones/tablets are the new battleground, other operating systems such as Windows and potentially Jolla could disrupt the trend in 2014.

10. The death knell sounds for RIM/Blackberry – not much more to say. Most likely they will be acquired by one of the big new technology companies to gain access to the remaining smart phone users.

The only thing to add to this is that there may be a ‘dead-cat’ bounce for Blackberry in 2014.

 

Once again we hope you have enjoyed our monthly articles and have had a successful 2013. We wish you all the same for 2014!

 

What Tech companies can learn from Banks – There’s no such thing as a free lunch.

Posted on : 23-12-2013 | By : richard.gale | In : Finance, Innovation

Tags: , , , , , , , , , , ,

2

Rewind to the 1990’s 

In the early ’90s I moved from a Californian software start-up to a venerable merchant bank in the City. There were a number of changes in culture, which included being admonished for not wearing a jacket when walking through reception, but most thrilling was the staff restaurant…. It was free and you could eat as much as you like. I couldn’t believe my luck! Older hands complained that they could no longer have a glass or two of wine with lunch (also free) and that the breakfasts ‘weren’t like they used to be’. I was amazed that the bank could afford to give away so much and munched my way through most of the decade there.

Banking Innovation

Apart from the impact on my waistline it was an exciting time. Historically, banking had been a straightforward affair, but now there was ever increasing demand for new, innovative financial services and the profitability of these could be immense. I worked with the derivatives team for a while and they made huge amounts of money creating & selling bonds to allow exposure to emerging stock markets. These securities were complex and an increasing number of mathematical wizards and PhD’s came into the department attracted by the dual carrots that banking was becoming fun & fashionable (again) and vast quantities of money delivered in bonuses.

Complexity

As more rivals joined the market the products became more complex and I, for one, soon lost the ability to work out what the underlying securities & risks were. I’m sure the clients had a better understanding than me but was not always convinced.

The drive for new and more exotic products accelerated and the intake of post-doctorates rose further. There were a number on the team that really were ‘rocket scientists’… The bank was increasing sales and profitability through the creation and trading of these products and everything was good.

Financial services firms moved into new sectors they may not have had the same level of expertise in which resulted in varying degrees of success.

Concentration

As the demand for products grew and the supply of brain power was limited the inevitable happened and the price of skilled product innovators and traders went up. Also there were a significant number of acquisitions where global banks bought companies or teams with the lure of huge bonuses. This led to the concentration of skills within a small number of large organisations.

Costs

In addition to the high price of the Rainmakers, the cost of settling, accounting and monitoring the trades was rising. Small departments that could rely on shared knowledge (and a degree of shouting) became too large and compliance forced separation of roles (particularly after the Baring’s affair). This resulted in a much increased level of fixed people costs for the banks which was fine when business was growing but a heavy burden if the growth slowed as it was difficult to reduce the number of people without the processes failing.

Over-stretched

It is probably fair to say that a number of financial services firms over-stretched themselves financially and some of them both legally & morally too in pursuit of continued profitability & growth.

The now global complex web of front/back office interactions, teams of people trading and ensuring successful completion of trades, needed to be fed with even more new types of products. Some of these (such as sliced & diced packages of mortgage backed security derivatives) contributed to the banking crisis of 2008. Over ambitious expansion plans through acquisition & merger increased unwise leverage further.

Lessons learnt

Financial Services are now one of the most highly regulated and controlled industries in the world. The cost of doing business is extremely high as reflected in the large amounts of money being spent of regulatory and compliance projects. This is resulting in a smaller number of larger organisations running most of the global banking sector with reduced opportunities and, perhaps, less  inclination to be innovative.

 

Fast-forward to now

 

Tech companies are massively innovative with new and exciting products emerging all the time

Technology products are the most exciting and most accessible they have ever been.

Tech companies can be immensely valuable and command huge stock market valuations

Too many to mention here… Google, Facebook, Twitter, Amazon, anything new etc.

Tech companies have virtually unlimited amounts of money available to them.

Tech companies attract the top talent

Tech companies are fashionable, can pay well and have the additional attraction of huge bonuses in the form of share options.

 How these applications work is  a mystery to the average consumer

We use, buy and promote products often without understanding or even caring about where our data is going because they make our lives easier or more fun.

 There are a small number of large companies dominating the market

Any small innovative company is snapped up by one of the global giants, they have very deep  pockets and price is almost irrelevant to them over market share.

Costs are increasing

The older more established firms (Microsoft, Oracle etc) have large cost bases which is impacting their ability to innovate and also results in a lot of hungry mouths to feed which can eat into money potentially better used elsewhere such as R&D. ‘Newer’ Tech firms may not have reached that but will sometime soon (Google employees have increased from 20,000 in 2010 to 50,000+ in 2013). Not all of those can be working on front-line product innovation.

Tech companies provide free lunch

Most technology companies are desperate to retain their valuable staff so provide many mechanisms to do this; free massages, childcare & lunches. I have been reliably informed that Google provide an unlimited buffet in their London campus including lobster…

 

When will technology companies stop serving free lunches?

The Technology sector is part way through a sustained boom. How long it will last for is anyone’s guess but it would be good to think that the leaders of these companies can learn from the past and the mistakes made by some of the banks. If they think about how they are growing, what areas they are getting into, how well do they understand the products and risks, what impact does it have on the agility and complexity of their business and how can they prevent a drift towards complacency? How to stay aligned to the interests of your customers whilst continuing to remain profitable for the interests of your shareholders – it’s a difficult challenge when high levels of growth have been the norm.

If Tech companies do not recognise this and change then living up to those company slogans may get harder as employee numbers swell and profits get squeezed.

 

Self Diagnosing and Self Healing Systems

Posted on : 27-03-2013 | By : richard.gale | In : Innovation

Tags: , , , , , , , , , , ,

0

Medical internet sites are leading the charge on self-diagnosis – working through a set of symptoms to produce a number of likely outcomes. In automotive and aeronautical industries the concept of voting based systems for ‘mission critical’ decisions are well established (the Airbus has three sets of applications performing the same function developed and tested by separate teams. 99.9% of the time they all make the same decision correctly but if there is a dispute the majority ‘wins’).

Many business systems rely on an army of people to change, fix, tune and oil the huge number of systems, applications and processes that reside in organisations. Why have the ideas used in other disciplines not been transferred to general business?

We think that there is a lot of long term potential but there is long way to go and the reasons are as follows:

Homogeneity  – most systems use similar components or software but the business complexities result in very diverse implementations and one firm’s trade processing flow will look very different from another. So the ability to produce a generic solution for understanding issues and resolving them automatically currently outweighs the benefits. One major bank we know has identified that 70% of its risk systems across investment banking and corporate have the same functions but is not going to consolidate through a combination of politics, strategic focus and potential regulatory impacts. If it did have the desire (and nerve!) to do this it would be a perfect opportunity to build in some simple feedback and decision making abilities into the applications (we think anyway…)

Impact – although large they do not generally have the same impact or coverage. A medical self-diagnosis system requires human interaction but also will be generally the same for seven billion people, if an Airbus 320 crashes due to systems failure then the number of people directly impacted is low but the effect on the manufacturer, airline and air travel generally is very visible and high.

Desire – Most of these systems are ‘good enough’ and it is accepted practice to utilise a large team to support an application. Organisations look for efficiencies through standardisation, scaling, outsourcing and generally using lower cost staff to support them. Organisations benchmark themselves against their peers and if similar organisations are doing things a similar way then then the desire for radical changes can be reduced.

Risk – or fear of the unknown. There has been a great deal of research and experimentation with self diagnosis/healing in electronic control systems but the field is still young in the business applications space. Being an early-mover could result in a very expensive failure and so risk adverse CIOs are unlikely to step up to this challenge without one of their peers going first.

Knowledge – this is, perhaps, the deciding factor in the usage of self-diagnosis – electronic system that control planes, although being immensely complicated usually only have a small number of potential outcomes, financial systems with multiple forms of inputs, transformations, calculations, manual overrides, legacy and diverse systems can have an almost infinite number of outcomes or issues. No trading system is fully tested before it goes live as the complexity of the testing process would mean the system would be obsolete before it was signed off. Couple that with a 10 year old accounting engine written by 100 people (95 of which who have left the company), a bought in messaging system and an outsourced settlement function and it is little surprise why the inventive, creative minds of experienced human resources are needed to identify and resolve the myriad of issues emerging from the infrastructure.

So, for the short term at least, we think the armies of support staff across IT and business support are here to stay. But as technology continues to move forward we think there is a great opportunity for organisations to make a step-change in their support models and start building in self diagnosis and correction into their applications. The results in terms of operational efficiencies and reduced costs through errors and manual intervention could be enormous.

 

 


Broadgate Predicts 2013 – Survey Results

Posted on : 27-03-2013 | By : jo.rose | In : Data, Finance, General News, Innovation, IoT

Tags: , , , , , , , ,

0

In January we surveyed our clients, colleagues and partners against our predictions for 2013. We are pleased that we have now the results, the highlights of which are included below.

Key Messages

Infrastructure as a Service, Cloud and a shift to Data Centre & Hosted Services scored the highest, outlining the move from on-premise to a more utility based compute model.

Strategies to rationalise apps, infrastructure and organisations remains high on the priority list. However, removing the technology burden built over many years is proving difficult.

Many commented on the current financial constraints within organisations and the impact to the predictions in terms of technology advancement.

Response Breakdown

 

 

 

 

 

 

 

 

 

 

Of the total responses received, the vast majority concurred with the predictions for 2013. A total of 78% either “Agreed” or “Strongly Agreed” (broadly in line with the 2012 survey).

Ranking

 

 

 

 

 

 

 

 

 

 

The diagram above shows the results in order from highest scoring to lowest. The continued growth in Infrastructure as a Service had the top overall ranking with 91% and the least was Crowd-funding with 53% agreement.

Respondents

 

 

 

 

 

 

 

 

 

 

We sent our predictions out to over 700 of our clients and associates. Unlike our previous years’ survey, we wanted to get feedback from all levels and functions, so alongside CIOs, COOs and technology leaders we also surveyed SMEs on both the buy and sell side of service delivery organisations.

We would like to thank all respondents for their input and particularly for the many that provided additional insight and commentary.

If you would like a copy of the full report, please email jo.rose@broadgateconsultants.com.

Broadgate Predicts – 2013

Posted on : 31-12-2012 | By : jo.rose | In : General News

Tags: , , , , , , , , ,

3

As 2012 draws to a close we look forward to some themes for 2013.

These are our views, not analysts or market research firms. They are general observations and what we have determined during our interactions with clients over the past 12 months as to how we see the industry shaping. Let us know what you think if you can by completing the survey.

  1. Infrastructure Services continue to commoditise – for many organisations, Infrastructure as a Service (IaaS) is now mainstream. Technology advancement will continue to move the underlying infrastructure more towards a utility model and reduce costs in terms of software, hardware and resource.
  2. Application/Platform rationalisation – for many large firms there is still a large amount of legacy cost in terms of both disparate platforms, often aligned by business unit, and their sheer size/complexity. The next year will see an increase in rationalisation of application platforms to drive operational efficiency.
  3. Big Data/ Data Science grows and market starts to consolidate – 2012 was the year that Big Data technologies went mainstream…2013 will see an increased focus on Data Science resource and technology to maximise the analytical value. There will also be some consolidation at the infrastructure product level.
  4. Data Centre/Hosting providers continue growth – fewer and fewer companies are talking about building their own data centres now, even the very large ones. With the focus on core business value, infrastructure will continue to be hosted externally driving up the need for provider compute power.
  5. More rationalisation of IT organisations – 2012 saw large reductions in operational workforce, particularly in financial services. With revenues under more pressure this year (and in line with point 1) we will see more reductions in resource capacity and relocation to low cost locations, both nearshore and within the UK.
  6. Crowd-funding services continue to gain market share – there have been many new entrants to this space over recent years with companies such as Funding Circle, Thin-Cats, Bank-to-the-Future and Kickstarter all doing well. We see this continuing to grow as access to funds from traditional lenders is still hard. The question is at what point will they step in.
  7. ‘Instant’ Returns on investment required – growth of SaaS & BYOD is changing the perception of technology. People as consumers are now accustomed to an instant solution to a problem (by downloading an app or purchasing a service with a credit card). This, combined with historic patchy project successes, means that long lead-time projects are becoming  harder to justify; IT departments are having to find near instant solutions to business problems.
  8. Technology Talent Wars – with start-ups disrupting traditional players in areas such as data analytics, social media and mobile payment apps, barriers to entry eroding and salaries on the rise we see a shift from talent wanting to join industries such as financial services and choosing new technology companies.
  9. Samsung/Android gain more ground over Apple – we already have seen the Apple dominance, specifically in relation to the Appstore, being eroded and this will continue as the potential of a more open platform becomes apparent to both developers and users of technology.
  10. The death knell sounds for RIM/Blackberry – not much more to say. Most likely they will be acquired by one of the big new technology companies to gain access to the remaining smart phone users.
Click here to take part in our 2013 predictions survey.