Are you able to access all the data across your organisation?

Posted on : 31-03-2019 | By : richard.gale | In : Data, Finance

0

For many years data has been the lifeblood of the organisation and more recently, the value of this commodity has been realised by many companies (see our previous article “Data is like oil”).

Advances in technology, processing power and analytics means that companies can collect and process data in real time. Most businesses are sitting on vast amounts of data and those that can harness it effectively can gain a much deeper understanding of their customers, better predict and improve their customer experience.

Our survey revealed that whilst most companies understand the value of their data and the benefits it can bring, many clients revealed a level of frustration in the systems and processes that manage it. Some respondents did qualify that “most of the data” was available, whilst others admitted some was stranded.

 “Data is in legacy silos, our long-term goal is to provide access through a consistent data management framework”

The deficiencies that we also discuss in this newsletter regarding legacy systems are partly responsible for this, although not wholly. This is a particular issue in financial services where many organisations are running on old systems that are too complex and too expensive to replace. Critical company data is trapped in silos, disconnected and incompatible with the rest of the enterprise.

These silos present a huge challenge for many companies. Recalling a comment of one Chief Data Office at a large institution;

“If I ask a question in more than one place, I usually get more than one answer!”

Data silos are expanding as companies collect too much data which they hold onto for longer than they need to. Big data has been a buzz word for a while now, but it is important that companies distinguish between big data and big bad data! The number of data sources are increasing all the time so the issue must be addressed if the data is to be used effectively to return some business value. Collecting a virtually unlimited amount of data needs to be managed properly to ensure that all data stored has a purpose and can be protected.

Shadow data further exacerbates the issue. This data is unverified, often inaccurate and out of date. Oversharing of this data results in it being stored in areas that are unknown and unable to be traced. Creating yet more data silos hidden from the wider enterprise. This data is viewed as a valid data source relied upon and then used as input into other systems, which can ultimately lead to bad business decisions being made.

A robust data governance and management strategy is something which the importance of cannot be underestimated, particularly for those serious about the digital agenda and customer experience. This is also a topic where the combination of business and IT leadership aligning on the product strategy and underlying “data plumbing” is a must.  This is not just about systems but also about the organisation’s attitude to data and its importance in the life of every business process. It is important that companies implement a data management strategy which encompasses not only the internal platforms and governance but also the presentation layer for business users, consumers and data insights.

The ultimate way to move beyond trading latency?

Posted on : 29-03-2019 | By : richard.gale | In : Finance, Uncategorized

Tags: , , , , , , ,

0

A number of power surges and outages have been experienced in the East Grinstead area of the UK in recent months. Utility companies involved have traced the cause to one of three  high capacity feeds to a Global Investment bank’s data centre facility.

The profits created by the same bank’s London based Propriety Trading group has increased tenfold in the same time.

This bank employs 1% of the world’s best post-doctoral theoretical Physics graduates  to help build its black box trading systems

Could there be a connection? Wild & unconfirmed rumours have been circulating within  the firm that a major breakthrough in removing the problem of latency – the physical limitation the time it takes a signal to transfer down a wire – ultimately governed by of the speed of light.

For years traders have been trying to reduce execution latency to provide competitive advantage in a highly competitive fast moving environment. The focus has moved from seconds to milli and now microsecond savings.

Many Financial Services & technology organisations have attempted to solve this problem through reducing  data hopping, routing, and going as far as placing their hardware physically close to the source of data (such as in an Exchange’s data centre) to minimise latency but no one has solved the issue – yet.

It sounds like this bank may have gone one step further. It is known that at the boundary of the speed of light – physics as we know it -changes (Quantum mechanics is an example where the time/space continuum becomes ‘fuzzy’). Conventional physics states that travelling faster than the speed of light and see into the future would require infinite energy and so is not possible.

Investigation with a number of insiders at the firm has resulted in an amazing and almost unbelievable insight. They have managed to build a device which ‘hovers’ over the present and immediate future – little detail is known about it but it is understood to be based on the previously unproven ‘Alcubierre drive’ principle. This allows the trading system to predict (in reality observe) the next direction in the market providing invaluable trading advantage.

The product is still in test mode as the effects of trading ahead of the data they have already traded against is producing outages in the system as it then tries to correct the error in the future data which again changes the data ad finitum… The prediction model only allows a small glimpse into the immediate future which also limits the window of opportunity for trading.

The power requirements for the equipment are so large that they have had to been moved to the data centre environment where consumption can be more easily hidden (or not as the power outages showed).

If the bank does really crack this problem then they will have the ultimate trading advantage – the ability to see into the future and trade with ‘inside’ knowledge legally. Unless another bank is doing similar in the ‘trading arms race’ then the bank will quickly become dominant and the other banks may go out of business.

The US Congress have apparently discovered some details of this mechanism and are requesting the bank to disclose details of the project. The bank is understandably reluctant to do this as it has spent over $80m developing this and wants to make some return on its investment.

If this system goes into true production mode surely it cannot be long before Financial Regulators outlaw the tool as it will both distort and ultimately destroy the markets.

Of course the project has a codename…. Project Tachyons

No one from the company was available to comment on the accuracy of the claims.

Do you believe that your legacy systems are preventing digital transformation?

Posted on : 14-03-2019 | By : richard.gale | In : Data, Finance, FinTech, Innovation, Uncategorized

Tags: , , , , , , , ,

0

According to the results of our recent Broadgate Futures Survey more than half of our clients agreed that digital transformation within their organisation was being hampered by legacy systems. Indeed, no one “strongly disagreed” confirming the extent of the problem.

Many comments suggested that this was not simply a case of budget constraints, but the sheer size, scale and complexity of the transition had deterred organisations in fear of the fact that they were not adequately equipped to deliver successful change.

Legacy systems have a heritage going back many years to the days of the mega mainframes of the 70’s and 80’s. This was a time when banks were the masters of technological innovation. We saw the birth of ATMs, BACS and international card payments. It was an exciting time of intense modernisation. Many of the core systems that run the finance sector today are the same ones that were built back then. The only problem is that, although these systems were built to last they were not built for change.

The new millennium experienced another significant development with the introduction of the internet, an opportunity the banks could have seized and considered developing new, simpler, more versatile systems. However, instead they decided to adopt a different strategy and modify their existing systems, in their eyes there was no need to reinvent the wheel. They made additions and modifications as and when required. As a result, most financial organisations have evolved over the decades into organisations of complex networks, a myriad of applications and an overloaded IT infrastructure.

The Bank of England itself has recently been severely reprimanded by a Commons Select Committee review who found the Bank to be drowning in out of date processes in dire need of modernisation. Its legacy systems are overly complicated and inefficient, following a merger with the PRA in 2014 their IT estate comprises of duplicated systems and extensive data overload.

Budget, as stated earlier is not the only factor in preventing digital transformation, although there is no doubt that these projects are expensive and extremely time consuming. The complexity of the task and the fear of failure is another reason why companies hold on to their legacy systems. Better the devil you know! Think back to the TSB outage (there were a few…), systems were down for hours and customers were unable to access their accounts following a system upgrade. The incident ultimately led to huge fines from the Financial Conduct Authority and the resignation of the Chief Executive.

For most organisations abandoning their legacy systems is simply not an option so they need to find ways to update in order to facilitate the connection to digital platforms and plug into new technologies.

Many of our clients believe that it is not the legacy system themselves which are the barrier, but it is the inability to access the vast amount of data which is stored in its infrastructure.  It is the data that is the key to the digital transformation, so accessing it is a crucial piece of the puzzle.

“It’s more about legacy architecture and lack of active management of data than specifically systems”

By finding a way to unlock the data inside these out of date systems, banks can decentralise their data making it available to the new digital world.

With the creation of such advancements as the cloud and API’s, it is possible to sit an agility layer between the existing legacy systems and newly adopted applications. HSBC has successfully adopted this approach and used an API strategy to expand its digital and mobile services without needing to replace its legacy systems.

Legacy systems are no longer the barrier to digital innovation that they once were. With some creative thinking and the adoption of new technologies legacy can continue to be part of your IT infrastructure in 2019!

https://www.finextra.com/newsarticle/33529/bank-of-england-slammed-over-outdated-it-and-culture