Are you able to access all the data across your organisation?

Posted on : 31-03-2019 | By : richard.gale | In : Data, Finance

0

For many years data has been the lifeblood of the organisation and more recently, the value of this commodity has been realised by many companies (see our previous article “Data is like oil”).

Advances in technology, processing power and analytics means that companies can collect and process data in real time. Most businesses are sitting on vast amounts of data and those that can harness it effectively can gain a much deeper understanding of their customers, better predict and improve their customer experience.

Our survey revealed that whilst most companies understand the value of their data and the benefits it can bring, many clients revealed a level of frustration in the systems and processes that manage it. Some respondents did qualify that “most of the data” was available, whilst others admitted some was stranded.

 “Data is in legacy silos, our long-term goal is to provide access through a consistent data management framework”

The deficiencies that we also discuss in this newsletter regarding legacy systems are partly responsible for this, although not wholly. This is a particular issue in financial services where many organisations are running on old systems that are too complex and too expensive to replace. Critical company data is trapped in silos, disconnected and incompatible with the rest of the enterprise.

These silos present a huge challenge for many companies. Recalling a comment of one Chief Data Office at a large institution;

“If I ask a question in more than one place, I usually get more than one answer!”

Data silos are expanding as companies collect too much data which they hold onto for longer than they need to. Big data has been a buzz word for a while now, but it is important that companies distinguish between big data and big bad data! The number of data sources are increasing all the time so the issue must be addressed if the data is to be used effectively to return some business value. Collecting a virtually unlimited amount of data needs to be managed properly to ensure that all data stored has a purpose and can be protected.

Shadow data further exacerbates the issue. This data is unverified, often inaccurate and out of date. Oversharing of this data results in it being stored in areas that are unknown and unable to be traced. Creating yet more data silos hidden from the wider enterprise. This data is viewed as a valid data source relied upon and then used as input into other systems, which can ultimately lead to bad business decisions being made.

A robust data governance and management strategy is something which the importance of cannot be underestimated, particularly for those serious about the digital agenda and customer experience. This is also a topic where the combination of business and IT leadership aligning on the product strategy and underlying “data plumbing” is a must.  This is not just about systems but also about the organisation’s attitude to data and its importance in the life of every business process. It is important that companies implement a data management strategy which encompasses not only the internal platforms and governance but also the presentation layer for business users, consumers and data insights.

The ultimate way to move beyond trading latency?

Posted on : 29-03-2019 | By : richard.gale | In : Finance, Uncategorized

Tags: , , , , , , ,

0

A number of power surges and outages have been experienced in the East Grinstead area of the UK in recent months. Utility companies involved have traced the cause to one of three  high capacity feeds to a Global Investment bank’s data centre facility.

The profits created by the same bank’s London based Propriety Trading group has increased tenfold in the same time.

This bank employs 1% of the world’s best post-doctoral theoretical Physics graduates  to help build its black box trading systems

Could there be a connection? Wild & unconfirmed rumours have been circulating within  the firm that a major breakthrough in removing the problem of latency – the physical limitation the time it takes a signal to transfer down a wire – ultimately governed by of the speed of light.

For years traders have been trying to reduce execution latency to provide competitive advantage in a highly competitive fast moving environment. The focus has moved from seconds to milli and now microsecond savings.

Many Financial Services & technology organisations have attempted to solve this problem through reducing  data hopping, routing, and going as far as placing their hardware physically close to the source of data (such as in an Exchange’s data centre) to minimise latency but no one has solved the issue – yet.

It sounds like this bank may have gone one step further. It is known that at the boundary of the speed of light – physics as we know it -changes (Quantum mechanics is an example where the time/space continuum becomes ‘fuzzy’). Conventional physics states that travelling faster than the speed of light and see into the future would require infinite energy and so is not possible.

Investigation with a number of insiders at the firm has resulted in an amazing and almost unbelievable insight. They have managed to build a device which ‘hovers’ over the present and immediate future – little detail is known about it but it is understood to be based on the previously unproven ‘Alcubierre drive’ principle. This allows the trading system to predict (in reality observe) the next direction in the market providing invaluable trading advantage.

The product is still in test mode as the effects of trading ahead of the data they have already traded against is producing outages in the system as it then tries to correct the error in the future data which again changes the data ad finitum… The prediction model only allows a small glimpse into the immediate future which also limits the window of opportunity for trading.

The power requirements for the equipment are so large that they have had to been moved to the data centre environment where consumption can be more easily hidden (or not as the power outages showed).

If the bank does really crack this problem then they will have the ultimate trading advantage – the ability to see into the future and trade with ‘inside’ knowledge legally. Unless another bank is doing similar in the ‘trading arms race’ then the bank will quickly become dominant and the other banks may go out of business.

The US Congress have apparently discovered some details of this mechanism and are requesting the bank to disclose details of the project. The bank is understandably reluctant to do this as it has spent over $80m developing this and wants to make some return on its investment.

If this system goes into true production mode surely it cannot be long before Financial Regulators outlaw the tool as it will both distort and ultimately destroy the markets.

Of course the project has a codename…. Project Tachyons

No one from the company was available to comment on the accuracy of the claims.

Do you believe that your legacy systems are preventing digital transformation?

Posted on : 14-03-2019 | By : richard.gale | In : Data, Finance, FinTech, Innovation, Uncategorized

Tags: , , , , , , , ,

0

According to the results of our recent Broadgate Futures Survey more than half of our clients agreed that digital transformation within their organisation was being hampered by legacy systems. Indeed, no one “strongly disagreed” confirming the extent of the problem.

Many comments suggested that this was not simply a case of budget constraints, but the sheer size, scale and complexity of the transition had deterred organisations in fear of the fact that they were not adequately equipped to deliver successful change.

Legacy systems have a heritage going back many years to the days of the mega mainframes of the 70’s and 80’s. This was a time when banks were the masters of technological innovation. We saw the birth of ATMs, BACS and international card payments. It was an exciting time of intense modernisation. Many of the core systems that run the finance sector today are the same ones that were built back then. The only problem is that, although these systems were built to last they were not built for change.

The new millennium experienced another significant development with the introduction of the internet, an opportunity the banks could have seized and considered developing new, simpler, more versatile systems. However, instead they decided to adopt a different strategy and modify their existing systems, in their eyes there was no need to reinvent the wheel. They made additions and modifications as and when required. As a result, most financial organisations have evolved over the decades into organisations of complex networks, a myriad of applications and an overloaded IT infrastructure.

The Bank of England itself has recently been severely reprimanded by a Commons Select Committee review who found the Bank to be drowning in out of date processes in dire need of modernisation. Its legacy systems are overly complicated and inefficient, following a merger with the PRA in 2014 their IT estate comprises of duplicated systems and extensive data overload.

Budget, as stated earlier is not the only factor in preventing digital transformation, although there is no doubt that these projects are expensive and extremely time consuming. The complexity of the task and the fear of failure is another reason why companies hold on to their legacy systems. Better the devil you know! Think back to the TSB outage (there were a few…), systems were down for hours and customers were unable to access their accounts following a system upgrade. The incident ultimately led to huge fines from the Financial Conduct Authority and the resignation of the Chief Executive.

For most organisations abandoning their legacy systems is simply not an option so they need to find ways to update in order to facilitate the connection to digital platforms and plug into new technologies.

Many of our clients believe that it is not the legacy system themselves which are the barrier, but it is the inability to access the vast amount of data which is stored in its infrastructure.  It is the data that is the key to the digital transformation, so accessing it is a crucial piece of the puzzle.

“It’s more about legacy architecture and lack of active management of data than specifically systems”

By finding a way to unlock the data inside these out of date systems, banks can decentralise their data making it available to the new digital world.

With the creation of such advancements as the cloud and API’s, it is possible to sit an agility layer between the existing legacy systems and newly adopted applications. HSBC has successfully adopted this approach and used an API strategy to expand its digital and mobile services without needing to replace its legacy systems.

Legacy systems are no longer the barrier to digital innovation that they once were. With some creative thinking and the adoption of new technologies legacy can continue to be part of your IT infrastructure in 2019!

https://www.finextra.com/newsarticle/33529/bank-of-england-slammed-over-outdated-it-and-culture

M&A – Cyber Security Due Diligence

Posted on : 31-08-2018 | By : richard.gale | In : Cyber Security, data security, Finance

Tags: , ,

0

Following the discovery of two data breaches affecting more than 1 billion Yahoo Inc. users, Verizon Communications Inc. reduced its offer by $350 million to acquire the company in 2017. This transaction illustrates how a companies’ reputation and future are impacted by cybersecurity, failure to investigate these measures during mergers and acqusitions could lead to costly integration, unexpected liability and higher overall enterprise risk.

We can see almost daily the effect a data breach can have with companies losing millions in terms of direct losses, reputational damage and customer loyalty. A hurried or limited cybersecurity vetting process may miss exposures or key indicators of an existing or prior breach.

It is crucial to understand cybersecurity vulnerabilities, the damage that may occur in the event of a breach, and the effectiveness of the infrastructure that the target business has in place. An appropriate evaluation of these areas could significantly impact the value that the acquirer places on the target company and how the deal is structured. It is therefore crucial to perform a security assessment on the to-be-acquired company.

It wasn’t that long ago that mergers and acquisition deals were conducted in a paper-based room secured and locked down to only those with permitted access.  These days the process has moved on and is now mostly online, with the secure virtual data room being the norm. Awareness of cyber security in the information gathering part of the deal making process is well established. It is the awareness and need to look at the cyber security of the target company itself that has traditionally been under emphasised, looking more at the technical and practical job of integrating the merged companies’ infrastructure.

Deal makers acquiring must assess the cyber risk of an organisation in the same way that it would assess overall financial risk. Due diligence is all about establishing the potential liabilities of the company you are taking on.  According to the Verizon Data Breach survey it takes an average of 206 days to discover a breach. Often companies are breached without ever knowing. It is therefore important to look at the cyber risk not just in terms of have they been breached but what is the likelihood and impact of a breach.  An acquisition target company that looks good at the time of closing the deal may not look quite so good a few months later.

The main reason for this lack of importance given to the cyber threat is that M&A teams find it hard to quantify the cyber risk particularly given the time pressures involved.  A cyber risk assessment at the M&A stage is crucial if the acquiring company wants to protect its investment. The ability to carry out this assessment and to quantify the business impact of a likely cyber breach with a monetary value is invaluable to deal makers. Broadgate’s ASSURITY Assessment provides this information in a concise, value specific way using business language to measure risks, likelihood and cost of resolution.

A cyber security assessment should be part of every M&A due diligence process. If you don’t know what you are acquiring in terms of intellectual property and cyber risk how can you can possibly know the true value of what you are acquiring!

 

The Opportunity for Intelligent Process Automation in KYC / AML

Posted on : 28-06-2018 | By : richard.gale | In : compliance, Data, Finance, FinTech, Innovation

Tags: , , , , , , , , , , ,

0

Financial services firms have had a preoccupation with meeting the rules and regulations for fighting Financial Crime for the best part of the past decade. Ever since HSBC received sanction from both UK and US regulators in 2010, many other firms have also been caught short in failing to meet society’s expectations in this space. There have been huge programmes of change and remediation, amounting to 10’s of Billions of any currency you choose, to try to get Anti-Financial Crime (AFC) or Know Your Customer (KYC) / Anti-Money Laundering (AML) policies, risk methodologies, data sources, processes, organisation structures, systems and client populations into shape, at least to be able to meet the expectations of regulators, if not exactly stop financial crime.

The challenge for the industry is that Financial Crime is a massive and complex problem to solve. It is not just the detection and prevention of money laundering, but also needs to cover terrorist financing, bribery & corruption and tax evasion. Therefore, as the Banks, Asset Managers and Insurers have been doing, there is a need to focus upon all elements of the AFC regime, from education to process, and all the other activities in-between. Estimates as to the scale of the problem vary but the consensus is that somewhere between $3-5 trillion is introduced into the financial systems each year.

However, progress is being made. Harmonisation and clarity of industry standards and more consistency has come from the regulators with initiatives such as the 4th EU AML Directive. The appreciation and understanding of the importance of the controls are certainly better understood within Financial Services firms and by their shareholders. Perhaps what has not yet progressed significantly are the processes of performing client due diligence and monitoring of their subsequent activity. Most would argue that this is down to a number of factors, possibly the greatest challenge being the disparate and inconsistent nature of the data required to support these processes. Data needs to be sourced in many formats from country registries, stock exchanges, documents of incorporation, multiple media sources etc… Still today many firms have a predominantly manual process to achieve this, even when much of the data is available in digital form. Many still do not automatically ingest data into their work flows and have poorly defined processes to progress onboarding, or monitoring activities. This is for the regulations as they stand today, in the future this burden will further increase as firms will be expected to take all possible efforts to determine the integrity of their clients i.e. by establishing linkages to bad actors through other data sources such as social media and the dark web not evident in traditional sources such as company registries.

There have been several advances in recent years with technologies that have enormous potential for supporting the AFC cause. Data vendors have made big improvements in providing a broader and higher quality of data. The Aggregation solutions, such as Encompass offer services where the constituents of a corporate ownership structure can be assembled, and sanctions & PEP checks undertaken in seconds, rather than the current norm of multiple hours. This works well where the data is available from a reliable electronic source. However, does not work where there are no, or unreliable sources of digital data, as is the case for Trusts or in many jurisdictions around the world. Here we quickly get back to the world of paper and PDFs’ which still require human horsepower to review and decision.

Getting the information in the first instance can be very time consuming with complex interactions between multiple parties (relationship managers, clients, lawyers, data vendors, compliance teams etc) and multiple communications channels i.e. voice, email and chat in its various forms. We also have the challenge of Adverse Media, where thousands of news stories are generated every day on Corporates and Individuals that are the clients of Financial firms. The news items can be positive or negative but consumes tens of thousands of people to review, eliminate or investigate this mountain of data each day. The same challenges come with transaction monitoring, where individual firms can have thousands of ‘hits’ every day on ‘unusual’ payment patterns or ‘questionable’ beneficiaries. These also require review, repair, discounting or further investigation, the clear majority of which are false positives that can be readily discarded.

What is probably the most interesting opportunity for allowing the industry to see the wood for the trees in this data heavy world, is the maturing of Artificial Intelligence (AI) based, or ‘Intelligent’ solutions. The combination of Natural Language Processing with Machine Learning can help the human find the needles in the haystack or make sense of unstructured data that would ordinarily require much time to read and record. AI on its own is not a solution but combined with process management (workflow) and digitised, multi-channel communications, and even Robotics can achieve significant advances. In summary ‘Intelligent’ processing can address 3 of the main data challenges with the AFC regimes within financial institutions;

  1. Sourcing the right data – Where data is structured and digitally obtainable it can be readily harvested but needs to be integrated into the process flows to be compared, analysed, accepted or rejected as part of a review process. Here AI can be used to perform these comparisons, support analysis and look for patterns of common or disparate Data. Where the data is unstructured i.e. embedded in a paper document (email / PDF / doc etc.) then AI NLP and Machine Learning can be used to extract the relevant data and turn the unstructured into structured form for onward processing
  2. Filtering – with both Transaction Monitoring and Adverse Media reviews there is a tsunami of data and events presented to Compliance and Operations teams for sifting, reviewing, rejecting or further investigation. The use of AI can be extremely effective at performing this sifting and presenting back only relevant results to users. Done correctly this can reduce this burden by 90+% but perhaps more importantly, never miss or overlook a case so providing reassurance that relevant data is being captured
  3. By using Intelligent workflows, processes can be fully automated where simple decision making is supported by AI, thereby removing the need for manual intervention in many tasks being processed. Leaving the human to provide value in the complex end of problem solving

Solutions are now emerging in the industry, such as OPSMATiX, one of the first Intelligent Process Automation (IPA) solutions. Devised by a group of industry business experts as a set of technologies that combine to make sense of data across different communication channels, uses AI to turn the unstructured data into structured, and applies robust workflows to optimally manage the resolution of cases, exceptions and issues. The data vendors, and solution vendors such as Encompass are also embracing AI techniques and technologies to effectively create ‘smart filters’ that can be used to scour through thousands, if not millions of pieces of news and other media to discover, or discount information of interest. This can be achieved in a tiny fraction of the time, and therefore cost, and more importantly with far better accuracy than the human can achieve. The outcome of this will be to liberate the human from the process, and firms can either choose to reduce the costs of their operations or use people more effectively to investigate and analyse those events, information and clients that maybe of genuine cause for concern, rather than deal with the noise.

Only once the process has been made significantly more efficient, and the data brought under control can Financial firms really start to address the insidious business of financial crime. Currently all the effort is still going into meeting the regulations, and not societies actual demand which is to combat this global menace, Intelligent process should unlock this capability

 

Guest Author : David Deane, Managing Partner of FIMATIX and CEO of OPSMATiX. David has had a long and illustrious career within Operations and Technology global leadership with Wholesale Banks and Wealth Managers. Before creating FIMATIX and OPSMATiX, he was recently the Global Head of KYC / AML Operations for a Tier 1 Wholesale Bank.

david.deane@fimatix.com

Welcoming Robots to the Team

Posted on : 30-05-2018 | By : richard.gale | In : Finance, FinTech, Innovation

Tags: , , , , ,

1

Research suggests that that the adoption of Robotic Process Automation (RPA) and AI technologies is set to double by 2019. This marks a fundamental change in how organisations work and the potential impact on employees should not be underestimated.

For many years we have seen robots on the factory floor where manual processes have been replaced by automation. This has drastically changed the nature of manufacturing and has inevitably led to a reduction in these workforces.  It is understandable therefore, that we can hear the trembling voices of city workers shouting, “the robots are coming!”

Robotic software should not be thought of as the enemy but rather as a friendly addition to the IT family.  A different approach is needed. If you were replacing an excel spreadsheet with a software program an employee would see this as advantage, as it makes their job quicker and easier to do, therefore welcome the change. Looking at RPA in the same way will change the way employees view its implementation and how they feel about it.

There is no doubt that in some cases RPA is intended as a cost saver but organisations that see RPA as simply a cost saving solution will reap the least rewards. For many companies who have already completed successful RPA programmes, the number one priority has been to eliminate repetitive work that employees didn’t want or need to do. Approaching an RPA project in a carefully thought out and strategic manner will provide results that show that RPA and employees can work together.

Successful transformation using RPA relies on an often used but very relevant phrase  “it’s all about the People Process and Technology”.  You need all three in the equation. It is undeniable that automation is a disruptive technology which will affect employees outlook and affect the way they work. Change management is key in managing these expectations. If robots are to be a part of your organisation, then your employees must be prepared and included.

Perhaps it’s time to demystify RPA, and see it for what is really is, just another piece of software! Automation is about making what you do easier to execute, with less mistakes and greater flexibility. It is important to demonstrate to your staff that RPA is part of a much wider strategic plan of growth and new opportunities.

It is vital to communicate with staff at every level, explaining the purpose of RPA and what it will mean for them. Ensure everyone understands the implications and the benefits of the transition to automation. Even though activities and relationships within an organisation may change this does not necessarily mean a change for the worst.

Employees must be involved from the start of the process. Those individuals who have previously performed the tasks to be automated will be your subject matter experts. You will need to train several existing employees in RPA to manage the process going forward.  Building an RPA team from current employees will ensure that you have their buy- in which is crucial if the implementation is to be a success.

With any new software training is often an afterthought. In the case of RPA training is more important than ever, ensuring that the robots and employees understand each other and can work efficiently together. Working to train RPA experts internally will result in a value-added proposition for the future when it comes to maintaining or scaling your solution.

When analysing the initial RPA requirements, a great deal of thought must be given to the employees who are being replaced and where their skills can be effectively be redeployed. Employee engagement increases when personnel feel that their contribution to the organisation is meaningful and widespread.

Consultation and collaboration throughout the entire process will help to ensure a smoother transition where everyone can feel the benefits. Following a successful RPA implementation share the results with everyone in your organisation.  Share the outcomes and what you have learnt, highlight those employees and teams that have helped along the way.

The robots are coming! They are here to help and at your service!

OK Google, Alexa, Hey Siri – The Rise of Voice Control Technology

Posted on : 30-04-2018 | By : kerry.housley | In : Consumer behaviour, Finance, FinTech, Innovation, Predictions

Tags: , , , , ,

0

OK Google, Alexa, Hey Siri…. All too familiar phrases around the home now, but it was not that long ago that we did not know what a ‘smart phone’ was! Today most people could not live without one. Imagine not being able to check your email, instant message friends or watch a movie whilst on the move.  How long will it be before we no will no longer need a keyboard, instead talking to your computer will be the norm!

The development of voice activated technology in the home will ultimately revolutionise the way we command and control our computers. Google Home has enabled customers to shop with its partners, pay for the transaction and have goods delivered all without the touch of a keyboard. How useful could this be integrated into the office environment? Adding a voice to mundane tasks will enable employees to be more productive and free up time allowing them to manage their workflow and daily tasks more efficiently.

Voice-based systems has grown more powerful with the use of artificial intelligence, machine learning, cloud-based computing power and highly optimised algorithms. Modern speech recognition systems, combined with almost pristine text-to-speech voices that are almost indistinguishable from human speech, are ushering in a new era of voice-driven computing. As the technology improves and people become more accustomed to speaking to their devices, digital assistants will change how we interact with and think about technology.

There are many areas of business where this innovative technology will be most effective. Using voice control in customer service will transform the way businesses interact with their customers and improve the customer experience.

Many banks are in the process of, if they haven’t done so already, of introducing voice biometric technology. Voice control enables quick access to telephone banking without the need to remember a password every time you call or log in. No need to wade through pages of bank account details or direct debits to make your online payments instead a digital assistant makes the payment for you.

Santander has trialled a system that allows customers to make transfers to existing payees on their account by using voice recognition. Customers access the process by speaking into an application on their mobile device.

Insurance companies are also realising the benefits voice control can bring to their customers. HDFC  Insurance, an Indian firm, has announced the launch of its AI enabled chatbot on Amazon’s cloud-based voice service, Alexa. It aims to offer a 24/7 customer assistance with instant solutions to customer queries. Thereby creating an enhanced customer service experience, allowing them to get easy access to information about policies, simply with the use of voice commands.

It could also help to streamline the claims process where inefficiencies in claims documentation take up insurers’ time and money. Claims processors spend as much as 50% of their day typing reports and documentation; speech recognition could rapidly reduce the time it takes to complete the process. US company Nuance claims that their Dragon Speech Recognition Solution can enable agents to dictate documents three times faster than typing with up to 99% accuracy. They can use simple voice commands to collapse the process further.

Retailers too are turning to this technology. With competition so tough on the high street retailers are always looking for the ultimate customer experience and many believe that voice control is a great way to achieve this. Imagine a mobile app where you could scan shopping items, then pay using a simple voice command or a selfie as you leave the store. No more queuing at the till.

Luxury department store Liberty is a big advocate of voice control and uses it for their warehouse stock picking. Using headsets and a voice controlled application, a voice controlled app issues commands to a central server about which products should be picked. For retailers voice control is hit on and off the shop floor.

So, how accurate is voice recognition? Accuracy rates are improving all the time with researchers commenting that some systems could be better than human transcription. In 1995 the error rate was 43%, today the major vendors claim an error rate of just 5%.

Security is a major factor users still face with verification requiring two factor authentication with mobile applications. However, as the technology develops there should be less of a need to confirm an individual’s identity before commands can be completed.

As advances are made in artificial intelligence and machine learning the sky will be limit for Alexa and her voice control friends. In future stopping what you are doing and typing in a command or search will start to feel a little strange and old-fashioned.

 

How long will it be before you can pick up your smart phone talk to your bank and ask it to transfer £50 to a friend, probably not as far away prospect as you might think!!

How is Alternative Data Giving Investment Managers the Edge?

Posted on : 29-03-2018 | By : richard.gale | In : Consumer behaviour, Data, data security, Finance, FinTech, Innovation

Tags: , , ,

0

Alternative data (or ‘Alt-Data’) refers to data that is derived from a non-traditional source covering a whole array of platforms such as social media, newsfeeds, satellite tracking and web traffic.  There is vast amount of data in cyber space which, until recently remained untouched.  Here we shall look at the role of these unstructured data sets.

Information is the key to the success of any investment manager and information that can give the investor the edge is by no means a new phenomenon.  Traditional financial data, such as stock price history and fundamentals has been the standard for determining the health of a stock. However, alternative data has the potential to reveal insights about a stock’s health before traditional financial data. This has major implications for investors.

If information is power, then unique information sourced from places not-yet-sourced is giving those players the edge in a highly competitive market. Given that we’re in what we like to call a data revolution, where nearly every move we make can be digitized, tracked, and analysed, every company is now a data company. Everyone is both producing and consuming immense amounts of data in the race to make more money. People are well connected on social media platforms and information is available to them is many different forms. Add geographical data into the mix and that’s a lot of data about whose doing what and why. Take Twitter, it is a great tool for showing what’s happening in the world and what is being talked about. Being able to capture sentiment as well as data is a major advance in the world of data analytics.

Advanced analytical procedures can pull all this data together using machine learning and cognitive computing. Using this technology, we can take the unstructured data and transform it into useable data sets at rapid speed.

Hedge funds have been the early adopters and investment managers have now seen the light are expected to spend $7bn by 2020 on alternative data.  All asset managers realise that this data can produce valuable insight and give them the edge in a highly competitive market place.

However, it could be said that if all investment managers research data in this way, then that will put them all on the same footing and the competitive advantage is lost. Commentators have suggested that given the data pool is so vast and the combinations and permutations analysis is of data complex, it is still highly likely that this data can be uncovered that has not been uncovered by someone else. It all depends on the data scientist and where they decide to look. Far from creating a level playing field, where more readily available information simply leads to greater market efficiency, the impact of the information revolution is the opposite. It is creating hard-to access pockets for long-term alpha generation for those players with the scale and resources to take advantage of it.

Which leads us to our next point. A huge amount of money and resource is required to research this data, and this will mean only the strong survive. A report last year by S&P found that 80% of asset managers plan to increase their investments in big data over the next 12 months. Only 6% of asset managers argue that it is not important. Where does this leave the 6%?

Leading hedge fund bosses have warned fund managers they will not survive if they ignore the explosion of big data that is changing the way investors beat the markets. They are

Investing a lot of time and money to develop machine learning in areas of its business where humans can no longer keep up.

There is however one crucial issue which all investors should be aware of and that is the area of privacy. Do you know where that data originates from? Did that vendor have the right to sell the information in the first place?  We have seen this illustrated over the last few weeks with the Facebook “data breach” where Facebook sold on some of its users’ data to Cambridge Analytica without the users’ knowledge. This has wiped $100bn off the Facebook value so we can see the negative impact of using data without the owner’s permission.

The key question in the use of alternative data ultimately is, does it add value? Perhaps too early to tell. Watch this space!

GDPR – The Countdown Conundrum

Posted on : 30-01-2018 | By : Tom Loxley | In : Cloud, compliance, Cyber Security, data security, Finance, GDPR, General News, Uncategorized

Tags: , , , , , , , , , , , , ,

0

Crunch time is just around the corner and yet businesses are not prepared, but why?

General Data Protection Regulation (GDPR) – a new set of rules set out from the European Union which aims to simplify data protection laws and provide citizens across all member states with more control over their personal data”

It is estimated that just under half of businesses are unaware of incoming data protection laws that they will be subject to in just four months’ time, or how the new legislation affects information security.

Following a government survey, the lack of awareness about the upcoming introduction of GDPR has led to the UK government to issue a warning to the public over businesses shortfall in preparation for the change. According to the Digital, Culture, Media and Sport secretary Matt Hancock:

“These figures show many organisations still need to act to make sure the personal data they hold is secure and they are prepared for our Data Protection Bill”

GDPR comes into force on 25 May 2018 and potentially huge fines face those who are found to misuse, exploit, lose or otherwise mishandle personal data. This can be as much as up to four percent of company turnover. Organisations could also face penalties if they’re hacked and attempt to hide what happened from customers.

There is also a very real and emerging risk of a huge loss of business. Specifically, 3rd-party compliance and assurance is common practice now and your clients will want to know that you are compliant with GDPR as part of doing business.

Yet regardless of the risks to reputation, potential loss of business and fines with being non-GDPR compliant, the government survey has found that many organisations aren’t prepared – or aren’t even aware – of the incoming legislation and how it will impact on their information and data security strategy.

Not surprisingly, considering the ever-changing landscape of regulatory requirements they have had to adapt to, finance and insurance sectors are said to have the highest awareness of the incoming security legislation. Conversely, only one in four businesses in the construction sector is said to be aware of GDPR, awareness in manufacturing also poor. According to the report, the overall figure comes in at just under half of businesses – including a third of charities – who have subsequently made changes to their cybersecurity policies as a result of GDPR.

If your organisation is one of those who are unsure of your GDPR compliance strategy, areas to consider may include;

  • Creating or improving new cybersecurity procedures
  • Hiring new staff (or creating new roles and responsibilities for your additional staff)
  • Making concentrated efforts to update security software
  • Mapping your current data state, what you hold, where it’s held and how it’s stored

In terms of getting help, this article is a great place to start: What is GDPR? Everything you need to know about the new general data protection regulations

However, if you’re worried your organisation is behind the curve there is still have time to ensure that you do everything to be GDPR compliant. The is an abundance of free guidance available from the National Cyber Security Centre and the on how to ensure your corporate cybersecurity policy is correct and up to date.

The ICO suggests that, rather than being fearful of GDPR, organisations should embrace GDPR as a chance to improve how they do business. The Information Commissioner Elizabeth Denham stated:

“The GDPR offers a real opportunity to present themselves on the basis of how they respect the privacy of individuals, and over time this can play more of a role in consumer choice. Enhanced customer trust and more competitive advantage are just two of the benefits of getting it right”

If you require pragmatic advice on the implementation of GDPR data security and management, please feel free to contact us for a chat. We have assessed and guided a number of our client through the maze of regulations including GDPR. Please contact Thomas.Loxley@broadgateconsultants.com in the first instance.

 

Ripple Makes Waves

Posted on : 27-10-2017 | By : Tom Loxley | In : Bitcoin, Blockchain, Crytpocurrency, DLT, Finance, FinTech, Innovation

0

The big banks seem to have stopped resisting and begun embracing blockchain technology…well, at least when it comes to embedding the technology into their payments and transactions. Some time ago now Ripple (the digital currency and distributed payment network) pitched its tent on the Swift network’s front lawn and has been an increasingly irritating thorn in the payments provider’s side. (Description of Ripple provided by coindesk.com.)

It seems that Swift can no longer deny the clear advantages of blockchain technology, or perhaps in a savvy move, Swift has let Ripple do the hard work when it comes to the risk and testing involved in new employing a new leading-edge technology. But it’s more than just a new piece of code or tech that Ripple and the other big cryptocurrencies have brought to the financial services (FS) arena. It’s more like a paradigm shift.

For years the financial institutions have been upgrading software and bolting on workarounds to try and keep up with new demands and ever-evolving marketplace. This has resulted in a metaphoric Frankenstein of IT stacks and ageing and outdated technology. Some of the bigger institutions still have to wheel in the experts from the 80’s to deal with their issues because the tech is so old the IT workforce of today just don’t get exposed to it. Talk about choke points or single points of failure.

Big names from the FS industry have come out swinging against the cryptocurrencies boom with the likes of Jamie Dimon of JPMorgan and Larry Fink of Blackrock voicing their issues with Bitcoin. Now I’m not a diehard fan of cryptocurrencies, and I see the obvious concern with what appears to be the massive bubble that is Bitcoin and some of the other more expensive digital currencies, but there is part of me that hails them for the disruptive kick up the backside they seem to have given the FS industry.

Whatever you may personally think about Bitcoin and the early cryptocurrencies, their presence has created choice, a new way to transact value with some real benefits (transparency, security, more autonomy/control, speed, and lower costs) using blockchain technology, or Distributed Ledger Technology (DLT) as it is becoming more widely known as in the FS circles. (As if renaming it and slightly tweaking the definition has somehow distanced them from admitting there is real value in something that was widely scoffed initially.)

By popularising blockchain technology, cryptocurrencies have forced the FS industry to take a serious look at their technology and this (in my opinion) seems to have busted the door open to other new FinTech ideas. In fact, it now seems that the bigger institutions are clawing to be the first in the FinTech race and woe betide the Innovation Executive who is responsible for passing on the next bit of groundbreaking software, especially if the competition picks it up. Innovation is the new name of the game.

Ripple has continued its assault on the Swift network by boasting that its distributed financial technology can help banks cut the time and cost of clearing transactions and at the same time allowing new types of high-volume, low-value global transactions. Ripple also hosted their conference called “Swell: The Future Is Here” over the same period in October and only a few miles away from Swift’s Sibos event. They came out guns blazing with Ben Bernanke and Tim Berners-Lee headlining at their event and have made it clear the time and location of the event was not a coincidence.

Ripples tenacity seems to be paying off with over a 100 banks and FS organisations signing up to its network. Swift is hitting back with the 3rd phase of its global payment initiative (SWIFT gpi) focussing on DLT. Many of the larger banks have joined forces with Swift to explore the DLT Proof of Concept reporting initial success.

Swift is not the only large FS organisation exploring in this space. Indeed, despite Jamie Dimon’s opinion of Bitcoin, apparently he’s not opposed to the underlying technology. JPMorgan has used the Ethereum blockchain protocol as a base for Quorum, a DLT platform designed to support any application requiring high speed and high throughput processing of private transactions within a permissioned group of known participants.

Many other FS organisations are also exploring privately and collectively in consortiums to win the race and harness the power of the blockchain.

The irony here is that while we’re all caught up in this whirlwind of disruption to the FS industry, at the end of it all what is the real impact a year or so down the line? Kelly, the insurance broker from Doncaster, South Yorkshire makes the deposit payment on her new 4 bedroom semi-detached and says…hmmm…that was quicker than I remember a few years ago… and then gets on with her day. Or am I just being cynical?