Do you believe that your legacy systems are preventing digital transformation?

Posted on : 14-03-2019 | By : richard.gale | In : Data, Finance, FinTech, Innovation, Uncategorized

Tags: , , , , , , , ,

0

According to the results of our recent Broadgate Futures Survey more than half of our clients agreed that digital transformation within their organisation was being hampered by legacy systems. Indeed, no one “strongly disagreed” confirming the extent of the problem.

Many comments suggested that this was not simply a case of budget constraints, but the sheer size, scale and complexity of the transition had deterred organisations in fear of the fact that they were not adequately equipped to deliver successful change.

Legacy systems have a heritage going back many years to the days of the mega mainframes of the 70’s and 80’s. This was a time when banks were the masters of technological innovation. We saw the birth of ATMs, BACS and international card payments. It was an exciting time of intense modernisation. Many of the core systems that run the finance sector today are the same ones that were built back then. The only problem is that, although these systems were built to last they were not built for change.

The new millennium experienced another significant development with the introduction of the internet, an opportunity the banks could have seized and considered developing new, simpler, more versatile systems. However, instead they decided to adopt a different strategy and modify their existing systems, in their eyes there was no need to reinvent the wheel. They made additions and modifications as and when required. As a result, most financial organisations have evolved over the decades into organisations of complex networks, a myriad of applications and an overloaded IT infrastructure.

The Bank of England itself has recently been severely reprimanded by a Commons Select Committee review who found the Bank to be drowning in out of date processes in dire need of modernisation. Its legacy systems are overly complicated and inefficient, following a merger with the PRA in 2014 their IT estate comprises of duplicated systems and extensive data overload.

Budget, as stated earlier is not the only factor in preventing digital transformation, although there is no doubt that these projects are expensive and extremely time consuming. The complexity of the task and the fear of failure is another reason why companies hold on to their legacy systems. Better the devil you know! Think back to the TSB outage (there were a few…), systems were down for hours and customers were unable to access their accounts following a system upgrade. The incident ultimately led to huge fines from the Financial Conduct Authority and the resignation of the Chief Executive.

For most organisations abandoning their legacy systems is simply not an option so they need to find ways to update in order to facilitate the connection to digital platforms and plug into new technologies.

Many of our clients believe that it is not the legacy system themselves which are the barrier, but it is the inability to access the vast amount of data which is stored in its infrastructure.  It is the data that is the key to the digital transformation, so accessing it is a crucial piece of the puzzle.

“It’s more about legacy architecture and lack of active management of data than specifically systems”

By finding a way to unlock the data inside these out of date systems, banks can decentralise their data making it available to the new digital world.

With the creation of such advancements as the cloud and API’s, it is possible to sit an agility layer between the existing legacy systems and newly adopted applications. HSBC has successfully adopted this approach and used an API strategy to expand its digital and mobile services without needing to replace its legacy systems.

Legacy systems are no longer the barrier to digital innovation that they once were. With some creative thinking and the adoption of new technologies legacy can continue to be part of your IT infrastructure in 2019!

https://www.finextra.com/newsarticle/33529/bank-of-england-slammed-over-outdated-it-and-culture

What will the IT department look like in the future?

Posted on : 29-01-2019 | By : john.vincent | In : Cloud, Data, General News, Innovation

Tags: , , , , , , , , , ,

0

We are going through a significant change in how technology services are delivered as we stride further into the latest phase of the Digital Revolution. The internet provided the starting pistol for this phase and now access to new technology, data and services is accelerating at breakneck speed.

More recently the real enablers of a more agile and service-based technology have been the introduction of virtualisation and orchestration technologies which allowed for compute to be tapped into on demand and removed the friction between software and hardware.

The impact of this cannot be underestimated. The removal of the needed to manually configure and provision new compute environments was a huge step forwards, and one which continues with developments in Infrastructure as Code (“IaC”), micro services and server-less technology.

However, whilst these technologies continually disrupt the market, the corresponding changes to the overall operating models has in our view lagged (this is particularly true in larger organisations which have struggled to shift from the old to the new).

If you take a peek into organisation structures today they often still resemble those of the late 90’s where capabilities in infrastructure were organised by specialists such as data centre, storage, service management, application support etc. There have been changes, specifically more recently with the shift to devops and continuous integration and development, but there is still a long way go.

Our recent Technology Futures Survey provided a great insight into how our clients (290) are responding to the shifting technology services landscape.

“What will your IT department look like in 5-7 years’ time?”

There were no surprises in the large majority of respondents agreeing that the organisation would look different in the near future. The big shift is to a more service focused, vendor led technology model, with between 53%-65% believing that this is the direction of travel.

One surprise was a relatively low consensus on the impact that Artificial Intelligence (“AI”) would have on management of live services, with only 10% saying it would be very likely. However, the providers of technology and services formed a smaller proportion of our respondents (28%) and naturally were more positive about the impact of AI.

The Broadgate view is that the changing shape of digital service delivery is challenging previous models and applying tension to organisations and providers alike.  There are two main areas where we see this;

  1. With the shift to cloud based and on-demand services, the need for any provider, whether internal or external, has diminished
  2. Automation, AI and machine learning are developing new capabilities in self-managing technology services

We expect that the technology organisation will shift to focus more on business products and procuring the best fit service providers. Central to this is AI and ML which, where truly intelligent (and not just marketing), can create a self-healing and dynamic compute capability with limited human intervention.

Cloud, machine learning and RPA will remove much of the need to manage and develop code

To really understand how the organisation model is shifting, we have to look at the impact that technology is having the on the whole supply chain. We’ve long outsourced the delivery of services. However, if we look the traditional service providers (IBM, DXC, TCS, Cognizant etc.) that in the first instance acted as brokers to this new digital technology innovations we see that they are increasingly being disintermediated, with provisioning and management now directly in the hands of the consumer.

Companies like Microsoft, Google and Amazon have superior technical expertise and they are continuing to expose these directly to the end consumer. Thus, the IT department needs to think less about how to either build or procure from a third party, but more how to build a framework of services which “knits together” a service model which can best meet their business needs with a layered, end-to-end approach. This fits perfectly with a more business product centric approach.

We don’t see an increase for in-house technology footprints with maybe the exception of truly data driven organisations or tech companies themselves.

In our results, the removal of cyber security issues was endorsed by 28% with a further 41% believing that this was a possible outcome. This represents a leap of faith given the current battle that organisations are undertaking to combat data breaches! Broadgate expect that organisations will increasingly shift the management of these security risks to third party providers, with telecommunication carriers also taking more responsibilities over time.

As the results suggest, the commercial and vendor management aspects of the IT department will become more important. This is often a skill which is absent in current companies, so a conscious strategy to develop capability is needed.

Organisations should update their operating model to reflect the changing shape of technology services, with the closer alignment of products and services to technology provision never being as important as it is today.

Indeed, our view is that even if your model serves you well today, by 2022 it is likely to look fairly stale. This is because what your company currently offers to your customers is almost certain to change, which will require fundamental re-engineering across, and around, the entire IT stack.

The Challenges of Implementing Robotic Process Automation (RPA)

Posted on : 25-01-2019 | By : kerry.housley | In : Innovation, Uncategorized

Tags: , , , , ,

0

We recently surveyed our clients on their views around the future of technology in the workplace and the changes that they think are likely to shape their future working environment. 

One of the questions identified by many clients as a major challenge was around the adoption of RPA. We asked the question; 

“Do You Agree that RPA could improve the Efficiency of Your Business? 

Around 65% of the respondents to our survey agreed that RPA could improve the efficiency of their business, but many commented that they were put off by the challenges that needed to be overcome in order for RPA deployment to be a success. 

“The challenge is being able to identify how and where RPA is best deployed, avoiding any detrimental disruption 

In this article we will discuss in more detail the challenges, and what steps can be taken to ensure a more successful outcome. 

The benefits of RPA are:

  • Reduced operating costs
  • Increased productivity
  • Reduce employee’s workload to spend more time on higher value tasks
  • Get more done in less time! 

What Processes are Right for Automation? 

One of the challenges facing many organisations is deciding which processes are good for automation and which process to choose to automate first. This line from Bill Gates offers some good advice; 

automation applied to an inefficient operation will magnify the inefficiency” 

It follows therefore, that the first step in any automation journey is reviewing all of your business processes to ensure that they are all running as efficiently as possible.  You do not want to waste time, money and effort in implementing a robot to carry an inefficient process which will reap no rewards at all.  

Another challenge is choosing which process to automate first. In our experience, many clients have earmarked one of their most painful processes as process number one in order to heal the pain.  This fails more often than not because the most painful process is often one of the most difficult to automate.  Ideally, you want to pick a straightforward, highly repetitive process which will be easier to automate with simple results, clearly showing the benefits to automation. Buy-in at this stage from all stakeholders is critical if RPA is be successfully deployed further in the organisation. Management need to see the efficiency saving and employees can see how the robot can help them to do their job quicker and free up their time to do more interesting work. Employee resistance and onboarding should not be underestimated. Keeping workers in the loop and reducing the perceived threat is crucial to your RPA success.  

Collaboration is Key 

Successful RPA deployment is all about understanding and collaboration which if not approached carefully could ultimately lead to the failure of the project.  RPA in one sense, is just like any other piece of software that you will implement, but in another way it’s not. Implementation involves close scrutiny of an employee’s job with the employee feeling threatened by the fact that the robot may take over and they will be left redundant in the process.   

IT and the business must work closely together to ensure that process accuracy, cost reduction, and customer satisfaction benchmarks are met during implementation.  RPA implementation success is both IT- and business-driven, with RPA governance sitting directly in the space between business and IT. Failure to maintain consistent communication between these two sides will mean that project governance will be weak and that any obstacles, such as potential integration issues of RPA with existing programs, cannot be dealt effectively. 

Don’t Underestimate Change 

Change management should not be underestimated, the implementation of RPA is a major change in an organisation which needs to be planned for, and carefully managed. Consistently working through the change management aspects is critical to making RPA successful. It is important to set realistic expectations and look at RPA from an enterprise perspective focusing on the expected results and what will be delivered. 

 RPA = Better Business Outcomes 

RPA is a valuable automation asset in a company’s digital road map and can deliver great results if implemented well. However, often RPA implementations have not delivered the returns promised, impacted by the challenges we have discussed. Implementations that give significant consideration to the design phase and realise the importance of broader change management into the process will benefit from better business outcomes across the end-to-end process. Enterprises looking to embark on the RPA journey can have chance to take note, avoid the pitfalls and experience the success that RPA can bring. 

Application Performance Management (APM)  – Monitor Every Critical Swipe, Tap and Click

Posted on : 30-08-2018 | By : richard.gale | In : App, Consumer behaviour, Innovation

Tags: ,

0

Customers expect your business application to perform consistently and reliably at all times and for good reason. Many have built their own business systems based on the reliability of your application. This reliability target is your Service Level Objective (SLO), the measurable characteristics of a Service Level Agreement (SLA) between a service provider and its customer.

The SLO sets target values and expectations on how your service(s) will perform over time. It includes Service Level Indicators (SLIs)—quantitative measures of key aspects of the level of service—which may include measurements of availability, frequency, response time, quality, throughput and so on.

If your application goes down for longer than the SLO dictates, fair warning: All hell may break loose, and you may experience frantic pages from customers trying to figure out what’s going on. Furthermore, a breach to your SLO error budget—the rate at which service level objectives can be missed—could have serious financial implications as defined in the SLA.

Developers are always eager to release new features and functionality. But these upgrades don’t always turn out as expected, and this can result in an SLO violation. Deployments and system upgrades will be needed, but anytime you make changes to applications, you introduce the potential for instability.

There are two companies currently leading the way in Business Service Monitoring, New Relic and AppDynamics. AppDynamics has been named as Gartner Magic quadrant winner in APM for the last six years. This suite of application and business performance monitoring solutions ensures that every part of even the most complex, multi-cloud environments—from software to infrastructure to business outcomes—is highly visible, optimized, and primed to drive growth. The need for such a monitoring tool can be evidenced in the large number of Tier One banks which have taken it onboard.

AppDynamics is a tool which enables you to track the numerous metrics for your SLI. You can choose which metrics to monitor, with additional tools that can deliver deeper insights into areas such as End User Monitoring, Business IQ and Browser Synthetic Monitoring.

The application can be broken down into the following components:

  • APM: Say your application relies heavily on APIs and automation. Start with a few API you want to monitor and ask, “Which one of these APIs, if it fails, will impact my application or affect revenue?”  These calls usually have a very demanding SLO.
  • End User Monitoring: EUM is the best way to truly understand the customer experience because it automatically captures key metrics, including end-user response time, network requests, crashes, errors, page load details and so on.
  • Business iQ: Monitoring your application is not just about reviewing performance data.  Biz iQ helps expose application performance from a business perspective, whether your app is generating revenue as forecasted or experiencing a high abandon rate due to degraded performance.
  • Browser Synthetic Monitoring: While EUM shows the full user experience, sometimes it’s hard to know if an issue is caused by the application or the user. Generating synthetic traffic will allow you to differentiate between the two.

There is an SRE dashboard where you can view your KPIs:

  • SLO violation duration graph, response time (99th percentile) and load for your critical API calls
  • Error rate
  • Database response time
  • End-user response time (99th percentile)
  • Requests per minute
  • Availability
  • Session duration

SLI, SLO, SLA and error budget aren’t just fancy terms. They’re critical to determining if your system is reliable, available or even useful to your users. You should be able to measure these metrics and tie them to your business objectives, as the ultimate goal of your application is to provide value to your customers.

Selecting a new “digitally focused” sourcing partner

Posted on : 18-07-2018 | By : john.vincent | In : Cloud, FinTech, Innovation, Uncategorized

Tags: , , , , , ,

0

It was interesting to see the recent figures this month from the ISG Index, showing that the traditional outsourcing market in EMEA has rebounded. Figures for the second quarter for commercial outsourcing contracts show a combined annual contract value (ACV) of €3.7Bn. This is significantly up 23% on 2017 and for the traditional sourcing market, reverses a downward trend which had persisted for the previous four quarters.

This is an interesting change of direction, particularly against a backdrop of economic uncertainty around Brexit and the much “over indulged”, GDPR preparation. It seems that despite this, rather than hunkering down with a tin hat and stockpiling rations, companies in EMEA have invested in their technology service provision to support an agile digital growth for the future. The global number also accelerated, up 31% to a record ACV of €9.9Bn.

Underpinning some of these figures has been a huge acceleration in the As-a-Service market. In the last 2 years the ACV attributed to SaaS and IaaS has almost doubled. This has been fairly consistent across all sectors.

So when selecting a sourcing partner, what should companies consider outside of the usual criteria including size, capability, cultural fit, industry experience, flexibility, cost and so on?

One aspect that is interesting from these figures is the influence that technologies such as cloud based services, automation (including AI) and robotic process automation (RPA) are having both now and in the years to come. Many organisations have used sourcing models to fix costs and benefit from labour arbitrage as a pass-through from suppliers. Indeed, this shift of labour ownership has fuelled incredible growth within some of the service providers. For example, Tata Consultancy Services (TCS) has grown from 45.7k employees in 2005 to 394k in March 2018.

However, having reached this heady number if staff, the technologies mentioned previously are threatening the model of some of these companies. As-a-Service providers such as Microsoft Azure and Amazon AWS have platforms now which are carving their way through technology service provision, which previously would have been managed by human beings.

In the infrastructure space commoditisation is well under way. Indeed, we predict that the within 3 years the build, configure and manage skills in areas such Windows and Linux platforms will be rarely in demand. DevOps models, and variants of, are moving at a rapid pace with tools to support spinning up platforms on demand to support application services now mainstream. Service providers often focus on their technology overlay “value add” in this space, with portals or orchestration products which can manage cloud services. However, the value of these is often questionable over direct access or through commercial 3rd party products.

Secondly, as we’ve discussed here before, technology advances in RPA, machine learning and AI are transforming service provision. This of course is not just in terms of business applications but also in terms of the underpinning services. This is translating itself into areas such as self-service Bots which can be queried by end users to provide solutions and guidance, or self-learning AI processes which can predict potential system failures before they occur and take preventative actions.

These advances present a challenge to the workforce focused outsource providers.

Given the factors above, and the market shift, it is important that companies take these into account when selecting a technology service provider. Questions to consider are;

  • What are their strategic relationships with cloud providers, and not just at the “corporate” level, but do they have in depth knowledge of the whole technology ecosystem at a low level?
  • Can they demonstrate skills in the orchestration and automation of platforms at an “infrastructure as a code” level?
  • Do they have capability to deliver process automation through techniques such as Bots, can they scale to enterprise and where are their RPA alliances?
  • Does the potential partner have domain expertise and open to partnership around new products and shared reward/JV models?

The traditional sourcing engagement models are evolving which has developed new opportunities on both sides. Expect new entrants, without the technical debt, organisational overheads and with a more technology solution focus to disrupt the market.

The Opportunity for Intelligent Process Automation in KYC / AML

Posted on : 28-06-2018 | By : richard.gale | In : compliance, Data, Finance, FinTech, Innovation

Tags: , , , , , , , , , , ,

0

Financial services firms have had a preoccupation with meeting the rules and regulations for fighting Financial Crime for the best part of the past decade. Ever since HSBC received sanction from both UK and US regulators in 2010, many other firms have also been caught short in failing to meet society’s expectations in this space. There have been huge programmes of change and remediation, amounting to 10’s of Billions of any currency you choose, to try to get Anti-Financial Crime (AFC) or Know Your Customer (KYC) / Anti-Money Laundering (AML) policies, risk methodologies, data sources, processes, organisation structures, systems and client populations into shape, at least to be able to meet the expectations of regulators, if not exactly stop financial crime.

The challenge for the industry is that Financial Crime is a massive and complex problem to solve. It is not just the detection and prevention of money laundering, but also needs to cover terrorist financing, bribery & corruption and tax evasion. Therefore, as the Banks, Asset Managers and Insurers have been doing, there is a need to focus upon all elements of the AFC regime, from education to process, and all the other activities in-between. Estimates as to the scale of the problem vary but the consensus is that somewhere between $3-5 trillion is introduced into the financial systems each year.

However, progress is being made. Harmonisation and clarity of industry standards and more consistency has come from the regulators with initiatives such as the 4th EU AML Directive. The appreciation and understanding of the importance of the controls are certainly better understood within Financial Services firms and by their shareholders. Perhaps what has not yet progressed significantly are the processes of performing client due diligence and monitoring of their subsequent activity. Most would argue that this is down to a number of factors, possibly the greatest challenge being the disparate and inconsistent nature of the data required to support these processes. Data needs to be sourced in many formats from country registries, stock exchanges, documents of incorporation, multiple media sources etc… Still today many firms have a predominantly manual process to achieve this, even when much of the data is available in digital form. Many still do not automatically ingest data into their work flows and have poorly defined processes to progress onboarding, or monitoring activities. This is for the regulations as they stand today, in the future this burden will further increase as firms will be expected to take all possible efforts to determine the integrity of their clients i.e. by establishing linkages to bad actors through other data sources such as social media and the dark web not evident in traditional sources such as company registries.

There have been several advances in recent years with technologies that have enormous potential for supporting the AFC cause. Data vendors have made big improvements in providing a broader and higher quality of data. The Aggregation solutions, such as Encompass offer services where the constituents of a corporate ownership structure can be assembled, and sanctions & PEP checks undertaken in seconds, rather than the current norm of multiple hours. This works well where the data is available from a reliable electronic source. However, does not work where there are no, or unreliable sources of digital data, as is the case for Trusts or in many jurisdictions around the world. Here we quickly get back to the world of paper and PDFs’ which still require human horsepower to review and decision.

Getting the information in the first instance can be very time consuming with complex interactions between multiple parties (relationship managers, clients, lawyers, data vendors, compliance teams etc) and multiple communications channels i.e. voice, email and chat in its various forms. We also have the challenge of Adverse Media, where thousands of news stories are generated every day on Corporates and Individuals that are the clients of Financial firms. The news items can be positive or negative but consumes tens of thousands of people to review, eliminate or investigate this mountain of data each day. The same challenges come with transaction monitoring, where individual firms can have thousands of ‘hits’ every day on ‘unusual’ payment patterns or ‘questionable’ beneficiaries. These also require review, repair, discounting or further investigation, the clear majority of which are false positives that can be readily discarded.

What is probably the most interesting opportunity for allowing the industry to see the wood for the trees in this data heavy world, is the maturing of Artificial Intelligence (AI) based, or ‘Intelligent’ solutions. The combination of Natural Language Processing with Machine Learning can help the human find the needles in the haystack or make sense of unstructured data that would ordinarily require much time to read and record. AI on its own is not a solution but combined with process management (workflow) and digitised, multi-channel communications, and even Robotics can achieve significant advances. In summary ‘Intelligent’ processing can address 3 of the main data challenges with the AFC regimes within financial institutions;

  1. Sourcing the right data – Where data is structured and digitally obtainable it can be readily harvested but needs to be integrated into the process flows to be compared, analysed, accepted or rejected as part of a review process. Here AI can be used to perform these comparisons, support analysis and look for patterns of common or disparate Data. Where the data is unstructured i.e. embedded in a paper document (email / PDF / doc etc.) then AI NLP and Machine Learning can be used to extract the relevant data and turn the unstructured into structured form for onward processing
  2. Filtering – with both Transaction Monitoring and Adverse Media reviews there is a tsunami of data and events presented to Compliance and Operations teams for sifting, reviewing, rejecting or further investigation. The use of AI can be extremely effective at performing this sifting and presenting back only relevant results to users. Done correctly this can reduce this burden by 90+% but perhaps more importantly, never miss or overlook a case so providing reassurance that relevant data is being captured
  3. By using Intelligent workflows, processes can be fully automated where simple decision making is supported by AI, thereby removing the need for manual intervention in many tasks being processed. Leaving the human to provide value in the complex end of problem solving

Solutions are now emerging in the industry, such as OPSMATiX, one of the first Intelligent Process Automation (IPA) solutions. Devised by a group of industry business experts as a set of technologies that combine to make sense of data across different communication channels, uses AI to turn the unstructured data into structured, and applies robust workflows to optimally manage the resolution of cases, exceptions and issues. The data vendors, and solution vendors such as Encompass are also embracing AI techniques and technologies to effectively create ‘smart filters’ that can be used to scour through thousands, if not millions of pieces of news and other media to discover, or discount information of interest. This can be achieved in a tiny fraction of the time, and therefore cost, and more importantly with far better accuracy than the human can achieve. The outcome of this will be to liberate the human from the process, and firms can either choose to reduce the costs of their operations or use people more effectively to investigate and analyse those events, information and clients that maybe of genuine cause for concern, rather than deal with the noise.

Only once the process has been made significantly more efficient, and the data brought under control can Financial firms really start to address the insidious business of financial crime. Currently all the effort is still going into meeting the regulations, and not societies actual demand which is to combat this global menace, Intelligent process should unlock this capability

 

Guest Author : David Deane, Managing Partner of FIMATIX and CEO of OPSMATiX. David has had a long and illustrious career within Operations and Technology global leadership with Wholesale Banks and Wealth Managers. Before creating FIMATIX and OPSMATiX, he was recently the Global Head of KYC / AML Operations for a Tier 1 Wholesale Bank.

david.deane@fimatix.com

Welcoming Robots to the Team

Posted on : 30-05-2018 | By : richard.gale | In : Finance, FinTech, Innovation

Tags: , , , , ,

1

Research suggests that that the adoption of Robotic Process Automation (RPA) and AI technologies is set to double by 2019. This marks a fundamental change in how organisations work and the potential impact on employees should not be underestimated.

For many years we have seen robots on the factory floor where manual processes have been replaced by automation. This has drastically changed the nature of manufacturing and has inevitably led to a reduction in these workforces.  It is understandable therefore, that we can hear the trembling voices of city workers shouting, “the robots are coming!”

Robotic software should not be thought of as the enemy but rather as a friendly addition to the IT family.  A different approach is needed. If you were replacing an excel spreadsheet with a software program an employee would see this as advantage, as it makes their job quicker and easier to do, therefore welcome the change. Looking at RPA in the same way will change the way employees view its implementation and how they feel about it.

There is no doubt that in some cases RPA is intended as a cost saver but organisations that see RPA as simply a cost saving solution will reap the least rewards. For many companies who have already completed successful RPA programmes, the number one priority has been to eliminate repetitive work that employees didn’t want or need to do. Approaching an RPA project in a carefully thought out and strategic manner will provide results that show that RPA and employees can work together.

Successful transformation using RPA relies on an often used but very relevant phrase  “it’s all about the People Process and Technology”.  You need all three in the equation. It is undeniable that automation is a disruptive technology which will affect employees outlook and affect the way they work. Change management is key in managing these expectations. If robots are to be a part of your organisation, then your employees must be prepared and included.

Perhaps it’s time to demystify RPA, and see it for what is really is, just another piece of software! Automation is about making what you do easier to execute, with less mistakes and greater flexibility. It is important to demonstrate to your staff that RPA is part of a much wider strategic plan of growth and new opportunities.

It is vital to communicate with staff at every level, explaining the purpose of RPA and what it will mean for them. Ensure everyone understands the implications and the benefits of the transition to automation. Even though activities and relationships within an organisation may change this does not necessarily mean a change for the worst.

Employees must be involved from the start of the process. Those individuals who have previously performed the tasks to be automated will be your subject matter experts. You will need to train several existing employees in RPA to manage the process going forward.  Building an RPA team from current employees will ensure that you have their buy- in which is crucial if the implementation is to be a success.

With any new software training is often an afterthought. In the case of RPA training is more important than ever, ensuring that the robots and employees understand each other and can work efficiently together. Working to train RPA experts internally will result in a value-added proposition for the future when it comes to maintaining or scaling your solution.

When analysing the initial RPA requirements, a great deal of thought must be given to the employees who are being replaced and where their skills can be effectively be redeployed. Employee engagement increases when personnel feel that their contribution to the organisation is meaningful and widespread.

Consultation and collaboration throughout the entire process will help to ensure a smoother transition where everyone can feel the benefits. Following a successful RPA implementation share the results with everyone in your organisation.  Share the outcomes and what you have learnt, highlight those employees and teams that have helped along the way.

The robots are coming! They are here to help and at your service!

AI Evolution: Survival of the Smartest

Posted on : 21-05-2018 | By : richard.gale | In : Innovation, Predictions

Tags: , , , , ,

0

Artificial intelligence is getting very good at identifying things: Let it analyse a million pictures, and it can tell with amazing accuracy which show a child crossing the road. But AI is hopeless at generating images of people or whatever by itself. If it could do that, it would be able to create visions of realistic but synthetic pictures depicting people in various settings, which a self-driving car could use to train itself without ever going out on the road.

The problem is, creating something entirely new requires imaginationand until now that has been a step to far for machine learning.

There is an emerging solution first conceived by  Ian Goodfellow during an academic argument in a bar in 2014… The approach, known as a generative adversarial network, or “GAN”, takes two neural networksthe simplified mathematical models of the human brain that underpin most modern machine learningand pits them against each other to identify flaws and gaps in the others thought model.

Both networks are trained on the same data set. One, known as the generator, is tasked with creating variations on images it’s already seenperhaps a picture of a pedestrian with an extra arm. The second, known as the discriminator, is asked to identify whether the example it sees is like the images it has been trained on or a fake produced by the generatorbasically, is that three-armed person likely to be real?

Over time, the generator can become so good at producing images that the discriminator can’t spot fakes. Essentially, the generator has been taught to recognize, and then create, realistic-looking images of pedestrians.

The technology has become one of the most promising advances in AI in the past decade, able to help machines produce results that fool even humans.

GANs have been put to use creating realistic-sounding speech and photo realistic fake imagery. In one compelling example, researchers from chipmaker Nvidia primed a GAN with celebrity photographs to create hundreds of credible faces of people who don’t exist. Another research group made not-unconvincing fake paintings that look like the works of van Gogh. Pushed further, GANs can reimagine images in different waysmaking a sunny road appear snowy, or turning horses into zebras.

The results aren’t always perfect: GANs can conjure up bicycles with two sets of handlebars, say, or faces with eyebrows in the wrong place. But because the images and sounds are often startlingly realistic, some experts believe there’s a sense in which GANs are beginning to understand the underlying structure of the world they see and hear. And that means AI may gain, along with a sense of imagination, a more independent ability to make sense of what it sees in the world. 

This approach is starting to provide programmed machines with something along the lines of imagination. This, in turn, will make them less reliant on human help to differentiate. It will also help blur the lines between what is real and what is fake… And in an age where we are already plagued with ‘fake news’ and doctored pictures are we ready for seemingly real but constructed images and voices….

OK Google, Alexa, Hey Siri – The Rise of Voice Control Technology

Posted on : 30-04-2018 | By : kerry.housley | In : Consumer behaviour, Finance, FinTech, Innovation, Predictions

Tags: , , , , ,

0

OK Google, Alexa, Hey Siri…. All too familiar phrases around the home now, but it was not that long ago that we did not know what a ‘smart phone’ was! Today most people could not live without one. Imagine not being able to check your email, instant message friends or watch a movie whilst on the move.  How long will it be before we no will no longer need a keyboard, instead talking to your computer will be the norm!

The development of voice activated technology in the home will ultimately revolutionise the way we command and control our computers. Google Home has enabled customers to shop with its partners, pay for the transaction and have goods delivered all without the touch of a keyboard. How useful could this be integrated into the office environment? Adding a voice to mundane tasks will enable employees to be more productive and free up time allowing them to manage their workflow and daily tasks more efficiently.

Voice-based systems has grown more powerful with the use of artificial intelligence, machine learning, cloud-based computing power and highly optimised algorithms. Modern speech recognition systems, combined with almost pristine text-to-speech voices that are almost indistinguishable from human speech, are ushering in a new era of voice-driven computing. As the technology improves and people become more accustomed to speaking to their devices, digital assistants will change how we interact with and think about technology.

There are many areas of business where this innovative technology will be most effective. Using voice control in customer service will transform the way businesses interact with their customers and improve the customer experience.

Many banks are in the process of, if they haven’t done so already, of introducing voice biometric technology. Voice control enables quick access to telephone banking without the need to remember a password every time you call or log in. No need to wade through pages of bank account details or direct debits to make your online payments instead a digital assistant makes the payment for you.

Santander has trialled a system that allows customers to make transfers to existing payees on their account by using voice recognition. Customers access the process by speaking into an application on their mobile device.

Insurance companies are also realising the benefits voice control can bring to their customers. HDFC  Insurance, an Indian firm, has announced the launch of its AI enabled chatbot on Amazon’s cloud-based voice service, Alexa. It aims to offer a 24/7 customer assistance with instant solutions to customer queries. Thereby creating an enhanced customer service experience, allowing them to get easy access to information about policies, simply with the use of voice commands.

It could also help to streamline the claims process where inefficiencies in claims documentation take up insurers’ time and money. Claims processors spend as much as 50% of their day typing reports and documentation; speech recognition could rapidly reduce the time it takes to complete the process. US company Nuance claims that their Dragon Speech Recognition Solution can enable agents to dictate documents three times faster than typing with up to 99% accuracy. They can use simple voice commands to collapse the process further.

Retailers too are turning to this technology. With competition so tough on the high street retailers are always looking for the ultimate customer experience and many believe that voice control is a great way to achieve this. Imagine a mobile app where you could scan shopping items, then pay using a simple voice command or a selfie as you leave the store. No more queuing at the till.

Luxury department store Liberty is a big advocate of voice control and uses it for their warehouse stock picking. Using headsets and a voice controlled application, a voice controlled app issues commands to a central server about which products should be picked. For retailers voice control is hit on and off the shop floor.

So, how accurate is voice recognition? Accuracy rates are improving all the time with researchers commenting that some systems could be better than human transcription. In 1995 the error rate was 43%, today the major vendors claim an error rate of just 5%.

Security is a major factor users still face with verification requiring two factor authentication with mobile applications. However, as the technology develops there should be less of a need to confirm an individual’s identity before commands can be completed.

As advances are made in artificial intelligence and machine learning the sky will be limit for Alexa and her voice control friends. In future stopping what you are doing and typing in a command or search will start to feel a little strange and old-fashioned.

 

How long will it be before you can pick up your smart phone talk to your bank and ask it to transfer £50 to a friend, probably not as far away prospect as you might think!!

How is Alternative Data Giving Investment Managers the Edge?

Posted on : 29-03-2018 | By : richard.gale | In : Consumer behaviour, Data, data security, Finance, FinTech, Innovation

Tags: , , ,

0

Alternative data (or ‘Alt-Data’) refers to data that is derived from a non-traditional source covering a whole array of platforms such as social media, newsfeeds, satellite tracking and web traffic.  There is vast amount of data in cyber space which, until recently remained untouched.  Here we shall look at the role of these unstructured data sets.

Information is the key to the success of any investment manager and information that can give the investor the edge is by no means a new phenomenon.  Traditional financial data, such as stock price history and fundamentals has been the standard for determining the health of a stock. However, alternative data has the potential to reveal insights about a stock’s health before traditional financial data. This has major implications for investors.

If information is power, then unique information sourced from places not-yet-sourced is giving those players the edge in a highly competitive market. Given that we’re in what we like to call a data revolution, where nearly every move we make can be digitized, tracked, and analysed, every company is now a data company. Everyone is both producing and consuming immense amounts of data in the race to make more money. People are well connected on social media platforms and information is available to them is many different forms. Add geographical data into the mix and that’s a lot of data about whose doing what and why. Take Twitter, it is a great tool for showing what’s happening in the world and what is being talked about. Being able to capture sentiment as well as data is a major advance in the world of data analytics.

Advanced analytical procedures can pull all this data together using machine learning and cognitive computing. Using this technology, we can take the unstructured data and transform it into useable data sets at rapid speed.

Hedge funds have been the early adopters and investment managers have now seen the light are expected to spend $7bn by 2020 on alternative data.  All asset managers realise that this data can produce valuable insight and give them the edge in a highly competitive market place.

However, it could be said that if all investment managers research data in this way, then that will put them all on the same footing and the competitive advantage is lost. Commentators have suggested that given the data pool is so vast and the combinations and permutations analysis is of data complex, it is still highly likely that this data can be uncovered that has not been uncovered by someone else. It all depends on the data scientist and where they decide to look. Far from creating a level playing field, where more readily available information simply leads to greater market efficiency, the impact of the information revolution is the opposite. It is creating hard-to access pockets for long-term alpha generation for those players with the scale and resources to take advantage of it.

Which leads us to our next point. A huge amount of money and resource is required to research this data, and this will mean only the strong survive. A report last year by S&P found that 80% of asset managers plan to increase their investments in big data over the next 12 months. Only 6% of asset managers argue that it is not important. Where does this leave the 6%?

Leading hedge fund bosses have warned fund managers they will not survive if they ignore the explosion of big data that is changing the way investors beat the markets. They are

Investing a lot of time and money to develop machine learning in areas of its business where humans can no longer keep up.

There is however one crucial issue which all investors should be aware of and that is the area of privacy. Do you know where that data originates from? Did that vendor have the right to sell the information in the first place?  We have seen this illustrated over the last few weeks with the Facebook “data breach” where Facebook sold on some of its users’ data to Cambridge Analytica without the users’ knowledge. This has wiped $100bn off the Facebook value so we can see the negative impact of using data without the owner’s permission.

The key question in the use of alternative data ultimately is, does it add value? Perhaps too early to tell. Watch this space!