Do you believe that your legacy systems are preventing digital transformation?

Posted on : 14-03-2019 | By : richard.gale | In : Data, Finance, FinTech, Innovation, Uncategorized

Tags: , , , , , , , ,

0

According to the results of our recent Broadgate Futures Survey more than half of our clients agreed that digital transformation within their organisation was being hampered by legacy systems. Indeed, no one “strongly disagreed” confirming the extent of the problem.

Many comments suggested that this was not simply a case of budget constraints, but the sheer size, scale and complexity of the transition had deterred organisations in fear of the fact that they were not adequately equipped to deliver successful change.

Legacy systems have a heritage going back many years to the days of the mega mainframes of the 70’s and 80’s. This was a time when banks were the masters of technological innovation. We saw the birth of ATMs, BACS and international card payments. It was an exciting time of intense modernisation. Many of the core systems that run the finance sector today are the same ones that were built back then. The only problem is that, although these systems were built to last they were not built for change.

The new millennium experienced another significant development with the introduction of the internet, an opportunity the banks could have seized and considered developing new, simpler, more versatile systems. However, instead they decided to adopt a different strategy and modify their existing systems, in their eyes there was no need to reinvent the wheel. They made additions and modifications as and when required. As a result, most financial organisations have evolved over the decades into organisations of complex networks, a myriad of applications and an overloaded IT infrastructure.

The Bank of England itself has recently been severely reprimanded by a Commons Select Committee review who found the Bank to be drowning in out of date processes in dire need of modernisation. Its legacy systems are overly complicated and inefficient, following a merger with the PRA in 2014 their IT estate comprises of duplicated systems and extensive data overload.

Budget, as stated earlier is not the only factor in preventing digital transformation, although there is no doubt that these projects are expensive and extremely time consuming. The complexity of the task and the fear of failure is another reason why companies hold on to their legacy systems. Better the devil you know! Think back to the TSB outage (there were a few…), systems were down for hours and customers were unable to access their accounts following a system upgrade. The incident ultimately led to huge fines from the Financial Conduct Authority and the resignation of the Chief Executive.

For most organisations abandoning their legacy systems is simply not an option so they need to find ways to update in order to facilitate the connection to digital platforms and plug into new technologies.

Many of our clients believe that it is not the legacy system themselves which are the barrier, but it is the inability to access the vast amount of data which is stored in its infrastructure.  It is the data that is the key to the digital transformation, so accessing it is a crucial piece of the puzzle.

“It’s more about legacy architecture and lack of active management of data than specifically systems”

By finding a way to unlock the data inside these out of date systems, banks can decentralise their data making it available to the new digital world.

With the creation of such advancements as the cloud and API’s, it is possible to sit an agility layer between the existing legacy systems and newly adopted applications. HSBC has successfully adopted this approach and used an API strategy to expand its digital and mobile services without needing to replace its legacy systems.

Legacy systems are no longer the barrier to digital innovation that they once were. With some creative thinking and the adoption of new technologies legacy can continue to be part of your IT infrastructure in 2019!

https://www.finextra.com/newsarticle/33529/bank-of-england-slammed-over-outdated-it-and-culture

Selecting a new “digitally focused” sourcing partner

Posted on : 18-07-2018 | By : john.vincent | In : Cloud, FinTech, Innovation, Uncategorized

Tags: , , , , , ,

0

It was interesting to see the recent figures this month from the ISG Index, showing that the traditional outsourcing market in EMEA has rebounded. Figures for the second quarter for commercial outsourcing contracts show a combined annual contract value (ACV) of €3.7Bn. This is significantly up 23% on 2017 and for the traditional sourcing market, reverses a downward trend which had persisted for the previous four quarters.

This is an interesting change of direction, particularly against a backdrop of economic uncertainty around Brexit and the much “over indulged”, GDPR preparation. It seems that despite this, rather than hunkering down with a tin hat and stockpiling rations, companies in EMEA have invested in their technology service provision to support an agile digital growth for the future. The global number also accelerated, up 31% to a record ACV of €9.9Bn.

Underpinning some of these figures has been a huge acceleration in the As-a-Service market. In the last 2 years the ACV attributed to SaaS and IaaS has almost doubled. This has been fairly consistent across all sectors.

So when selecting a sourcing partner, what should companies consider outside of the usual criteria including size, capability, cultural fit, industry experience, flexibility, cost and so on?

One aspect that is interesting from these figures is the influence that technologies such as cloud based services, automation (including AI) and robotic process automation (RPA) are having both now and in the years to come. Many organisations have used sourcing models to fix costs and benefit from labour arbitrage as a pass-through from suppliers. Indeed, this shift of labour ownership has fuelled incredible growth within some of the service providers. For example, Tata Consultancy Services (TCS) has grown from 45.7k employees in 2005 to 394k in March 2018.

However, having reached this heady number if staff, the technologies mentioned previously are threatening the model of some of these companies. As-a-Service providers such as Microsoft Azure and Amazon AWS have platforms now which are carving their way through technology service provision, which previously would have been managed by human beings.

In the infrastructure space commoditisation is well under way. Indeed, we predict that the within 3 years the build, configure and manage skills in areas such Windows and Linux platforms will be rarely in demand. DevOps models, and variants of, are moving at a rapid pace with tools to support spinning up platforms on demand to support application services now mainstream. Service providers often focus on their technology overlay “value add” in this space, with portals or orchestration products which can manage cloud services. However, the value of these is often questionable over direct access or through commercial 3rd party products.

Secondly, as we’ve discussed here before, technology advances in RPA, machine learning and AI are transforming service provision. This of course is not just in terms of business applications but also in terms of the underpinning services. This is translating itself into areas such as self-service Bots which can be queried by end users to provide solutions and guidance, or self-learning AI processes which can predict potential system failures before they occur and take preventative actions.

These advances present a challenge to the workforce focused outsource providers.

Given the factors above, and the market shift, it is important that companies take these into account when selecting a technology service provider. Questions to consider are;

  • What are their strategic relationships with cloud providers, and not just at the “corporate” level, but do they have in depth knowledge of the whole technology ecosystem at a low level?
  • Can they demonstrate skills in the orchestration and automation of platforms at an “infrastructure as a code” level?
  • Do they have capability to deliver process automation through techniques such as Bots, can they scale to enterprise and where are their RPA alliances?
  • Does the potential partner have domain expertise and open to partnership around new products and shared reward/JV models?

The traditional sourcing engagement models are evolving which has developed new opportunities on both sides. Expect new entrants, without the technical debt, organisational overheads and with a more technology solution focus to disrupt the market.

The Opportunity for Intelligent Process Automation in KYC / AML

Posted on : 28-06-2018 | By : richard.gale | In : compliance, Data, Finance, FinTech, Innovation

Tags: , , , , , , , , , , ,

0

Financial services firms have had a preoccupation with meeting the rules and regulations for fighting Financial Crime for the best part of the past decade. Ever since HSBC received sanction from both UK and US regulators in 2010, many other firms have also been caught short in failing to meet society’s expectations in this space. There have been huge programmes of change and remediation, amounting to 10’s of Billions of any currency you choose, to try to get Anti-Financial Crime (AFC) or Know Your Customer (KYC) / Anti-Money Laundering (AML) policies, risk methodologies, data sources, processes, organisation structures, systems and client populations into shape, at least to be able to meet the expectations of regulators, if not exactly stop financial crime.

The challenge for the industry is that Financial Crime is a massive and complex problem to solve. It is not just the detection and prevention of money laundering, but also needs to cover terrorist financing, bribery & corruption and tax evasion. Therefore, as the Banks, Asset Managers and Insurers have been doing, there is a need to focus upon all elements of the AFC regime, from education to process, and all the other activities in-between. Estimates as to the scale of the problem vary but the consensus is that somewhere between $3-5 trillion is introduced into the financial systems each year.

However, progress is being made. Harmonisation and clarity of industry standards and more consistency has come from the regulators with initiatives such as the 4th EU AML Directive. The appreciation and understanding of the importance of the controls are certainly better understood within Financial Services firms and by their shareholders. Perhaps what has not yet progressed significantly are the processes of performing client due diligence and monitoring of their subsequent activity. Most would argue that this is down to a number of factors, possibly the greatest challenge being the disparate and inconsistent nature of the data required to support these processes. Data needs to be sourced in many formats from country registries, stock exchanges, documents of incorporation, multiple media sources etc… Still today many firms have a predominantly manual process to achieve this, even when much of the data is available in digital form. Many still do not automatically ingest data into their work flows and have poorly defined processes to progress onboarding, or monitoring activities. This is for the regulations as they stand today, in the future this burden will further increase as firms will be expected to take all possible efforts to determine the integrity of their clients i.e. by establishing linkages to bad actors through other data sources such as social media and the dark web not evident in traditional sources such as company registries.

There have been several advances in recent years with technologies that have enormous potential for supporting the AFC cause. Data vendors have made big improvements in providing a broader and higher quality of data. The Aggregation solutions, such as Encompass offer services where the constituents of a corporate ownership structure can be assembled, and sanctions & PEP checks undertaken in seconds, rather than the current norm of multiple hours. This works well where the data is available from a reliable electronic source. However, does not work where there are no, or unreliable sources of digital data, as is the case for Trusts or in many jurisdictions around the world. Here we quickly get back to the world of paper and PDFs’ which still require human horsepower to review and decision.

Getting the information in the first instance can be very time consuming with complex interactions between multiple parties (relationship managers, clients, lawyers, data vendors, compliance teams etc) and multiple communications channels i.e. voice, email and chat in its various forms. We also have the challenge of Adverse Media, where thousands of news stories are generated every day on Corporates and Individuals that are the clients of Financial firms. The news items can be positive or negative but consumes tens of thousands of people to review, eliminate or investigate this mountain of data each day. The same challenges come with transaction monitoring, where individual firms can have thousands of ‘hits’ every day on ‘unusual’ payment patterns or ‘questionable’ beneficiaries. These also require review, repair, discounting or further investigation, the clear majority of which are false positives that can be readily discarded.

What is probably the most interesting opportunity for allowing the industry to see the wood for the trees in this data heavy world, is the maturing of Artificial Intelligence (AI) based, or ‘Intelligent’ solutions. The combination of Natural Language Processing with Machine Learning can help the human find the needles in the haystack or make sense of unstructured data that would ordinarily require much time to read and record. AI on its own is not a solution but combined with process management (workflow) and digitised, multi-channel communications, and even Robotics can achieve significant advances. In summary ‘Intelligent’ processing can address 3 of the main data challenges with the AFC regimes within financial institutions;

  1. Sourcing the right data – Where data is structured and digitally obtainable it can be readily harvested but needs to be integrated into the process flows to be compared, analysed, accepted or rejected as part of a review process. Here AI can be used to perform these comparisons, support analysis and look for patterns of common or disparate Data. Where the data is unstructured i.e. embedded in a paper document (email / PDF / doc etc.) then AI NLP and Machine Learning can be used to extract the relevant data and turn the unstructured into structured form for onward processing
  2. Filtering – with both Transaction Monitoring and Adverse Media reviews there is a tsunami of data and events presented to Compliance and Operations teams for sifting, reviewing, rejecting or further investigation. The use of AI can be extremely effective at performing this sifting and presenting back only relevant results to users. Done correctly this can reduce this burden by 90+% but perhaps more importantly, never miss or overlook a case so providing reassurance that relevant data is being captured
  3. By using Intelligent workflows, processes can be fully automated where simple decision making is supported by AI, thereby removing the need for manual intervention in many tasks being processed. Leaving the human to provide value in the complex end of problem solving

Solutions are now emerging in the industry, such as OPSMATiX, one of the first Intelligent Process Automation (IPA) solutions. Devised by a group of industry business experts as a set of technologies that combine to make sense of data across different communication channels, uses AI to turn the unstructured data into structured, and applies robust workflows to optimally manage the resolution of cases, exceptions and issues. The data vendors, and solution vendors such as Encompass are also embracing AI techniques and technologies to effectively create ‘smart filters’ that can be used to scour through thousands, if not millions of pieces of news and other media to discover, or discount information of interest. This can be achieved in a tiny fraction of the time, and therefore cost, and more importantly with far better accuracy than the human can achieve. The outcome of this will be to liberate the human from the process, and firms can either choose to reduce the costs of their operations or use people more effectively to investigate and analyse those events, information and clients that maybe of genuine cause for concern, rather than deal with the noise.

Only once the process has been made significantly more efficient, and the data brought under control can Financial firms really start to address the insidious business of financial crime. Currently all the effort is still going into meeting the regulations, and not societies actual demand which is to combat this global menace, Intelligent process should unlock this capability

 

Guest Author : David Deane, Managing Partner of FIMATIX and CEO of OPSMATiX. David has had a long and illustrious career within Operations and Technology global leadership with Wholesale Banks and Wealth Managers. Before creating FIMATIX and OPSMATiX, he was recently the Global Head of KYC / AML Operations for a Tier 1 Wholesale Bank.

david.deane@fimatix.com

Welcoming Robots to the Team

Posted on : 30-05-2018 | By : richard.gale | In : Finance, FinTech, Innovation

Tags: , , , , ,

1

Research suggests that that the adoption of Robotic Process Automation (RPA) and AI technologies is set to double by 2019. This marks a fundamental change in how organisations work and the potential impact on employees should not be underestimated.

For many years we have seen robots on the factory floor where manual processes have been replaced by automation. This has drastically changed the nature of manufacturing and has inevitably led to a reduction in these workforces.  It is understandable therefore, that we can hear the trembling voices of city workers shouting, “the robots are coming!”

Robotic software should not be thought of as the enemy but rather as a friendly addition to the IT family.  A different approach is needed. If you were replacing an excel spreadsheet with a software program an employee would see this as advantage, as it makes their job quicker and easier to do, therefore welcome the change. Looking at RPA in the same way will change the way employees view its implementation and how they feel about it.

There is no doubt that in some cases RPA is intended as a cost saver but organisations that see RPA as simply a cost saving solution will reap the least rewards. For many companies who have already completed successful RPA programmes, the number one priority has been to eliminate repetitive work that employees didn’t want or need to do. Approaching an RPA project in a carefully thought out and strategic manner will provide results that show that RPA and employees can work together.

Successful transformation using RPA relies on an often used but very relevant phrase  “it’s all about the People Process and Technology”.  You need all three in the equation. It is undeniable that automation is a disruptive technology which will affect employees outlook and affect the way they work. Change management is key in managing these expectations. If robots are to be a part of your organisation, then your employees must be prepared and included.

Perhaps it’s time to demystify RPA, and see it for what is really is, just another piece of software! Automation is about making what you do easier to execute, with less mistakes and greater flexibility. It is important to demonstrate to your staff that RPA is part of a much wider strategic plan of growth and new opportunities.

It is vital to communicate with staff at every level, explaining the purpose of RPA and what it will mean for them. Ensure everyone understands the implications and the benefits of the transition to automation. Even though activities and relationships within an organisation may change this does not necessarily mean a change for the worst.

Employees must be involved from the start of the process. Those individuals who have previously performed the tasks to be automated will be your subject matter experts. You will need to train several existing employees in RPA to manage the process going forward.  Building an RPA team from current employees will ensure that you have their buy- in which is crucial if the implementation is to be a success.

With any new software training is often an afterthought. In the case of RPA training is more important than ever, ensuring that the robots and employees understand each other and can work efficiently together. Working to train RPA experts internally will result in a value-added proposition for the future when it comes to maintaining or scaling your solution.

When analysing the initial RPA requirements, a great deal of thought must be given to the employees who are being replaced and where their skills can be effectively be redeployed. Employee engagement increases when personnel feel that their contribution to the organisation is meaningful and widespread.

Consultation and collaboration throughout the entire process will help to ensure a smoother transition where everyone can feel the benefits. Following a successful RPA implementation share the results with everyone in your organisation.  Share the outcomes and what you have learnt, highlight those employees and teams that have helped along the way.

The robots are coming! They are here to help and at your service!

OK Google, Alexa, Hey Siri – The Rise of Voice Control Technology

Posted on : 30-04-2018 | By : kerry.housley | In : Consumer behaviour, Finance, FinTech, Innovation, Predictions

Tags: , , , , ,

0

OK Google, Alexa, Hey Siri…. All too familiar phrases around the home now, but it was not that long ago that we did not know what a ‘smart phone’ was! Today most people could not live without one. Imagine not being able to check your email, instant message friends or watch a movie whilst on the move.  How long will it be before we no will no longer need a keyboard, instead talking to your computer will be the norm!

The development of voice activated technology in the home will ultimately revolutionise the way we command and control our computers. Google Home has enabled customers to shop with its partners, pay for the transaction and have goods delivered all without the touch of a keyboard. How useful could this be integrated into the office environment? Adding a voice to mundane tasks will enable employees to be more productive and free up time allowing them to manage their workflow and daily tasks more efficiently.

Voice-based systems has grown more powerful with the use of artificial intelligence, machine learning, cloud-based computing power and highly optimised algorithms. Modern speech recognition systems, combined with almost pristine text-to-speech voices that are almost indistinguishable from human speech, are ushering in a new era of voice-driven computing. As the technology improves and people become more accustomed to speaking to their devices, digital assistants will change how we interact with and think about technology.

There are many areas of business where this innovative technology will be most effective. Using voice control in customer service will transform the way businesses interact with their customers and improve the customer experience.

Many banks are in the process of, if they haven’t done so already, of introducing voice biometric technology. Voice control enables quick access to telephone banking without the need to remember a password every time you call or log in. No need to wade through pages of bank account details or direct debits to make your online payments instead a digital assistant makes the payment for you.

Santander has trialled a system that allows customers to make transfers to existing payees on their account by using voice recognition. Customers access the process by speaking into an application on their mobile device.

Insurance companies are also realising the benefits voice control can bring to their customers. HDFC  Insurance, an Indian firm, has announced the launch of its AI enabled chatbot on Amazon’s cloud-based voice service, Alexa. It aims to offer a 24/7 customer assistance with instant solutions to customer queries. Thereby creating an enhanced customer service experience, allowing them to get easy access to information about policies, simply with the use of voice commands.

It could also help to streamline the claims process where inefficiencies in claims documentation take up insurers’ time and money. Claims processors spend as much as 50% of their day typing reports and documentation; speech recognition could rapidly reduce the time it takes to complete the process. US company Nuance claims that their Dragon Speech Recognition Solution can enable agents to dictate documents three times faster than typing with up to 99% accuracy. They can use simple voice commands to collapse the process further.

Retailers too are turning to this technology. With competition so tough on the high street retailers are always looking for the ultimate customer experience and many believe that voice control is a great way to achieve this. Imagine a mobile app where you could scan shopping items, then pay using a simple voice command or a selfie as you leave the store. No more queuing at the till.

Luxury department store Liberty is a big advocate of voice control and uses it for their warehouse stock picking. Using headsets and a voice controlled application, a voice controlled app issues commands to a central server about which products should be picked. For retailers voice control is hit on and off the shop floor.

So, how accurate is voice recognition? Accuracy rates are improving all the time with researchers commenting that some systems could be better than human transcription. In 1995 the error rate was 43%, today the major vendors claim an error rate of just 5%.

Security is a major factor users still face with verification requiring two factor authentication with mobile applications. However, as the technology develops there should be less of a need to confirm an individual’s identity before commands can be completed.

As advances are made in artificial intelligence and machine learning the sky will be limit for Alexa and her voice control friends. In future stopping what you are doing and typing in a command or search will start to feel a little strange and old-fashioned.

 

How long will it be before you can pick up your smart phone talk to your bank and ask it to transfer £50 to a friend, probably not as far away prospect as you might think!!

How is Alternative Data Giving Investment Managers the Edge?

Posted on : 29-03-2018 | By : richard.gale | In : Consumer behaviour, Data, data security, Finance, FinTech, Innovation

Tags: , , ,

0

Alternative data (or ‘Alt-Data’) refers to data that is derived from a non-traditional source covering a whole array of platforms such as social media, newsfeeds, satellite tracking and web traffic.  There is vast amount of data in cyber space which, until recently remained untouched.  Here we shall look at the role of these unstructured data sets.

Information is the key to the success of any investment manager and information that can give the investor the edge is by no means a new phenomenon.  Traditional financial data, such as stock price history and fundamentals has been the standard for determining the health of a stock. However, alternative data has the potential to reveal insights about a stock’s health before traditional financial data. This has major implications for investors.

If information is power, then unique information sourced from places not-yet-sourced is giving those players the edge in a highly competitive market. Given that we’re in what we like to call a data revolution, where nearly every move we make can be digitized, tracked, and analysed, every company is now a data company. Everyone is both producing and consuming immense amounts of data in the race to make more money. People are well connected on social media platforms and information is available to them is many different forms. Add geographical data into the mix and that’s a lot of data about whose doing what and why. Take Twitter, it is a great tool for showing what’s happening in the world and what is being talked about. Being able to capture sentiment as well as data is a major advance in the world of data analytics.

Advanced analytical procedures can pull all this data together using machine learning and cognitive computing. Using this technology, we can take the unstructured data and transform it into useable data sets at rapid speed.

Hedge funds have been the early adopters and investment managers have now seen the light are expected to spend $7bn by 2020 on alternative data.  All asset managers realise that this data can produce valuable insight and give them the edge in a highly competitive market place.

However, it could be said that if all investment managers research data in this way, then that will put them all on the same footing and the competitive advantage is lost. Commentators have suggested that given the data pool is so vast and the combinations and permutations analysis is of data complex, it is still highly likely that this data can be uncovered that has not been uncovered by someone else. It all depends on the data scientist and where they decide to look. Far from creating a level playing field, where more readily available information simply leads to greater market efficiency, the impact of the information revolution is the opposite. It is creating hard-to access pockets for long-term alpha generation for those players with the scale and resources to take advantage of it.

Which leads us to our next point. A huge amount of money and resource is required to research this data, and this will mean only the strong survive. A report last year by S&P found that 80% of asset managers plan to increase their investments in big data over the next 12 months. Only 6% of asset managers argue that it is not important. Where does this leave the 6%?

Leading hedge fund bosses have warned fund managers they will not survive if they ignore the explosion of big data that is changing the way investors beat the markets. They are

Investing a lot of time and money to develop machine learning in areas of its business where humans can no longer keep up.

There is however one crucial issue which all investors should be aware of and that is the area of privacy. Do you know where that data originates from? Did that vendor have the right to sell the information in the first place?  We have seen this illustrated over the last few weeks with the Facebook “data breach” where Facebook sold on some of its users’ data to Cambridge Analytica without the users’ knowledge. This has wiped $100bn off the Facebook value so we can see the negative impact of using data without the owner’s permission.

The key question in the use of alternative data ultimately is, does it add value? Perhaps too early to tell. Watch this space!

Battle of the Algorithms Quantum v Security

Posted on : 28-03-2018 | By : kerry.housley | In : Cyber Security, data security, FinTech, Innovation, Predictions

Tags: , , , , ,

0

Like black holes, quantum computing was for many years nothing more than a theoretical possibility. It was something that physicists believed could exist, but it hadn’t yet been observed or invented.

Today, quantum computing is a proven technology, with the potential to accelerate advances in all aspects our lives, the scope is limitless. However, this very same computing power that can enhance our lives can also do a great deal of damage as it touches many of the everyday tasks that we take for granted. Whether you’re sending money via PayPal or ordering goods online, you’re relying on security systems based on cryptography. Cryptography is a way of keeping these transactions safe from cyber criminals hoping to catch some of the online action (i.e. your money!). Modern cryptography relies on mathematical calculations so complex—using such large numbers—that attackers can’t crack them. Quantum could change this!

Cybersecurity systems rely on uncrackable encryption to protect information, but such encryption could be seriously at risk as quantum develops. The threat is serious enough that it’s caught the interest of the US agency National Institute of Standards and Technology (NIST). Whilst acknowledging that quantum computers could be 15 to 20 years away, NIST believes that we “must begin now to prepare our information security systems to be able to resist quantum computing.”

Many believe that quantum computers could rock the current security protocols that protect global financial markets and the inner workings of government. Quantum computers are so big and expensive that—outside of global technology companies and well-funded research universities—most will be owned and maintained by nation-states. Imagine the scenario where a nation-state intercepts the encrypted financial data that flows across the world and are is able to read it as easily as you are reading this article. Rogue states may be able to leverage the power of quantum to attack the banking and financial systems at the heart of the western business centres.

The evolution of the quantum era could have significant consequences for cyber security where we will see a new phase in the race between defenders and attackers of our information. Cryptography will be the battlefield in which this war of the future will be fought, the contenders of which are already preparing for a confrontation that could take place in the coming years. The evolution of quantum computing will crack some cryptography codes but how serious is the threat?

In theory, a quantum computer would be able to break most of the current algorithms, especially those based on public keys. A quantum computer can factor at a much higher speed than a conventional one. A brute-force attack (testing all possible passwords at high speed until you get the right one) would be a piece of cake with a machine that boasts these characteristics.

However, on the other hand, with this paradigm shift in computing will also come the great hope for privacy. Quantum cryptography will make things very difficult for cybercriminals. While current encryption systems are secure because intruders who attempt to access information can only do so by solving complex problems, with quantum cryptography they would have to violate the laws of quantum mechanics, which, as of today, is impossible.

Despite these developments we don’t believe there is any cause for panic. As it currently stands the reality is that quantum computers are not going to break all encryption. Although they are exponentially more powerful than standard computers, they are awkward to use as algorithms must be written precisely or the answers they return cannot be read, so they are not easy to build and implement.

It is unlikely that hacktivists and cybercriminals could afford quantum computers in the foreseeable future. What we need to remember is that most of attacks in today’s threat landscape target the user where social engineering plays as large, if not larger a part than technical expertise. If a human can be persuaded to part with a secret in inappropriate circumstances, all the cryptography in the world will not help, quantum or not!

It is important that organisations understand the implications that quantum computing will have on their legacy systems, and take steps to be ready. At a minimum, that means retrofitting their networks, computers, and applications with encryption that can withstand a quantum attack.

Quantum computing presents both an unprecedented opportunity and a serious threat. We find ourselves in a pre-quantum era, we know it’s coming but we don’t know when…

Are you ready for Y2Q (Years to Quantum)?

Will Robotic Process Automation be responsible for the next generation of technical debt?

Posted on : 28-03-2018 | By : kerry.housley | In : FinTech, Innovation, Predictions, Uncategorized

Tags: , , , , , , , , , ,

0

All hail the great Bill Gates and his immortal words:

The first rule of any technology used in a business is that automation applied to an efficient operation will magnify the efficiency. The second is that automation applied to an inefficient operation will magnify the inefficiency.”

With the Robotic Process Automation (RPA) wave crashing down all about us and as we all scramble around trying to catch a ride on its efficiency, cost saving and performance optimising goodness, we should take a minute and take heed of Mr Gate’s wise words and remember that poorly designed processes done more efficiently will still be ineffectual. In theory, you’re just getting better at doing things poorly.

Now before we go any further, we should state that we have no doubt about the many benefits of RPA and in our opinion RPA should be taken advantage of and utilised where appropriate.

Now with that said…

RPA lends itself very well to quick fixes and fast savings, which are very tempting to any organisation. However, there are many organisations with years of technical debt built up already through adding quick fixes to fundamental issues in their IT systems. For these organisations, the introduction of RPA (although very fruitful in the short term) will actually add more technological dependencies to the mix. This will increase their technical debt if not maintained effectively. Eventually, this will become unsustainable and very costly to your organisation.

RPA will increase dependencies on other systems, adding subtle complex levels of interoperability, and like any interdependent ecosystem, when one thing alters there is an (often unforeseen) knock-on effect in other areas.

An upgrade that causes a subtle change to a user interface will cause the RPA process to stop working, or worse the process will keep working but do the wrong thing.

Consider this; what happens when an RPA process that has been running for a few years needs updating or changing? Will you still have the inherent expert understanding of this particular process at the human level or has that expertise now been lost?

How will we get around these problems?  Well, as with most IT issues, an overworked and understaffed IT department will create a quick workaround to solve the problem, and then move on to the myriad of other technical issues that need their attention. Hey presto… technical debt.

So, what is the answer? Of course, we need to stay competitive and take advantage of this new blend of technologies. It just needs to be a considered decision, you need to go in with your eyes open and understand the mid and long-term implications.

A big question surrounding RPA is who owns this new technology within organisations? Does it belong to the business side or the IT side and how involved should your CIO or CTO be?

It’s tempting to say that processes are designed by the business side and because RPA is simply going to replace the human element of an already existing process this can all be done by the business side, we don’t need to (or want to) involve the CIO in this decision. However, you wouldn’t hire a new employee into your organisation without HR being involved and the same is true of introducing new tech into your system. True, RPA is designed to sit outside/on top of your networks and systems in which case it shouldn’t interfere with your existing network, but at the very least the CIO and IT department should have an oversight of RPA being introduced into the organisation. They can then be aware of any issues that may occur as a result of any upgrades or changes to the existing system.

Our advice would be that organisations should initially only implement RPA measures that have been considered by both the CIO and the business side to be directly beneficial to the strategic goals of the company.

Following this, you can then perform a proper opportunity assessment to find the optimum portfolio of processes.  Generally, low or medium complexity processes or sub-processes will be the best initial options for RPA, if your assessment shows that the Full Time Equivalent (FTE) savings are worth it of course. Ultimately, you should be looking for the processes with the best return, and simplest delivery.

A final point on software tools and vendors. Like most niche markets of trending technology RPA is awash with companies offering various software tools. You may have heard of some of the bigger and more reputable names like UiPath and Blue Prism. It can be a minefield of offerings, so understanding your needs and selecting an appropriate vendor will be key to making the most of RPA. In order to combat the build-up of technical debt, tools provided by the vendor to enable some of the maintenance and management of the RPA processes is essential.

For advice on how to begin to introduce RPA into your organisation, vendor selection or help conducting a RPA opportunity assessment, or for help reducing your technical debt please email Richard.gale@broadgateconsultants.com.

 

2017 – A great year for the hackers

Posted on : 29-12-2017 | By : Tom Loxley | In : Cloud, compliance, Cyber Security, Data, data security, FinTech, GDPR, Uncategorized

0

This year saw some of the biggest data breaches so far, we saw cover-ups exposed and ransoms reaching new highs.

Of course, it’s no secret that when it comes to cybersecurity this was a pretty bad year and I’m certain that there are many CIO’s, CISO’s and CTO’s and indeed CEO’s wondering what 2018 has to offer from the hackers.

That 2018 threat landscape is sure to be full of yet more sophisticated security attacks on the horizon. However, the big win for 2017 is that people have woken up to the threat, “not if, but when” has been finally been acknowledged and people are becoming as proactive and creative as the attackers to protect their companies. The old adage of “offence is the best form of defence” still rings true.

With that in mind we’re going to look back at some of what 2017 had to offer, the past may not predict the future, but it certainly gives you a good place to start your planning for it.

So let’s take a look at some of the most high profile data breaches of 2017.

Equifax (you guessed it) – No doubt you’ll have heard of this breach and because of its huge scale its very likely that if you weren’t directly affected yourself, you’ll know someone who was. This breach was and still is being highly published and for good reason. A plethora of litigation and investigations followed the breach in an effort to deal with the colossal scale of personal information stolen. This includes over 240 individual class-action lawsuits, an investigation opened by the Federal Trade Commission, and more than 60 government investigations from U.S. state attorneys general, federal agencies and the British and Canadian governments. More recently a rare 50-state class-action suit has been served on the company.

Here are some of the facts:

  • 145.5 million people (the figure recently revised by Equifax, now 2.5 million more than it initially reported) as its estimate for the number of people potentially affected.
  • U.K. consumers unknown. Equifax said it is still determining the extent of the breach for U.K. consumers.
  • 8,000 potential Canadian victims (recently revised down from 100,000).
  • High profile Snr leaders to leave since the breach. Former CEO Richard Smith retired (Smith is reported to have banked a $90 million retirement golden handshake), the chief information officer and chief security officer have also “left”.
  • There are an unknown number of internal investigations taking place against board members (including its chief financial officer and general counsel), for selling stock after the breach’s discovery, but before its public disclosure.
  • The breach lasted from mid-May through July.
  • The hackers accessed people’s names, Social Security numbers, birth dates, addresses and, in some instances, driver’s license numbers.
  • They also stole credit card numbers for about 209,000 people and dispute documents with personal identifying information for about 182,000 people

Uber – The big story here wasn’t so much the actual breach, but the attempt to cover it up. The breach itself actually happened 2016. The hackers stole the personal data of 57 million Uber customers, and the Uber paid them $100,000 to cover it up. However, the incident wasn’t revealed to the public until this November, when the breach was made known by the new Uber CEO Dara Khosrowshahi.

Uber has felt the impact of the backlash for the cover-up globally and on varying scales. From the big guns in the US where three senators in the US introduced a bill that could make executives face jail time for knowingly covering up data breaches. Right through to the city of York in the UK where the city voted against renewing Uber’s licence on December 23 due to concerns about the data breach.

Deloitte – According to a report from the Guardian in September earlier this year, a Deloitte global email server was breached, giving the attackers access to emails to and from the company’s staff, not to mention customer information on some of the company’s most high-profile public and private sector clients. Although the breach was discovered in March 2017, it is thought that the hackers had been in the company’s systems since October or November 2016. During in this period, the hackers could have had access to information such as usernames, passwords, IP addresses and architectural design diagrams. Deloitte confirmed the breach, saying that the hack had taken place through an admin account and that only a few clients were impacted by the attack

Now if I covered even half of the high profile cyber-attack cases in detail this article would look more like a novel. Plus, as much as I love to spend my time delighting you my dear readers it is Christmas, which means I have bad tv to watch, family arguments to take part in and copious amounts of calories (alcohol) to consume and feel guilty about for the next 3 months. So, with that in mind let’s do a short recap of some of the other massive exploits and data breaches this past year:

  1. Wonga, the payday loan firm suffered a data breach which may have affected up to 245,000 customers in the UK.
  2. WannaCry and Bad Rabbit, these massive ransomware attack affected millions of computers around the world including the NHS.
  3. The NSA was breached by a group called The Shadow Brokers. They stole and leaked around 100GB of confidential information and hacking tools.
  4. WikiLeaks Vault 7 leak, WikiLeaks exposed the CIA’s secret documentation and user guides for hacking tools which targeting the Mac and Linux operating systems.
  5. Due to a vulnerability, Cloudflare unwittingly leaked customer data from Uber, OKCupid and 1Password.
  6. Bell Canada was threatened by hackers with the leak of 9 million customer records. When the company refused to pay, some of the information was published online.
  7. Other hacks include Verizon, Yahoo, and Virgin America, Instagram…it goes on.

So, all in all not a great year but looking on the bright side if you weren’t on the wrong end of a cyber-attack this year or even if you were, there are plenty of lessons that can be learnt from the attacks that took place and some easy wins you can get by doing the basics right. We’ll be exploring some of these with our newsletter in 2018 and delving into the timelines of some of the more high-profile attacks that took place to help our readers understand and deal with the attack if they’re ever unfortunate enough to be in that situation. But if you can’t wait that long and want some advice now please feel free to get in touch anytime

 

Ripple Makes Waves

Posted on : 27-10-2017 | By : Tom Loxley | In : Bitcoin, Blockchain, Crytpocurrency, DLT, Finance, FinTech, Innovation

0

The big banks seem to have stopped resisting and begun embracing blockchain technology…well, at least when it comes to embedding the technology into their payments and transactions. Some time ago now Ripple (the digital currency and distributed payment network) pitched its tent on the Swift network’s front lawn and has been an increasingly irritating thorn in the payments provider’s side. (Description of Ripple provided by coindesk.com.)

It seems that Swift can no longer deny the clear advantages of blockchain technology, or perhaps in a savvy move, Swift has let Ripple do the hard work when it comes to the risk and testing involved in new employing a new leading-edge technology. But it’s more than just a new piece of code or tech that Ripple and the other big cryptocurrencies have brought to the financial services (FS) arena. It’s more like a paradigm shift.

For years the financial institutions have been upgrading software and bolting on workarounds to try and keep up with new demands and ever-evolving marketplace. This has resulted in a metaphoric Frankenstein of IT stacks and ageing and outdated technology. Some of the bigger institutions still have to wheel in the experts from the 80’s to deal with their issues because the tech is so old the IT workforce of today just don’t get exposed to it. Talk about choke points or single points of failure.

Big names from the FS industry have come out swinging against the cryptocurrencies boom with the likes of Jamie Dimon of JPMorgan and Larry Fink of Blackrock voicing their issues with Bitcoin. Now I’m not a diehard fan of cryptocurrencies, and I see the obvious concern with what appears to be the massive bubble that is Bitcoin and some of the other more expensive digital currencies, but there is part of me that hails them for the disruptive kick up the backside they seem to have given the FS industry.

Whatever you may personally think about Bitcoin and the early cryptocurrencies, their presence has created choice, a new way to transact value with some real benefits (transparency, security, more autonomy/control, speed, and lower costs) using blockchain technology, or Distributed Ledger Technology (DLT) as it is becoming more widely known as in the FS circles. (As if renaming it and slightly tweaking the definition has somehow distanced them from admitting there is real value in something that was widely scoffed initially.)

By popularising blockchain technology, cryptocurrencies have forced the FS industry to take a serious look at their technology and this (in my opinion) seems to have busted the door open to other new FinTech ideas. In fact, it now seems that the bigger institutions are clawing to be the first in the FinTech race and woe betide the Innovation Executive who is responsible for passing on the next bit of groundbreaking software, especially if the competition picks it up. Innovation is the new name of the game.

Ripple has continued its assault on the Swift network by boasting that its distributed financial technology can help banks cut the time and cost of clearing transactions and at the same time allowing new types of high-volume, low-value global transactions. Ripple also hosted their conference called “Swell: The Future Is Here” over the same period in October and only a few miles away from Swift’s Sibos event. They came out guns blazing with Ben Bernanke and Tim Berners-Lee headlining at their event and have made it clear the time and location of the event was not a coincidence.

Ripples tenacity seems to be paying off with over a 100 banks and FS organisations signing up to its network. Swift is hitting back with the 3rd phase of its global payment initiative (SWIFT gpi) focussing on DLT. Many of the larger banks have joined forces with Swift to explore the DLT Proof of Concept reporting initial success.

Swift is not the only large FS organisation exploring in this space. Indeed, despite Jamie Dimon’s opinion of Bitcoin, apparently he’s not opposed to the underlying technology. JPMorgan has used the Ethereum blockchain protocol as a base for Quorum, a DLT platform designed to support any application requiring high speed and high throughput processing of private transactions within a permissioned group of known participants.

Many other FS organisations are also exploring privately and collectively in consortiums to win the race and harness the power of the blockchain.

The irony here is that while we’re all caught up in this whirlwind of disruption to the FS industry, at the end of it all what is the real impact a year or so down the line? Kelly, the insurance broker from Doncaster, South Yorkshire makes the deposit payment on her new 4 bedroom semi-detached and says…hmmm…that was quicker than I remember a few years ago… and then gets on with her day. Or am I just being cynical?