Are you able to access all the data across your organisation?

Posted on : 31-03-2019 | By : richard.gale | In : Data, Finance

0

For many years data has been the lifeblood of the organisation and more recently, the value of this commodity has been realised by many companies (see our previous article “Data is like oil”).

Advances in technology, processing power and analytics means that companies can collect and process data in real time. Most businesses are sitting on vast amounts of data and those that can harness it effectively can gain a much deeper understanding of their customers, better predict and improve their customer experience.

Our survey revealed that whilst most companies understand the value of their data and the benefits it can bring, many clients revealed a level of frustration in the systems and processes that manage it. Some respondents did qualify that “most of the data” was available, whilst others admitted some was stranded.

 “Data is in legacy silos, our long-term goal is to provide access through a consistent data management framework”

The deficiencies that we also discuss in this newsletter regarding legacy systems are partly responsible for this, although not wholly. This is a particular issue in financial services where many organisations are running on old systems that are too complex and too expensive to replace. Critical company data is trapped in silos, disconnected and incompatible with the rest of the enterprise.

These silos present a huge challenge for many companies. Recalling a comment of one Chief Data Office at a large institution;

“If I ask a question in more than one place, I usually get more than one answer!”

Data silos are expanding as companies collect too much data which they hold onto for longer than they need to. Big data has been a buzz word for a while now, but it is important that companies distinguish between big data and big bad data! The number of data sources are increasing all the time so the issue must be addressed if the data is to be used effectively to return some business value. Collecting a virtually unlimited amount of data needs to be managed properly to ensure that all data stored has a purpose and can be protected.

Shadow data further exacerbates the issue. This data is unverified, often inaccurate and out of date. Oversharing of this data results in it being stored in areas that are unknown and unable to be traced. Creating yet more data silos hidden from the wider enterprise. This data is viewed as a valid data source relied upon and then used as input into other systems, which can ultimately lead to bad business decisions being made.

A robust data governance and management strategy is something which the importance of cannot be underestimated, particularly for those serious about the digital agenda and customer experience. This is also a topic where the combination of business and IT leadership aligning on the product strategy and underlying “data plumbing” is a must.  This is not just about systems but also about the organisation’s attitude to data and its importance in the life of every business process. It is important that companies implement a data management strategy which encompasses not only the internal platforms and governance but also the presentation layer for business users, consumers and data insights.

Do you believe that your legacy systems are preventing digital transformation?

Posted on : 14-03-2019 | By : richard.gale | In : Data, Finance, FinTech, Innovation, Uncategorized

Tags: , , , , , , , ,

0

According to the results of our recent Broadgate Futures Survey more than half of our clients agreed that digital transformation within their organisation was being hampered by legacy systems. Indeed, no one “strongly disagreed” confirming the extent of the problem.

Many comments suggested that this was not simply a case of budget constraints, but the sheer size, scale and complexity of the transition had deterred organisations in fear of the fact that they were not adequately equipped to deliver successful change.

Legacy systems have a heritage going back many years to the days of the mega mainframes of the 70’s and 80’s. This was a time when banks were the masters of technological innovation. We saw the birth of ATMs, BACS and international card payments. It was an exciting time of intense modernisation. Many of the core systems that run the finance sector today are the same ones that were built back then. The only problem is that, although these systems were built to last they were not built for change.

The new millennium experienced another significant development with the introduction of the internet, an opportunity the banks could have seized and considered developing new, simpler, more versatile systems. However, instead they decided to adopt a different strategy and modify their existing systems, in their eyes there was no need to reinvent the wheel. They made additions and modifications as and when required. As a result, most financial organisations have evolved over the decades into organisations of complex networks, a myriad of applications and an overloaded IT infrastructure.

The Bank of England itself has recently been severely reprimanded by a Commons Select Committee review who found the Bank to be drowning in out of date processes in dire need of modernisation. Its legacy systems are overly complicated and inefficient, following a merger with the PRA in 2014 their IT estate comprises of duplicated systems and extensive data overload.

Budget, as stated earlier is not the only factor in preventing digital transformation, although there is no doubt that these projects are expensive and extremely time consuming. The complexity of the task and the fear of failure is another reason why companies hold on to their legacy systems. Better the devil you know! Think back to the TSB outage (there were a few…), systems were down for hours and customers were unable to access their accounts following a system upgrade. The incident ultimately led to huge fines from the Financial Conduct Authority and the resignation of the Chief Executive.

For most organisations abandoning their legacy systems is simply not an option so they need to find ways to update in order to facilitate the connection to digital platforms and plug into new technologies.

Many of our clients believe that it is not the legacy system themselves which are the barrier, but it is the inability to access the vast amount of data which is stored in its infrastructure.  It is the data that is the key to the digital transformation, so accessing it is a crucial piece of the puzzle.

“It’s more about legacy architecture and lack of active management of data than specifically systems”

By finding a way to unlock the data inside these out of date systems, banks can decentralise their data making it available to the new digital world.

With the creation of such advancements as the cloud and API’s, it is possible to sit an agility layer between the existing legacy systems and newly adopted applications. HSBC has successfully adopted this approach and used an API strategy to expand its digital and mobile services without needing to replace its legacy systems.

Legacy systems are no longer the barrier to digital innovation that they once were. With some creative thinking and the adoption of new technologies legacy can continue to be part of your IT infrastructure in 2019!

https://www.finextra.com/newsarticle/33529/bank-of-england-slammed-over-outdated-it-and-culture

What will the IT department look like in the future?

Posted on : 29-01-2019 | By : john.vincent | In : Cloud, Data, General News, Innovation

Tags: , , , , , , , , , ,

0

We are going through a significant change in how technology services are delivered as we stride further into the latest phase of the Digital Revolution. The internet provided the starting pistol for this phase and now access to new technology, data and services is accelerating at breakneck speed.

More recently the real enablers of a more agile and service-based technology have been the introduction of virtualisation and orchestration technologies which allowed for compute to be tapped into on demand and removed the friction between software and hardware.

The impact of this cannot be underestimated. The removal of the needed to manually configure and provision new compute environments was a huge step forwards, and one which continues with developments in Infrastructure as Code (“IaC”), micro services and server-less technology.

However, whilst these technologies continually disrupt the market, the corresponding changes to the overall operating models has in our view lagged (this is particularly true in larger organisations which have struggled to shift from the old to the new).

If you take a peek into organisation structures today they often still resemble those of the late 90’s where capabilities in infrastructure were organised by specialists such as data centre, storage, service management, application support etc. There have been changes, specifically more recently with the shift to devops and continuous integration and development, but there is still a long way go.

Our recent Technology Futures Survey provided a great insight into how our clients (290) are responding to the shifting technology services landscape.

“What will your IT department look like in 5-7 years’ time?”

There were no surprises in the large majority of respondents agreeing that the organisation would look different in the near future. The big shift is to a more service focused, vendor led technology model, with between 53%-65% believing that this is the direction of travel.

One surprise was a relatively low consensus on the impact that Artificial Intelligence (“AI”) would have on management of live services, with only 10% saying it would be very likely. However, the providers of technology and services formed a smaller proportion of our respondents (28%) and naturally were more positive about the impact of AI.

The Broadgate view is that the changing shape of digital service delivery is challenging previous models and applying tension to organisations and providers alike.  There are two main areas where we see this;

  1. With the shift to cloud based and on-demand services, the need for any provider, whether internal or external, has diminished
  2. Automation, AI and machine learning are developing new capabilities in self-managing technology services

We expect that the technology organisation will shift to focus more on business products and procuring the best fit service providers. Central to this is AI and ML which, where truly intelligent (and not just marketing), can create a self-healing and dynamic compute capability with limited human intervention.

Cloud, machine learning and RPA will remove much of the need to manage and develop code

To really understand how the organisation model is shifting, we have to look at the impact that technology is having the on the whole supply chain. We’ve long outsourced the delivery of services. However, if we look the traditional service providers (IBM, DXC, TCS, Cognizant etc.) that in the first instance acted as brokers to this new digital technology innovations we see that they are increasingly being disintermediated, with provisioning and management now directly in the hands of the consumer.

Companies like Microsoft, Google and Amazon have superior technical expertise and they are continuing to expose these directly to the end consumer. Thus, the IT department needs to think less about how to either build or procure from a third party, but more how to build a framework of services which “knits together” a service model which can best meet their business needs with a layered, end-to-end approach. This fits perfectly with a more business product centric approach.

We don’t see an increase for in-house technology footprints with maybe the exception of truly data driven organisations or tech companies themselves.

In our results, the removal of cyber security issues was endorsed by 28% with a further 41% believing that this was a possible outcome. This represents a leap of faith given the current battle that organisations are undertaking to combat data breaches! Broadgate expect that organisations will increasingly shift the management of these security risks to third party providers, with telecommunication carriers also taking more responsibilities over time.

As the results suggest, the commercial and vendor management aspects of the IT department will become more important. This is often a skill which is absent in current companies, so a conscious strategy to develop capability is needed.

Organisations should update their operating model to reflect the changing shape of technology services, with the closer alignment of products and services to technology provision never being as important as it is today.

Indeed, our view is that even if your model serves you well today, by 2022 it is likely to look fairly stale. This is because what your company currently offers to your customers is almost certain to change, which will require fundamental re-engineering across, and around, the entire IT stack.

GDPR – A Never Ending Story

Posted on : 28-06-2018 | By : richard.gale | In : compliance, Consumer behaviour, Cyber Security, Data, data security, GDPR

Tags: , , , , , ,

0

For most of us, the run up to the implementation of GDPR meant that we were overwhelmed by privacy notices and emails begging us to sign up to mailing lists. A month on, what is the reality of this regulation and what does it mean for businesses and their clients?

There was much agonising by companies who were racing to comply, concerned that they would not meet the deadline and worried what the impact of the new rules would mean for their business.

If we look at the regulation from a simple, practical level all GDPR has done is to make sure that people are aware of what data they hand over and can control how it’s used. That should not be something new.

Understanding where data is and how it is managed correctly is not only fundamental to regulatory compliance and customer trust, but also to providing the highly personalised and predictive services that customers crave. Therefore, the requirements of regulation are by no means at odds with the strategies of data-driven finance firms, but in fact are perfectly in tune.

Having this knowledge is great for business as clients will experience a more transparent relationship and with this transparency comes trust. Businesses may potentially have a smaller customer base to market to, but this potential customer base will be more willing and engaged which should lead to greater sales conversion.

The businesses that will see a negative impact on their business will be the companies that collect data by tricking people with dubious tactics. The winners will be the companies that collect data in open and honest ways, then use that data to clearly benefit customers. Those companies will deliver good experiences that foster loyalty. Loyalty drives consumers to share more data. Better data allows for an even better, more relevant customer experiences.

If we look at the fundamentals of financial services, clients are often handing over their life savings which they are entrusting to companies to nurture and grow. Regardless of GDPR, business shouldn’t rely on regulation to keep their companies in check but instead always have customer trust at the top of their agenda. No trust means no business.

The key consideration is what can you offer that will inspire individuals to want to share their data.

Consumers willingly give their financial data to financial institutions when they become customers. An investment company may want to ask each prospect how much money she is looking to invest, what her investment goal is, what interests she has and what kind of investor she is. If these questions are asked “so we can sell to you better,” it is unlikely that the prospect will answer or engage. But, if these questions are asked “so that we can send you a weekly email that describes an investment option relevant to you and includes a few bullets on the pros and cons of that option,” now the prospect may happily answer the questions because she will get something from the exchange of data.

Another advantage of GDPR is the awareness requirement. All companies must ensure that their staff know about GDPR and understand the importance of data protection. This is a great opportunity to review your policies and procedures and address the company culture around client information and how it should be protected.  With around 50% of security breaches being caused by careless employees, the reputational risks and potential damage to customer relationships are significant, as are the fines that can be levied by the ICO for privacy breeches.

Therefore, it is important to address the culture to make sure all staff take responsibility for data security and the part that they play. Whilst disciplinary codes may be tightened up to make individuals more accountable, forward thinking organisations will take this opportunity to positively engage with staff and reinforce a culture of genuine customer care and respect.

A month on, it is important to stress that being GDPR ready is not the same as being done! Data protection is an ongoing challenge requiring regular review and updates in fast moving threat environment.

With some work upfront, GDPR is a chance to clean your data and review your processes to make everything more streamlined benefiting both your business and your clients.

Everyone’s a winner!

 

kerry.housley@broadgateconsultants.com

 

The Opportunity for Intelligent Process Automation in KYC / AML

Posted on : 28-06-2018 | By : richard.gale | In : compliance, Data, Finance, FinTech, Innovation

Tags: , , , , , , , , , , ,

0

Financial services firms have had a preoccupation with meeting the rules and regulations for fighting Financial Crime for the best part of the past decade. Ever since HSBC received sanction from both UK and US regulators in 2010, many other firms have also been caught short in failing to meet society’s expectations in this space. There have been huge programmes of change and remediation, amounting to 10’s of Billions of any currency you choose, to try to get Anti-Financial Crime (AFC) or Know Your Customer (KYC) / Anti-Money Laundering (AML) policies, risk methodologies, data sources, processes, organisation structures, systems and client populations into shape, at least to be able to meet the expectations of regulators, if not exactly stop financial crime.

The challenge for the industry is that Financial Crime is a massive and complex problem to solve. It is not just the detection and prevention of money laundering, but also needs to cover terrorist financing, bribery & corruption and tax evasion. Therefore, as the Banks, Asset Managers and Insurers have been doing, there is a need to focus upon all elements of the AFC regime, from education to process, and all the other activities in-between. Estimates as to the scale of the problem vary but the consensus is that somewhere between $3-5 trillion is introduced into the financial systems each year.

However, progress is being made. Harmonisation and clarity of industry standards and more consistency has come from the regulators with initiatives such as the 4th EU AML Directive. The appreciation and understanding of the importance of the controls are certainly better understood within Financial Services firms and by their shareholders. Perhaps what has not yet progressed significantly are the processes of performing client due diligence and monitoring of their subsequent activity. Most would argue that this is down to a number of factors, possibly the greatest challenge being the disparate and inconsistent nature of the data required to support these processes. Data needs to be sourced in many formats from country registries, stock exchanges, documents of incorporation, multiple media sources etc… Still today many firms have a predominantly manual process to achieve this, even when much of the data is available in digital form. Many still do not automatically ingest data into their work flows and have poorly defined processes to progress onboarding, or monitoring activities. This is for the regulations as they stand today, in the future this burden will further increase as firms will be expected to take all possible efforts to determine the integrity of their clients i.e. by establishing linkages to bad actors through other data sources such as social media and the dark web not evident in traditional sources such as company registries.

There have been several advances in recent years with technologies that have enormous potential for supporting the AFC cause. Data vendors have made big improvements in providing a broader and higher quality of data. The Aggregation solutions, such as Encompass offer services where the constituents of a corporate ownership structure can be assembled, and sanctions & PEP checks undertaken in seconds, rather than the current norm of multiple hours. This works well where the data is available from a reliable electronic source. However, does not work where there are no, or unreliable sources of digital data, as is the case for Trusts or in many jurisdictions around the world. Here we quickly get back to the world of paper and PDFs’ which still require human horsepower to review and decision.

Getting the information in the first instance can be very time consuming with complex interactions between multiple parties (relationship managers, clients, lawyers, data vendors, compliance teams etc) and multiple communications channels i.e. voice, email and chat in its various forms. We also have the challenge of Adverse Media, where thousands of news stories are generated every day on Corporates and Individuals that are the clients of Financial firms. The news items can be positive or negative but consumes tens of thousands of people to review, eliminate or investigate this mountain of data each day. The same challenges come with transaction monitoring, where individual firms can have thousands of ‘hits’ every day on ‘unusual’ payment patterns or ‘questionable’ beneficiaries. These also require review, repair, discounting or further investigation, the clear majority of which are false positives that can be readily discarded.

What is probably the most interesting opportunity for allowing the industry to see the wood for the trees in this data heavy world, is the maturing of Artificial Intelligence (AI) based, or ‘Intelligent’ solutions. The combination of Natural Language Processing with Machine Learning can help the human find the needles in the haystack or make sense of unstructured data that would ordinarily require much time to read and record. AI on its own is not a solution but combined with process management (workflow) and digitised, multi-channel communications, and even Robotics can achieve significant advances. In summary ‘Intelligent’ processing can address 3 of the main data challenges with the AFC regimes within financial institutions;

  1. Sourcing the right data – Where data is structured and digitally obtainable it can be readily harvested but needs to be integrated into the process flows to be compared, analysed, accepted or rejected as part of a review process. Here AI can be used to perform these comparisons, support analysis and look for patterns of common or disparate Data. Where the data is unstructured i.e. embedded in a paper document (email / PDF / doc etc.) then AI NLP and Machine Learning can be used to extract the relevant data and turn the unstructured into structured form for onward processing
  2. Filtering – with both Transaction Monitoring and Adverse Media reviews there is a tsunami of data and events presented to Compliance and Operations teams for sifting, reviewing, rejecting or further investigation. The use of AI can be extremely effective at performing this sifting and presenting back only relevant results to users. Done correctly this can reduce this burden by 90+% but perhaps more importantly, never miss or overlook a case so providing reassurance that relevant data is being captured
  3. By using Intelligent workflows, processes can be fully automated where simple decision making is supported by AI, thereby removing the need for manual intervention in many tasks being processed. Leaving the human to provide value in the complex end of problem solving

Solutions are now emerging in the industry, such as OPSMATiX, one of the first Intelligent Process Automation (IPA) solutions. Devised by a group of industry business experts as a set of technologies that combine to make sense of data across different communication channels, uses AI to turn the unstructured data into structured, and applies robust workflows to optimally manage the resolution of cases, exceptions and issues. The data vendors, and solution vendors such as Encompass are also embracing AI techniques and technologies to effectively create ‘smart filters’ that can be used to scour through thousands, if not millions of pieces of news and other media to discover, or discount information of interest. This can be achieved in a tiny fraction of the time, and therefore cost, and more importantly with far better accuracy than the human can achieve. The outcome of this will be to liberate the human from the process, and firms can either choose to reduce the costs of their operations or use people more effectively to investigate and analyse those events, information and clients that maybe of genuine cause for concern, rather than deal with the noise.

Only once the process has been made significantly more efficient, and the data brought under control can Financial firms really start to address the insidious business of financial crime. Currently all the effort is still going into meeting the regulations, and not societies actual demand which is to combat this global menace, Intelligent process should unlock this capability

 

Guest Author : David Deane, Managing Partner of FIMATIX and CEO of OPSMATiX. David has had a long and illustrious career within Operations and Technology global leadership with Wholesale Banks and Wealth Managers. Before creating FIMATIX and OPSMATiX, he was recently the Global Head of KYC / AML Operations for a Tier 1 Wholesale Bank.

david.deane@fimatix.com

GDPR – Are You Ready?

Posted on : 30-04-2018 | By : kerry.housley | In : compliance, Consumer behaviour, Cyber Security, Data, data security, GDPR

Tags: , , , ,

0

It is less than a month until the General Data Protection Regulation (GDPR) comes into force, but after two years of preparation, how many businesses are GDPR ready? The latest flurry of figures suggest that many businesses are nowhere near prepared for the new legislation’s demands that they: re-establish a legal basis for using people’s data (whether that’s consent or otherwise), are able to quickly respond to subject access requests, can delete people’s data if asked to, the list goes on!

So, what does all this mean for your organisation? Well, firstly, there is no need to panic. Hopefully, you have made a start on your compliance journey, even if you’re not going to make the deadline.  Any business that deals with personal data in the UK is currently bound by the terms of the Data Protection Act.  If you comply with the Data Protection Act, then you will have made a great to start towards GDPR compliance. Regardless of GDPR, any business that takes the needs of its customers seriously will already be taking all the appropriate steps to protect its customers information.  Cyber crime and data theft is ever increasing, and organisations must be prepared for a breach and be confident they can deal with it quickly with minimum fall out. Reputational damage can lose you customers and seriously dent your profits.

There has been much GDPR hype over the last few years with talk of extortionate fines and punitive actions should your business fail to comply. The frenzy whipped up by the media and the new GDPR “experts” is unfounded says Elizabeth Denham, the Information Commissioner.  The Information Commissioners Office (ICO) do not intend to start dishing out harsh fines as soon as the regulation comes into place and neither will they target smaller organisations because they will be easier to catch.  The purpose of the ICO has always been to protect peoples’ data and to help business to do this by providing policy and guidance. It follows the carrot before the stick approach and has always viewed issuing large fines as a large resort. Ms Denham has been quoted as saying the implementation of GDPR will not alter this business-friendly approach.

That said, there is no denying the new regulation and the obligations placed upon all business to comply. At this late stage with a round a month to go, all organisations who have not yet addressed GDPR should try to achieve as much as possible in the run up to the 25th May deadline, to build up their compliance and demonstrate that information security is a priority for their business.

  • It is important to show that your organisation takes GDPR seriously and has taken action and has a plan in place to become GDPR ready.
  • Evidence of action taken is crucial.
  • Review all the personal data you hold, where is it, what is it, why do you need it, how long you need to hold it for, and who do you share it with.
  • Identify whether you are the data controller or data processor of this data.
  • Review of all policy and procedures in place around data protection and identify any gaps.
  • Review all contracts, who process personal data on your behalf, update all contracts with a data privacy clause which shows that processor is protecting the data on your behalf as the controller.
  • Demonstrate that you have a tried and tested Incident Response and Data Recovery plans in place should a breach occur.

You’re far less likely to suffer a significant fine if you show documentation of the GDPR compliant processes you have implemented and show a detailed roadmap of achieving anything that you still need to do.

GDPR isn’t all about the race to comply. Once you have tackled your data protection issues your customers will be happy, and you will have minimised the breach of data risk for your organisation. Everyone’s a winner!

How is Alternative Data Giving Investment Managers the Edge?

Posted on : 29-03-2018 | By : richard.gale | In : Consumer behaviour, Data, data security, Finance, FinTech, Innovation

Tags: , , ,

0

Alternative data (or ‘Alt-Data’) refers to data that is derived from a non-traditional source covering a whole array of platforms such as social media, newsfeeds, satellite tracking and web traffic.  There is vast amount of data in cyber space which, until recently remained untouched.  Here we shall look at the role of these unstructured data sets.

Information is the key to the success of any investment manager and information that can give the investor the edge is by no means a new phenomenon.  Traditional financial data, such as stock price history and fundamentals has been the standard for determining the health of a stock. However, alternative data has the potential to reveal insights about a stock’s health before traditional financial data. This has major implications for investors.

If information is power, then unique information sourced from places not-yet-sourced is giving those players the edge in a highly competitive market. Given that we’re in what we like to call a data revolution, where nearly every move we make can be digitized, tracked, and analysed, every company is now a data company. Everyone is both producing and consuming immense amounts of data in the race to make more money. People are well connected on social media platforms and information is available to them is many different forms. Add geographical data into the mix and that’s a lot of data about whose doing what and why. Take Twitter, it is a great tool for showing what’s happening in the world and what is being talked about. Being able to capture sentiment as well as data is a major advance in the world of data analytics.

Advanced analytical procedures can pull all this data together using machine learning and cognitive computing. Using this technology, we can take the unstructured data and transform it into useable data sets at rapid speed.

Hedge funds have been the early adopters and investment managers have now seen the light are expected to spend $7bn by 2020 on alternative data.  All asset managers realise that this data can produce valuable insight and give them the edge in a highly competitive market place.

However, it could be said that if all investment managers research data in this way, then that will put them all on the same footing and the competitive advantage is lost. Commentators have suggested that given the data pool is so vast and the combinations and permutations analysis is of data complex, it is still highly likely that this data can be uncovered that has not been uncovered by someone else. It all depends on the data scientist and where they decide to look. Far from creating a level playing field, where more readily available information simply leads to greater market efficiency, the impact of the information revolution is the opposite. It is creating hard-to access pockets for long-term alpha generation for those players with the scale and resources to take advantage of it.

Which leads us to our next point. A huge amount of money and resource is required to research this data, and this will mean only the strong survive. A report last year by S&P found that 80% of asset managers plan to increase their investments in big data over the next 12 months. Only 6% of asset managers argue that it is not important. Where does this leave the 6%?

Leading hedge fund bosses have warned fund managers they will not survive if they ignore the explosion of big data that is changing the way investors beat the markets. They are

Investing a lot of time and money to develop machine learning in areas of its business where humans can no longer keep up.

There is however one crucial issue which all investors should be aware of and that is the area of privacy. Do you know where that data originates from? Did that vendor have the right to sell the information in the first place?  We have seen this illustrated over the last few weeks with the Facebook “data breach” where Facebook sold on some of its users’ data to Cambridge Analytica without the users’ knowledge. This has wiped $100bn off the Facebook value so we can see the negative impact of using data without the owner’s permission.

The key question in the use of alternative data ultimately is, does it add value? Perhaps too early to tell. Watch this space!

Beware the GDPR Hackivist DDoS Threat

Posted on : 28-02-2018 | By : Tom Loxley | In : compliance, Cyber Security, Data, data security, GDPR, Uncategorized

Tags: , , , , , ,

0

Getting GDPReady is on most organisations agenda at the moment, however, what if, after all the effort, cost and times spent becoming compliant with GDPR I told you that you could have opened your organisation up to a serious distributed denial-of-service (DDoS) threat?

Whilst we all know that GDPR is a requirement for all businesses it is largely for the benefit of the public.

For instance, with GDPR individuals now have the right to have their personal data held by organisations revealed or deleted forgotten. Now imagine if masses of people in a focused effort decided to ask for their information at once overwhelming the target organisation. The result could be crippling and in the wrong hands be used as DDoS style attack

Before we go any further let’s just consider for one moment the amount of work, manpower, cost and time involved in processing a request to be forgotten or to produce all information currently held on a single individual. Even for organisations who have mapped their data and stored it efficiently and created a smooth process exactly for this purpose, there is still a lot of effort involved.

Hacktivism is the act of hacking or breaking into a computer system, for a politically or socially motivated purpose, so technically speaking your defences against other cyber attacks would normally protect you. But in this case, hacktivist groups could cause serious damage to an organisation without the need for any technical or cyber expertise and there is even uncertainty as to whether or not it would be illegal.

So, could GDPR requests for data deletion and anonymity be used as a legal method to disrupt organisations? I am not suggesting the occasional request would cause an issue but a coordinated mass of requests, which legally organisations will now be obliged to process, resulting in a DDoS style attack.

Organisations will be trapped by their compliance. What are the alternatives? Don’t comply with GDPR and there are fines of 4% of annual turnover or 20,000,000 euros (whichever is greater). The scary thing here is what is stopping the politically or morally motivated group who takes issue with your company from using this method? It’s easy and low risk for them and potentially crippling to some organisations so why not?

How will the ICO possibly select between the complaints of those organisations genuinely failing to comply with regulation and those which have been engineered for the purpose of a complaint?

With so many organisations still being reported as unprepared for GDPR and the ICO keen to prove GDPR will work and make some early examples of a those who don’t comply to show they mean business; my worry is that there will be a bit of a gold rush of litigation in the first few months after the May 2018 compliance deadline is issued in much the same way as PPI claims have affected the finical services lenders.

For many companies, the issue is that the prospect for preparing for GDPR seems complicated, daunting and the information on the ICO website is sometimes rather ambiguous which doesn’t help matters. The truth is that for some companies it will be far more difficult than for others and finding the help either internally or by outsourcing will be essential in their journey to prepare and implement effective GDPR compliant policy and processes.

Broadgate Consultants can advise and assist you to secure and manage your data, assess and mitigate your risks and implement the right measures and solutions to get your organisation secure and GDPReady.

For further information, please email thomas.loxley@broadgateconsultants.com.

 

Be aware of “AI Washing”

Posted on : 26-01-2018 | By : john.vincent | In : Cloud, Data, General News, Innovation

Tags: , , , ,

0

I checked and it’s almost 5 years ago now that we wrote about the journey to cloud and mentioned “cloud washing“, the process by which technology providers were re-positioning previous offerings to be “cloud enabled”, “cloud ready” and the like.

Of course, the temptation to do this is natural. After all, if the general public can trigger a 200% increase in share price simply by re-branding your iced tea company to “Long Blockchain“, then why not.

And so we enter another “washing” phase, this time in the form of a surge in Artificial Intelligence (AI) powered technologies. As the enterprise interest in AI and machine learning gathers pace, software vendors are falling over each other to meet the market demands.

Indeed, according to Gartner by 2020;

AI technologies will be virtually pervasive in almost every new software product and service

This is great news and the speed of change is outstanding. However, it does pose some challenges for technology leaders and decision makers as the hype continues.

Firstly, we need to apply the “so what?” test against the claims of AI enablement. The fact that a product has AI capabilities doesn’t propel it automatically to the top of selection criteria. It needs to be coupled with a true business value rather than simply a sales and marketing tool.

Whilst that sounds obvious, before you cry “pass me another egg Vincent”, it does warrant a pause and reflection. Human behaviour and the pressures on generating business value against a more difficult backdrop can easier drive a penchant for the latest trend (anyone seen “GDPR compliant” monikers appearing?)

In terms of the bandwagon jumping, Gartner says;

Similar to greenwashing, in which companies exaggerate the environmental-friendliness of their products or practices for business benefit, many technology vendors are now “AI washing” by applying the AI label a little too indiscriminately

The second point, is to ask the question “Is this really AI or Automation?”. I’ve sat in a number of vendor presentations through 2017 where I asked exactly that. After much deliberation, pontification and several “well umms” we agreed that it was actually the latter we were discussing. Indeed, there terms are often interchanged at will during pitches which can be somewhat disconcerting.

The thing is, Automation doesn’t have the “blade runner-esc” cachet of AI, which conjures up the usual visions that the film industry has imprinted on our minds (of course, to counter this we’ve now got Robotic Process Automation!)

So what’s the difference between AI and Automation? The basic definition is;

  • Automation is software that follows pre-programmed ‘rules’.
  • Artificial intelligence is designed to simulate human thinking.

Automation is everywhere and been an important part of industry for decades. It enables machines to perform repetitive, monotonous tasks thus freeing up time for human beings to focus on the activities that require more reasoning, rationale and personal touch. This drives efficiency and a more productive and efficient business and personal life.

The difference with Automation is that is requires manual configuration and set up. It is smart, but it has to follow set instructions and workflow.

AI however is not developed simply to follow a set of predefined instructions. It is designed to mimic human behaviour to continuously seek patterns, learn from it data and “experiences” and determine the appropriate course of action or responses based on these parameters. This all comes under the general heading of “machine learning”.

The common “fuel” that drives both Automation and AI is Data. It is the lifeblood of the organisation and we now live is an environment where we talk about “data driven” technologies at the centre of the enterprise.

Whilst it’s hard to ignore all the hype around AI it is important for decision makers to think carefully not only in terms of what they want to achieve, but also how to filter out the “AI washing”.

2017 – A great year for the hackers

Posted on : 29-12-2017 | By : Tom Loxley | In : Cloud, compliance, Cyber Security, Data, data security, FinTech, GDPR, Uncategorized

0

This year saw some of the biggest data breaches so far, we saw cover-ups exposed and ransoms reaching new highs.

Of course, it’s no secret that when it comes to cybersecurity this was a pretty bad year and I’m certain that there are many CIO’s, CISO’s and CTO’s and indeed CEO’s wondering what 2018 has to offer from the hackers.

That 2018 threat landscape is sure to be full of yet more sophisticated security attacks on the horizon. However, the big win for 2017 is that people have woken up to the threat, “not if, but when” has been finally been acknowledged and people are becoming as proactive and creative as the attackers to protect their companies. The old adage of “offence is the best form of defence” still rings true.

With that in mind we’re going to look back at some of what 2017 had to offer, the past may not predict the future, but it certainly gives you a good place to start your planning for it.

So let’s take a look at some of the most high profile data breaches of 2017.

Equifax (you guessed it) – No doubt you’ll have heard of this breach and because of its huge scale its very likely that if you weren’t directly affected yourself, you’ll know someone who was. This breach was and still is being highly published and for good reason. A plethora of litigation and investigations followed the breach in an effort to deal with the colossal scale of personal information stolen. This includes over 240 individual class-action lawsuits, an investigation opened by the Federal Trade Commission, and more than 60 government investigations from U.S. state attorneys general, federal agencies and the British and Canadian governments. More recently a rare 50-state class-action suit has been served on the company.

Here are some of the facts:

  • 145.5 million people (the figure recently revised by Equifax, now 2.5 million more than it initially reported) as its estimate for the number of people potentially affected.
  • U.K. consumers unknown. Equifax said it is still determining the extent of the breach for U.K. consumers.
  • 8,000 potential Canadian victims (recently revised down from 100,000).
  • High profile Snr leaders to leave since the breach. Former CEO Richard Smith retired (Smith is reported to have banked a $90 million retirement golden handshake), the chief information officer and chief security officer have also “left”.
  • There are an unknown number of internal investigations taking place against board members (including its chief financial officer and general counsel), for selling stock after the breach’s discovery, but before its public disclosure.
  • The breach lasted from mid-May through July.
  • The hackers accessed people’s names, Social Security numbers, birth dates, addresses and, in some instances, driver’s license numbers.
  • They also stole credit card numbers for about 209,000 people and dispute documents with personal identifying information for about 182,000 people

Uber – The big story here wasn’t so much the actual breach, but the attempt to cover it up. The breach itself actually happened 2016. The hackers stole the personal data of 57 million Uber customers, and the Uber paid them $100,000 to cover it up. However, the incident wasn’t revealed to the public until this November, when the breach was made known by the new Uber CEO Dara Khosrowshahi.

Uber has felt the impact of the backlash for the cover-up globally and on varying scales. From the big guns in the US where three senators in the US introduced a bill that could make executives face jail time for knowingly covering up data breaches. Right through to the city of York in the UK where the city voted against renewing Uber’s licence on December 23 due to concerns about the data breach.

Deloitte – According to a report from the Guardian in September earlier this year, a Deloitte global email server was breached, giving the attackers access to emails to and from the company’s staff, not to mention customer information on some of the company’s most high-profile public and private sector clients. Although the breach was discovered in March 2017, it is thought that the hackers had been in the company’s systems since October or November 2016. During in this period, the hackers could have had access to information such as usernames, passwords, IP addresses and architectural design diagrams. Deloitte confirmed the breach, saying that the hack had taken place through an admin account and that only a few clients were impacted by the attack

Now if I covered even half of the high profile cyber-attack cases in detail this article would look more like a novel. Plus, as much as I love to spend my time delighting you my dear readers it is Christmas, which means I have bad tv to watch, family arguments to take part in and copious amounts of calories (alcohol) to consume and feel guilty about for the next 3 months. So, with that in mind let’s do a short recap of some of the other massive exploits and data breaches this past year:

  1. Wonga, the payday loan firm suffered a data breach which may have affected up to 245,000 customers in the UK.
  2. WannaCry and Bad Rabbit, these massive ransomware attack affected millions of computers around the world including the NHS.
  3. The NSA was breached by a group called The Shadow Brokers. They stole and leaked around 100GB of confidential information and hacking tools.
  4. WikiLeaks Vault 7 leak, WikiLeaks exposed the CIA’s secret documentation and user guides for hacking tools which targeting the Mac and Linux operating systems.
  5. Due to a vulnerability, Cloudflare unwittingly leaked customer data from Uber, OKCupid and 1Password.
  6. Bell Canada was threatened by hackers with the leak of 9 million customer records. When the company refused to pay, some of the information was published online.
  7. Other hacks include Verizon, Yahoo, and Virgin America, Instagram…it goes on.

So, all in all not a great year but looking on the bright side if you weren’t on the wrong end of a cyber-attack this year or even if you were, there are plenty of lessons that can be learnt from the attacks that took place and some easy wins you can get by doing the basics right. We’ll be exploring some of these with our newsletter in 2018 and delving into the timelines of some of the more high-profile attacks that took place to help our readers understand and deal with the attack if they’re ever unfortunate enough to be in that situation. But if you can’t wait that long and want some advice now please feel free to get in touch anytime