Could You Boost Your Cybersecurity With Blockchain?

Posted on : 28-11-2017 | By : Tom Loxley | In : Blockchain, Cloud, compliance, Cyber Security, Data, data security, DLT, GDPR, Innovation

Tags: , , , , , , , , , , , , , , ,

0

Securing your data, the smart way

 

The implications of Blockchain technology are being felt across many industries, in fact, the disruptive effect it’s having on Financial Services is changing the fundamental ways we bank and trade. Its presence is also impacting Defense, Business Services, Logistics, Retail, you name it the applications are endless, although not all blockchain applications are practical or worth pursuing. Like all things which have genuine potential and value, they are accompanied by the buzz words, trends and fads that also undermine them as many try to jump on the bandwagon and cash in on the hype.

However, one area where tangible progress is being made and where blockchain technology can add real value is in the domain of cybersecurity and in particular data security.

Your personal information and data are valuable and therefore worth stealing and worth protecting and many criminals are working hard to exploit this. In the late 90’s the data collection began to ramp up with the popularity of the internet and now the hoarding of our personal, and professional data has reached fever pitch. We live in the age of information and information is power. It directly translates to value in the digital world.

However, some organisations both public sector and private sector alike have dealt with our information in such a flippant and negligent way that they don’t even know what they hold, how much they have, where or how they have it stored.

Lists of our information are emailed to multiple people on spreadsheets, downloaded and saved on to desktops, copied, chopped, pasted, formatted into different document types and then uploaded on to cloud storage systems then duplicated in CRM’s (customer relationship management systems) and so on…are you lost yet? Well so is your information.

This negligence doesn’t happen with any malice or negative intent but simply through a lack awareness and a lack process or procedure around data governance (or a failure to implement what process and procedure do exist).

Human nature dictates we take the easiest route, combine this with deadlines needing to be met and a reluctance to delete anything in case we may need it later at some point and we end up with information being continually copied and replicated and stored in every nook and cranny of hard drives, networks and clouds until we don’t know what is where anymore. As is this wasn’t bad enough this makes it nearly impossible to secure this information.

In fact, for most, it’s just easier to buy more space in your cloud or buy a bigger hard drive than it is to maintain a clean, data-efficient network.

Big budgets aren’t the key to securing data either. Equifax is still hurting from an immense cybersecurity breach earlier this year. During the breach, cybercriminals accessed the personal data of approximately 143 million U.S. Equifax consumers. Equifax isn’t the only one, if I were able to list all the serious data breaches over the last year or two you’d end up both scarred by and bored with the sheer amount. The sheer scale of numbers here makes this hard to comprehend, the amounts of money criminals have ransomed out of companies and individuals, the amount of data stolen, or even the numbers of companies who’ve been breached, the numbers are huge and growing.

So it’s no surprise that anything in the tech world that can vastly aid cybersecurity and in particular securing information is going to be in pretty high demand.

Enter blockchain technology

 

The beauty of a blockchain is that it kills two birds with one stone, controlled security and order.

Blockchains provide immense benefits when it comes to securing our data (the blockchain technology that underpins the cryptocurrency Bitcoin has never been breached since its inception over 8 years ago).

Blockchains store their data on an immutable record, that means once the data is stored where it’s not going anywhere. Each block (or piece of information) is cryptographically chained to the next block in a chronological order. Multiple copies of the blockchain are distributed across a number of computers (or nodes) if an attempted change is made anywhere on the blockchain all the nodes become are aware of it.

For a new block of data to be added, there must be a consensus amongst the other nodes (on a private blockchain the number of nodes is up to you). This means that once information is stored on the blockchain, in order to change or steel it you would have to reverse engineer near unbreakable cryptography (perhaps hundreds of times depending on how many other blocks of information were stored after it), then do that on every other node that holds a copy of the blockchain.

That means that when you store information on a blockchain it is all transparently monitored and recorded. Another benefit to using blockchains for data security is that because private blockchains are permissioned, therefore accountability and responsibly are enforced by definition and in my experience when people become accountable for what they do they tend to care a lot more about how they do it.

One company that has taken the initiative in this space is Gospel Technology. Gospel Technology has taken the security of data a step further than simply storing information on a blockchain, they have added another clever layer of security that further enables the safe transfer of information to those who do not have access to the blockchain. This makes it perfect for dealing with third parties or those within organisations who don’t hold permissioned access to the blockchain but need certain files.

One of the issues with blockchains is the user interface. It’s not always pretty or intuitive but Gospel has also taken care of this with a simple and elegant platform that makes data security easy for the end user.  The company describes their product Gospel® as an enterprise-grade security platform, underpinned by blockchain, that enables data to be accessed and tracked with absolute trust and security.

The applications for Gospel are many and it seems that in the current environment this kind of solution is a growing requirement for organisations across many industries, especially with the new regulatory implications of GDPR coming to the fore and the financial penalties for breaching it.

From our point of view as a consultancy in the Cyber Security space, we see the genuine concern and need for clarity, understanding and assurance for our clients and the organisations that we speak to on a daily basis. The realisation that data and cyber security is now something that can’t be taken lighted has begun to hit home. The issue for most businesses is that there are so many solutions out there it’s hard to know what to choose and so many threats, that trying to stay on top of it without a dedicated staff is nearly impossible. However, the good news is that there are good quality solutions out there and with a little effort and guidance and a considered approach to your organisation’s security you can turn back the tide on data security and protect your organisation well.

Investment Management – what’s left to outsource

Posted on : 30-11-2016 | By : richard.gale | In : Finance

Tags: , , , , , ,

0

Many Investment Management (IM) firms have outsourced significant business functions: settlement, collateral management, accounting departments have been ‘lifted out’ of a significant number of IM companies and are being run as a service by a smaller number of specialised financial services organisations.

We think the next phase for outsourcing are the middle and some of the front office functions as focus for IM firms is on ability to out-perform, reduce time to market for new products and to reduce costs. Regulation is a key driver for this as the complexities of dealing with constant regulatory change is increasing costs and constraints on  IM firms ability to move into new, more profitable, markets. New investment themes such as liability driven investing and securities such as OTC derivatives are much more widely utilised in investment firms than, say, 5 years ago. There is also the avalanche of regulation in-flight (AIFM, Dodd-Frank, MiFIR & Solvency II to name a few)  to enforce reporting and risk management. This results in operational activities such as collateral management becoming much more complex than transacting with conventional securities.

A few months back we discussed the future of middle office outsourcing with Maha Khan Phillips in Best Execution magazine and we want to expand on those thoughts here.

Another trend we see is how the Investment Banking industry is starting to look at outsourcing the non-value-add functions to reduce costs and help streamline their business areas. They are being impacted in a similar way to IM firms at the turn of the century in terms of reduction in income and focus on cost reduction.

 Outsourcing history and developments

The first phase of outsourcing often was a simple ‘lift-out’ where the back office was separated as a whole – people, systems, and processes  with a line drawn across the organisation splitting the remaining front/middle office from the outsourced back office. This was driven by a number of factors but cost reduction and the drive to better returns was core.

As an approach the lift-out worked and enabled the IM organisation to focus on its core business of investing money.  Over time as the industry matures, the limitations of this approach are becoming clear. The ability to be responsive to new business requirements can be reduced:  flexibility in the operating model to react to new changes such as business focus, new asset classes and volume variations are often slowed by split between organisations. The outsourcers will have a number of clients with differing requirements and a limited ability to change which can impact speed of delivery.

These factors have led to some operational challenges and frictions between the client and supplier the result of which has led to a reassessment of the services and relationship. The client has a number of choices available and, as the earlier contracts mature, firms are identifying this period as an opportunity to review the current state vs. alternative strategies. The choices are broadly:

  1. Insource. To undo the lift-out and bring services back in-house. Some organisations have done this with varying degrees of success but the underlying rationale for outsourcing and the business case underpinning this needs to be closely examined.
  2. Migrate to new outsourcer. This is potentially one of the more complex solutions but also a possibility to re-engineer the business. Often there are complex interactions between the client/supplier that exist because of the way the outsource was constructed historically. This ‘web’ of interfaces, processes and procedures will need to be cleaned and logically split to migrate. Also the level of complexity from moving from one (client) organisation to an outsource supplier goes to a new level when migrating suppliers.
  3. Stay with existing and work together to improve service, relationship and capabilities.
  4. A combination of the above not excluding outsourcing more functions of the client firm.

Assuming the client strategically does not which to insource the functions then one of the most important activities is to grow the client/supplier relationship into an aligned partnership. This is the time when parties need to work together to construct a roadmap to move to a more efficient, cost effective and flexible model to deliver optimised services and capacity to grow.

This trend is gathering pace as firms look to ‘smarter’ outsourcing which bundles up groups of functions and let someone else look after the day to day management whilst enjoying a consistent service and pricing. Significant middle office functions are in-scope and included in those are what are traditionally seen as front office capabilities such as deal execution and compliance monitoring.

Interestingly the Buy-side has led the way on outsourcing. Investment banks have previously been too busy ‘running’ to keep up – growing new business areas and have been wary of outsourcing as a brake on their flexibility and ability to expand. The focus has been on IT infrastructure, testing & development and creating ‘captives’ in lower cost areas for operations. Now cost and regulatory pressures are proving a heavy burden then banks are now spending more time and energy looking into outsourcing their non-propriety functions. We think this is one of the trend areas for the next few years.

This is an updated version of our article first published in 2012. The thoughts are still very relevant and we wanted share them again.

www.twitter.com/broadgateview

Agile. Is it the new name for in-sourcing?

Posted on : 30-01-2015 | By : richard.gale | In : Innovation

Tags: , , , , , , , , , , , , , , ,

0

Business, IT, clothing are all similar in so much that they can lead and follow fashions & trends.

Looking at IT specifically there is a trend to commoditise and outsource as much as possible to concentrate on the core ‘business’ of growing a business. As we all know this has many advantages for the bottom line and keeps the board happy as there is a certainty of service & cost, headcount is down and the CIO has something to talk about in the exec meetings.

At the coalface the story is often a different one with users growing increasingly frustrated with the SLA driven service, business initiatives start to be strangled by a cumbersome change processes and support often rests in the hands of the dwindling number of IT staff with deep experience of the applications and organisation.

So a key question is –  How to tackle both the upward looking cost/headcount/service mentality whilst keeping the ability to support and change the business in a dynamic fulfilling way?

Agile is a hot topic in most IT and business departments, it emerged from several methodologies from the 1990’s with roots back to the ‘60s and has taken hold as a way of delivering change quickly to a rapidly changing business topology.

At its core Agile relies on:

  • Individuals & interaction – over process and tools
  • Customer communication & collaboration in the creation process – over agreeing scope/deliverables up front
  • Reactive to changing demands and environment – over a blinkered adherence to a plan

The basis of Agile though relies on a highly skilled, articulate, business & technology aware project team that is close to and includes the business. This in theory is not the opposite of an outsourced, commodity driven approach but in reality the outcome often is.

When we started working on projects in investment organisations in the early ‘90s most IT departments were small, focused on a specific part of the business and the team often sat next to the trader, accountant or fund manager. Projects were formal but the day to day interaction, prototyping, ideas and information gathering could be very informal with a mutual trust and respect between the participants. The development cycle was often lengthy but any proposed changes and enhancements could be story boarded and walked through on paper to ensure the end result would be close to the requirement.

In the front office programmers would sit next to the dealer and systems, changes and tweaks would be delivered almost real time to react to a change in trading conditions or new opportunities (it is true to say this is still the case in the more esoteric trading world where the split between trader and programmer is very blurry).  This world, although unstructured, is not that far away from Agile today.

Our thinking is that businesses & IT departments are increasingly using Agile not only for its approach to delivering projects but also, unconsciously perhaps,  as a method of bypassing the constraints of the outsourced IT model – the utilisation of experienced, skilled, articulate, geographically close resources who can think through and around business problems are starting to move otherwise stalled projects forward so enabling the business to develop & grow.

The danger is – of course – that as it becomes more fashionable – Agile will be in danger of becoming mainstream (some organisations have already built offshore Agile teams) and then ‘last years model’ or obsolete. We have no doubt that a new improved ‘next big thing’ will come along to supplant it.

 

Investment Management – what’s left to outsource.

Posted on : 30-09-2014 | By : richard.gale | In : Finance

Tags: , , , , , ,

0

Many Investment Management (IM) firms have outsourced significant business functions: settlement, collateral management, accounting departments have been ‘lifted out’ of a significant number of IM companies and are being run as a service by a smaller number of specialised financial services organisations.

We think the next phase for outsourcing are the middle and some of the front office functions as focus for IM firms is on ability to out-perform, reduce time to market for new products and to reduce costs. Regulation is a key driver for this as the complexities of dealing with constant regulatory change is increasing costs and constraints on  IM firms ability to move into new, more profitable, markets. New investment themes such as liability driven investing and securities such as OTC derivatives are much more widely utilised in investment firms than, say, 5 years ago. There is also the avalanche of regulation in-flight (AIFM, Dodd-Frank, MiFIR & Solvency II to name a few)  to enforce reporting and risk management. This results in operational activities such as collateral management becoming much more complex than transacting with conventional securities.

A few months back we discussed the future of middle office outsourcing with Maha Khan Phillips in Best Execution magazine and we want to expand on those thoughts here.

Another trend we see is how the Investment Banking industry is starting to look at outsourcing the non-value-add functions to reduce costs and help streamline their business areas. They are being impacted in a similar way to IM firms at the turn of the century in terms of reduction in income and focus on cost reduction.

 Outsourcing history and developments

The first phase of outsourcing often was a simple ‘lift-out’ where the back office was separated as a whole – people, systems, and processes  with a line drawn across the organisation splitting the remaining front/middle office from the outsourced back office. This was driven by a number of factors but cost reduction and the drive to better returns was core.

As an approach the lift-out worked and enabled the IM organisation to focus on its core business of investing money.  Over time as the industry matures, the limitations of this approach are becoming clear. The ability to be responsive to new business requirements can be reduced:  flexibility in the operating model to react to new changes such as business focus, new asset classes and volume variations are often slowed by split between organisations. The outsourcers will have a number of clients with differing requirements and a limited ability to change which can impact speed of delivery.

These factors have led to some operational challenges and frictions between the client and supplier the result of which has led to a reassessment of the services and relationship. The client has a number of choices available and, as the earlier contracts mature, firms are identifying this period as an opportunity to review the current state vs. alternative strategies. The choices are broadly:

  1. Insource. To undo the lift-out and bring services back in-house. Some organisations have done this with varying degrees of success but the underlying rationale for outsourcing and the business case underpinning this needs to be closely examined.
  2. Migrate to new outsourcer. This is potentially one of the more complex solutions but also a possibility to re-engineer the business. Often there are complex interactions between the client/supplier that exist because of the way the outsource was constructed historically. This ‘web’ of interfaces, processes and procedures will need to be cleaned and logically split to migrate. Also the level of complexity from moving from one (client) organisation to an outsource supplier goes to a new level when migrating suppliers.
  3. Stay with existing and work together to improve service, relationship and capabilities.
  4. A combination of the above not excluding outsourcing more functions of the client firm.

Assuming the client strategically does not which to insource the functions then one of the most important activities is to grow the client/supplier relationship into an aligned partnership. This is the time when parties need to work together to construct a roadmap to move to a more efficient, cost effective and flexible model to deliver optimised services and capacity to grow.

This trend is gathering pace as firms look to ‘smarter’ outsourcing which bundles up groups of functions and let someone else look after the day to day management whilst enjoying a consistent service and pricing. Significant middle office functions are in-scope and included in those are what are traditionally seen as front office capabilities such as deal execution and compliance monitoring.

Interestingly the Buy-side has led the way on outsourcing. Investment banks have previously been too busy ‘running’ to keep up – growing new business areas and have been wary of outsourcing as a brake on their flexibility and ability to expand. The focus has been on IT infrastructure, testing & development and creating ‘captives’ in lower cost areas for operations. Now cost and regulatory pressures are proving a heavy burden then banks are now spending more time and energy looking into outsourcing their non-propriety functions. We think this is one of the trend areas for the next few years.

This is an updated version of our article first published in 2012. The thoughts are still very relevant and we wanted share them again.

www.twitter.com/broadgateview

Why’s my computer so slow? Maybe someone is digging for virtual gold.

Posted on : 30-06-2014 | By : richard.gale | In : Cyber Security

Tags: , , , , , , ,

0

We’ve discussed the rise and fall and rise of virtual currencies in a couple of previous articles (When are Bitcoins going to crash and what’s next?,  The hidden costs of transacting with virtual currencies).

Creating new currency (whether it be Bitcoin, Dogecoin, Litecoin etc) involves using more and more complex logarithms that consume computing power. The reward for this problem solving is a virtual coin and the amount of work required to ‘earn’ a ‘coin’ is constantly rising.  ‘Miners’, as the creators are call are always looking for new and creative ways to build more coins and the cost of processing power sometimes outweighs the worth of the output.

A phenomena that will only rise in frequency and impact is the misuse of other people’s computers to do this.  A few examples are outlined below where organisations were unwittingly hosting unauthorised external mining activities (maybe some terminology from the Californian gold rush would be appropriate – are they virtual “claim jumpers” or “processing poachers”?)

Harvard University research servers have been used to mine dogecoins. A powerful cluster of machines known as ‘Odyssey’ had been hijacked – misused really as the user had legitimate access – and a mining operation was in place for an unknown period of time. The perpetrator has now had their access revoked but is is not known how profitable the operation was.

Another example, the US National Science foundation supercomputers had been taken over for bitcoin mining – the researcher accused of creating the mining operation said he was ‘conducting research’ and it is thought around $8,000 worth of bitcoins were produced.

There are other occurrences of this phenomena including rogue Android applications which have been reported to have taken over peoples’ mobile phones to carry out mining activities (although they would need a large number of phones to make this at all valuable).

We think these examples reflect a wider problem. People  can have legitimate access to huge amounts of computing  power, this especially true in academic, governmental and larger enterprises. How can the need to run large simulations or experiments be differentiated from more sinister misuses of that excess power?

This whole space is a difficult area to analyse. What is ‘normal’ and what is ‘abnormal’? We’ve been thinking about how to differentiate the two and are now working with a really smart new security company that can help with this (and many other security) issues.

The product, Darktrace, has been built by some ex-MI5 and GCHQ scientists and it grew out of the need to protect the UK’s critical network infrastructure (energy & water supplies, communications & transport)  against terrorist or foreign state cyber-attack. The guys at Darktrace quickly realised that the current suite of protection could not prevent most insider attacks (whether intentioned or accidental) so a new model was needed.

Darktrace sits at the centre of your network, listens and learns about the behaviour of users, connected devices and the network itself and then alerts when something abnormal or unusual occurs. It has no preconceptions about the environment when it is installed and it learns (for a period of 2-4 weeks) and then shouts (usually to the security operations team or external team such as the Mandiant response units) when something odd happens. Darktrace views the appliance almost like the immune system of a body, It understands what healthy is and alerts its ‘antibodies’ to investigate and destroy if necessary any potential threat.

The product uses some clever probabilistic algorithms that constantly learn and build on its knowledge of your environment. An example could be the user ‘Fred’. Fred normally logs in to the network after 8:00am, accesses mail, three file servers and then logs out before 7:00pm. If Fred suddenly starts logging in at 02:0am, searchers eight different file servers for documents containing the word ‘Patent’ and then starts exporting them outside the organisation to a site in the Ukraine then it would be marked as ‘unusual’ and alerted. This could potentially be legitimate activity if ‘Fred’s role has changed but probably not. Traditional cyber-technologies may not catch these sort of issues as they are looking for specific patterns or types of behaviour rather than general differences from the norm.

We have been working with Darktrace and can install the appliance on your network to perform the analysis for you. We can do this for a period of 4-8 weeks(to give the system enough time to learn the environment and to sufficient data to work with) and can provide analysis of any unusual behaviour and advice to your security team through that period. In that period of time we would expect to see some unusual activity so should hopefully show the value to your organisation.

If you would like to learn more about this please do contact us.

Is it the time for Joint Shared Services?

Posted on : 29-11-2013 | By : john.vincent | In : Innovation

Tags: , , , , , , , , , , , , ,

1

Last month we wrote about how the rate of technology change is outpacing the internal IT departments of organisations. It certainly seems that the “squeeze” is on with cloud and external providers offering more agile compute services at the infrastructure level (now at an on-demand cost which can compete), and the business consumers procuring what they need, when they need it and of course where the need it through Software as a Service (SaaS) providers.

Two years ago the ability for CIOs to raise the virtual “Red Card” at these external forces through risk, compliance, data security, cost and the like still existed, particularly in areas such as financial services (although we constantly heard anecdotes of technology services being brought on credit cards in the front office and expensed back). However, today it is more a case or working out how to protect digital assets and company reputation from the increased decentralisation of technology governance (business/end-user empowerment), whilst continuing to deliver operational services against a backdrop of having to justify value.

So, whilst this move of technology governance to the corporate edges continues, the question is “What approach should organisations take to sourcing their underpinning infrastructure commodity services?”

We have seen decades of ebb and flow for the sourcing of technology services….Outsourcing, off shoring, near shoring, right shoring (we may have finally run out of prefixes…), managed services and the like. Internally, organisations have coupled this operating model with shared service functions such as Finance, Human Resource and Operations to deliver further efficiencies. What is less prevalent, however, is collaboration between client organisations.

Large service providers have shown the benefits through economies of scale to running client technology platforms. However, whatever your position is on outsourcing technology, many would argue that the clients themselves do not benefit fully from these efficiencies. This is of course natural where there is a fragmented delivery chain and limited client side collaboration. So, is the time right to extend the shared service model and create shared service models, or joint ventures, between peer organisations?

If you take the infrastructure layer then we think…YES. As we said in our previous article, where is the business (or more importantly brand) value in having technicians crafting infrastructure services? There are pockets/exceptions, but typically the “compute plumbing” supporting business applications does not drive competitive advantage. However, in todays fast moving landscape it is very easy to erode value through rigid or elongated timescales for service provisioning.

The pace of change is clearly illustrated by the transformed data centre market. Back in 2005/2006, many large corporate CIOs were scrambling to purchase their own data centres as space and power became scarce. Fast Forward to today and many of those same organisations are sitting with surplus capacity.

In the space of a few years, driven by new the revolution in virtualisation and cloud computing, it would now seem a bad strategy to build and manage your own client facility. 

The question to ask is how organisations can collaborate together to source their compute requirements together for mutual benefit. For back office processing there have been “carve outs”, collaborations or joint ventures such as in the investment management and insurance markets. Leading on from this, there is no reason why peer organisations couldn’t combine to create a SPV/JV for their underlying infrastructure requirements. This has the potential to bring many benefits, including:

  • Increased market leverage for commodity service pricing
  • Reduced fixed overheads and move from Capex to Opex
  • Improved standards and policies in areas such as security and risk management (through collective influence)
  • Increased agility and time to market
  • Enhanced technology innovation 
  • Improved focus on core business competencies

There are many others (and no doubt many counter arguments, which happy to receive…)

So what stops organisations proceeding? Well, most of all we are talking about a cultural shift which, if driven from the technology organisation themselves (CIO), is unlikely to get much traction. This level of change is not something that can be technology driven. This needs to be a top down, business led discussion.

It also doesn’t apply only to technology. Many years ago (I think late 90’s) I attended a conference where the speaker talked about measuring real company value and how organisations would over time “jettison” those operations that didn’t contribute to the customer proposition. What is left in the final end game? In the extreme example it is simply those creating the Strategy and Brand alone, with everything else sourced from the market. When you think about it, it does make sense.

Every year previously we have produced our predictions for the coming 12 months. We don’t see this happening in that timeframe but at least opening up the discussion should be on the CEOs “to-do” list in 2014…

Technology Empowerment vs. Frustration: A User(s) Guide

Posted on : 30-04-2012 | By : richard.gale | In : General News

Tags: , , , , , , , , , , , , ,

0

One of the most exciting aspects of running an IT Consultancy is the variety of views and opinions we get to hear about from our clients, teams, suppliers & partners. We want to focus this month on looking at the relationships between business users of technology and the IT departments that supply solutions.  As with most ‘marriages’ this is a complex, ever changing interaction, but two factors are key to this are: Empowerment  and Frustration.

We think we are on the cusp of a major change in the balance of power between the user and IT departments, this happens very rarely so we are watching with interest how it develops over the next few years. Business users now are digitally aware, often frustrated by tech departments and confident enough to bypass them. This is a dangerous time for the traditional IT team and trying to control and close down alternatives would be a mistake and is probably too late anyway.

The graph below highlights how users frustration with IT has increased whilst their ability to control has diminished. There was a brief (golden?) period in the 1990’s where Desktop computing and productivity tools helped business users become more self-sufficient but that was then reduced as IT took control back of the desktop

 

Business Frustration vs. Empowerment 1970 onwards

Business Frustration vs. Empowerment 1970 onwards

The 70s – the decade of opportunity

Obviously computing did not start in 1970 (although Unix time did start on Jan 1st…) but the ’70s was perhaps the time when IT started making major positive impacts to organisations. Payroll, invoicing, purchasing and accounting functions started to become more widely automated and computerised. The productivity gains from some of these applications transformed businesses and the suppliers (IBM, IBM, IBM etc) did phenomenally well out of it. User empowerment was minimal but frustration was also low as demand for additional functions and flexibility was limited

The 80s – growing demands and an awakening workforce

The 1980s saw the rise of the desktop with Apple and Microsoft fighting for top-dog position. This explosion of functionality was exciting for the home user initially and then quickly grew to be utilised and exploited by organisations. Productivity tools such as spreadsheets, word processing and email allowed business users to create and modify their working practices and processes. The adoption of desktops accelerated towards the end of the decade so we make this decade as: Empowerment up (and growing) and frustration down.

The 90s – Power to the people (sort of… for a while)

Traditional IT departments recognised the power of the utility PC and adjusted (and grew) to support the business. Networks and so file sharing, and as importantly, backups became the norm. Business departments were  becoming more autonomous with the power the PC gave them. Macros and Visual Basic add-ons turned into business critical applications, new software was being produced by innovative companies all the time. Business users were free to download and run pretty much anything on their work computer.  The complexity of IT infrastructure and applications was increasing exponentially… so inevitably things began to creak and break, end user applications (or EUCs as they became known) could be intolerant of change (such as a new version of Excel), also they were often put together in an ad-hoc fashion to solve a particular problem and then woven into a complicated business process which became impossible to change. This, with the additional twist of the ‘computer virus’ gave the opportunity for the IT department to lock-down users PCs and force applications to be developed by the new, in-house, development teams. Result for the 1990s – User frustrations rising, demands rising and empowerment on the way down.

The 00s – Control and process

The dawn of the new millennium, the first crash of the dot coms and the lockdown of user PCs continues at pace. The impacts from the ’90s – unsupportable applications, viruses, complexity of the desktop were joined by higher levels of regulation, audit and internal controls. These combined with a focus on saving money in the still expanding IT departments caused further reduction in user abilities to ‘do IT’. In large organisations most PCs were constrained to such an extent they could only be used for basic email, word processing and Excel (now the only spreadsheet in town). Any new application would have to go through a lengthy evaluation, purchasing, configuration, security testing, ‘packaging’ and finally installation if it was required for business use so inevitably – User frustration was rising to dangerous levels and empowerment was further degraded.

The 10s – A digital workforce demands power

The controls and restrictions of the ’00s now ran into signification budgetary restrictions on IT departments. Costs were, and are, being squeezed, fewer and less experienced resources are dealing with increasing demands an pace. Frustration levels were peaking to a point relationships between IT and business were breaking down. Outsourcing parts of IT organisations made some significant savings on budgets but did nothing to reduce user concerns around delivery and service (at least in the short term).

Some users started to ‘rebel’, the increasing  visibility of software as a service (SaaS) enabled certain functions to implement simple but functionally rich solutions to a team or department relatively easily and without much/any IT involvement. Salesforce.com did amazingly well through an ease of use, globally available, infrastructure free product which did everything a Sales team needed and could be purchased on a credit card and expensed…  Internal productivity tools such as Sharepoint started being used for  complex workflow processes – by the business without need for IT.

At the same time personal devices such as smartphones, tablets and laptops (BYOD) became the norm for the business community. They want and are demanding ability to share business data on these tools.

Public cloud usage by business users is also starting to gather pace and the credit card/utility model means some functions do not use IT for certain areas where quick creation and turnaround of data/processing is needed (whether that is wise or not is a different question).

So what are IT departments doing to ensure they can continue to help business units in the future:

  • Become much more business needs focused (obvious but needs to be addressed as a priority)
  • Encourage the use of BYOD – in the end it will save the firm money through not having to purchase hardware
  • Aggressively addressing traditional structures and costs – ask questions such as
    • “Why can’t we get someone else to run this for us?” – whether outsource, cloud or SaaS
    • “Why don’t you have a SaaS/Cloud enabled product?”
  • Become a service broker to the business – looking ahead and managing service and supplier rather than infrastructure, applications or process.

User empowerment rising but user demands and frustrations still high

The 20s – Business runs business and Utilities run IT

What will happen in the next few years? Who can tell but trends we are seeing include:

  • There will be a small number of large firms with massive computing capacity – most other organisations will just use this power as required.
  • There will be new opportunities for financial engineering such as exchange trading computing & processing power, storage & network capacity.
  • IT infrastructure departments in the majority of organisations would have disappeared
  • IT for business organisations will consist of Strategy, Architecture, Business Design, (small specialised) Development focusing on value-add tooling and integration, Relationship and Supply management of providers, products and  pricing

All these point to more power for the business user but one trend emerging which may reverse that is the on-going impact of legislation and regulation. This could limit business capability to be ‘free’ and the lockdown of IT may begin again but this time more from government onto the external suppliers of the service resulting in increasing frustration levels and reduced empowerment….. interesting to see how this goes.

 

 

The next phase of Investment Management outsourcing

Posted on : 27-02-2012 | By : richard.gale | In : Finance

Tags: , , , , , , , , , , , ,

1

Many Investment Management (IM) firms have outsourced significant business functions: settlement, collateral management, accounting departments have been ‘lifted out’ of a significant number of IM companies and are being run as a service by a smaller number of specialised financial services organisations.

We think the next phase for outsourcing are the middle and some of the front office functions as focus for IM firms is on ability to out-perform, reduce time to market for new products and to reduce costs. Regulation is a key driver for this as the complexities of dealing with constant regulatory change is increasing costs and constraints on  IM firms ability to move into new, more profitable, markets. For example OTC derivatives are much more widely utilised in investment firms than, say, 5 years ago and there is an avalanche of regulation in-flight (Dodd-Frank, MiFIR & Solvency II to name a few)  to enforce reporting and risk management. This results in operational activities such as collateral management becoming much more complex than transacting with conventional securities.

Last month we discussed the future of middle office outsourcing with Maha Khan Phillips in January’s edition of Best Execution magazine and we want to expand on those thoughts here.

Another trend we see is how the Investment Banking industry is starting to look at outsourcing the non-value-add functions to reduce costs and help streamline their business areas. They are being impacted in a similar way to IM firms at the turn of the century in terms of reduction in income and focus on cost reduction.

 Outsourcing history and developments

The first phase of outsourcing often was a simple ‘lift-out’ where the back office was separated as a whole – people, systems, and processes  with a line drawn across the organisation splitting the remaining front/middle office from the outsourced back office. This was driven by a number of factors but cost reduction and the drive to better returns was core.

As an approach the lift-out worked and enabled the IM organisation to focus on its core business of investing money.  Over time as the industry matures, the limitations of this approach are becoming clear. The ability to be responsive to new business requirements can be reduced:  flexibility in the operating model to react to new changes such as business focus, new asset classes and volume variations are often slowed by split between organisations. The outsourcers will have a number of clients with differing requirements and a limited ability to change which can impact speed of delivery.

These factors have led to some operational challenges and frictions between the client and supplier the result of which has led to a reassessment of the services and relationship. The client has a number of choices available and, as the earlier contracts mature, firms are identifying this period as an opportunity to review the current state vs. alternative strategies. The choices are broadly:

  1. Insource. To undo the lift-out and bring services back in-house. Some organisations have done this with varying degrees of success but the underlying rationale for outsourcing and the business case underpinning this needs to be closely examined.
  2. Migrate to new outsourcer. This is potentially one of the more complex solutions but also a possibility to re-engineer the business. Often there are complex interactions between the client/supplier that exist because of the way the outsource was constructed historically. This ‘web’ of interfaces, processes and procedures will need to be cleaned and logically split to migrate. Also the level of complexity from moving from one (client) organisation to an outsource supplier goes to a new level when migrating suppliers.
  3. Stay with existing and work together to improve service, relationship and capabilities.
  4. A combination of the above not excluding outsourcing more functions of the client firm.

Assuming the client strategically does not which to insource the functions then one of the most important activities is to grow the client/supplier relationship into an aligned partnership. This is the time when parties need to work together to construct a roadmap to move to a more efficient, cost effective and flexible model to deliver optimised services and capacity to grow.

This trend is gathering pace as firms look to ‘smarter’ outsourcing which bundles up groups of functions and let someone else look after the day to day management whilst enjoying a consistent service and pricing. Significant middle office functions are in-scope and included in those are what are traditionally seen as front office capabilities such as deal execution and compliance monitoring.

Interestingly the Buy-side has led the way on outsourcing. Investment banks have previously been too busy ‘running’ to keep up – growing new business areas and have been wary of outsourcing as a brake on their flexibility and ability to expand. The focus has been on IT infrastructure, testing & development and creating ‘captives’ in lower cost areas for operations. Now cost and regulatory pressures are proving a heavy burden then banks are now spending more time and energy looking into outsourcing their non-propriety functions. We think this is one of the trend areas for the next few years.

Agile – Is it the new Insourcing?

Posted on : 23-08-2011 | By : richard.gale | In : Innovation

Tags: , , , ,

0

Business, IT, clothing are all similar in so much that they can lead and follow fashions & trends.

Looking at IT specifically there is a trend to commoditise and outsource as much as possible to concentrate on the core ‘business’ of growing a business. As we all know this has many advantages for the bottom line and keeps the board happy as there is a certainty of service & cost, headcount is down and the CIO has something to talk about in the exec meetings.

At the coalface the story is often a different one with users growing increasingly frustrated with the SLA driven service, business initiatives start to be strangled by a cumbersome change processes and support often rests in the hands of the dwindling number of IT staff with deep experience of the applications and organisation.

So a key question is –  How to tackle both the upward looking cost/headcount/service mentality whilst keeping the ability to support and change the business in a dynamic fulfilling way?

Agile is a hot topic in most IT and business departments, it emerged from several methodologies from the 1990’s with roots back to the ‘60s and has taken hold as a way of delivering change quickly to a rapidly changing business topology.

At its core Agile relies on:

  • Individuals & interaction – over process and tools
  • Customer communication & collaboration in the creation process – over agreeing scope/deliverables up front
  • Reactive to changing demands and environment – over a blinkered adherence to a plan

The basis of Agile though relies on a highly skilled, articulate, business & technology aware project team that is close to and includes the business. This in theory is not the opposite of an outsourced, commodity driven approach but in reality the outcome often is.

When we started working on projects in investment organisations in the early ‘90s most IT departments were small, focused on a specific part of the business and the team often sat next to the trader, accountant or fund manager. Projects were formal but the day to day interaction, prototyping, ideas and information gathering could be very informal with a mutual trust and respect between the participants. The development cycle was often lengthy but any proposed changes and enhancements could story boarded and walked through on paper to ensure the end result would be close to the requirement.

In the front office programmers would sit next to the dealer and systems, changes and tweaks would be delivered almost real time to react to a change in trading conditions or new opportunities (it is true to say this is still the case in the more esoteric trading world where the split between trader and programmer is very blurry).  This world, although unstructured, is not that far away from Agile today.

Our thinking is that businesses & IT departments are increasingly using Agile not only for its approach to delivering projects but also, unconsciously perhaps,  as a method of bypassing the constraints of the outsourced IT model – the utilisation of experienced, skilled, articulate, geographically close resources who can think through and around business problems are starting to move otherwise stalled projects forward so enabling the business to develop & grow.

The danger is – of course – that as it becomes more fashionable – Agile will be in danger of becoming mainstream (some organisations have already built offshore Agile teams) and then ‘last years model’ or obsolete. We have no doubt that a new improved ‘next big thing’ will come along to supplant it.