Investment Management in the Cloud – Is it Time to Move?

Posted on : 29-01-2016 | By : Jack.Rawden | In : Cloud, Finance

Tags: , , ,

0

One of Broadgate’s key predictions for 2016 is the continued acceleration of cloud technologies within organisations. Finance, often the trailblazers for new technologies, have been slow to adapt to this technology for a variety of reasons (discussed later). As the technology matures, the arguments to not move toward the cloud become less and less. In this article we will discuss a few of the major reasons, based off discussions with our clients, as to why there has been slow cloud adoption in investment management.

Security is at the forefront of a CIO looking to move to the cloud. The perceived loss of control and ownership of data plus concerns about how a service provider might secure your data can be a worry.  There is also the risk of interception of data whilst it is being transferred. The counter argument of this is simple. Often we find SaaS firms have a bigger budgets associated to cyber security – plus as specialist providers and holders of multiple organisations data they have a greater exposure if a breach were to occur. If they were to lose your data, would you renew your contract with them? Probably not.

Due to this the level of security is often dramatically improved from that of a traditional financial institution and providers often have:-

  • An infrastructure that is designed to be more secure with additional safeguards in place
  • Greater levels of encryption
  • More resources dedicated to keeping data secure
  • Updated latest principles and best practices
  • More technology to detect threats and breaches

 

Moving data to a site other than your own can cause not only security concerns but also about conformity to regulation and ownership.  New EU data ownership rules, due to come in force in 2017 (see this useful article on them from computer weekly http://www.computerworlduk.com/security/10-things-you-need-know-about-new-eu-data-protection-regulation-3610851/) mean financial regulators might investigate how and where sensitive data, particularly client data is being stored.  International/Multi-national providers are overcoming this by opening targeted data centres – for example EU only or UK only depending on the classification of data. Amazon Web Services, Microsoft Azure Cloud and major financial providers such as BlackRock all have these offerings. These comply with all major and new regulations. Usefully, if an organisation is split between countries, it is possible to implement local data centres and ownership, e.g. having a Swiss Data Centre and UK Data Centre for the same functionality.

Another key concern mentioned when moving to the cloud are potential performance issues. What happens if the cloud “goes down” or connectivity is lost? What happens if there are latency issues between the cloud and local machine?

Connectivity/Uptime of servers, in our experience, is still an important factor when agreeing contracts and service levels.  However, SaaS providers rarely fail to hit these levels and if issues occur  then there are economies of scale for large providers through having live backups, better failsafe’s and more resources to bring systems back online.  With that we often find there is an improved recovery plan and business continuity plan that a typical investment manager.

Latency levels can often be more difficult to overcome and may require changes to infrastructure.  Dedicated connections now widely supported by cloud providers mean that speeds are fast and often users are unaware that solutions are hosted off site.

With the above considerations there has often been trepidation with moving operations, particularly critical operations such as trading, into a cloud environment. Organisations are often looking for a competitor to make a move to see how they fare, or potentially sticking with the more traditional methods and applying the logic “If it aint broke, don’t fix it”.  This has been compounded by the fact the traditional investment management products have also been slow to adapt their offerings, sticking with the current on premise solutions, rather than offering updated SaaS based solutions.  However, products such as Blackrock Aladdin offering a standardised full functionality cloud based hosted platform are trailblazing this area.

So after overcoming the potential issues that might be faced with the cloud, why would a firm want to move it’s offering to the cloud, this will be the focus of a future article but the major factors are:-

  • Increased Agility
  • Reduced software/hardware maintenance
  • Ability for investment managers to focus on investments over technology
  • Reduces the time to market for new products

With advantages of the cloud becoming recognised we are finding that this is an area where vendors and investment managers are really focussing.  Traditional vendors are adapting their products to be able to provide cloud based services and are producing some excellent new products. Investment managers see the potential service improvements, cost savings and maintenance savings discussed above. It’s an area that is rapidly changing and adapting on a monthly basis.  It’s where we will be watching for new technology improvements in 2016.

Five minutes with…

Posted on : 27-11-2015 | By : Maria Motyka | In : 5 Minutes With, Cloud, Cyber Security, Innovation

Tags: , , , , , , , ,

0

We are doing a series of interviews with leaders to get their insight on the current technology market and business challenges. Here in our first one, we get thoughts from Stephen O’Donnell, who recently took up the post of CIO for UK & Ireland at G4S.

Which technology trends do you predict will be a key theme for 2016?

“The key trend is the adoption of cloud technology moving from the SME market space, where it is already strong, to really making an impact in the enterprise space.

We’ve seen cloud and SaaS being adopted by smaller companies and now it will be adopted by bigger enterprises. We’ve also seen support for cloud based services from major system integrators and software suppliers like Microsoft, SAP and so on. The time for IT delivered as a service has come and the cloud is about to become all-encompassing across the entire IT world.

This has big implications in the ways that CIO’s and business leaders need to manage their systems, away from low-level management of infrastructure into the management of services and concerns about service integration.

Fundamentally it’s a bit like the Hollywood movie industry moving from the silent movie era to the talking era. Not all of the actors made it through – they did not have the skills and experience and I think this is what will happen in the IT industry. Some IT leaders will have difficulties, others will be more successful thanks to their deeper understanding of the business impact of IT, how automation and cloud based services can really help businesses drive competitiveness and agility, reduce risk and cut costs.”

 

You recently joined G4S as CIO, the worlds leading international security solutions group. What is your vision for the future of technology services there?

“G4S are adopting the cloud very aggressively. We have 622,000 employees, we’re a really large entity and we have stopped using Microsoft technology and are now using Google and the cloud instead. This consists of Google Apps for work, Google Docs for word processing, Google Sheets for spreadsheets and Gmail for email and collaboration platforms. In terms of the cloud, we use Google Drive for storage, everything is now in the cloud and we access it through a browser.

You have no idea how much simpler the world becomes. All of the complexities fade away. It’s now very much about managing the cloud contract and ensuring that the end-users are familiar with the technology and are appropriately supported. It’s very simple, it integrates extremely well with any device. We’ve seen very happy customer experience – whether using a chromebook, a Mac, a PC with a browser – people can access the systems in the same way and just as securely. Wifi capabilities in the office also become a lot simpler and we don’t have to be worried about highly secured corporate networks.

I think everyone would agree that the world is moving away from landlines to mobile communications. From standard telephone calls to IP-based telephone calls: using – in the consumer space Skype and WhatsApp, in the business space Google Hangouts, Skype for Business and so on – we see a massive adoption of that in business. We’ve really adopted Google Hangouts for collaboration and conferencing and have moved away from desk phones to cellphones.

Even when you look at the shape of our business… we have a huge number of people and the vast majority of them are working on customer site because they are security guards there, they do facility management, they’re doing cash in transit. They’re working in public services, working for hospitals… Having landlines just doesn’t make sense.

The whole company has gone mobile I don’t have a desk phone and – actually – you know what? I don’t miss it at all. I have a cellphone and it works extremely well, when I want to collaborate I use some of the internet-based tools like Hangouts. Equally –  why do you need a fax? When was the last time you’ve sent or received a fax…?

Migration from fixed to mobile has been a key change in the workplace and I’ll be surprised if more companies don’t adopt this. It’s all about simplifying the environment and being more economical.”

 

In your opinion, what are the greatest challenges IT leaders face in terms of securing organisations’ critical data?

“It’s a very relevant question. In the aftermath of the Paris attacks by ISIS someone said the terrorists only have to be lucky once and the authorities need to be lucky all of the time. I think the same applies to corporate and corporate data security.

Everyone is under absolutely intense attack and due to the complex systems, we have to make assumptions that, regardless what we do, some of our critical data will become exposed.

It could be through employees or through contractors whom we trust who might choose to do the wrong thing, or it might be via external agents, who manage to overcome our security systems either by using technology or by stealth, for example phishing attacks getting access to our data.

I think the key things are that we can put all the peripheral protections on our data: firewalls, secure data centres, the man guards on the gates etc. but we have to encrypt the data.

We have to adopt digital rights management so that we can restrict the data to those who are supposed to see it and ensure that anyone who steals it won’t be able to use it due to encryption.

If you can’t publish your corporate data on the internet and know it’s safe, then it’s not safe. So it really needs to be encrypted and protected. That’s the core principle.”

 

You spent two years at Broadgate, what was the most rewarding client project you delivered working with them as a consultant?

“That’s a really difficult question as all my projects at Broadgate have been quite exciting. If you don’t mind I’ll tell you about the highlights of the things that I did as a Broadgate Consultant.

I worked in the insurance business for as Chief Technology Officer and I took a massive 2 year development backload and cut it down to delivering in real time. My change programme involved taking the company from being a waterfall software delivery shop into being an agile delivery shop.

It involved the entire Development Team and Project Managers and the end result was that in a very short period of 6 months, we changed the business and its view on the IT departments ability to deliver. A very positive outcome.”

 

It’s interesting how your work was also about changing businesses’ view on the importance of IT protection?

“I very much agree. I think that very often businesses wrongly focus merely on cost-cutting.

It is also worth noting, that a radical process, such as operating model change can be difficult for incumbent teams to deliver. Bringing in a fresh pair of hands, someone who doesn’t have the business-as-usual activities to get on with and can focus on change really accelerates such projects and helps business.

At a large retail bank, I went into the voice communications department. The organisation was spending £55m a year on third party costs – telecommunications, calls etc.. My work there was to introduce a new operating model – consolidating business into a single telecoms entity and cutting costs. In a very short period of time (11 months), I saved the company £27m and simultaneously dramatically improved service levels offered by the business, so it was a real success.

Another engagement was really a short but exciting project at a wealth management client who had a business imperative to modernise their IT platforms. It was a really exciting piece of work working with the CIO and we made the decision not to modernise IT platforms but migrate functionality into the cloud. The piece of work I was set to do was responsible for the new cloud strategy: assessing costs, determine what the approach should be, identifying critical success factors and considering the things that might get in the way of the client executing on their vision.”

 

What do you see as the biggest technology disrupters in data centre services?

“Just like everything else in the world, IT is commoditising and lately we’ve seen this accelerating.

Everyone uses IT, the younger generation check their Facebook and Instagram several times an hour, it’s an absolutely essential business tool – try to work without email – absolutely impossible.

The industry commoditises and consolidates and IT is becoming a service. We see large global organisations delivering IT services that are ready to be consumed, you don’t have to self-assemble them. If you buy a car you expect it to come with tyres and a steering wheel. That’s not how IT has been consumed – you had to buy all the parts separately and assemble them. That’s changing. It is all commoditising, it’s becoming holistic, delivered as a service.”

 

Sinking in a data storm? Ideas for investment companies

Posted on : 30-06-2013 | By : richard.gale | In : Data

Tags: , , , , , , , , , , , , , ,

1

All established organisations have oceans of data and only very basic ways to navigate a path through it

This data builds up over time through interaction with clients, suppliers and other organisations. It is usually stored in different ways on disconnected systems and documents

Trying to Identify what it means on a single system is a big enough challenge, trying to do this across a variety of applications is a much bigger problem with different meaning and interpretations of the fields and terms in the system

How can a company get a ‘360’ view of their client when they have different identifiers in various applications and there is no way of connecting them together. How can you measure the true value of your client when you can only see a small amount of the information you hold about them.

Many attempts have been made to join and integrate these data sets (through architected common data structures, data warehouses, messaging systems, business intelligence applications etc) but it has proved a very expensive and difficult problem to solve. These kind of projects take a long time to implement and the business has often moved on by the time they are ready. In addition early benefits are hard to find so these sorts of projects can often fall victim to termination if a round of cost cutting is required.

So what can be done? Three of the key problems are identification of value from data, duration & costs of data projects and ability to deal with a changing business landscape.

There is no silver bullet but we have been working with a number of Big Data firms and have found a key value from them is the ability to quickly load large volumes of data (both traditional database and unstructured documents, text, multi-media). This technology is relatively cheap and the hardware required is both generic and cheap and again can be easily sourced from cloud vendors.

Using a Hadoop based data store on Amazon cloud or a set of spare servers enables large amounts of data to be uploaded and made available for analysis.

So that can help with the first part, having disparate data in one place. So how to start extracting additional value from that data?

We have found a good way is to start asking questions of the data – “what is the total value of business client X does with my company?” or “what is our overall risk if this counterparty fails?” or “what is my cost of doing business with supplier A vs. supplier B?” if you start building question sets against the data and test & retest you can refine the questions, data and results and answers with higher levels of confidence start appearing. What often happens is that the answers create new questions and so answers etc.

There is nothing new about using data sets to enquire and test but the emerging Big Data technologies allow larger, more complex sets of data to be analysed and cheaper cloud ‘utility’ computing power makes the experimentation economically viable.

What is also good about this is that as the business grows and moves on – to new areas, systems or processes then loading the new data sets should be straightforward and fast. The questions can be re-run and results reappraised quickly and cheaply.

As we have discussed previously we think the most exciting areas within Big Data are the Data science and analytics – find which questions to ask and refining the results.

Visualisation of these results is another area where we see some exciting developments and we will be writing an article on this soon.

 

 

BROADScale – Cloud Assessment

Posted on : 30-04-2013 | By : jo.rose | In : Cloud

Tags: , , , , , , , , , , , , ,

0

We are well into a step-change in the way that underlying technology services are delivered.  Cloud Computing in its various guises is gaining industry acceptance.  Terms such as Software as a Service (SaaS), Platform as a Service (PaaS), Private Cloud, Hybrid Cloud, Infrastructure as a Service (IaaS) and so on have made their way into the vocabulary of the CIO organisation.

Cloud Computing isn’t new.  Indeed many organisations have been sourcing applications or infrastructure in a utility model for years, although it is only recently that vendors have rebranded these offerings ( “Cloud Washing” ).

With all the hype it is vital that organisations consider carefully their approach to Cloud as part of their overall business strategy and enterprise architecture.

Most importantly, it is not a technology issue and should be considered first and fore mostly from the standpoint of Business, Applications and Operating Model.

Organisations are facing a number of common challenges:

  • Technology budgets are under increasing pressure, with CIO’s looking to extract more value from existing assets with less resource
  • Data Centre investment continues to grow with IT departments constantly battling the issue of power consumption and physical space constraints
  • Time to market and business innovation sit uncomfortably alongside the speed with which IT departments can transform and refresh technology
  • Increases in service level management standards and customer intimacy continue to be at the forefront

Cloud Computing can assist in addressing some of these issues, but only as part of a well thought out strategy as it also brings with it a number of additional complexities and challenges of its own.

Considering the bigger picture, a “Strategic Cloud Framework”

Before entering into a Cloud deployment, organisations should look at all of the dimensions which drive their technology requirements, not the technology itself.  These will shape the Cloud Framework and include:

  • Governance – business alignment, policies and procedures, approval processes and workflow
  • Organisation – changes to operating models, organisation, interdependencies, end-to-end processes, roles and responsibilities
  • Enterprise Architecture – application profiling to determine which applications are suitable, such as irregular / spiky utilisation, loosely coupled, low latency dependency, commodity, development and test
  • Sourcing – internal versus external, Cloud providers positioning, service management, selection approach and leverage
  • Investment Model – business case, impact to technology refresh cycle, cost allocation, recharge model and finance
  • Data Security – user access, data integrity and availability, identity management, confidentiality, IP, reputational risk, legislature, compliance, storage and retrieval processes

The BROADScale service

At Broadgate Consultants we have developed an approach to address the business aspects of the Cloud strategy.  Our consultants have experience in the underpinning technology but also understand that it is led from the Business domain and can help organisations determine the “best execution venue” for their business applications.

Our recommended initial engagement depends on the size, scale and scope of services in terms of the Cloud assessment.

  1. Initial – High Level analysis of capability, maturity and focus areas
  2. Targeted – Specific review around a business function or platform
  3. Deep – Complete analysis and application profiling

At the end of the assessment period we will provide a report and discuss the findings with you.  It will cover the areas outlined in the “Strategic Cloud Framework” and provide you with a roadmap and plan of approach.

During the engagement, our consultant will organise workshops with key stakeholders and align with the IT Strategy and Architecture.

For more details and to schedule an appointment contact us on 0203 326 8000 or email BROADScale@broadgateconsultants.com

Technology Innovation – “Life moves pretty fast…”

Posted on : 25-09-2012 | By : john.vincent | In : Cloud, Data, Innovation

Tags: , , , , , , , , , ,

0

We recently held an event with senior technology leaders where we discussed the current innovation landscape and had some new technology companies present in the areas of Social Media, Data Science and Big Data Analytics. Whilst putting together the schedule and material, I was reminded of a quote from that classic 80’s film, Ferris Buellers Day Off;

“Life moves pretty fast. If you don’t stop and look around once in a while, you could miss it”

When you look at today’s challenges facing leadership involved with technology this does seem very relevant. Organisations are fighting hard just to stand still (or survive)….trying to do more with less, both staff and budget. And whilst dealing with this prevailing climate, around them the world is changing at an ever increasing rate. Where does Technology Innovation fit in then? Well for many, it doesn’t. There’s no time and certainly no budget to look at new way of doing things. However, it does really depend a little on definition.

  • Is switching to more of a consumption based/utility model, be that cloud or whatever makes it more palatable to communicate, classified as innovation?
  • Is using any of the “big data” technologies to consolidate the many pools of unstructured and structured data into a single query-able infrastructure innovation?
  • Is providing a BYOD service for staff, or providing iPad’s for executives or sales staff to do presentations or interface with clients innovation?

No, not really. This is simply evolution of technology. The question is, some technology organisations themselves even keep up with this? We were interested in the results of the 2012 Gartner CIO Agenda Report. The 3 technology areas that CIO’s ranked highest in terms of priority were;

  1. Analytics and Business Intelligence
  2. Mobile Technologies
  3. Cloud Computing (SaaS, IaaS, PaaS)

That in itself isn’t overly surprising. What we found more interesting was looking at how these CIO’s saw the technologies evolving from Emerging, through Developing and to Mainstream. We work a lot with Financial Services companies, so have picked that vertical for the graphic below;

The first area around Big Data/Analytics is largely in line with our view of the market. We see a lot of activity in this space (a some significant hype as well). However, we do concur that by 2015 we expect to see this Mainstream and an increased focus on Data Science as a practice.

Mobile has certainly emerged already and we would expect this to be more in line with the first category. On the device side, technology is moving at a fast pace (in the mobile handset space look at the VIRTUS chipset, which transmits large volumes of data at ultra-high speeds of a reported 2 Gigabits per second. That’s 1,000 times faster than Bluetooth !).

In the area of corporate device support, business application delivery and BYOD, we already see a lot of traction in some organisations. Alongside this new entrants are disrupting the market in terms of mobile payments (such as Monitise).

Lastly, and most surprisingly, whilst financial services see Cloud delivery as a top priority they also see it as Emerging from now through the next 5 years. That can’t be right, can it? (Btw – if you look at the Retail vertical for the same questions, they see all three priorities as Mainstream in the same period).

That brings us back to the question…what do CIO’s consider as Innovation? Reading between the lines of the Gartner survey it clearly differs by vertical. Are financial services organisations less innovative? I’m not sure they are…more conservative, perhaps, but that is to be understood to some degree (see the recently launched Fintech Innovation Lab sponsored by Accenture and many FS firms).

No, what would worry me as a leader within FS is the opening comment from Mr Bueller. Technology and Innovation is certainly moving fast and perhaps the pressure on operational efficiencies, whilst undoubtedly needed, could ultimately detract from bringing new innovation to benefit business and drive competitive value?

There is also a risk that in this climate and with barriers to entry reducing, new entrants could actually gain market share with more agile, functionally rich products and services. We wrote before about the rise of new technology entrepreneurs…there is certainly a danger that this talent pool completely by-passes the financial services technology sector.

Perhaps we do need to “take a moment to stop and look around”. Who in our organisation is responsible for Innovation? Do we have effective Process and Governance? Do we nurture ideas form Concept through to Commercialisation. Some food for thought…

London 2012: The Technology Powered Games

Posted on : 28-05-2012 | By : jo.rose | In : General News

Tags: , , , , , , , , , , , , ,

0

With less than 2 months to go until the 2012 Olympics and Paralympics hit London we thought we’d take a look at the role that technology is playing, both supporting the event itself and also some considerations for business operations during a potentially disruptive summer.

The Technology Operations Centre (TOC) in Canary Wharf has been up and running for the last 6 months, from where it will provide central control and monitoring for all of the systems supporting the games staffed by members of the Organising Committee’s team and the selected delivery partners.

During the Games, the TOC will oversee critical applications, as well as monitoring 900 servers, 1,000 network and security devices and 9,500 PCs. In total over 5,000 technology staff, including 2,500 volunteers, will be involved in delivering the Olympics technology.

Breaking Records already

Not surprisingly, there are some technology firsts at this year’s games along with the usual impressive lists of stats, speeds, feeds etc… Here are a few:

  • Big Data: One of the major challenges for the London 2012 Games will be the sheer amount of data generated. Estimates put this at 30% more than that of the Beijing Olympics four years ago, providing real-time information to fans, commentators and broadcasters around the world.
  • Access Channels: BT’s single voice and data network will provide access to 94 locations (including the Olympic Village and 34 competition venues). This will also support 80,000 connections and has 5,500km of internal cabling plus 1,800 wireless access points. During the event Virgin Media are also providing free Wifi on the from 80 underground stations.
  • Bandwidth: As a consequence of the increase in data and access points, London 2012 is going to be very bandwidth-intensive. BT has provisioned four times the network capacity of Beijing and during peak times the network traffic is expected to hit 60Gbps.  To put this in context, the infrastructure is capable of transmitting the entire contents of Wikipedia every 0.5 seconds!
  • Mobile Payments: Another technology to be introduced this year is that of near-field communications mobile payments. Samsung have announced that “a limited edition showcase device enabled with Visa’s mobile payment application, Visa payWave, will be available for Samsung and Visa sponsored athletes and trialists”.
  • 3D TV:  The BBC will broadcast live 3D coverage to homes across the UK as part of a 3D trial, with coverage including the opening and closing ceremonies and the men’s 100m final. The free-to-air broadcast of these events will be available to anyone who has access to a 3D TV set and to HD Channels, regardless of which digital TV provider they use.

One interesting aside, the actual systems recording results will be on a separate network from a security perspective, so “runners” will take physical printouts to 3rd parties following each event.

Contingency Planning

Of course, the other side that technology is playing it’s part is in the area of providing remote access to applications, systems and data during the games. Numerous governing bodies have advised businesses to allow staff to work from home to ease travel congestion for the duration of the event.

Indeed, many organisations have tested their contingency plans already for the games in terms of remote working (May 18th was “National work from home day”). The policies across organisations differ greatly, and we are not going to get into the productivity debate…but suffice to say that Olympics aside, changing attitudes to home working and the increased availability of cloud based applications could mean that over half of employees will work from home over the next decade anyway (see survey by Virgin Media).

What isn’t certain is the impact that a surge in people going online to watch the London Olympics may have on the internet. The Cabinet Office and the London Games organising committee (LOCOG) is advising businesses that “due to an increased number of people accessing the internet” during this summer’s Games that “internet services may be slower” or “in very severe cases there may be drop outs”.

It has also warned that businesses may even face bandwidth rationing…“ISPs may introduce data caps during peak times to try and spread the loading and give a more equal service to their entire customer base.”

This poses a bit of a dilemma given the previous advice about allowing staff to work from home to reduce impact on transport infrastructure. Businesses need to check with their ISP’s regarding their contract and the expected impact on service they will be able to offer during the Games, particularly with respect to managing peak demand (corporate networks will also be tested, with homeworkers increasing the demand for collaboration through screen sharing, file transfer and video-conferencing).

Final word to the LOCOG…“In developing your business continuity plan for the Games you will need to ensure that any increase in homeworking is supported by appropriate IT, and that internal systems and ISP’s have been engaged in the planning process so that the demands on the system can be understood and managed.”

Simple advice…sure it will be a great success with technology playing it’s part quietly behind the scenes.

Technology Empowerment vs. Frustration: A User(s) Guide

Posted on : 30-04-2012 | By : richard.gale | In : General News

Tags: , , , , , , , , , , , , ,

0

One of the most exciting aspects of running an IT Consultancy is the variety of views and opinions we get to hear about from our clients, teams, suppliers & partners. We want to focus this month on looking at the relationships between business users of technology and the IT departments that supply solutions.  As with most ‘marriages’ this is a complex, ever changing interaction, but two factors are key to this are: Empowerment  and Frustration.

We think we are on the cusp of a major change in the balance of power between the user and IT departments, this happens very rarely so we are watching with interest how it develops over the next few years. Business users now are digitally aware, often frustrated by tech departments and confident enough to bypass them. This is a dangerous time for the traditional IT team and trying to control and close down alternatives would be a mistake and is probably too late anyway.

The graph below highlights how users frustration with IT has increased whilst their ability to control has diminished. There was a brief (golden?) period in the 1990’s where Desktop computing and productivity tools helped business users become more self-sufficient but that was then reduced as IT took control back of the desktop

 

Business Frustration vs. Empowerment 1970 onwards

Business Frustration vs. Empowerment 1970 onwards

The 70s – the decade of opportunity

Obviously computing did not start in 1970 (although Unix time did start on Jan 1st…) but the ’70s was perhaps the time when IT started making major positive impacts to organisations. Payroll, invoicing, purchasing and accounting functions started to become more widely automated and computerised. The productivity gains from some of these applications transformed businesses and the suppliers (IBM, IBM, IBM etc) did phenomenally well out of it. User empowerment was minimal but frustration was also low as demand for additional functions and flexibility was limited

The 80s – growing demands and an awakening workforce

The 1980s saw the rise of the desktop with Apple and Microsoft fighting for top-dog position. This explosion of functionality was exciting for the home user initially and then quickly grew to be utilised and exploited by organisations. Productivity tools such as spreadsheets, word processing and email allowed business users to create and modify their working practices and processes. The adoption of desktops accelerated towards the end of the decade so we make this decade as: Empowerment up (and growing) and frustration down.

The 90s – Power to the people (sort of… for a while)

Traditional IT departments recognised the power of the utility PC and adjusted (and grew) to support the business. Networks and so file sharing, and as importantly, backups became the norm. Business departments were  becoming more autonomous with the power the PC gave them. Macros and Visual Basic add-ons turned into business critical applications, new software was being produced by innovative companies all the time. Business users were free to download and run pretty much anything on their work computer.  The complexity of IT infrastructure and applications was increasing exponentially… so inevitably things began to creak and break, end user applications (or EUCs as they became known) could be intolerant of change (such as a new version of Excel), also they were often put together in an ad-hoc fashion to solve a particular problem and then woven into a complicated business process which became impossible to change. This, with the additional twist of the ‘computer virus’ gave the opportunity for the IT department to lock-down users PCs and force applications to be developed by the new, in-house, development teams. Result for the 1990s – User frustrations rising, demands rising and empowerment on the way down.

The 00s – Control and process

The dawn of the new millennium, the first crash of the dot coms and the lockdown of user PCs continues at pace. The impacts from the ’90s – unsupportable applications, viruses, complexity of the desktop were joined by higher levels of regulation, audit and internal controls. These combined with a focus on saving money in the still expanding IT departments caused further reduction in user abilities to ‘do IT’. In large organisations most PCs were constrained to such an extent they could only be used for basic email, word processing and Excel (now the only spreadsheet in town). Any new application would have to go through a lengthy evaluation, purchasing, configuration, security testing, ‘packaging’ and finally installation if it was required for business use so inevitably – User frustration was rising to dangerous levels and empowerment was further degraded.

The 10s – A digital workforce demands power

The controls and restrictions of the ’00s now ran into signification budgetary restrictions on IT departments. Costs were, and are, being squeezed, fewer and less experienced resources are dealing with increasing demands an pace. Frustration levels were peaking to a point relationships between IT and business were breaking down. Outsourcing parts of IT organisations made some significant savings on budgets but did nothing to reduce user concerns around delivery and service (at least in the short term).

Some users started to ‘rebel’, the increasing  visibility of software as a service (SaaS) enabled certain functions to implement simple but functionally rich solutions to a team or department relatively easily and without much/any IT involvement. Salesforce.com did amazingly well through an ease of use, globally available, infrastructure free product which did everything a Sales team needed and could be purchased on a credit card and expensed…  Internal productivity tools such as Sharepoint started being used for  complex workflow processes – by the business without need for IT.

At the same time personal devices such as smartphones, tablets and laptops (BYOD) became the norm for the business community. They want and are demanding ability to share business data on these tools.

Public cloud usage by business users is also starting to gather pace and the credit card/utility model means some functions do not use IT for certain areas where quick creation and turnaround of data/processing is needed (whether that is wise or not is a different question).

So what are IT departments doing to ensure they can continue to help business units in the future:

  • Become much more business needs focused (obvious but needs to be addressed as a priority)
  • Encourage the use of BYOD – in the end it will save the firm money through not having to purchase hardware
  • Aggressively addressing traditional structures and costs – ask questions such as
    • “Why can’t we get someone else to run this for us?” – whether outsource, cloud or SaaS
    • “Why don’t you have a SaaS/Cloud enabled product?”
  • Become a service broker to the business – looking ahead and managing service and supplier rather than infrastructure, applications or process.

User empowerment rising but user demands and frustrations still high

The 20s – Business runs business and Utilities run IT

What will happen in the next few years? Who can tell but trends we are seeing include:

  • There will be a small number of large firms with massive computing capacity – most other organisations will just use this power as required.
  • There will be new opportunities for financial engineering such as exchange trading computing & processing power, storage & network capacity.
  • IT infrastructure departments in the majority of organisations would have disappeared
  • IT for business organisations will consist of Strategy, Architecture, Business Design, (small specialised) Development focusing on value-add tooling and integration, Relationship and Supply management of providers, products and  pricing

All these point to more power for the business user but one trend emerging which may reverse that is the on-going impact of legislation and regulation. This could limit business capability to be ‘free’ and the lockdown of IT may begin again but this time more from government onto the external suppliers of the service resulting in increasing frustration levels and reduced empowerment….. interesting to see how this goes.

 

 

Integrated ‘IS command and control’ – can cloud-based services deliver it?

Posted on : 30-04-2012 | By : jo.rose | In : General News

Tags: , , , , , , ,

0

For years, many IS organisations have lacked efficient and effective IS command and control processes. Key IS processes involving internal teams, business stakeholders, projects, executives and third party providers have fallen short.  They largely rely on weak, paper-based processes. Reporting has been manual. Dozens of processes run using spreadsheets, documents and shared lists.

Integrated IS governance has been elusive.  Convincing senior managers they have any kind of line-of-sight and that they are seeing ‘one version of the truth’ has been impossible.

The traditional approach is to spend lots of time and cost worrying about low level integration of data and tools. Investment is spent on workflows and customising SharePoint. But the result can still be a set of shared lists. Some of this is valid, but for the most part it misses the point.

The core information needed to run an effective IS organisation is accessible anyway.  It’s just a case of pulling it together into a ‘governance model’ based on largely industry-standard core processes. It needs a consolidation process, coupled with a set of tools for data capture, automation and control. The more effective, 80:20 approach, delivers IS processes and governance as a series of small steps. Impressive results can be obtained in a few weeks.

Making a difference

Key processes support a rapid delivery model. A good approach is to look at the three levels of prioritisation below (though never forget that organisations can pick and mix the delivery approach).

Tier 1

  • IS performance visibility and reporting (and connecting this to core data sources like the helpdesk data centre tools, etc).
  • Service Provider reporting (and connecting this to the IS and business value chain).
  • PMO automation (phase 1 – portfolio, programme project visibility).

Tier 2

  • Business reporting from IS – providing focused view of IS operational and project performance impacting each business stakeholder group.
  • Resource management, stakeholder reporting.
  • Automation of business change requests.

Tier 3

  • Business requests for project investment, portfolio prioritisation.
  • Third Party Management for major contracts.
  • Scenario planning for investment.

Approached in this way, organisations can make headway quickly. Much of the tier 1 and 2 priorities can be delivered in 8-12 weeks as long as the following simple rules are adopted:

  • Think about the core ‘decision points’ in and around IS. Buy in a ready-made IS governance model, don’t ask someone to design it from scratch.
  • Go top-down – not bottom up. You will get there faster and obtain results that will gain confidence with the business
  • Be a ‘consolidator of information’. Don’t get bogged down in trying to build an elaborate data and system integration for reporting. Cloud products do this for you and quickly connect to existing data sources.
  • Accept 80:20 completeness. Start from what you have, deliver ‘quick winds’ that make a difference and work outwards in small steps.
  • Look at what you can get out of the box. Consider using a Software-as-a-Service (SaaS) Platform (see below). Avoid lengthy and expensive integration projects.  Our experience is that they rarely deliver effective results.

 What’s clear is that a game-changing approach is needed to deliver an integrated command and control processes for IS departments. Choosing the right SaaS solution lets you do this: SaaS can be implemented quickly with impressive and fast results.  Look for an out of the box solution that offers nothing less than these key IS governance processes:

  • IS performance reporting to internal and external ‘governance points’.
  • Portfolio, programme and project management.
  • Business workflows for requests, approvals supporting both the business units and third parties.
  • Common IS-wide processes for issue and risk management, escalation, reporting, resource management, timesheets.
  • Management of third party service provider contracts, including, commercial, operational and transformational performance.

So can cloud service enable integrated IS command and control?  In truth, it actually looks like the most strategic way forward to the majority of organisations.

Contribution by: Stephen Randall is CEO of Execview, a ground-breaking Enterprise Platform for delivering and managing business transformation, operational excellence and strategic outcomes available as SaaS.  For more information contact us at info@execview.com or visit www.execview.co.uk