Has the agile product delivery model has been too widely adopted?

Posted on : 30-01-2019 | By : richard.gale | In : Uncategorized

Tags: , , , ,

0

As a consultancy, we have the benefit of working with many clients across almost all industry verticals. Specifically, over the last 7-8 years we have seen a huge uptake in the shift from traditional project delivery models towards more agile techniques.

The combination of people, process and technology with this delivery model has been hugely beneficial in increasing both the speed of execution and alignment of business requirements with products. That said, in more recent years we have observed an almost “religious like” adoption of agile often, in our view, at the expense of pragmatism and execution focus. A purist approach to agile—where traditional development is completely replaced in one fell swoop— results in failure for many organisations, especially those that rely on tight controls, rigid structures and cost-benefit analysis.

Despite its advantages, many organisations struggle to successfully transition to agile, leading to an unnecessarily high agile project failure rate. While there are several common causes for this failure rate, one of the top causes—if not the leading cause—is the lack of an agile-ready culture.

This has been evident with our own client discussions which have centred around “organisational culture at odds with agile values” and “lack of business customer or product owner availability” as challenges for adopting and scaling agile.  Agile as a methodology does require a corresponding agile culture to ensure success.  It’s no good committing to implementing in an agile way when the organisation is anything but agile!

Doing Agile v Being Agile

Adopting an Agile methodology in an organisation which has not fully embraced Agile can still reap results (various estimates but benchmark around a 20% increase in benefits). If, on the other hand, the firm has truly embraced an agile approach in the organisation from CEO to receptionist then the sky is the limit and improvements of 200% plus have been experienced!

Investing in the change management required to build an agile culture is the key to making a successful transition to agile and experiencing all of the competitive advantages it affords. Through this investment, your business leadership, IT leadership and IT teams can align, collaborate and deliver quality solutions for customers, as well as drive organisational transformation—both today and into the future.

There are certain projects, where shoehorning them into agile processes just serves to slow down the delivery with no benefit. Some of this may come from the increase in devops delivery but we see it stifling many infrastructure or underpinning projects, which still lend themselves to a more waterfall delivery approach.

The main difference between agile methodologies and waterfall methodologies is the phased approach that waterfall takes (define requirements, freeze requirements, begin coding, move to testing, etc.) as opposed to the iterative approach of agile. However, there are different ways to implement a waterfall methodology, including iterative waterfall, which still practices the phased approach but delivers in smaller release cycles.

Today, more and more teams would say that they are using an agile methodology. When in fact, many of those teams are likely to be using a hybrid model that includes elements of several agile methodologies as well as waterfall.

It is crucial to bring together people, processes and technologies and identify where it makes business sense to implement agile; agile is not a silver bullet. An assessment of the areas where agile would work best is required, which will then guide the transition. Many organisations kick off an agile project without carrying out this assessment and find following this path is just too difficult. A well-defined transitional approach is a prerequisite for success.

We all understand that today’s business units need to be flexible and agile to survive but following an agile delivery model is not always the only solution.

A few tips to securing data in the cloud

Posted on : 30-11-2016 | By : john.vincent | In : Cloud, Cyber Security, Data, Uncategorized

Tags: , , , , , , , , , , ,

0

In our view, we’ve finally reached the point where the move from internally built and managed technology to cloud based applications, platforms and compute services is now the norm. There are a few die hard “remainers” but the public has chosen – the only question now is one of pace.

Cloud platform adoption brings a host of benefits, from agility in deployment, cost efficiency, improved productivity and collaboration amongst others. Of course, the question of security is at the forefront, and quite rightly so. As I write this the rolling data breach news continues, with today being that of potentially compromised accounts at the National Lottery.

We are moving to a world where the governance of cloud based services becomes increasingly complex. For years organisations have sought to find, capture or shutdown internal pockets of “shadow IT”, seeing them as a risk to efficiency and increasing risk. In todays new world however, these shadows are more fragmented, with services and data being very much moving towards the end user edge of the corporate domain.

So with more and more data moving to the cloud, how do we protect against malicious activity, breaches, fraud or general internal misuse? Indeed, regarding the last point, the Forrsights Security Survey stated:

“Authorised users inadvertently exposing sensitive information was the most common cause of data beaches in the past 12 months.”

We need to think of the challenge in terms of people, process and technology. Often, we have a tendency to jump straight to an IT solution, so let’s come to that later. Firstly, organisations need to look at few fundamental pillars of good practice;

  1. Invest in User Training and Awareness – it is important that all users throughout and organisation understand that security is a collective responsibility. The gap between front and back office operations is often too wide, but in the area of security organisations must instil a culture of shared accountability. Understanding and educating users on the risks, in a collaborative way rather than merely enforcing policy, is probably the top priority for many organisations.
  2. Don’t make security a user problem – we need to secure the cloud based data and assets of an organisation in a way that balances protection with the benefits that cloud adoption brings. Often, the tendency can be to raise the bar to a level that both constrains user adoption and productivity. We often hear that IT are leading the positioning of the barrier irrespective of the business processes or outcomes. This tends to lead to an approach of being overly risk adverse without the context of disruption to business processes. The result? Either a winding back of the original solution or users taking the path of least resistance, which often increases risks.

On the technology side, there are many approaches to securing data in the cloud.  Broadly, these solutions have been bundled in the category of Cloud Access Security Broker (CASB), which is software or a tool that sits in between the internal on-premise infrastructure and the cloud provider, be that software, platform or other kind of as-a-service. The good thing about these solutions is that they can enforce controls and policies without the need to revert to the old approach of managing shadow IT functions, effectively allowing for a more federated model.

Over recent years, vendors have come to market to address the issue through several approaches. One of the techniques is through implementing gateways that either use encryption or tokenisation to ensure secure communication of data between internal users and cloud based services. However, with these the upfront design and scalability can be a challenge given the changing scope and volume of cloud based applications.

Another solution is to use an API based approach, such as that of Cloudlock (recently purchased by Cisco). This platform uses a programmatic approach to cloud security on the key SaaS platforms such as  to address areas such as Data Loss Prevention, Compliance and Threat Protection with User and Entity Behaviour Analytics (UEBA). The last of these users machine learning to detect anomalies in cloud activities and access.

Hopefully some food for though in the challenge of protecting data in the cloud, whichever path you take.

The Blockchain Revolution

Posted on : 28-08-2015 | By : richard.gale | In : Cyber Security

Tags: , , , , , , , , , , ,

3

We’ve been excited by the potential of blockchain and in particular bitcoin technology and possibilities for a while now (Bitcoins: When will they crash?  More on Bitcoins..  Is someone mining on my machine? ). We even predicted that bitcoins would start to go mainstream in our 2015 predictions . We may be a little ahead of ourselves there but the possibilities of the blockchain, the underpinning technology of crypto currencies is starting to gather momentum in the financial services world.

Blockchain technology contains the following elements which are essential to any financial transaction

  1. Security – Blockchain data is secure as each part of the chain is linked with the other and many copies of that data are stored among the many thousands of ‘miners’ in an encrypted (currently unhackable) format. Even if a proportion of these miners were corrupt with criminal intent the voting of the majority will ensure integrity
  2. Full auditability – Every block in the chain has current and historic information relating to that transaction, the chain itself has everything that ever happened to it. The data is stored in multiple places and so there is a very high degree of assurance that the account is full and correct
  3. Transparency – All information is available in a consistent way to anyone with a valid interest in the data
  4. Portability – The information can be available anywhere in the world, apart from certain governments’ legislation there are few or no barriers to trade using blockchain technology
  5. Availability – There are  many copies of each blockchain available in virtually every part of the world blockchains should then always be available for use

The blockchain technology platform is flexible enough to incorporate additional functions and process without compromising it’s underlying strengths.

All major banks and a number of innovative startups are looking at ways blockchain can change the way transactions are executed. There are significant opportunities for both scale and efficiency using this technology. Areas being researched include;

  • Financial trading and settlement. Fully auditable, automated chain of events with automated payments, reporting and completion globally and instantly
  • Retail transactions. End to end transactions delivered automatically without the opportunity of loss or fraud
  • Logistics and distribution. Automatically attached to physical and virtual goods with certified load information enabling swift transit across nations
  • Personal data. Passports, medical records and government related information can be stored encrypted but available and trusted
There are still some significant challenges with blockchain technology;
  1. Transactional throughput – limited by banking standards (10’s of transactions per second at present rather than 10,000’s)
  2. Fear and lack of understanding of the technology – this is slowing down thinking and adoption
  3. Lack of skills to design and build – scarce resources in this space and most are snapped up by start-ups
  4. Complexity and lack of transparency – Even though the technology itself is transparent the leap from the decades old processes used in banks back offices for example to a blockchain programme can be a large one. In the case of time critical trading or personal information then security concerns on who can view data come to the fore.
  5. Will there be something else that replaces it – will the potentially large investment in the technology be wasted by the ‘next big thing’?

We think blockchain could have a big future. Some people are even saying it will revolutionize government, cutting spending by huge amounts. If blockchain transactions were used to buy things then sales tax and various amounts to retailers, wholesalers, manufacturers could be paid immediately and automatically. The sales person could have the blockchain credit straightaway too.

Blockchains could remove huge levels of inefficiency and potential for fraud. It could also put a significant number of jobs at risk reflected in John Vincent’s article on the future of employment.

Agile. Is it the new name for in-sourcing?

Posted on : 30-01-2015 | By : richard.gale | In : Innovation

Tags: , , , , , , , , , , , , , , ,

0

Business, IT, clothing are all similar in so much that they can lead and follow fashions & trends.

Looking at IT specifically there is a trend to commoditise and outsource as much as possible to concentrate on the core ‘business’ of growing a business. As we all know this has many advantages for the bottom line and keeps the board happy as there is a certainty of service & cost, headcount is down and the CIO has something to talk about in the exec meetings.

At the coalface the story is often a different one with users growing increasingly frustrated with the SLA driven service, business initiatives start to be strangled by a cumbersome change processes and support often rests in the hands of the dwindling number of IT staff with deep experience of the applications and organisation.

So a key question is –  How to tackle both the upward looking cost/headcount/service mentality whilst keeping the ability to support and change the business in a dynamic fulfilling way?

Agile is a hot topic in most IT and business departments, it emerged from several methodologies from the 1990’s with roots back to the ‘60s and has taken hold as a way of delivering change quickly to a rapidly changing business topology.

At its core Agile relies on:

  • Individuals & interaction – over process and tools
  • Customer communication & collaboration in the creation process – over agreeing scope/deliverables up front
  • Reactive to changing demands and environment – over a blinkered adherence to a plan

The basis of Agile though relies on a highly skilled, articulate, business & technology aware project team that is close to and includes the business. This in theory is not the opposite of an outsourced, commodity driven approach but in reality the outcome often is.

When we started working on projects in investment organisations in the early ‘90s most IT departments were small, focused on a specific part of the business and the team often sat next to the trader, accountant or fund manager. Projects were formal but the day to day interaction, prototyping, ideas and information gathering could be very informal with a mutual trust and respect between the participants. The development cycle was often lengthy but any proposed changes and enhancements could be story boarded and walked through on paper to ensure the end result would be close to the requirement.

In the front office programmers would sit next to the dealer and systems, changes and tweaks would be delivered almost real time to react to a change in trading conditions or new opportunities (it is true to say this is still the case in the more esoteric trading world where the split between trader and programmer is very blurry).  This world, although unstructured, is not that far away from Agile today.

Our thinking is that businesses & IT departments are increasingly using Agile not only for its approach to delivering projects but also, unconsciously perhaps,  as a method of bypassing the constraints of the outsourced IT model – the utilisation of experienced, skilled, articulate, geographically close resources who can think through and around business problems are starting to move otherwise stalled projects forward so enabling the business to develop & grow.

The danger is – of course – that as it becomes more fashionable – Agile will be in danger of becoming mainstream (some organisations have already built offshore Agile teams) and then ‘last years model’ or obsolete. We have no doubt that a new improved ‘next big thing’ will come along to supplant it.

 

Broadgate Predictions for 2015

Posted on : 29-12-2014 | By : richard.gale | In : Innovation

Tags: , , , , , , , , , , , ,

1

We’ve had a number of lively discussions in the office and here are our condensed predictions for the coming year.  Most of our clients work with the financial services sector so we have focused on predictions in these areas.  It would be good to know your thoughts on these and your own predictions.

 

Cloud becomes the default

There has been widespread resistance to the cloud in the FS world. We’ve been promoting the advantages of demand based or utility computing for years and in 2014 there seemed to be acceptance that cloud (whether external applications such as SalesForce or on demand platforms such as Azure) can provide advantages over traditional ‘build and deploy’ set-ups. Our prediction is that cloud will become the ‘norm’ for FS companies in 2015 and building in-house will become the exception and then mostly for integration.

Intranpreneur‘ becomes widely used (again)

We first came across the term Intranpreneur in the late ’80s in the Economist magazine. It highlighted some forward thinking organisations attempt to change culture, to foster,  employ and grow internal entrepreneurs, people who think differently and have a start-up mentality within large firms to make them more dynamic and fast moving. The term came back into fashion in the tech boom of the late ’90s, mainly by large consulting firms desperate to hold on to their young smart workforce that was being snapped up by Silicon Valley. We have seen the resurgence of that movement with banks competing with tech for the top talent and the consultancies trying to find enough people to fulfil their client projects.

Bitcoins or similar become mainstream

Crypto-currencies are fascinating. Their emergence in the last few years has only really touched the periphery of finance, starting as an academic exercise, being used by underground and cyber-criminals, adopted by tech-savvy consumers and firms. We think there is a chance a form of electronic currency may become more widely used in the coming year. There may be a trigger event – such as rapid inflation combined with currency controls in Russia – or a significant payment firm, such as MasterCard or Paypal, starts accepting it.

Bitcoins or similar gets hacked so causing massive volatility

This is almost inevitable. The algorithms and technology mean that Bitcoins will be hacked at some point. This will cause massive volatility, loss of confidence and then their demise but a stronger currency will emerge. The reason why it is inevitable is that the tech used to create Bitcoins rely on the speed of computer hardware slowing their creation. If someone works around this or utilises a yet undeveloped approach such as quantum computing then all bets are off. Also, perhaps more likely, someone will discover a flaw or bug with the creation process, short cut the process or just up the numbers in their account and become (virtually) very rich very quickly.

Mobile payments, via a tech company, become mainstream

This is one of the strongest growth areas in 2015. Apple, Google, Paypal, Amazon, the card companies and most of the global banks are desperate to get a bit of the action. Whoever gets it right, with trust, easy to use great products will make a huge amount of money, tie consumers to their brand and also know a heck of a lot more about them and their spending habits. Payments will only be the start and banking accounts and lifestyle finance will follow. This one product could transform technology companies (as they are the ones that are most likely to succeed) beyond recognition and make existing valuations seem miniscule compared to their future worth.

Mobile payments get hacked

Almost as inevitable as bitcoins getting hacked. Who knows when or how but it will happen but will not impact as greatly as it will on the early crypto-currencies.

Firms wake up to the value of Data Science over Big Data

Like cloud many firms have been talking up the advantages of big data in the last couple of years. We still see situations where people are missing the point. Loading large amounts of disparate information into a central store is all well and good but it is asking the right questions of it and understanding the outputs is what it’s all about. If you don’t think about what you need the information for then it will not provide value or insight to your business. We welcome the change in thinking from Big Data to Data Science.

The monetisation of an individual’s personal data results in a multi-billion dollar valuation an unknown start-up

Long Sentence… but the value of people’s data is high and the price firms currently pay for it is low to no cost. If someone can start to monetise that data it will transform the information industry. There are companies and research projects out there working on approaches and products. One or more will emerge in 2015 to be bought by one of the existing tech players or become that multi-billion dollar firm. They will have the converse effect on Facebook, Google etc that rely on that free information to power their advertising engines.

Cyber Insurance becomes mandatory for firms holding personal data (OK maybe 2016)

It wouldn’t be too far fetched to assume that all financial services firms are currently compromised, either internally or externally. Most firms have encountered either direct financial or indirect losses in the last few years. Cyber or Internet security protection measures now form part of most companies’ annual reports. We think, in addition to the physical, virtual and procedural protection there will be a huge growth in Cyber-Insurance protection and it may well become mandatory in some jurisdictions especially with personal data protection. Insurance companies will make sure there are levels of protection in place before they insure so forcing companies to improve their security further.

Regulation continues to absorb the majority of budgets….

No change then.

We think 2015 is going to be another exciting year in technology and financial services and are really looking forward to it!

 

Highlights of 2014 and some Predictions for 2015 in Financial Technology

Posted on : 22-12-2014 | By : richard.gale | In : Innovation

Tags: , , , , , , , , , , ,

0

A number of emerging technology trends have impacted financial services in 2014. Some of these will continue to grow and enjoy wider adoption through 2015 whilst additional new concepts and products will also appear.

Financial Services embrace the Start-up community

What has been apparent, in London at least, is the increasing connection between tech and FS. We have been pursuing this for a number of years by introducing great start-up products and people to our clients and the growing influence of TechMeetups, Level39 etc within the financial sector follows this trend. We have also seen some interesting innovation with seemingly legacy technology  – Our old friend Lubo from L3C offers mainframe ‘on demand’ and cut-price, secure Oracle databases an IBM S3 in the cloud! Innovation and digital departments are the norm in most firms now staffed with clever, creative people encouraging often slow moving, cumbersome organisations to think and (sometimes) act differently to embrace different ways of thinking. Will FS fall out of love with Tech in 2015 – we don’t think so. There will be a few bumps along the way but the potential, upside and energy of start-ups will start to move deeper into large organisations.

Cloud Adoption

FS firms are finally facing up to the cloud. Over the last five years we have bored too many people within financial services talking about the advantages of the cloud. Our question ‘why have you just built a £200m datacentre when you are a bank not an IT company?’ was met with many answers but two themes were ‘Security’ and ‘We are an IT company’…. Finally, driven by user empowerment (see our previous article on ‘user frustration vs. empowerment) banks and over financial organisations are ’embracing’ the cloud mainly with SaaS products and IaaS using private and public clouds. The march to the cloud will accelerate over the coming years. Looking back from 2020 we see massively different IT organisations within banks. The vast majority of infrastructure will be elsewhere, development will take place by the business users and the ‘IT department’ will be a combination of rocket scientist data gurus and procurement experts managing and tuning contracts with vendors and partners.

Mobile Payments

Mobile payments have been one of the discussed subjects of the past year. Not only do mobile payments enable customers to pay without getting their wallets out but using a phone or wearable will be the norm in the future. With new entrants coming online every day, offering mobile payment solutions that are faster and cheaper than competitors is on every bank’s agenda. Labelled ‘disruptors’ due to the disruptive impact they are having on businesses within the financial service industry (in particular banks), many of these new entrants are either large non-financial brands with a big customer-base or start-up companies with fresh new solutions to existing issues.

One of the biggest non-financial companies to enter the payments sector in 2014 was Apple. Some experts believe that Apple Pay has the power to disrupt the entire sector. Although Apple Pay has 500 banks signed up and there is competition from card issuers to get their card as the default card option under Apple devices, some banks are still worried that Apple Pay and other similar service will make their branches less important. If Apple chose to go into retail banking seriously by offering current accounts then the banks would have plenty more to worry them.

Collaboration

The fusion of development, operations and business teams to provide agile, focussed solutions has been one of the growth areas in 2014. The ‘DevOps’ approach has transformed many otherwise slow, ponderous IT departments into talking to their business & operational consumers of their systems and providing better, faster and closer-fit applications and processes. This trend is only going to grow and 2015 maybe the year it really takes off. The repercussions for 2016 are that too many projects will become ‘DevOpped’ and start failing through focussing on short term solutions rather than long term strategy.

Security

Obviously the Sony Pictures hack is on everyone’s mind at the moment but protection against cyber attack from countries with virtually unlimited will, if not resources, is a threat that most firms cannot protect against. Most organisations have had a breach of some type this year (and the others probably don’t know it’s happened). Security has risen up to the boardroom and threat mitigation is now published on most firms annual reports. We see three themes emerging to combat this.

– More of the same, more budget and resource is focussed on organisational protection (both technology and people/process)
– Companies start to mitigate with the purchase of Cyber Insurance
– Governments start to move from defence/inform to attacking the main criminal or political motivated culprits

We hope you’ve enjoyed our posts over the last few years and we’re looking forward to more in 2015.

Twitter.com/broadgateview

 

 

Cloud as an “Innovation Enabler”

Posted on : 30-06-2014 | By : john.vincent | In : Cloud

Tags: , , , , , , ,

0

It seems that most people we come across in our daily activities now agree that cloud computing is a key disrupter to “traditional” technology service delivery. We no longer start conversations with “let’s define what we mean by cloud computing”, “cloud means different things to different people” or having to ensuring all documents have descriptions of public, private and hybrid cloud as laid out by NIST (the  National Institute of Standards and Technology).

People get cloud now. Of course, there are still naysayers and those that raise the security, compliance or regulatory card, but those voices are now becoming fainter (Indeed, if you look closer you’ll often find that what it actually stems from is a cultural fear, such as loss of control).

If we look at the evolution and adoption of cloud technology, it has predominantly been focused around two business drivers, efficiency and agility. The first of these took some time just from an economic business case perspective. As with most new technologies or ideologies, economies of scale create the tipping point for accelerating adoption but we have now reached the point where the pure cost benefits of on-demand infrastructure are compelling when compared to the internally managed alternative.

The agility angle requires more of a shift in the operating model and mindset for technology organisations. CIOs are generally used to owning and managing infrastructure in “tranches” – deploying additional compute capability for new applications or removing it for consolidation, rationalisation and changes in business strategy.

What cloud technologies provide is the capability for matching demand and supply of compute resource without step changes. To deliver this, however, requires improved forecasting, provisioning and monitoring processes within the technology organisation.

So that’s where most organisations have positioned the cloud. However, what about using cloud to drive business innovation?

A recent McKinsey study on Cloud and Innovation made the following point:

The problem in many cases is that adopting cloud technologies is an IT initiative, which means that cloud solutions are all around improving IT and IT productivity. But that’s not where growth is going to come from. . . Incremental investments in productivity don’t drive growth. . . Investments need to go into innovation and disruptive business models . . . Unless companies are asking themselves how to use the cloud to disrupt their own business models or someone else’s, then adopting the cloud is just another IT project.

This observation encapsulates the current situation well – we often see cloud in the category of “another IT project”. We also saw similar with the whole “Big Data” hype (not that we like that label) in recent years when some IT organisations were building capabilities with products like Hadoop without really knowing what the business objectives or value were. Sound familiar?

Building further on this, we see the problem with driving innovation through cloud based technology as two-fold.

Firstly, many organisations still struggle to foster innovation, whether within the company boundaries or via external ventures and partnerships. We have written about this in previous articles (here as related to innovation in banks). Although things are developing with companies building “Digital Business Units” as a complete separate entity (staffed with both business and IT stakeholders), or sponsoring/funding start-up programmes, it is still too slow. Sadly, innovation is too often just an objective on a performance appraisal which was “achieved” through something fairly uninspiring.

The second point is that relating cloud technology as an enabler to innovation requires a high degree of abstraction between current and future state. It needs people to work together that understand and can shape;

  • The current value of a business and history from a people, process, asset and customer perspective
  • How cloud technology can innovate and underpin new digital channels, such as mobile, social, payments, the internet-of-things and the like
  • How to change the mindset of peer C’level executives to embrace the “art of the possible” – to take decisions that will bring a step change in the companies client services

The challenge facing many organisations is that the shift to innovative cloud based services, which connects clients, services, data and devices on a potentially huge scale, is not supported by traditional technology architectures. It jars with the old, tried and tested way of designing technology infrastructure within a defined boundaries.

However, if organisations do not adapt and innovate then the real threat comes from those companies who know nothing more than “innovating in the cloud”. They started there and use it not only as an efficiency and agility tool but to deliver new and disruptive cloud based business services. To compete, traditional organisations will need to evolve their cloud based innovation.

“Software Defined Everything” powering next generation data centres

Posted on : 23-12-2013 | By : john.vincent | In : Data

Tags: , , , , , , , ,

1

In October we wrote about how technology is outpacing IT departments and how organisations are reacting to catch up (or trying to). We won’t go back over the reasons for this, but one topic that is interesting to explore further, particularly as it relates to this speed of change, is the shift to “Software Defined” technology.

The concepts have been around for a few years now and the pieces of the puzzle are maturing into what some are labelling the Software Defined Data Centre (SDDC). We’ll look at these pieces a little more, but below is the Forrester definition of SDDC:

…an abstracted and pooled set of shared resources. But the secret sauce is in the automation that slices up and allocates those shared resources on-demand, without manual tinkering.”

I like this simple statement, although I’m sure on the technology side many will reject the word “tinkering” ;-).

The SDDC market is being driven predominantly through innovations in processing power and memory, increased demand for resource pooling and underpinning networking configuration. There has also been a shift in the pricing models from hardware based to software based pricing and an increased acceptance of the requirement for multi-tenancy support and erosion of vendor lock in. Companies providing software defined solutions and virtualisation are positioning for growth and competitive advantage in this market, creating new solutions and intelligent/integrated management platforms.

In 2013 the global SDDC market has been estimated at $396 million and is expected to grow to $5.41 billion in 2018. Pretty impressive if the forecasts are correct!

So, what are the components that form the SDDC?

  1. Software Defined Compute – virtualisation is arguably the biggest game changer in how technology services are delivered for the last decade. Companies like VMWare have provided a solution for IT organisations to break down the physical barriers of compute resources and also provide a much more agile infrastructure. Developments in the Software Defined Server will continue with servers optimised for data centre/cloud workloads, such as the HP Moonshot which use 89 percent less power and 80 percent less space than traditional server systems and claim to reduce complexity by 97 percent.
  2. Software Defined Storage – puts the emphasis on storage services instead of storage hardware, in functional areas such as replication, de-duplication and policy based management. The abstraction between the software and hardware allows for greater flexibility without having to consider the infrastructure attributes, allowing for storage to become a logical shared pool on commodity hardware. There are numerous vendors that claim to be in this space, with the number reaching “epidemic levels”, so expect to see a further tightening of the definition and consolidation over the coming years.
  3. Software Defined Network – we wrote about SDN at the beginning of this year as the last piece in the “virtualisation puzzle”. Essentially, SDN decouples the network services from the underlying infrastructure (i.e. the network interface level and associate networking software/protocols). By doing this a greater degree of flexibility can be achieved – it essentially eliminates the need for applications to understand the internal workings of the technical network components, such as routers, bridges and switches.

The roadmap to SDDC

Just looking at the projected growth numbers, you can see that we have some way to go on the SDDC journey. Indeed, the reality of piecing these solutions together into a complete, dynamic, automated and efficient SDDC solution right now just isn’t there yet.

However, what organisations can do is build the overall roadmap and associated architecture to get them SDDC now. Whilst the full benefit will take some time, the good thing is that for each step there will be a positive benefit which can be immediately realised.

Some of the considerations/directional pointers in the roadmap:

  • Keep it Open – build a software defined environment based on a with a virtualised data center that includes compute, storage, and networking resources built on open interfaces and an integrated framework (such as OpenStack).
  • Think about security – SDDC will change significantly the way that organisations have to think about security, with solutions being further abstracted into software and cyber-attacks on the increase, it is important that the security risks are addressed.
  • Change the Operating Model – as the roadmap evolves so the organisation needs to change with it. Technology skills in areas such as automation and provisioning will become more important, as will the softer skills in areas such as the ability to manage business demand pipeline and vendors.

We’re not producing a “Broadgate Predicts” this year, but safe to say, 2014 will be interesting to watch in terms of Software Defined Everything.

—–

If you would like to find out more, we are pleased to recommend the next IT in Business event on Tuesday 28th January where this topic will be discussed with peer organisations – click here to register.

 

 

Has technology outpaced internal IT departments?

Posted on : 31-10-2013 | By : john.vincent | In : Data

Tags: , , , , , , , , , , , ,

4

In technology we love to put a box around something, or define it in a clear and concise way. Indeed, it makes a lot of sense in many technical disciplines to do this, such as architecture, development, processes, policies, infrastructure and so on. We talk about “the stack”, or “technology towers“, or “Reference Architectures”…it provides a common language for us to define out compute needs. “This switch operates at layer 3 versus layer 2” etc…

In the same way we put our technology human capital into nice, neat boxes. Simple, repeatable stuff: 1) Open up Powerpoint…2) Insert SmartArt…3) Hierarchy-Organisation Chart…and away we go. CIO , next level… Head of Infrastructure, Head of Operations, CTO, Head of Applications, Head of Networks, Architecture, COO (can’t have enough of those)…

The general taxonomy of technology organisations has barely changed since the mid 1980’s and actually, until maybe the last 5 or so years, this has been fine. Whilst technology has evolved, it has done so “within the boxes”. We have gone through shifts in operating model and approach, from mainframe to distributed and back again, but the desktop, data, storage, server, mid range and so on services have remained and with it the support organisations around them.

However, things are somewhat different now. The pace of change through Consumerisation, Commoditisation and Cloud (the 3Cs) has redefined the way that businesses engage and capitalise on technology in work and home lives. At the forefront in comes down to three main business drivers:

  • Increased Agility – access to applications and service provisioning should be as close to instantaneous as the laws of physics will allow
  • Increased Mobility – the ability to access applications anywhere, on any device at any time
  • Increased Visibility – a rich data and application environment to improve business intelligence and decision making

To the end user, everything else is just noise. Security, availability, DR, performance, big data analytics…this just gets sorted. Apple does it. Amazon does it, therefore my IT organisation should be the same. In fact better.

So, how does the traditional IT organisation fit with the new paradigm? Well the 3c’s certainly provide significant challenges. The issue is that you have something that was previous contained within a silo now breaking down the barriers. Today’s compute requirements are “fluid” in nature and don’t fit well with the previous operating models. Data, once centralised, contained and controlled, is now moving the the organisational edges. Applications need to be accessible through multiple channels and deployed quickly. Resources need to scale up (and down) to meet, and more importantly match, business consumption.

How does the organisation react to these challenges? Does it still fit neatly into a “stack” or silo? Probably not. How many people, processes and departments does the service pass through in order to provision, operate and control? Many in most cases. Can we apply our well-constructed ITIL processes and a SLA? No. Can we scale quickly for new business requirements from a people perspective? Unlikely…

So what is the impact? Well, it wasn’t that long ago that CIOs spent much of their time declaring war on Shadow IT departments within business functions. With “Alex Ferguson-like” vigour they either moved them into the central technology organisation or squeezed them out, through cost or service risk.

However, it seems that the Shadow IT trend is back. Is this a reaction to the incumbent organisation being unable to provide the requisite level of service? Probably.

I guess the question that we should ask is whether the decentralised model giving more autonomy to business users, for certain functions, is actually where we should be heading anyway? Even within IT departments, the split between ownership, definition and execution of services has evolved through global standards and regional/local service deployment.  Now perhaps it’s time to go further and really align the business and technology service delivery with a much smaller central control of the important stuff, like security, architecture, under-pinning services (like networks), vendor management and disaster recovery.

And then there’s the question of who actually needs to run the underlying technology “compute”. The cloud naysayers are still there although the script is starting to wear a bit thin. There are very few sacred cows…can internal teams really compete long term? The forward thinking are laying out a clear roadmap with targets for cloud/on-demand consumption.

The old saying of “we are a [insert business vertical], not an IT company” is truer today than ever. It may be just that it took the 3cs to force the change.

The aggregation of marginal gains – what can we learn from the sport of cycling?

Posted on : 30-09-2013 | By : richard.gale | In : General News

Tags: , , , , , , , , , ,

0

Sir David Brailsford is the major driver behind a revolution in the fortunes of British Cycling. The UK is now one of the most successful cycling nations with two successive Tour de France winners from Team Sky, a team that was put together barely 4 years ago. Fifteen years ago British cycling was languishing in the lower divisions, now it is riding high in the world rankings.

One of the most interesting techniques Brailsford has applied to cycle coaching is the “aggregation of marginal gains” the sum of analysing & making many small changes to an environment or training plan.  Many examples have been quoted such as heating bib shorts before use to keep the muscles warm, wiping tyres down with alcohol before the start of races to clean grit off and employing a chef to provide optimised meals for the riders.

One specific example of this is the Team Sky Bus. Every competitor has a bus but, before Brailsford and his team, none had thought about in the same way. Team Sky started from scratch and built it out to provide the perfect environment to support the riders on the tours. Every part of the rider’s routine was analysed and an environment was then designed to meet their needs perfectly. Riders need lots of clean, dry kit, the need lots of nutritious interesting food, they need somewhere private to discuss the days’ events and plan for the next one. So the bus included washing machines (muffled of course), meeting rooms, kitchen & sleeping areas customised for the riders.

The attention to detail (and an almost unlimited budget) showed through when two brand new Volvo coaches were torn apart and then 9000 man hours of kitting out took place. This process involved the coaches, riders and other staff with continuous feedback which refined the result into an additional pair of team members. Initially the rival teams dismissed the buses nicknamed “Death Stars” as just another bus (abet – expensive they ended up costing around £750k each)but as Sky’s daily results on the tours jumped up the leader boards they came to learn and respect the thought processes involved.

So what lessons can we learn on the Sky approach? Well the techniques they are using have been borrowed from business ideas but it is the consistent application of them which is making them work so well.

GB cycling & the Sky team have a similar philosophy based on the following core principles:

Setting ambitious goals

From a standing start in 2010 Brailsford said Team Sky would win the Tour de France within five years. This was seen as ludicrous by the cycling establishment. He disrupted conventional thinking by applying scientific methods to the sport and, with Bradley Wiggins victory in 2012, it actually took them three years.

We think this ‘shooting for the stars’ ambition can work for business just as well. Aiming for what could be done not what is being done changes the way people think within companies and, given the right environment, support, drive and that ambition does create winning organisations.

Focus on the end result

What is important? All around there is noise, interference and distractions so keeping the ‘blinkers’ on to aim for the end-game is critical. Saying that, blindly ignoring feedback or responses around you can be fatal too so ensuring you are aiming for the right end result is also critical.

Teamwork & Ensuring the whole team has one vision

All organisations have teams. Team GB & Sky have ensured the right mix of individuals form a team with a common, shared goal. This is something which is part directed, part in built and always reinforced. Everyone understands the obligations and rewards of having the single winning vision.

Analyse everything

Data is everything and unlocking its hidden value is another key to the team’s success. Everyone in the team understands the value of capturing as much information as possible and that data is analysed and replayed in as near time as possible. The Sky team sometimes forgo the glory of the ‘hands free’ roll over the finishing line to punch in the completion message on their bike computers.

Control & Discipline

There is a poster on the entrance to the team bus with the Team rules re-emphasises the importance of the vision and goals of the team. It does not spell out the penalties for infringement but a number of people have left the team after breaching rules either during or before their stint with Sky.

Grow the person

This is the aim of most businesses but both GB and Sky aim to get inside their team members’ heads to understand their motivations, desires and ambitions. This energy is then focussed in such a way to build and improve the team whilst maximising the personal objectives of the person.

Plan and plan flexibility

Team GB & Sky management and riders spend a large amount of their time planning for every eventuality including differing weather conditions, team strengths, rivals changing strategies and  any other factors that can influence the race. They then produce the strategic plan of the race, the day, the hour or the hill. The important piece is that any changing circumstances are fed into the plan to modify or indeed create a new plan as it is required. It is strong enough to hold up and work but flexible enough change and still be a success.

 

All these attributes can be applied to most business areas and it is the ability to plan and refine every detail which has provided British cycling and Sky with their continued success. Small continuous improvements bring marginal gains to both Sport and also Business teams.

What is also critical is that the strategy or ‘big picture’ is going in the right direction. There is no point bringing the right pillow if the bus is parked in the wrong town.