Feedback Loops: How to maximise value from your Big Data

Posted on : 27-06-2012 | By : richard.gale | In : General News

Tags: , , , , , , , , , , , , , , , , ,


Closing the feedback loop

With businesses becoming increasingly sensitive to customer opinion of their brand, monitoring consumer feedback is becoming ever more important.  Additionally, the recognition of social media as an important and valid source of customer opinion has brought about a need for new systems and a new approach.

Traditional approaches of reactive response to any press coverage by a PR department, or conducting infrequent customer surveys whether online or by phone are all part of extremely slow-cycle feedback loops, no longer adequate to capture the ever-changing shifts in public sentiment.

They represent a huge hindrance to any business looking to improve brand relations; delay in feedback can cost real money.  Inevitably, the manual sections of traditional approaches create huge delays in information reaching its users.  These days, we need constant feedback and we need low-latency – the information needs to be almost real-time.  Wait a few moments too long, and suddenly the intelligence you captured could be stale and useless.


A social media listening post

Witness the rise of the “social media listening post”: a new breed of system designed to plug directly in to social networks, constantly watching for brand feedback automatically around the clock.  Some forward-thinking companies have already built such systems.  How does yours keep track right now?  If your competitors have it and you don’t, does that give them a competitive advantage over you?

I’d argue for the need for most big brands to have such a system these days.  Gone are the days when businesses could wait months for surveys or focus groups to trickle back with a sampled response from a small select group.  In that time, your brand could have been suffering ongoing damage, and by the time you find out, valuable customers have been lost.  Intelligence is readily available these days on an near-instantaneous basis, can you afford not to use it?

Some emerging “Big Data” platforms offer the perfect tool for monitoring public sentiment toward a company or brand, even in the face of the rapid explosion in data volumes from social media, which could easily overwhelm traditional BI analytics tools.  By implementing a social media “listening post” on cutting-edge Big Data technology, organisations now have the opportunity to unlock a new dimension in customer feedback and insight into public sentiment toward their brands.

Primarily, we must design the platform for low-latency continuous operation to allow complete closure of the feedback loop – that is to say, events (news, ad campaigns etc) can be monitored for near-real time positive/negative/neutral response by the public – thus bringing rapid response, corrections in strategy, into the realm of possibility.  Maybe you could just pull that new ad campaign early if something disastrous and unexpected happened to public reaction to the material?  It’s also about understanding trends and topics of interest to a brand audience, and who are the influencers.  Social media platforms like Twitter offer a rich granular model for exploring this complex web of social influence.

The three main challenges inherent in implementing a social media listening post are:

  • Data volume
  • Complexity of data integration – e.g. unstructured, semi-structured, evolving schema etc
  • Complexity of analysis – e.g. determining sentiment: is it really a positive or negative statement with respect to the brand?

To gain a complete picture of public opinion towards your brand or organisation through social media, many millions of web sites and data services must be consumed, continuously around the clock.  They need to be analysed in complex ways, far beyond traditional data warehouse query functionality.  Even just a sentiment analysis capability on its’ own poses a considerable challenge, and as a science is still an emerging discipline, but even more advanced techniques in Machine Learning may prove necessary to correctly interpret all signals from the data.  Data format will vary greatly among social media sources, ranging from regular ‘structured’ data through semi-and unstructured forms, to complex poly-structured data with many dimensions.  This structural complexity poses extreme difficulty for traditional data warehouses and up-front ETL (Extract-Transform-Load) approaches, and demands a far more flexible data consumption platform.

So how do we architect a system like this?  Generally speaking, at its core you will need some kind of distributed data capture and analysis platform.  Big Data platforms were designed to address problems where you have Volume, Variety, or Velocity of data – and most often, all three.  In this particular use-case, we need to look towards the cutting-edge of the technology, and look for a platform which supports near-real time, streaming data capture and analysis, with the capability to implement Machine Learning algorithms for the analytics/sentiment analysis component.

For the back-end, a high-throughput data capture/store/query capability is required, suitable for continuous streaming operation, probably with redundancy/high-availability, and a non-rigid schema layer capable of evolving over time as the data sources evolve.  So-called “No-SQL” database systems (which in fact stands for “Not Only SQL” rather than NO SQL) such as Cassandra, HBase or MongoDB offer excellent properties for high-volume streaming operation, and would be well suited to the challenge, or there are also commercial derviatives of some of these platforms on the market, such as the excellent Acunu Data Platform which commercialises Cassandra.

Additionally a facility for complex analytics, most likely via parallel, shared-nothing computation (due to the extreme data volumes) will be required to derive any useful insight from the data you capture.  For this component, paradigms like MapReduce are a natural choice, offering the benefits of linear scalability and unlimited flexibility in implementing custom algorithms, and libraries of Machine Learning such as the great Apache Mahout project have grown up around providing a toolbox of analytics on top of the MapReduce programming model.  Hadoop is an obvious choice when it comes to exploiting the MapReduce model, but since the objective here is to achieve near-real time streaming capability, it may not always be the best choice.  Cassandra and HBase (which in fact runs on Hadoop) can be a good choice since they offer the low-latency characteristics, coupled with MapReduce analytic capabilities.

Finally, some form of front-end visualization/analysis layer will be necessary to graph and present results in a usable visual form.  There are some new open-source BI Analytics tools around which might do the job, or a variety of commercial offerings in this area.  The exact package to be selected for this component is strongly dependent on the desired insight and form of visualization and so is probably beyond the scope of this article, but of course requirements are clearly that it needs to interface with whatever back-end storage layer you choose.

Given the cutting-edge nature of many of the systems required, a solid operational team is really essential to maintain and tune the system for continuous operation.  Many of these products have complex tuning requirements demanding specialist skill with dedicated headcount.  Some of the commercial open-source offerings have support packages that can help mitigate this requirement, but either way, the need for operational resource must never be ignored if the project is to be a success.

The technologies highlighted here are evolving rapidly, with variants or entirely new products appearing frequently, as such it would not be unreasonable to expect significant advancement in this field within the 6-12 month timeframe.  This will likely translate into advancement on two fronts: increased and functional capability of the leading distributed data platforms in areas such as query interface and indexing capability, and reduced operational complexity and maintenance requirements.

Tim Seears – CTO Big Data Partnership.

Big Data Partnership and Broadgate Consultants are working together to help organisations unlock the value in their big data.  This partnership allows us to provide a full suite of services from thought-leadership, strategic advice and consultancy through delivery and implementation of robust, supportable big data solutions.  We would be delighted to discuss further the use case outlined above should you wish to explore adapting these concepts to your own business.

New skills for Project Managers: What is required in today’s environment?

Posted on : 27-06-2012 | By : richard.gale | In : General News

Tags: , , , , ,


Do project managers now need different skills to succeed?

In the last few years the skill-sets required of project managers have changed. The traditional hard-bitten, deliverable focused and sometimes blinkered project manager is still required (otherwise many projects would not get delivered at all) but there are new skills needed to match the faster changing business and technology environment. Agility, flexibility, creativity and all importantly the ability to collaborate are becoming critical parts of the project manager’s toolkit.

Traditional Project Structure

The traditional role of the project manager was to solve a specific problem by completing a project, to specification, on time and on budget. These were his success criteria generally within the framework of a set methodology such as Prince II.

After a few projects and a bit of training the execution of a project was relatively straightforward and also pretty much fixed in scope, time and cost. The traditional lifecycle was as follows:

  • Starting up a Project
  • Initiating a Project
  • Directing a Project
  • Controlling a Stage
  • Managing Stage Boundaries and Scope
  • Managing Product Delivery
  • Closing a Project

As projects progressed then requirements could change with a change request process with estimates of impact on time and cost and a change board or steering committee would help guide the project to success. Generally, though, projects finish as they start with the same objectives and goals, those that don’t generally don’t finish (successfully anyway)

Successful Project Manager Skills

So what makes a good project manager? Asking around several of our consultants and clients there are a few core characteristics that generally exist in successful PMs

  • Organisational ability
  • Discipline
  • Focus & Drive
  • People Skills
  • Communication
  • Openness
  • Pragmatism
  • Thick skin
  • Sense of humour…

In addition business knowledge of the delivery area is essential – not the detailed skills of the BA’s and technical teams but enough to be able to understand and talk coherently around the subjects.

Emerging Project Structures

The business world is changing and becoming more uncertain. Timeframes are being compressed, internal and external events are having bigger impacts on the running of organisations. This coupled with the social expectations and skillsets of the next generation of users means that that project managers require a new set of tools in their toolbox. Often a clear remit or scope on a project is not available or changes with events and time.

New Skills Required For Project Managers

To cope with these challenges additional skills are needed to be successful with a project:

Collaboration – the command, control and direct aspects of delivery are still critical but PM’s require the buy-in, co-operation and knowledge of a broad team (often not under the direct management of the PM) then collaboration and empathy/emotional intelligence become more and more important.

Agility – organisations and their environments are changing at a faster pace so the ability to take stock and the strength to change direction mid-flight is now a required skill. Blindly completing a project and marking it as a ‘success’ as the original (now defunct) deliverables have been completed on time, on budget are now not acceptable.

Creativity – is becoming more important as solutions and the desired outcome changes often. Creativity has sometimes previously been seen as a disadvantage in a project manager. It could be ‘distracting’ and mean the goals are not met. Identifying and executing creative solutions to tricky problems encountered on the path of a project some of most valued skills a PM could possess.

We have a team of project managers that, along with their battle scars and medals from successful previous projects and programmes also have the people and creative skills to deliver projects in the current and future environment. Please contact Jo and we can see how we might help.

“Black Swans” – are global project failures predictable?

Posted on : 26-06-2012 | By : jo.rose | In : General News

Tags: , , , , ,


The technology landscape is littered with the remains of failed projects and, from our discussions over the first half of 2012, it appears that for large scale change projects this isn’t a problem for which we’ve found a cure.

There are of course many well know technology failures both in private enterprise and the public sector which have been deemed newsworthy enough to hit the press. However, there are also many which continue under the radar and, during the current climate of austerity and focus on efficiency, continue to astound in terms of resource wastage.

So what do these projects have in common?

Well, we’re not going to go into a general view of the pitfalls, do’s and don’ts or best practices etc… What is apparent though, is that if the project includes the word “global” in scope or name, then it will have an increased chance of falling short of delivering the business objectives, overrunning in terms of dates or blowing the original budget (and most likely all three). Indeed, many of these could be classified as “black swans“, a term coined by the author and scholar Nassim Nicholas Taleb  “to describe high-impact events that are rare and unpredictable but in retrospect seem not so improbable.” (see Harvard Business Review for an example of the huge impact of some of these failures on organisations, including bankruptcy).

We’ve all seen examples of these projects over the years, often using the name itself to boldly announce the expected end state deliverable…”Global Consolidated Finance Platform”, “Global Common Desktop”, “Global Enterprise Project Tracking” etc… but on reflection how many of these delivered to the original brief? and were the signs of derailment obvious?

There are a few questions (which might appear obvious) that maybe don’t always get attention…

Where does the sponsorship and accountability for the project reside? If the answer is within the technology organisation then an alarm bell should immediately sound. There are still many projects we hear about where the technology delivery and governance is decoupled from the real business requirements whether in terms of functionality, efficiency, time to market or ultimately revenue generation.

How long will the project take to complete? The world we live in is moving to a much more agile, faster paced delivery with “hackathons” building applications overnight, business users looking to take more control, infrastructure being deployed with a “right-click”, and yet how may projects span 6, 12 or 18 months plus? By the time you get past the halfway point of some of these projects the business environment, objectives and crucially often the people themselves have changed. And that makes it easy to both forget and conceal. We spoke with someone recently who well after the end point of a global rollout project had completed 40% of the environment, but was now apparently in “BAU mode”….really? Better to define a smaller set of faster deliverables which can be modified iteratively.

How realistic is the “global” project objective? Over recent years we’ve centralised governance into Centres of Excellence (COE’s) whilst, if sensible, keeping local delivery accountability. However, particularly in most multi-nationals there is always inherent business process, technology and most importantly culture, that if not treated properly, will quickly derail even the most laudable initiative. Replacing a the project tracking and governance platform across the globe has huge benefits overall, but what about the level of re-engineering of local processes, remediation of interfaces and standardisation of functionality. At the outset there is often a cry of “not invented here” accompanied by a collective digging in of the heels. That shouldn’t stop the project trying but the stakeholder management needs to be coupled with a healthy dose of pragmatism.

Is it possible to Fail Fast? Like the example black swan in the linked example, once large scale projects get going they difficult to stop and can consume resources at an alarming rate. So do we make enough use of prototyping?…and can we create the right conditions to stop projects without huge political fallout? Again, this can be a cultural thing. For example, in the US some of the most successful entrepreneurs have been through a few business failures prior to hitting the jackpot. It’s not seen as failure…more resilience. Stopping a project is OK…maybe the switch to less short term objectives linked to compensation will encourage this.

Let’s not be too pessimistic though!…there are plenty of successful global projects and we are all adjusting to new market conditions and the future of technology services 😉