How Much is Your Data Worth?

Posted on : 29-05-2012 | By : richard.gale | In : Data

Tags: , , , , , ,



Data is like Oil,  sort of…

  • We are completely dependent upon it to go about our daily lives
  • It is difficult and expensive to locate and extract and vast tracts of it are currently inaccessible.
  • As technology improves we are able to obtain more of it but the demand constantly outpaces supply.
  • The raw material is not worth much and it is the processing which provides the value, fuels & plastics in the case of oil and business intelligence from data.
  • It lubricates the running of an organisation in the same way as oil does for a car.
  • The key difference between oil and data is that the supply of data is increasing at an ever faster rate whilst the amount of oil is fixed.


So how can data be valued and what exploration mechanisms are available to exploit this asset?

The recent valuation of Facebook at ~$100B shows the value of data to the market. Facebook itself has tangible (accounts friendly) assets of around $8-10B but the potential value of its data and growth gives rise to the high price investors are willing to pay.

The Facebook and other social media IPO valuations has highlighted that calculating a company’s data worth or value has not built into most more established organisations price. The economic value of a firm’s information assets has recently been termed ‘data equity’ and a new economics discipline, Infonomics, is emerging to provide a structure and foundation of measuring value in data.

The value and so price of organisations could radically alter as the value of its data becomes more transparent. Data equity will at some point be added to the balance sheet of established firms potentially significantly affecting the share price – think about Dun & Bradstreet, the business intelligence service – they have vast amounts of information on businesses and individuals which is sold to help organisations make decisions in terms of credit worthiness. Does the price of D&B reflect the value of that data? Probably not.

Organisations are starting appreciate the value locked up in their data and are utilising technologies to process and analyse the Big Data both within and external to them. These Big Data tools are like the geological maps and exploration platforms for the information world.

Some of these tools were covered in our previous blog, but it is worth remembering the fundamentals which give rise to the Big Data challenge:

  • The volume of data is rising at an ever increasing rate
  • The velocity of that data rushing into and past organisations is accelerating
  • The variety of data has overwhelmed conventional indexing systems

Innovative technology and methods are improving the odds to finding and getting value from that data.

How can an organisation gain value from its data? What are forward thinking firms doing to invest and protect its data?

1. Agree a Common Language

Data is and does mean many things to different firms, departments and people. If there is no common understanding of what a ‘client’ or ‘sale’ or an ‘asset’ is then at the very least confusion will reign and most likely that poor business decisions will be made from the poor data.

This task is not to be underestimated. As organisations grow they build new functions with different thinking, they acquire or are bought themselves and the ‘standard’ definitions of what data means can change and blur. Getting a handle on organisation wide data definitions is a critical and complex set of tasks that need leadership and buy-in. Building a data fabric into an organisation is a thankless but necessary activity in order to achieve longer term value from the firm’s data.

2.Quality, Quality, Quality

The old adage of rubbish in, rubbish out still rings true. All organisations have multiple ‘golden sources’ of data often with legacy transformation and translation rules shunting the data between systems – if a new delivery mechanism is built it is often implemented by reverse engineering the existing feeds to make it the same rather than looking at the underlying data quality and logic. The potential for issues with one of the many consuming systems makes it too risky to do anything else. An alternative is to build a new feed for each new consumer system which de-risks the issue in one sense but builds a bewildering array of pipes crossing an organisation. With any organisation of size it is worth accepting that there will be multiple golden copies of data but the challenge is to make sure they are consistent and have quality checks built in. Reconciling sets of data across systems is great but doesn’t actually check if the data is correct, just that it matches another system….

3. Timeliness

Like most things, data has a time value. As one Chief Data Officer of a large bank recently commented ‘data has a half-life’ – the value decays over time and so ensuring the right data is in the correct place and the right time is essential and out of date/valueless data needs to be identified as such. For example; A correct prediction of tomorrow’s weather is useful, today’s weather is interesting and a report of yesterday’s weather has little value.

4. Organisational Culture

Large organisations are always ‘dealing’ with data problems and providing new solutions to improve data quality. Many large, expensive programmes have been started to solve ‘data’. Thinking about data needs to be more pervasive than that it needs to be part of the culture and fabric of the organisation. Thinking about data (accuracy, ownership, consistency, and time value) needs to be incorporated into organisations as part of the culture, articulating the value of data can help immensely with this.


Understanding what is important rather than having a blanket way of dealing with data is important. Some data doesn’t matter if it is wrong or not up to date because either not consumed (obvious question is – then why have it?) or irrelevant for process.  Other data is critical for a business to survive so a risk based approach to data quality needs to be used and data graded and classified on its value.

6. Data ownership

Someone needs to be accountable for and owner of data and data governance within an organisation. It does not mean that they have to manage each piece but they need to set the strategy and vision for data. More large organisations are now creating a Chief Data Officer role to ensure there is this ownership, strategy and discipline with regard to their data.

Data is the core of a business and there is a growing acknowledgement of its potential value.

As the ability to extract information and intelligence from data improves there will be some disruptive changes in the market value of firms that have  the sort of data which can improve the organisations market share, profitability and potentially traded.

Companies that have huge amounts of information regarding their customers: banks, shops, telecoms firms will be well positioned to take advantage of this information if they can manage to organise and exploit it.


London 2012: The Technology Powered Games

Posted on : 28-05-2012 | By : jo.rose | In : General News

Tags: , , , , , , , , , , , , ,


With less than 2 months to go until the 2012 Olympics and Paralympics hit London we thought we’d take a look at the role that technology is playing, both supporting the event itself and also some considerations for business operations during a potentially disruptive summer.

The Technology Operations Centre (TOC) in Canary Wharf has been up and running for the last 6 months, from where it will provide central control and monitoring for all of the systems supporting the games staffed by members of the Organising Committee’s team and the selected delivery partners.

During the Games, the TOC will oversee critical applications, as well as monitoring 900 servers, 1,000 network and security devices and 9,500 PCs. In total over 5,000 technology staff, including 2,500 volunteers, will be involved in delivering the Olympics technology.

Breaking Records already

Not surprisingly, there are some technology firsts at this year’s games along with the usual impressive lists of stats, speeds, feeds etc… Here are a few:

  • Big Data: One of the major challenges for the London 2012 Games will be the sheer amount of data generated. Estimates put this at 30% more than that of the Beijing Olympics four years ago, providing real-time information to fans, commentators and broadcasters around the world.
  • Access Channels: BT’s single voice and data network will provide access to 94 locations (including the Olympic Village and 34 competition venues). This will also support 80,000 connections and has 5,500km of internal cabling plus 1,800 wireless access points. During the event Virgin Media are also providing free Wifi on the from 80 underground stations.
  • Bandwidth: As a consequence of the increase in data and access points, London 2012 is going to be very bandwidth-intensive. BT has provisioned four times the network capacity of Beijing and during peak times the network traffic is expected to hit 60Gbps.  To put this in context, the infrastructure is capable of transmitting the entire contents of Wikipedia every 0.5 seconds!
  • Mobile Payments: Another technology to be introduced this year is that of near-field communications mobile payments. Samsung have announced that “a limited edition showcase device enabled with Visa’s mobile payment application, Visa payWave, will be available for Samsung and Visa sponsored athletes and trialists”.
  • 3D TV:  The BBC will broadcast live 3D coverage to homes across the UK as part of a 3D trial, with coverage including the opening and closing ceremonies and the men’s 100m final. The free-to-air broadcast of these events will be available to anyone who has access to a 3D TV set and to HD Channels, regardless of which digital TV provider they use.

One interesting aside, the actual systems recording results will be on a separate network from a security perspective, so “runners” will take physical printouts to 3rd parties following each event.

Contingency Planning

Of course, the other side that technology is playing it’s part is in the area of providing remote access to applications, systems and data during the games. Numerous governing bodies have advised businesses to allow staff to work from home to ease travel congestion for the duration of the event.

Indeed, many organisations have tested their contingency plans already for the games in terms of remote working (May 18th was “National work from home day”). The policies across organisations differ greatly, and we are not going to get into the productivity debate…but suffice to say that Olympics aside, changing attitudes to home working and the increased availability of cloud based applications could mean that over half of employees will work from home over the next decade anyway (see survey by Virgin Media).

What isn’t certain is the impact that a surge in people going online to watch the London Olympics may have on the internet. The Cabinet Office and the London Games organising committee (LOCOG) is advising businesses that “due to an increased number of people accessing the internet” during this summer’s Games that “internet services may be slower” or “in very severe cases there may be drop outs”.

It has also warned that businesses may even face bandwidth rationing…“ISPs may introduce data caps during peak times to try and spread the loading and give a more equal service to their entire customer base.”

This poses a bit of a dilemma given the previous advice about allowing staff to work from home to reduce impact on transport infrastructure. Businesses need to check with their ISP’s regarding their contract and the expected impact on service they will be able to offer during the Games, particularly with respect to managing peak demand (corporate networks will also be tested, with homeworkers increasing the demand for collaboration through screen sharing, file transfer and video-conferencing).

Final word to the LOCOG…“In developing your business continuity plan for the Games you will need to ensure that any increase in homeworking is supported by appropriate IT, and that internal systems and ISP’s have been engaged in the planning process so that the demands on the system can be understood and managed.”

Simple advice…sure it will be a great success with technology playing it’s part quietly behind the scenes.