What will the IT department look like in the future?

Posted on : 29-01-2019 | By : john.vincent | In : Cloud, Data, General News, Innovation

Tags: , , , , , , , , , ,

0

We are going through a significant change in how technology services are delivered as we stride further into the latest phase of the Digital Revolution. The internet provided the starting pistol for this phase and now access to new technology, data and services is accelerating at breakneck speed.

More recently the real enablers of a more agile and service-based technology have been the introduction of virtualisation and orchestration technologies which allowed for compute to be tapped into on demand and removed the friction between software and hardware.

The impact of this cannot be underestimated. The removal of the needed to manually configure and provision new compute environments was a huge step forwards, and one which continues with developments in Infrastructure as Code (“IaC”), micro services and server-less technology.

However, whilst these technologies continually disrupt the market, the corresponding changes to the overall operating models has in our view lagged (this is particularly true in larger organisations which have struggled to shift from the old to the new).

If you take a peek into organisation structures today they often still resemble those of the late 90’s where capabilities in infrastructure were organised by specialists such as data centre, storage, service management, application support etc. There have been changes, specifically more recently with the shift to devops and continuous integration and development, but there is still a long way go.

Our recent Technology Futures Survey provided a great insight into how our clients (290) are responding to the shifting technology services landscape.

“What will your IT department look like in 5-7 years’ time?”

There were no surprises in the large majority of respondents agreeing that the organisation would look different in the near future. The big shift is to a more service focused, vendor led technology model, with between 53%-65% believing that this is the direction of travel.

One surprise was a relatively low consensus on the impact that Artificial Intelligence (“AI”) would have on management of live services, with only 10% saying it would be very likely. However, the providers of technology and services formed a smaller proportion of our respondents (28%) and naturally were more positive about the impact of AI.

The Broadgate view is that the changing shape of digital service delivery is challenging previous models and applying tension to organisations and providers alike.  There are two main areas where we see this;

  1. With the shift to cloud based and on-demand services, the need for any provider, whether internal or external, has diminished
  2. Automation, AI and machine learning are developing new capabilities in self-managing technology services

We expect that the technology organisation will shift to focus more on business products and procuring the best fit service providers. Central to this is AI and ML which, where truly intelligent (and not just marketing), can create a self-healing and dynamic compute capability with limited human intervention.

Cloud, machine learning and RPA will remove much of the need to manage and develop code

To really understand how the organisation model is shifting, we have to look at the impact that technology is having the on the whole supply chain. We’ve long outsourced the delivery of services. However, if we look the traditional service providers (IBM, DXC, TCS, Cognizant etc.) that in the first instance acted as brokers to this new digital technology innovations we see that they are increasingly being disintermediated, with provisioning and management now directly in the hands of the consumer.

Companies like Microsoft, Google and Amazon have superior technical expertise and they are continuing to expose these directly to the end consumer. Thus, the IT department needs to think less about how to either build or procure from a third party, but more how to build a framework of services which “knits together” a service model which can best meet their business needs with a layered, end-to-end approach. This fits perfectly with a more business product centric approach.

We don’t see an increase for in-house technology footprints with maybe the exception of truly data driven organisations or tech companies themselves.

In our results, the removal of cyber security issues was endorsed by 28% with a further 41% believing that this was a possible outcome. This represents a leap of faith given the current battle that organisations are undertaking to combat data breaches! Broadgate expect that organisations will increasingly shift the management of these security risks to third party providers, with telecommunication carriers also taking more responsibilities over time.

As the results suggest, the commercial and vendor management aspects of the IT department will become more important. This is often a skill which is absent in current companies, so a conscious strategy to develop capability is needed.

Organisations should update their operating model to reflect the changing shape of technology services, with the closer alignment of products and services to technology provision never being as important as it is today.

Indeed, our view is that even if your model serves you well today, by 2022 it is likely to look fairly stale. This is because what your company currently offers to your customers is almost certain to change, which will require fundamental re-engineering across, and around, the entire IT stack.

Selecting a new “digitally focused” sourcing partner

Posted on : 18-07-2018 | By : john.vincent | In : Cloud, FinTech, Innovation, Uncategorized

Tags: , , , , , ,

0

It was interesting to see the recent figures this month from the ISG Index, showing that the traditional outsourcing market in EMEA has rebounded. Figures for the second quarter for commercial outsourcing contracts show a combined annual contract value (ACV) of €3.7Bn. This is significantly up 23% on 2017 and for the traditional sourcing market, reverses a downward trend which had persisted for the previous four quarters.

This is an interesting change of direction, particularly against a backdrop of economic uncertainty around Brexit and the much “over indulged”, GDPR preparation. It seems that despite this, rather than hunkering down with a tin hat and stockpiling rations, companies in EMEA have invested in their technology service provision to support an agile digital growth for the future. The global number also accelerated, up 31% to a record ACV of €9.9Bn.

Underpinning some of these figures has been a huge acceleration in the As-a-Service market. In the last 2 years the ACV attributed to SaaS and IaaS has almost doubled. This has been fairly consistent across all sectors.

So when selecting a sourcing partner, what should companies consider outside of the usual criteria including size, capability, cultural fit, industry experience, flexibility, cost and so on?

One aspect that is interesting from these figures is the influence that technologies such as cloud based services, automation (including AI) and robotic process automation (RPA) are having both now and in the years to come. Many organisations have used sourcing models to fix costs and benefit from labour arbitrage as a pass-through from suppliers. Indeed, this shift of labour ownership has fuelled incredible growth within some of the service providers. For example, Tata Consultancy Services (TCS) has grown from 45.7k employees in 2005 to 394k in March 2018.

However, having reached this heady number if staff, the technologies mentioned previously are threatening the model of some of these companies. As-a-Service providers such as Microsoft Azure and Amazon AWS have platforms now which are carving their way through technology service provision, which previously would have been managed by human beings.

In the infrastructure space commoditisation is well under way. Indeed, we predict that the within 3 years the build, configure and manage skills in areas such Windows and Linux platforms will be rarely in demand. DevOps models, and variants of, are moving at a rapid pace with tools to support spinning up platforms on demand to support application services now mainstream. Service providers often focus on their technology overlay “value add” in this space, with portals or orchestration products which can manage cloud services. However, the value of these is often questionable over direct access or through commercial 3rd party products.

Secondly, as we’ve discussed here before, technology advances in RPA, machine learning and AI are transforming service provision. This of course is not just in terms of business applications but also in terms of the underpinning services. This is translating itself into areas such as self-service Bots which can be queried by end users to provide solutions and guidance, or self-learning AI processes which can predict potential system failures before they occur and take preventative actions.

These advances present a challenge to the workforce focused outsource providers.

Given the factors above, and the market shift, it is important that companies take these into account when selecting a technology service provider. Questions to consider are;

  • What are their strategic relationships with cloud providers, and not just at the “corporate” level, but do they have in depth knowledge of the whole technology ecosystem at a low level?
  • Can they demonstrate skills in the orchestration and automation of platforms at an “infrastructure as a code” level?
  • Do they have capability to deliver process automation through techniques such as Bots, can they scale to enterprise and where are their RPA alliances?
  • Does the potential partner have domain expertise and open to partnership around new products and shared reward/JV models?

The traditional sourcing engagement models are evolving which has developed new opportunities on both sides. Expect new entrants, without the technical debt, organisational overheads and with a more technology solution focus to disrupt the market.

GDPR – The Countdown Conundrum

Posted on : 30-01-2018 | By : Tom Loxley | In : Cloud, compliance, Cyber Security, data security, Finance, GDPR, General News, Uncategorized

Tags: , , , , , , , , , , , , ,

0

Crunch time is just around the corner and yet businesses are not prepared, but why?

General Data Protection Regulation (GDPR) – a new set of rules set out from the European Union which aims to simplify data protection laws and provide citizens across all member states with more control over their personal data”

It is estimated that just under half of businesses are unaware of incoming data protection laws that they will be subject to in just four months’ time, or how the new legislation affects information security.

Following a government survey, the lack of awareness about the upcoming introduction of GDPR has led to the UK government to issue a warning to the public over businesses shortfall in preparation for the change. According to the Digital, Culture, Media and Sport secretary Matt Hancock:

“These figures show many organisations still need to act to make sure the personal data they hold is secure and they are prepared for our Data Protection Bill”

GDPR comes into force on 25 May 2018 and potentially huge fines face those who are found to misuse, exploit, lose or otherwise mishandle personal data. This can be as much as up to four percent of company turnover. Organisations could also face penalties if they’re hacked and attempt to hide what happened from customers.

There is also a very real and emerging risk of a huge loss of business. Specifically, 3rd-party compliance and assurance is common practice now and your clients will want to know that you are compliant with GDPR as part of doing business.

Yet regardless of the risks to reputation, potential loss of business and fines with being non-GDPR compliant, the government survey has found that many organisations aren’t prepared – or aren’t even aware – of the incoming legislation and how it will impact on their information and data security strategy.

Not surprisingly, considering the ever-changing landscape of regulatory requirements they have had to adapt to, finance and insurance sectors are said to have the highest awareness of the incoming security legislation. Conversely, only one in four businesses in the construction sector is said to be aware of GDPR, awareness in manufacturing also poor. According to the report, the overall figure comes in at just under half of businesses – including a third of charities – who have subsequently made changes to their cybersecurity policies as a result of GDPR.

If your organisation is one of those who are unsure of your GDPR compliance strategy, areas to consider may include;

  • Creating or improving new cybersecurity procedures
  • Hiring new staff (or creating new roles and responsibilities for your additional staff)
  • Making concentrated efforts to update security software
  • Mapping your current data state, what you hold, where it’s held and how it’s stored

In terms of getting help, this article is a great place to start: What is GDPR? Everything you need to know about the new general data protection regulations

However, if you’re worried your organisation is behind the curve there is still have time to ensure that you do everything to be GDPR compliant. The is an abundance of free guidance available from the National Cyber Security Centre and the on how to ensure your corporate cybersecurity policy is correct and up to date.

The ICO suggests that, rather than being fearful of GDPR, organisations should embrace GDPR as a chance to improve how they do business. The Information Commissioner Elizabeth Denham stated:

“The GDPR offers a real opportunity to present themselves on the basis of how they respect the privacy of individuals, and over time this can play more of a role in consumer choice. Enhanced customer trust and more competitive advantage are just two of the benefits of getting it right”

If you require pragmatic advice on the implementation of GDPR data security and management, please feel free to contact us for a chat. We have assessed and guided a number of our client through the maze of regulations including GDPR. Please contact Thomas.Loxley@broadgateconsultants.com in the first instance.

 

Be aware of “AI Washing”

Posted on : 26-01-2018 | By : john.vincent | In : Cloud, Data, General News, Innovation

Tags: , , , ,

0

I checked and it’s almost 5 years ago now that we wrote about the journey to cloud and mentioned “cloud washing“, the process by which technology providers were re-positioning previous offerings to be “cloud enabled”, “cloud ready” and the like.

Of course, the temptation to do this is natural. After all, if the general public can trigger a 200% increase in share price simply by re-branding your iced tea company to “Long Blockchain“, then why not.

And so we enter another “washing” phase, this time in the form of a surge in Artificial Intelligence (AI) powered technologies. As the enterprise interest in AI and machine learning gathers pace, software vendors are falling over each other to meet the market demands.

Indeed, according to Gartner by 2020;

AI technologies will be virtually pervasive in almost every new software product and service

This is great news and the speed of change is outstanding. However, it does pose some challenges for technology leaders and decision makers as the hype continues.

Firstly, we need to apply the “so what?” test against the claims of AI enablement. The fact that a product has AI capabilities doesn’t propel it automatically to the top of selection criteria. It needs to be coupled with a true business value rather than simply a sales and marketing tool.

Whilst that sounds obvious, before you cry “pass me another egg Vincent”, it does warrant a pause and reflection. Human behaviour and the pressures on generating business value against a more difficult backdrop can easier drive a penchant for the latest trend (anyone seen “GDPR compliant” monikers appearing?)

In terms of the bandwagon jumping, Gartner says;

Similar to greenwashing, in which companies exaggerate the environmental-friendliness of their products or practices for business benefit, many technology vendors are now “AI washing” by applying the AI label a little too indiscriminately

The second point, is to ask the question “Is this really AI or Automation?”. I’ve sat in a number of vendor presentations through 2017 where I asked exactly that. After much deliberation, pontification and several “well umms” we agreed that it was actually the latter we were discussing. Indeed, there terms are often interchanged at will during pitches which can be somewhat disconcerting.

The thing is, Automation doesn’t have the “blade runner-esc” cachet of AI, which conjures up the usual visions that the film industry has imprinted on our minds (of course, to counter this we’ve now got Robotic Process Automation!)

So what’s the difference between AI and Automation? The basic definition is;

  • Automation is software that follows pre-programmed ‘rules’.
  • Artificial intelligence is designed to simulate human thinking.

Automation is everywhere and been an important part of industry for decades. It enables machines to perform repetitive, monotonous tasks thus freeing up time for human beings to focus on the activities that require more reasoning, rationale and personal touch. This drives efficiency and a more productive and efficient business and personal life.

The difference with Automation is that is requires manual configuration and set up. It is smart, but it has to follow set instructions and workflow.

AI however is not developed simply to follow a set of predefined instructions. It is designed to mimic human behaviour to continuously seek patterns, learn from it data and “experiences” and determine the appropriate course of action or responses based on these parameters. This all comes under the general heading of “machine learning”.

The common “fuel” that drives both Automation and AI is Data. It is the lifeblood of the organisation and we now live is an environment where we talk about “data driven” technologies at the centre of the enterprise.

Whilst it’s hard to ignore all the hype around AI it is important for decision makers to think carefully not only in terms of what they want to achieve, but also how to filter out the “AI washing”.

2017 – A great year for the hackers

Posted on : 29-12-2017 | By : Tom Loxley | In : Cloud, compliance, Cyber Security, Data, data security, FinTech, GDPR, Uncategorized

0

This year saw some of the biggest data breaches so far, we saw cover-ups exposed and ransoms reaching new highs.

Of course, it’s no secret that when it comes to cybersecurity this was a pretty bad year and I’m certain that there are many CIO’s, CISO’s and CTO’s and indeed CEO’s wondering what 2018 has to offer from the hackers.

That 2018 threat landscape is sure to be full of yet more sophisticated security attacks on the horizon. However, the big win for 2017 is that people have woken up to the threat, “not if, but when” has been finally been acknowledged and people are becoming as proactive and creative as the attackers to protect their companies. The old adage of “offence is the best form of defence” still rings true.

With that in mind we’re going to look back at some of what 2017 had to offer, the past may not predict the future, but it certainly gives you a good place to start your planning for it.

So let’s take a look at some of the most high profile data breaches of 2017.

Equifax (you guessed it) – No doubt you’ll have heard of this breach and because of its huge scale its very likely that if you weren’t directly affected yourself, you’ll know someone who was. This breach was and still is being highly published and for good reason. A plethora of litigation and investigations followed the breach in an effort to deal with the colossal scale of personal information stolen. This includes over 240 individual class-action lawsuits, an investigation opened by the Federal Trade Commission, and more than 60 government investigations from U.S. state attorneys general, federal agencies and the British and Canadian governments. More recently a rare 50-state class-action suit has been served on the company.

Here are some of the facts:

  • 145.5 million people (the figure recently revised by Equifax, now 2.5 million more than it initially reported) as its estimate for the number of people potentially affected.
  • U.K. consumers unknown. Equifax said it is still determining the extent of the breach for U.K. consumers.
  • 8,000 potential Canadian victims (recently revised down from 100,000).
  • High profile Snr leaders to leave since the breach. Former CEO Richard Smith retired (Smith is reported to have banked a $90 million retirement golden handshake), the chief information officer and chief security officer have also “left”.
  • There are an unknown number of internal investigations taking place against board members (including its chief financial officer and general counsel), for selling stock after the breach’s discovery, but before its public disclosure.
  • The breach lasted from mid-May through July.
  • The hackers accessed people’s names, Social Security numbers, birth dates, addresses and, in some instances, driver’s license numbers.
  • They also stole credit card numbers for about 209,000 people and dispute documents with personal identifying information for about 182,000 people

Uber – The big story here wasn’t so much the actual breach, but the attempt to cover it up. The breach itself actually happened 2016. The hackers stole the personal data of 57 million Uber customers, and the Uber paid them $100,000 to cover it up. However, the incident wasn’t revealed to the public until this November, when the breach was made known by the new Uber CEO Dara Khosrowshahi.

Uber has felt the impact of the backlash for the cover-up globally and on varying scales. From the big guns in the US where three senators in the US introduced a bill that could make executives face jail time for knowingly covering up data breaches. Right through to the city of York in the UK where the city voted against renewing Uber’s licence on December 23 due to concerns about the data breach.

Deloitte – According to a report from the Guardian in September earlier this year, a Deloitte global email server was breached, giving the attackers access to emails to and from the company’s staff, not to mention customer information on some of the company’s most high-profile public and private sector clients. Although the breach was discovered in March 2017, it is thought that the hackers had been in the company’s systems since October or November 2016. During in this period, the hackers could have had access to information such as usernames, passwords, IP addresses and architectural design diagrams. Deloitte confirmed the breach, saying that the hack had taken place through an admin account and that only a few clients were impacted by the attack

Now if I covered even half of the high profile cyber-attack cases in detail this article would look more like a novel. Plus, as much as I love to spend my time delighting you my dear readers it is Christmas, which means I have bad tv to watch, family arguments to take part in and copious amounts of calories (alcohol) to consume and feel guilty about for the next 3 months. So, with that in mind let’s do a short recap of some of the other massive exploits and data breaches this past year:

  1. Wonga, the payday loan firm suffered a data breach which may have affected up to 245,000 customers in the UK.
  2. WannaCry and Bad Rabbit, these massive ransomware attack affected millions of computers around the world including the NHS.
  3. The NSA was breached by a group called The Shadow Brokers. They stole and leaked around 100GB of confidential information and hacking tools.
  4. WikiLeaks Vault 7 leak, WikiLeaks exposed the CIA’s secret documentation and user guides for hacking tools which targeting the Mac and Linux operating systems.
  5. Due to a vulnerability, Cloudflare unwittingly leaked customer data from Uber, OKCupid and 1Password.
  6. Bell Canada was threatened by hackers with the leak of 9 million customer records. When the company refused to pay, some of the information was published online.
  7. Other hacks include Verizon, Yahoo, and Virgin America, Instagram…it goes on.

So, all in all not a great year but looking on the bright side if you weren’t on the wrong end of a cyber-attack this year or even if you were, there are plenty of lessons that can be learnt from the attacks that took place and some easy wins you can get by doing the basics right. We’ll be exploring some of these with our newsletter in 2018 and delving into the timelines of some of the more high-profile attacks that took place to help our readers understand and deal with the attack if they’re ever unfortunate enough to be in that situation. But if you can’t wait that long and want some advice now please feel free to get in touch anytime

 

Could You Boost Your Cybersecurity With Blockchain?

Posted on : 28-11-2017 | By : Tom Loxley | In : Blockchain, Cloud, compliance, Cyber Security, Data, data security, DLT, GDPR, Innovation

Tags: , , , , , , , , , , , , , , ,

0

Securing your data, the smart way

 

The implications of Blockchain technology are being felt across many industries, in fact, the disruptive effect it’s having on Financial Services is changing the fundamental ways we bank and trade. Its presence is also impacting Defense, Business Services, Logistics, Retail, you name it the applications are endless, although not all blockchain applications are practical or worth pursuing. Like all things which have genuine potential and value, they are accompanied by the buzz words, trends and fads that also undermine them as many try to jump on the bandwagon and cash in on the hype.

However, one area where tangible progress is being made and where blockchain technology can add real value is in the domain of cybersecurity and in particular data security.

Your personal information and data are valuable and therefore worth stealing and worth protecting and many criminals are working hard to exploit this. In the late 90’s the data collection began to ramp up with the popularity of the internet and now the hoarding of our personal, and professional data has reached fever pitch. We live in the age of information and information is power. It directly translates to value in the digital world.

However, some organisations both public sector and private sector alike have dealt with our information in such a flippant and negligent way that they don’t even know what they hold, how much they have, where or how they have it stored.

Lists of our information are emailed to multiple people on spreadsheets, downloaded and saved on to desktops, copied, chopped, pasted, formatted into different document types and then uploaded on to cloud storage systems then duplicated in CRM’s (customer relationship management systems) and so on…are you lost yet? Well so is your information.

This negligence doesn’t happen with any malice or negative intent but simply through a lack awareness and a lack process or procedure around data governance (or a failure to implement what process and procedure do exist).

Human nature dictates we take the easiest route, combine this with deadlines needing to be met and a reluctance to delete anything in case we may need it later at some point and we end up with information being continually copied and replicated and stored in every nook and cranny of hard drives, networks and clouds until we don’t know what is where anymore. As is this wasn’t bad enough this makes it nearly impossible to secure this information.

In fact, for most, it’s just easier to buy more space in your cloud or buy a bigger hard drive than it is to maintain a clean, data-efficient network.

Big budgets aren’t the key to securing data either. Equifax is still hurting from an immense cybersecurity breach earlier this year. During the breach, cybercriminals accessed the personal data of approximately 143 million U.S. Equifax consumers. Equifax isn’t the only one, if I were able to list all the serious data breaches over the last year or two you’d end up both scarred by and bored with the sheer amount. The sheer scale of numbers here makes this hard to comprehend, the amounts of money criminals have ransomed out of companies and individuals, the amount of data stolen, or even the numbers of companies who’ve been breached, the numbers are huge and growing.

So it’s no surprise that anything in the tech world that can vastly aid cybersecurity and in particular securing information is going to be in pretty high demand.

Enter blockchain technology

 

The beauty of a blockchain is that it kills two birds with one stone, controlled security and order.

Blockchains provide immense benefits when it comes to securing our data (the blockchain technology that underpins the cryptocurrency Bitcoin has never been breached since its inception over 8 years ago).

Blockchains store their data on an immutable record, that means once the data is stored where it’s not going anywhere. Each block (or piece of information) is cryptographically chained to the next block in a chronological order. Multiple copies of the blockchain are distributed across a number of computers (or nodes) if an attempted change is made anywhere on the blockchain all the nodes become are aware of it.

For a new block of data to be added, there must be a consensus amongst the other nodes (on a private blockchain the number of nodes is up to you). This means that once information is stored on the blockchain, in order to change or steel it you would have to reverse engineer near unbreakable cryptography (perhaps hundreds of times depending on how many other blocks of information were stored after it), then do that on every other node that holds a copy of the blockchain.

That means that when you store information on a blockchain it is all transparently monitored and recorded. Another benefit to using blockchains for data security is that because private blockchains are permissioned, therefore accountability and responsibly are enforced by definition and in my experience when people become accountable for what they do they tend to care a lot more about how they do it.

One company that has taken the initiative in this space is Gospel Technology. Gospel Technology has taken the security of data a step further than simply storing information on a blockchain, they have added another clever layer of security that further enables the safe transfer of information to those who do not have access to the blockchain. This makes it perfect for dealing with third parties or those within organisations who don’t hold permissioned access to the blockchain but need certain files.

One of the issues with blockchains is the user interface. It’s not always pretty or intuitive but Gospel has also taken care of this with a simple and elegant platform that makes data security easy for the end user.  The company describes their product Gospel® as an enterprise-grade security platform, underpinned by blockchain, that enables data to be accessed and tracked with absolute trust and security.

The applications for Gospel are many and it seems that in the current environment this kind of solution is a growing requirement for organisations across many industries, especially with the new regulatory implications of GDPR coming to the fore and the financial penalties for breaching it.

From our point of view as a consultancy in the Cyber Security space, we see the genuine concern and need for clarity, understanding and assurance for our clients and the organisations that we speak to on a daily basis. The realisation that data and cyber security is now something that can’t be taken lighted has begun to hit home. The issue for most businesses is that there are so many solutions out there it’s hard to know what to choose and so many threats, that trying to stay on top of it without a dedicated staff is nearly impossible. However, the good news is that there are good quality solutions out there and with a little effort and guidance and a considered approach to your organisation’s security you can turn back the tide on data security and protect your organisation well.

GDPR & Cyber-threats – How exposed is your business?

Posted on : 28-11-2017 | By : Tom Loxley | In : Cloud, compliance, Cyber Security, Data, data security, GDPR

Tags: , , , , , , , , , , , ,

0

With the looming deadline approaching for the ICO enforcement of GDPR it’s not surprising that we are increasingly being asked by our clients to assist in helping them assess the current threats to their organisation from a data security perspective. Cybersecurity has been a core part of our services portfolio for some years now and it continues to become more prevalent in the current threat landscape, as attacks increase and new legislation (with potentially crippling fines) becomes a reality.

However, the good news is that with some advice, guidance, consideration and a little effort, most organisations will find it easy enough to comply with GDPR and to protect itself again well against the current and emerging threats out there.

The question of measuring an organisations threat exposure is not easy. There are many angles and techniques that companies can take, from assessing processes, audit requirements, regulatory posture, perimeter defence mechanisms, end-user computing controls, network access and so on.

The reality is, companies often select the approach that suits their current operating model, or if independent, one which is aligned with their technology or methodology bias. In 99% of cases, what these assessment approaches have in common is that they address a subset of the problem.

At Broadgate, we take a very different approach. It starts with two very simple guiding principles:

  1. What are the more critical data and digital assets that your company needs to protect?
  2. How do your board members assess, measure and quantify secure risks?

Our methodology applies a top-down lens over these questions and then looks at the various inputs into them. We also consider the threats in real-world terms, discarding the “FUD” (Fear, Uncertainty and Doubt) that many service providers use to embed solutions and drive revenue, often against the real needs of clients.

Some of the principles of our methodology are:

  • Top Down – we start with the boardroom. As the requirements to understand, act and report on breaches within a company become more robust, it is the board/C-level executives who need the data on which to make informed decisions.

 

  • Traceability – any methodology should have a common grounding to position it and also to allow for comparison against the market. Everything we assess can be traced back to industry terminology from top to bottom whilst maintaining a vocabulary that resonates in the boardroom.

 

  • Risk Driven – to conduct a proper assessment of an organisations exposure to security breaches, it is vital that companies accurately understand the various aspects of their business profile and the potential origin of threats, both internal and external. For a thorough assessment, organisations need to consider the likelihood and impact from various data angles, including regulatory position, industry vertical, threat trends and of course, the board members themselves (as attacks are more and more personal by nature). Our methodology takes these, and many other aspects, into consideration and applies a value at risk, which allows for focused remediation plans and development of strategic security roadmaps.

 

  • Maturity Based – we map the key security standards and frameworks, such as GDPR, ISO 27001/2, Sans-20, Cyber Essentials etc. from the top level through to the mechanics of implementation. We then present these in a non-technical, business language so that there is a very clear common understanding of where compromises may exist and also the current state maturity level. This is a vital part of our approach which many assessments do not cover, often choosing instead to present a simple black and white picture.

 

  • Technology Best Fit – the commercial success of the technology security market has led to a myriad of vendors plying their wares. Navigating this landscape is very difficult, particularly understanding the different approaches to prevention, detection and response.

At Broadgate, we have spent years looking into what are the best fit technologies to mitigate the threats of a cyber-attack or data breach and this experience forms a cornerstone of our methodology. Your business can also benefit from our V-CISO service to ensure you get an executive level of expertise, leadership and management to lead your organisation’s security. Our mantra is “The Business of Technology”. This applies to all of our products and services and never more so when it comes to really assessing the risks in the security space.

If you would like to explore our approach in more detail, and how it might benefit your company, please contact me at john.vincent@broadgateconsultants.com.

Digital out of Home – a growth and innovation market

Posted on : 28-09-2017 | By : jo.rose | In : Cloud, Innovation, IoT

Tags: , , , , , , , , ,

0

The acceleration of growth in the digital out of home market (DOOH) is impressive. As providers switch from traditional mediums to digital based technologies and with creative technological advances, such as programmatic and Virtual Reality (VR) and Augmented Reality (AR), it is an exciting time for the sector. Indeed, in 2016 the market was valued at USD 12.52 billion and is forecast to grow to over USD 26 billion by 2023.

According to William Eccleshare, chairman and chief executive of DOOH media provider Clear Channel International;

Globally, press has collapsed, TV is static and radio has declined. Outdoor though has been growing steadily

This growth naturally brings opportunities for the large incumbents (such as Clear Channel) as well as new startups, but at the same time there are challenges to switch existing inventory to the new distribution mediums, transform legacy systems and business process, as well as the requirements to design scalable and secure digital networks.

As with all industries, the DOOH ecosystem is shifting to cloud based platforms to allow for businesses to both flex with demand and also deploy campaigns to audiences on a global basis. These platforms are capable of processing increasingly large and complex data used in the delivery of more targeted audience driven products, which are more cost effective and allows for better integration with external systems. Indeed, as the internet of things (IoT) gathers pace, this data requirement and inter-connectivity will continue to grow at pace.

Let’s look at some of the trends in a bit more detail

Programmatic: Firstly, there’s a lot of talk about programmatic advertising and it’s major influence in the overall DOOH market. The programmatic advertising platform is an online auction where media buyers specify their targeting requirements, such as audience demographics, time of day and location, as well as their budgetary constraints. In itself this isn’t particularly innovative, with other markets such as retail auctions and financial services offering for many years. What it will do though is put pressure on the players (and margins) current value chain, from advertising creative to distribution. It will also provide further pressures on the incumbents who carry more legacy technical debt.

Data is everything: whilst (within reason) signs themselves remain static, the data regarding audiences and how they interact with their environment does not. It constantly changes based on numerous factors, from the time of day, to the weather and external new events etc. With over 75% of UK consumers owning a smartphone, and checking that c.80 times a day, harnessing and correlating this data as consumers go about their daily lives creates value. This plays naturally into the hands of the tech companies and mobile providers who have access to resource, networks and expertise to exploit this value. Here the providers of the digital infrastructure have a real challenge to maintain a foothold and become an integral part of the chain rather than a consumer of more and more costly data.

User Experience enrichment: DOOH is providing more opportunities than ever to touch, interact and engage with valuable consumers; helping to bring brands to life in creative and digitally disruptive ways. In todays “Experience Economy”, it is estimated that 65% of 18-34 year olds are more fulfilled by live experience than possessions. Digital advertising is already interactive in a lot of senses, through simple NFC, QR codes, facial recognition, context awareness etc. and we expect further innovations in a connected context to develop at pace.

Augmented Reality: the first big AR sensation was Pokemon Go. Within a week of its launch last year more than 28 million people a day walking around town and staring at their screens to catch a Pokemon (much to the bewilderment of many onlookers). Now technology partner and advertisers are rightly excited about the potential. Tim Cook recently said of AR that it presented;

broad mainstream applicability across education, entertainment interactive gaming, enterprise, and categories we probably haven’t even thought of

Beacon connectivity: to facilitate the consumer personalisation journey and communication, beacons are becoming more prevalent through the DOOH infrastructure with presence in taxis, retailers, buses, billboards, kiosks etc. We see this further with Google’s Eddystone beacons to create proximity-based experiences for consumers as an open beacon format for both Android and iOS. These developments have shifted the trend towards a creation of a new channel of personalisation based on precision of time, location and so context based digital advertising.

It’s an exciting time to be involved in DOOH innovation with great potential for media tech disruption, but with some significant risks for traditional players, some of which will struggle to shift their operating model and compete.

 

 

Are we addicted to “Digital”?

Posted on : 28-02-2017 | By : john.vincent | In : Cloud, Data, Innovation, IoT, Uncategorized

Tags: , , , , , , , ,

0

There’s no getting away from it. The speed of technology advancement is now a major factor in changing how we interact with the world around us. For the first time, it seems that innovation in technology is being applied across every industry to drive innovation, increase efficiency and open new market possibilities, whilst in our daily lives we rely more and more on a connected existence. This is seen in areas such as the increase in wearable tech and the Internet of Things.

But what is the impact on business and society of this technology revolution regarding human interaction?

Firstly, let’s get the “Digital” word out on the table. Like cloud before it, the industry seems to have adopted a label on which we can pin everything related to advancement in technology. Whilst technically relating to web, mobile, apps etc. it seems every organisation has a “digital agenda”, likely a Chief Digital Officer and often a whole department in which some sort of alchemy takes place to create digital “stuff”. Meanwhile, service providers and consultancies sharpen their marketing pencils to ensure we are all enticed by their “digital capabilities”. Did I miss the big analogue computing cut-over in the last few years?

What “digital” does do (I guess) is position the narrative away from just technology to a business led focus, which is a good thing.

So how is technology changing the way that we interact on a human level? Before we move on to the question of technology dependence, let’s look at some other applications.

Artificial Intelligence (AI) is a big theme today. We’ve discussed the growth of AI here before and the impact on future jobs. However, one of the areas relating social interaction which is interesting, is the development of emotionally intelligent AI software. This is most evident in call centres where some workers can now receive coaching from software in real-time which analyses their conversations with customers. During the call the software can recommend changes such as with style, pace, warning about the emotional state of the customer etc.

Clever stuff, and whilst replacing call centre agents with robots is still something that many predict is a way off (if at all) it does offer an insight into the way that humans and AI might interact in the future. By developing AI to understand mental states from facial expressions, vocal nuances, body posture and gesture software can make decisions such as adapting the way that navigational systems might work depending on the drivers mental condition (for example, lost or confused) or picking the right moment to sell something based on emotional state. The latter does, however, raise wider ethical issues.

So what about the increase in digital dependency and the social impacts? Anyone who has been in close proximity to “millennial gatherings” will have witnessed the sight of them sitting together, head bowed, thumbs moving at a speed akin to Bradley Coopers character in Limitless punctuated by the odd murmuring, comment or interjection. Seems once we drop in a bit of digital tech and a few apps we stifle the art of conversation.

In 2014 a programmer called Kevin Holesh developed an app called Moment which measures the time that a user is interacting with a screen (it doesn’t count time on phone calls). The results interesting, with 88% of those that downloaded the app using their phone for more than an hour a day, with the average being three hours. Indeed, over a 24 hour period, the average user checked their phone 39 times. By comparison, just 6 years earlier in 2008 (before the widespread use of smartphones) people spent just 18 minutes a day on their phone.

It’s the impact on students and the next generation that has raised a few alarm bells. Patricia Greenfield, distinguished professor of psychology and director of the UCLA Children’s Digital Media Center in a recent study found that college students felt closest (or “bonded”) to their friends when they discussed face to face and most distant from them when they text-messaged. However, the students still most often communicated by text.

“Being able to understand the feelings of other people is extremely important to society,” Greenfield said. “I think we can all see a reduction in that.”

Technology is changing everything about how we interact with each other, how we arrange our lives, what we eat, where and how we travel, how we find a partner, how we exercise etc… It is what makes up the rich fabric of the digitised society and will certainly continue to evolve at a pace. Humans, however, may be going the other way.

A few tips to securing data in the cloud

Posted on : 30-11-2016 | By : john.vincent | In : Cloud, Cyber Security, Data, Uncategorized

Tags: , , , , , , , , , , ,

0

In our view, we’ve finally reached the point where the move from internally built and managed technology to cloud based applications, platforms and compute services is now the norm. There are a few die hard “remainers” but the public has chosen – the only question now is one of pace.

Cloud platform adoption brings a host of benefits, from agility in deployment, cost efficiency, improved productivity and collaboration amongst others. Of course, the question of security is at the forefront, and quite rightly so. As I write this the rolling data breach news continues, with today being that of potentially compromised accounts at the National Lottery.

We are moving to a world where the governance of cloud based services becomes increasingly complex. For years organisations have sought to find, capture or shutdown internal pockets of “shadow IT”, seeing them as a risk to efficiency and increasing risk. In todays new world however, these shadows are more fragmented, with services and data being very much moving towards the end user edge of the corporate domain.

So with more and more data moving to the cloud, how do we protect against malicious activity, breaches, fraud or general internal misuse? Indeed, regarding the last point, the Forrsights Security Survey stated:

“Authorised users inadvertently exposing sensitive information was the most common cause of data beaches in the past 12 months.”

We need to think of the challenge in terms of people, process and technology. Often, we have a tendency to jump straight to an IT solution, so let’s come to that later. Firstly, organisations need to look at few fundamental pillars of good practice;

  1. Invest in User Training and Awareness – it is important that all users throughout and organisation understand that security is a collective responsibility. The gap between front and back office operations is often too wide, but in the area of security organisations must instil a culture of shared accountability. Understanding and educating users on the risks, in a collaborative way rather than merely enforcing policy, is probably the top priority for many organisations.
  2. Don’t make security a user problem – we need to secure the cloud based data and assets of an organisation in a way that balances protection with the benefits that cloud adoption brings. Often, the tendency can be to raise the bar to a level that both constrains user adoption and productivity. We often hear that IT are leading the positioning of the barrier irrespective of the business processes or outcomes. This tends to lead to an approach of being overly risk adverse without the context of disruption to business processes. The result? Either a winding back of the original solution or users taking the path of least resistance, which often increases risks.

On the technology side, there are many approaches to securing data in the cloud.  Broadly, these solutions have been bundled in the category of Cloud Access Security Broker (CASB), which is software or a tool that sits in between the internal on-premise infrastructure and the cloud provider, be that software, platform or other kind of as-a-service. The good thing about these solutions is that they can enforce controls and policies without the need to revert to the old approach of managing shadow IT functions, effectively allowing for a more federated model.

Over recent years, vendors have come to market to address the issue through several approaches. One of the techniques is through implementing gateways that either use encryption or tokenisation to ensure secure communication of data between internal users and cloud based services. However, with these the upfront design and scalability can be a challenge given the changing scope and volume of cloud based applications.

Another solution is to use an API based approach, such as that of Cloudlock (recently purchased by Cisco). This platform uses a programmatic approach to cloud security on the key SaaS platforms such as  to address areas such as Data Loss Prevention, Compliance and Threat Protection with User and Entity Behaviour Analytics (UEBA). The last of these users machine learning to detect anomalies in cloud activities and access.

Hopefully some food for though in the challenge of protecting data in the cloud, whichever path you take.