It’s Time to Take Control of Your Supply Chain Security

Posted on : 31-08-2018 | By : richard.gale | In : Uncategorized

1

According to the Annual Symantec Threat Report supply chain attacks have risen 200% in the period 2016-2017. Confirming the trend for attackers to start small, move up the chain and hit the big time!

Attackers are increasingly hijacking software updates as an entry point to target networks further up the supply chain. Nyetya, a global attack started this way affecting such companies as FedEx and Maersk costing them millions.

Although many corporations have wised up to the need to protect their network and their data, have all their suppliers? And their supplier’s suppliers? All it takes is a single vulnerability of one of your trusted vendors to gain access to your network and you and your customer’s sensitive data could be compromised.

Even if your immediate third parties don’t pose a direct risk, their third parties (your fourth parties) might. It is crucial to gain visibility into the flow of sensitive data among all third and fourth parties, and closely monitor every organization in your supply chain. If you have 100 vendors in your supply chain and 60 of them are using a certain provider for a critical service, what will happen if that critical provider experiences downtime or is breached?

The changing nature of the digital supply chain landscape calls for coordinated, efficient and agile defences. Unless the approach to supply chain risk management moves with the times, we will continue to see an increase in third-party attacks.

Organizations need to fundamentally change the way they approach managing third-party risk, and that means more collaboration, automation of the process with the adoption of new technology and procedures. It is no longer sufficient simply to add some clauses to your vendor contract stating that everything that applies to your third-party vendor applies to the vendors sub-contractors.  

Traditionally, vendor management means carrying out an assessment during the onboarding process and then perhaps an annual review to see if anything has changed since the initial review. This assessment is only based on the view at a point in time against a moving threat environment. What looks secure today may not be next week!

The solution to this problem is to supplement this assessment by taking an external view of your vendors using threat analytics which are publicly available, to see what is happening on their network today. With statistics coming through in real time you can monitor your suppliers on a continuous basis. It is not possible to prevent a third-party attack in your supply chain, but with up to date monitoring issues can be detected at the earliest possible opportunity limiting the potential damage to your company’s reputation and your client’s data.

Many vendor supply management tools use security ratings as a way of verifying the security of your suppliers using data-driven insights into any vendor’s security performance by continuously analysing, and monitoring companies’ cybersecurity, all from the outside. Security ratings are generated daily, giving organizations continuous visibility into the security posture of key business partners. By using security ratings, it enables an organisation to assess all suppliers in the supply chain at the touch of a button. This is marked difference to the traditional point-in-time risk assessment.

Here at Broadgate we have helped several clients to take back control of their supply chain by implementing the right technology solution, together with the right policies and procedures the security and efficiency of the vendor management process can be vastly improved.

If you are responsible for cyber security risk management in these times, you are certainly being faced with some overwhelming challenges.  Implementing a vendor risk management program that is well-managed, well-controlled, and well-maintained will mean that you have a more secure supply chain as a result. Companies with more secure third parties will in turn have a lower risk of accruing any financial or reputational damage that would result from a third-party breach. Don’t fret about your supply chain, invest in it and you will reap the rewards!

M&A – Cyber Security Due Diligence

Posted on : 31-08-2018 | By : richard.gale | In : Cyber Security, data security, Finance

Tags: , ,

0

Following the discovery of two data breaches affecting more than 1 billion Yahoo Inc. users, Verizon Communications Inc. reduced its offer by $350 million to acquire the company in 2017. This transaction illustrates how a companies’ reputation and future are impacted by cybersecurity, failure to investigate these measures during mergers and acqusitions could lead to costly integration, unexpected liability and higher overall enterprise risk.

We can see almost daily the effect a data breach can have with companies losing millions in terms of direct losses, reputational damage and customer loyalty. A hurried or limited cybersecurity vetting process may miss exposures or key indicators of an existing or prior breach.

It is crucial to understand cybersecurity vulnerabilities, the damage that may occur in the event of a breach, and the effectiveness of the infrastructure that the target business has in place. An appropriate evaluation of these areas could significantly impact the value that the acquirer places on the target company and how the deal is structured. It is therefore crucial to perform a security assessment on the to-be-acquired company.

It wasn’t that long ago that mergers and acquisition deals were conducted in a paper-based room secured and locked down to only those with permitted access.  These days the process has moved on and is now mostly online, with the secure virtual data room being the norm. Awareness of cyber security in the information gathering part of the deal making process is well established. It is the awareness and need to look at the cyber security of the target company itself that has traditionally been under emphasised, looking more at the technical and practical job of integrating the merged companies’ infrastructure.

Deal makers acquiring must assess the cyber risk of an organisation in the same way that it would assess overall financial risk. Due diligence is all about establishing the potential liabilities of the company you are taking on.  According to the Verizon Data Breach survey it takes an average of 206 days to discover a breach. Often companies are breached without ever knowing. It is therefore important to look at the cyber risk not just in terms of have they been breached but what is the likelihood and impact of a breach.  An acquisition target company that looks good at the time of closing the deal may not look quite so good a few months later.

The main reason for this lack of importance given to the cyber threat is that M&A teams find it hard to quantify the cyber risk particularly given the time pressures involved.  A cyber risk assessment at the M&A stage is crucial if the acquiring company wants to protect its investment. The ability to carry out this assessment and to quantify the business impact of a likely cyber breach with a monetary value is invaluable to deal makers. Broadgate’s ASSURITY Assessment provides this information in a concise, value specific way using business language to measure risks, likelihood and cost of resolution.

A cyber security assessment should be part of every M&A due diligence process. If you don’t know what you are acquiring in terms of intellectual property and cyber risk how can you can possibly know the true value of what you are acquiring!

 

Application Performance Management (APM)  – Monitor Every Critical Swipe, Tap and Click

Posted on : 30-08-2018 | By : richard.gale | In : App, Consumer behaviour, Innovation

Tags: ,

0

Customers expect your business application to perform consistently and reliably at all times and for good reason. Many have built their own business systems based on the reliability of your application. This reliability target is your Service Level Objective (SLO), the measurable characteristics of a Service Level Agreement (SLA) between a service provider and its customer.

The SLO sets target values and expectations on how your service(s) will perform over time. It includes Service Level Indicators (SLIs)—quantitative measures of key aspects of the level of service—which may include measurements of availability, frequency, response time, quality, throughput and so on.

If your application goes down for longer than the SLO dictates, fair warning: All hell may break loose, and you may experience frantic pages from customers trying to figure out what’s going on. Furthermore, a breach to your SLO error budget—the rate at which service level objectives can be missed—could have serious financial implications as defined in the SLA.

Developers are always eager to release new features and functionality. But these upgrades don’t always turn out as expected, and this can result in an SLO violation. Deployments and system upgrades will be needed, but anytime you make changes to applications, you introduce the potential for instability.

There are two companies currently leading the way in Business Service Monitoring, New Relic and AppDynamics. AppDynamics has been named as Gartner Magic quadrant winner in APM for the last six years. This suite of application and business performance monitoring solutions ensures that every part of even the most complex, multi-cloud environments—from software to infrastructure to business outcomes—is highly visible, optimized, and primed to drive growth. The need for such a monitoring tool can be evidenced in the large number of Tier One banks which have taken it onboard.

AppDynamics is a tool which enables you to track the numerous metrics for your SLI. You can choose which metrics to monitor, with additional tools that can deliver deeper insights into areas such as End User Monitoring, Business IQ and Browser Synthetic Monitoring.

The application can be broken down into the following components:

  • APM: Say your application relies heavily on APIs and automation. Start with a few API you want to monitor and ask, “Which one of these APIs, if it fails, will impact my application or affect revenue?”  These calls usually have a very demanding SLO.
  • End User Monitoring: EUM is the best way to truly understand the customer experience because it automatically captures key metrics, including end-user response time, network requests, crashes, errors, page load details and so on.
  • Business iQ: Monitoring your application is not just about reviewing performance data.  Biz iQ helps expose application performance from a business perspective, whether your app is generating revenue as forecasted or experiencing a high abandon rate due to degraded performance.
  • Browser Synthetic Monitoring: While EUM shows the full user experience, sometimes it’s hard to know if an issue is caused by the application or the user. Generating synthetic traffic will allow you to differentiate between the two.

There is an SRE dashboard where you can view your KPIs:

  • SLO violation duration graph, response time (99th percentile) and load for your critical API calls
  • Error rate
  • Database response time
  • End-user response time (99th percentile)
  • Requests per minute
  • Availability
  • Session duration

SLI, SLO, SLA and error budget aren’t just fancy terms. They’re critical to determining if your system is reliable, available or even useful to your users. You should be able to measure these metrics and tie them to your business objectives, as the ultimate goal of your application is to provide value to your customers.