Why are we still getting caught by the ‘Phisher’men?

Posted on : 26-09-2019 | By : kerry.housley | In : Cyber Security, data security, Finance, Innovation

Tags: , , , , , , ,

0

Phishing attacks have been on the increase and have overtaken malware as the most popular cyber attack method. Attackers are often able to convincingly impersonate users and domains, bait victims with fake cloud storage links, engage in social engineering and craft attachments that look like ones commonly used in the organisation.

Criminal scammers are using increasingly sophisticated methods by employing more complex phishing site infrastructures that can be made to look more legitimate to the target. These include the use of well-known cloud hosting and document sharing services, established brand names which users believe are secure simply due to name recognition. For example, Microsoft, Amazon and Facebook are top of the hackers list. Gone are the days when phishing simply involved the scammer sending a rogue email and tricking the user into clicking on a link!

And while we mostly associate phishing with email, attackers are taking advantage of a wide variety of attack methods to trick their victims. Increasingly, employees are being subjected to targeted phishing attacks directly in their browser with highly legitimate looking sites, ads, search results, pop-ups, social media posts, chat apps, instant messages, as well as rogue browser extensions and free web apps

HTML phishing is a particularly effective means of attack where it can be delivered straight into browsers and apps, bypassing secure email gateways, next-generation antivirus endpoint security systems and advanced endpoint protections. These surreptitious methods are capable of evading URL inspections and domain reputation checking.

To make matters worse, the lifespan of a phishing URL has decreased significantly in recent years. To evade detection, phishing gangs can often gather valuable personal information in around 45 minutes. The bad guys know how current technologies are trying to catch them, so they have devised imaginative new strategies to evade detection. For instance, they can change domains and URLs fast enough so the blacklist-based engines cannot keep up. In other cases, malicious URLs might be hosted on compromised sites that have good domain reputations. Once people click on those sites, the attackers have already collected all the data they need within a few minutes and moved on.

Only the largest firms have automated their detection systems to spot potential cyberattacks. Smaller firms are generally relying on manual processes – or no processes at all. This basic lack of protection is a big reason why phishing for data has become the first choice for the bad actors, who are becoming much more sophisticated. In most cases, employees can’t even spot the fakes, and traditional defences that rely on domain reputation and blacklists are not enough.

By the time the security teams have caught up, those attacks are long gone and hosted somewhere else. Of the tens of thousands of new phishing sites that go live each day, the majority are hosted on compromised but otherwise legitimate domains. These sites would pass a domain reputation test, but they’re still hosting the malicious pages. Due to the fast-paced urgency of this threat, financial institutions should adopt a more modern approach to defend their data. This involves protections that can immediately determine the threat level in real-time and block the phishing hook before they draw out valuable information..

  • Always check the spelling of the URLs in email links before you click or enter sensitive information
  • Watch out for URL redirects, where you’re subtly sent to a different website with identical design
  • If you receive an email from a source you know but it seems suspicious, contact that source with a new email, rather than just hitting reply
  • Don’t post personal data, like your birthday, vacation plans, or your address or phone number, publicly on social media

We have started to work with Ironscales, a company which provides protection utilising machine learning to understand normal behaviours of users email interactions. It highlights (and can automatically remove) emails from the user’s inbox before they have time to open them. They cross reference this information with a multiple of other sources and the actions of their other client’s SOC analysts. This massively reduces the overhead in dealing with phishing or potential phishing emails and ensures that users are aware of the risks. Some great day to day examples include the ability to identify that an email has come from a slightly different email address or IP source. The product is being further developed to identify changes in grammar and language to highlight where a legitimate email address from a known person may have been compromised. We really like the ease of use of the technology and the time saved on investigation & resolution.

If you would like to try Ironscales out, then please let us know?

 

Phishing criminals will continue to devise creative new ways of attacking your networks and your employees. Protecting against such attacks means safeguarding those assets with equal amounts of creativity.

Are we addicted to “Digital”?

Posted on : 28-02-2017 | By : john.vincent | In : Cloud, Data, Innovation, IoT, Uncategorized

Tags: , , , , , , , ,

0

There’s no getting away from it. The speed of technology advancement is now a major factor in changing how we interact with the world around us. For the first time, it seems that innovation in technology is being applied across every industry to drive innovation, increase efficiency and open new market possibilities, whilst in our daily lives we rely more and more on a connected existence. This is seen in areas such as the increase in wearable tech and the Internet of Things.

But what is the impact on business and society of this technology revolution regarding human interaction?

Firstly, let’s get the “Digital” word out on the table. Like cloud before it, the industry seems to have adopted a label on which we can pin everything related to advancement in technology. Whilst technically relating to web, mobile, apps etc. it seems every organisation has a “digital agenda”, likely a Chief Digital Officer and often a whole department in which some sort of alchemy takes place to create digital “stuff”. Meanwhile, service providers and consultancies sharpen their marketing pencils to ensure we are all enticed by their “digital capabilities”. Did I miss the big analogue computing cut-over in the last few years?

What “digital” does do (I guess) is position the narrative away from just technology to a business led focus, which is a good thing.

So how is technology changing the way that we interact on a human level? Before we move on to the question of technology dependence, let’s look at some other applications.

Artificial Intelligence (AI) is a big theme today. We’ve discussed the growth of AI here before and the impact on future jobs. However, one of the areas relating social interaction which is interesting, is the development of emotionally intelligent AI software. This is most evident in call centres where some workers can now receive coaching from software in real-time which analyses their conversations with customers. During the call the software can recommend changes such as with style, pace, warning about the emotional state of the customer etc.

Clever stuff, and whilst replacing call centre agents with robots is still something that many predict is a way off (if at all) it does offer an insight into the way that humans and AI might interact in the future. By developing AI to understand mental states from facial expressions, vocal nuances, body posture and gesture software can make decisions such as adapting the way that navigational systems might work depending on the drivers mental condition (for example, lost or confused) or picking the right moment to sell something based on emotional state. The latter does, however, raise wider ethical issues.

So what about the increase in digital dependency and the social impacts? Anyone who has been in close proximity to “millennial gatherings” will have witnessed the sight of them sitting together, head bowed, thumbs moving at a speed akin to Bradley Coopers character in Limitless punctuated by the odd murmuring, comment or interjection. Seems once we drop in a bit of digital tech and a few apps we stifle the art of conversation.

In 2014 a programmer called Kevin Holesh developed an app called Moment which measures the time that a user is interacting with a screen (it doesn’t count time on phone calls). The results interesting, with 88% of those that downloaded the app using their phone for more than an hour a day, with the average being three hours. Indeed, over a 24 hour period, the average user checked their phone 39 times. By comparison, just 6 years earlier in 2008 (before the widespread use of smartphones) people spent just 18 minutes a day on their phone.

It’s the impact on students and the next generation that has raised a few alarm bells. Patricia Greenfield, distinguished professor of psychology and director of the UCLA Children’s Digital Media Center in a recent study found that college students felt closest (or “bonded”) to their friends when they discussed face to face and most distant from them when they text-messaged. However, the students still most often communicated by text.

“Being able to understand the feelings of other people is extremely important to society,” Greenfield said. “I think we can all see a reduction in that.”

Technology is changing everything about how we interact with each other, how we arrange our lives, what we eat, where and how we travel, how we find a partner, how we exercise etc… It is what makes up the rich fabric of the digitised society and will certainly continue to evolve at a pace. Humans, however, may be going the other way.

Why is cyber so popular with today’s criminal?

Posted on : 30-01-2015 | By : richard.gale | In : Cyber Security

Tags: , , , ,

0

In a recent interview Manhattan District Attorney Cyrus Vance Jr stated that a third of the crimes his office investigates are now related to cyber crime and identity theft. Cyrus referred to it as a ‘Tsunami’ and it has forced significant changes in the way his department works.

Cybercrime in all its forms is accounting for 200 – 300 complaints per month and is rising fast. Cyber is one of the few areas of crime that is actually rising. Most other types of crime are decreasing and this pattern continues into the UK.

 

So why is cyber crime on the increase, what sorts of crime are occurring, who are the criminals and how do they operate?

 

Why do criminals carry out cybercrime?

Ease – The ability to carry out cybercrime is getting easier. There are plenty of tools available, some of the crimes are the simplest such as the scamming emails which purport to be from your bank or someone who has lost their wallet abroad do not need any special equipment. There is still a perception from some consumers that emails with the correct logos are official and should be taken seriously. More complex frauds using targeted malware and tools are more difficult to commit but are becoming widespread as the value of theft can be far greater. The ‘cost of entry’ to the Cyber market is getting lower and the tools becoming more prevalent.

Lower sentencing – Traditional crime, especially where violence or threat of violence is concerned is usually severely punished. Cybercrime generally comes under the banner of ‘white collar’ crime and the price criminals have to pay for this can be far lower in the form of lighter/suspended sentences or even just fines. This attracts criminals to the lower risk/reward ratio. Punishment of cybercrime may change as it matures but for the moment it is an easy option.

Higher Risk/Rewards – The average ‘take’ for a bank robbery in the U.S. is $1,200, the sentence for a violent crime can be life. Conversely the average loss for a cyber crime is $4,600 and the likelihood of any custodial sentence is low. In addition the chance of being caught is very low compared to a bank robbery.

Comfort – Traditional crime is weather dependent, burglary rates go down when it is cold and raining (partially due to the lack of open windows but also because burglars dislike going out in bad weather as much as the rest of us). A significant amount of cybercrime can be carried out from anywhere including the comfort of a criminal’s house.

 

What cybercrimes are popular? How are they carried out?

Hacking: This is a type of crime wherein a computer is broken into so that sensitive, confidential or personal information can be accessed by an unauthorised party. In hacking, the criminal uses a variety of software to enter a person’s computer and the person may not be aware that his computer is being accessed from a remote location.

Theft: This crime occurs when a third party steals credentials to access and reuse or sell unauthorised data. This can include reproducing copyrighted material such as music, movies, games and software. There are many peer sharing websites which encourage software piracy, these get shutdown on a regular basis but spring up again very quickly.

Cyber Stalking: This is a kind of online harassment wherein the victim is subjected to a barrage of online messages and emails. Typically, these stalkers fall into two groups. Ones who know their victims and instead of resorting to offline stalking, they use the Internet to stalk and the other where there is no previous connection to the victim except that they are in the public eye for some reason.

Identity Theft: This has become a major problem with people using the Internet for cash transactions and banking services. In this cybercrime, a criminal accesses data about a person’s bank account, credit or debit cards and other sensitive information to siphon money or to buy things online in the victim’s name. It can result in major financial losses for the victim and is an increasing overhead for financial services companies.

Malicious Software: These are Internet-based software or programs that are used to disrupt a network. The software is used to gain access to a system to steal sensitive information or data or causing damage to software present in the system. DDOS – denial of service and malicious encryption tools are often used for extortion purposes.

Child soliciting and Abuse: This is also a type of cyber crime wherein criminals solicit under age children through a variety of mechanisms for the purpose of child pornography. Government agencies are spending a lot of time targeting these types of crime and monitor chat rooms frequented by children to prevent this sort of child abuse.

 

Who are the cyber criminals?

Professor Marcus Rogers, Director of the Cyber Forensics & Security Program and Purdue University has produced a taxonomy of offenders;

Script kiddies: who are motivated by “immaturity, ego boosting, and thrill seeking.” Rogers says they tend to be “individuals with limited technical knowledge and abilities who run precompiled software to create mischief, without truly understanding what the software is accomplishing ‘under the hood.’ ”

Cyber-punks: who “have a clear disrespect for authority and its symbols and a disregard for societal norms.” According to Rogers, “they are driven by the need for recognition or notoriety from their peers and society,” and are “characterized by an underdeveloped sense of morality.”

Hacktivists: who, in Rogers’ estimation, might just be “petty criminals” trying to “justify their destructive behaviour, including defacing websites, by labelling [it] civil disobedience and ascribing political and moral correctness to it.”

Thieves: who are “primarily motivated by money and greed” and are “attracted to credit card numbers and bank accounts that can be used for immediate personal gain.”

Virus writers: who tend to be drawn to “the mental challenge and the academic exercise involved in the creation of the viruses.”

Professionals: who are often ex-intelligence operatives “involved in sophisticated swindles or corporate espionage.”

Cyber-terrorists: who are essentially warriors, often members of “the military or paramilitary of a nation state and are viewed as soldiers or freedom fighters in the new cyberspace battlefield.”

 

To conclude, cybercrime is a fast growing, multi-faceted problem with new participants entering the arena every day. It will be interesting to see how technology and other commercial organisations approach the problem and how society and government organisations attack the cyber hordes. We will be following this article with our thoughts on how it can be approached in the coming months.

“People Analytics” – Can robots replace the recruiters?

Posted on : 28-07-2014 | By : john.vincent | In : Innovation

Tags: , , , , , , , ,

0

The recruitment industry has been largely unchanged for many years. Technology has, of course, changed the way that companies and individuals interact in the process, from online job and candidate postings with companies like Jobserve and Monster, company recruitment portals to engage with and measure preferred suppliers and online screening of candidates prior to onboarding.

However, we are now at a point where technology can really disrupt the industry through the use of Big Data. The ability to not only hire, but equally as important, retain better talent through the use of what is being called “people analytics” is now a reality. By mining the huge amounts of data that potential candidates leave, either willingly or otherwise, in their daily digital lives is allowing companies to assess the value of existing and future employees.

We won’t get into the whole privacy thing…that’s for another day.

According to Prof Peter Capelli at the Centre for Human Resources at Wharton, big data can predict successful hires better than a companies HR department.

While HR researchers have been kicking around small and simple sets of data, much of it collected decades ago, the big-data people have fresh information on hundreds of thousands of people — in some cases, millions of people — and the information includes all kinds of performance measures, attributes of the individual employers, their experience and so forth. There are a lot of new things to look at.

Now, I’m sure there are a lot of HR professionals who would argue with this! However, like all industries where technology advancements have enabled new business practices and efficiencies, recruitment is no different.

Let’s look at the evolution in one specific area, recruitment of technology professionals themselves. During the technology boom years, agencies specialising in finding talent for companies sprung up at a fast pace, armed with a collection of job board subscriptions and expense account. The game was simple….it was all about speed. How quickly could a CV hit the desk of a hiring manager.

When demand outstripped supply the question of selecting the absolute best fit candidate could often be secondary. Get someone quick…in fact, if they’ve only got 50% of the role requirements then get two!… Demand was high, margins were high and everybody was happy.

Things have changed dramatically since 2008. As demand tailed off so did margins for recruitment firms, with in-house managed services firms putting the final nail in for many new entrants.

So, now with “people analytics” in full swing, are we entering a phase where the recruitment industry will fade away completely. Of course not. For certain roles, or levels of seniority, human interaction throughout the whole process from role requirements, through search and selection is a necessity.

However, for some roles such as developers, software engineers or analysts, the use of algorithms rather than traditional routes can uncover a whole new talent pool, through techniques such as actually mining open source code. According to Dr Vivienne Ming of Gild, a specialist tech recruiter;

There are about 100 times as many qualified but un-credentialed candidates out there, at every level of ability. Organizations are creating their own blind spots, which leads to companies paying too much for their hires and to talent being squandered

Indeed, when the University of Minnesota analysed 17 studies evaluating job applicants, they actually found that human decisions were outperformed by a simple equation by at least 25%.

So, the days of the CV may be numbered. Smart companies are not waiting to advertise a role and harvest applications through their traditional channels, but are more sourcing candidates directly by casting the net into the social media waters, looking at blogs and the like. A recent survey showed that some 44% of companies looked at these platforms before hiring and candidates are now much more aware of their social media brand.

The use of people analytics continues post hire to further develop, nurture and retain talent. An example of this is actually in the world of recruitment itself where Social Talent has developed a data tool which it is testing on 2000 individuals. By analysing their daily activity, from emails, phone calls, browsing, candidate key word searching etc… it is able to build a profile of the most successful techniques and provide constructive advice through popup messages in real time.

So where does that leave the recruiters on both sides of the fence? Well, some of the smart providers are developing their own platforms to provide their customers with advanced people analytics whilst on the client side, we see the focus shifting to a smaller subset of organisational roles.

As for the traditional HR role in the talent process, we’ll leave the last word to Peter Capelli;

My bet is that the CIO offices in most big companies will soon start using all the data they have (which is virtually everything) to build models of different aspects of employee performance, because that’s where the costs are in companies and it’s also the unexamined turf in business

 

Google Eyephone takes on Apple iPhone – what’s next in wearable technology?

Posted on : 31-05-2013 | By : richard.gale | In : Innovation

Tags: , , , , , , ,

0

Phones are everywhere

Obvious statement but true – mobile & smart phones are the first true utility computing for individuals. The growth has been phenomenal and the integration to our lives has been huge and growing. They We now don’t have to plan how to get somewhere before we start, we don’t need to know how to get somewhere, we have virtually unlimited information on tap and local to our person. This was still a fantasy world just 10 years ago and the convergence of technology and our lives has only just begun.

Phones are limited

Phones are basically blocks of metal and plastic and you need to use at least one of your two hands and both of your eyes  to use it effectively. When you are at a desk this is not a major issue but when you are mobile driving or walking then it becomes a problem. Using phones for anything but calls whilst driving is frowned upon and often illegal, using phones whilst walking can be either annoying for other pavement users or dangerous when crossing streets etc.

The fundamentals of phones have not changed much in the last 20 years, from the original ‘bricks’ they shrank down to a size that was virtually unusable and now the trend is to grow them to a size which, again, is virtually unusable. The new 6″ screens being developed may mean you’ll need a bigger head to use the phone correctly.

There has been huge jump forwards in technology and the amount of functions that can be built into a device but the limitations are holding the thing in your hand and putting it to your ear to make a call.

The next big trend is wearable technology

When I was at college many years ago one of the lecturers specialised in human computer interaction. His classes were often the most entertaining of the course and I still remember him talking about ‘data gloves’ – used by NASA in training astronauts – they were gloves which provided feedback through sensors and feedback motors to provide sensation and feeling so enabling the virtual world to become reality. He did also make reference to  ‘data suits’ and hinted of the ‘fun’ that they could provide through full body sensors – but that’s another story…

Google Glass has been the story of the last few months. Wearable tech that you can use to navigate, find friends, things you want to do and generally give you an amazing amount of information about the world you are travelling through whilst allowing you to keep your hands and most of your vision free.

Whilst laughing off Google Glass Tim Cook has also hinted that Apple is heavily into wearable technology. There have been many rumours of the iWatch and Sony already has a watch paired with it’s Xperia Z smartphone.

We think this is one of the most exciting areas of personal technology growth. Combining clothing, wearable and jewellery and phones/computing must be the way forward to for the next stages of human computer convergence.

Why won’t wearable technology take off?

Overall we think it will but moving from a separate block of technology you can hide in your pocket to something which is either attached or worn by a person will introduce other variables. These have failed in the past – Bluetooth earpieces have not been and never will be fashionable, also trends in fashion may counter some of the aims of the technology companies.

Will people want to wear technology? – now most people under 40 don’t wear watches – if they do it is more likely to be as a statement  rather than to tell the time – their phone has the time and is in constant use – so they don’t need another timepiece. Again with glasses, there is a fashion element of ‘geek chic’ but I’m not sure Google Glass quite has that yet. Apple is perhaps the most fashionable technology brand they may be able overcome this.

What’s next?

As technology gets smaller and cheaper then it will become easier to build into more things. There are already head’s up displays in high end cars and this will become standard with Google streetview overlaying the road you are driving on with real and real-time directions. Clothing will have technology built in – maybe shoes could have Satnavs with sensors directing you left and right whilst you walk, hoodies with headphones, running tee shirts with heart monitors, also as projection is the next big thing for mobiles then you could have a 3D projection of the world around you from your coat as you walk, cycle, drive along the road or a film as you sit on a train or plane.

We are looking forward to the next period in the evolution of truly personal computing.

 

 

CIOs bank on trendy technology as a priority for 2013

Posted on : 28-02-2013 | By : jo.rose | In : Cloud

Tags: , , , , , , , ,

0

Firstly, thanks to everyone who completed our Broadgate Predictions survey for 2013. We closed it off today and will publish the results in our March BROADSheet.

What is clear as a general theme is that organisations are putting more emphasis on new technology innovation to drive further business value for internal and external clients. CIO budgets have been flat (at best) for several years now so they must look at new ways to deliver improved technology services from the perspective of cost, quality, agility and competitive advantage.

(Note: the exception to this is in financial services where budgets for risk & regulation are swallowing the discretionary spend – in itself a big issue).

So how are CIOs fairing against the challenge of exploiting new technology effectively? Well in another recent survey by Gartner CIOs believed they were only realising 43% of technology’s potential. Now, this is a difficult measure, and one which is not really fleshed out. However, the headline number is startling and putting it out there will cause businesses to question what the world might look like by tapping into the other 57%…

The problem is that the mantra for years now has been efficiency – doing “more with less”. But this only goes so far and as we wrote in our November article about the big banks, CIOs are running out of their most precious commodity, time.

Therefore, CIOs are starting to think differently around new technology, particularly in the area of mobile, data science (we don’t like the “big data” tag), cloud services, social media and consumerization, which have or are entering market maturity.

Mark McDonald, group vice president and Gartner Fellow says;

“CIOs require a new agenda that incorporates hunting for new digital innovations and opportunities, and harvesting value from products, services and operations”.

It is a really good point. These technology advances provide the basis of a new way of delivering technology services. It isn’t really about being an innovator or “pioneer”, but more about changing the objectives, behaviour and culture of the CIO organisation to embrace new ways of delivering value.

We are certainly at a tipping point. Another more worrying finding of the Gartner survey was that around half of the CIOs surveyed (covering some 2000 companies) do not see the role of IT in the enterprise changing over the next 3 years.

Really? As the new digital innovations become more pervasive in society those that do not have a plan to embed these into transformation of applications, infrastructure and operations will, in McDonald’s words “…consign themselves to tending a garden of legacy assets and responsibilities.”

Digital technologies dominate the CIO agenda for 2013. The top 10 global technology priorities reflect a greater emphasis on externally-oriented digital technologies, as opposed to traditional IT/operationally oriented systems.

Over the next 10 years, CIOs see the following technologies fundamentally disrupting business;

  • Mobile technologies (70%)
  • Data Analytics (55%)
  • Social Media (54%)
  • Cloud (51%)

Naturally, many of these have the greatest transformational power when combined together rather than in isolation, similarly to our article this month on Software Defined Networks.

What is key is that we also think further about the changing role of the CIO and leadership throughout this journey. Over the last 10 or so years we’ve seen CIOs become more business aligned and commercially aware. The old days of simply being the “engine room” for data processing are long gone. Indeed, there is a much greater awareness from CIOs in recognising (and being recognised) in terms of technology innovation and the value in delivering business solutions.

As this evolves further, CIOs will potentially find themselves in new territory again in leading solutions outside of the traditional technology role, such as acting as the enterprise chief digital officer, being involved in new channels to market, leading real business solutions in a more collaborative way to shape competitive and innovative digital services etc…

This requires a new ethos and potentially new skills. For example, those CIOs that excel at driving cost savings/efficiency may not necessarily be the same ones to lead the business through the next phase of digital innovation. It will be the CIOs that grasp this and change (or step aside) will offer businesses the most.

 

Broadgate Big Data Dictionary Part One

Posted on : 26-07-2012 | By : richard.gale | In : Data

Tags: , , , , , , , , , , , , , , , ,

2

We have been interested in Big Data concepts and technology for a while. There is a great deal of interest and discussion with our clients and associates on the subject of obtaining additional knowledge & value from data.

As with most emerging ideas there are different interpretations and meanings for some of the terms and technologies (including the thinking that ‘big data’ isn’t new at all but just a new name for existing methods and techniques).

With this in mind we thought it would be useful to put together a few terms and definitions that people have asked us about recently to help frame Big Data.

We would really like to get feedback, useful articles & different views on these to help build a more definitive library of Big Data resources.  We’ve started with a few basic terms and next month with cover some of the firms developing solutions – this is just a starting point…

Analytics 

Big Data Analytics is the processing and searching through large volumes of unstructured and structured data to find hidden patterns and value. The results can be used to further scientific or commercial research, identify customer spending habits or find exceptions in financial, telemetric or risk data to indicate hidden issues or fraudulent activity.

Big Data Analytics is often carried out with software tools designed to sift and analyse large amounts of diverse information being produced at enormous velocity. Statistical tools used for predictive analysis and data mining are utilised to search and build algorithms.

Big Data

The term Big Data describes amounts of data that are too big for conventional data management systems to handle. The volume, velocity and variety of data overwhelm databases and storage. The result is that either data is discarded or unable to be analysed and mined for value.

Gartner has coined the term ‘Extreme Information Processing’ to describe Big Data – we think that’s a pretty good term to describe the limits of capability of existing infrastructure.

There has always been Big Data in the sense that data volumes have always exceeded the ability for systems to process it. The tool sets to store & analyse and make sense of the data generally lag behind the quantity and diversity of information sources.

The actual amounts and types of Big Data this relates to is constantly being redefined as database and hardware manufacturers are constantly moving those limits forward.

Several technologies have emerged to manage the Big Data challenge. Hadoop has become a favourite tool to store and manage the data, traditional database manufacturers have extended their products to deal with the volumes, variety and velocity and new database firms such as ParAccel, Sand & Vectorwise have emerged offering ultra-fast columnar data management systems. Some firms, such as Hadapt, have a hybrid solution utilising tools from both the relational and unstructured world with an intelligent query optimiser and loader which places data in the optimum storage engine.

Business Intelligence

The term Business Intelligence(BI) has been around for a long time and the growth of data and then Big Data has focused more attention in this space. The essence of BI is to obtain value from data to help build business benefits. Big Data itself could be seen as BI – it is a set of applications, techniques and technologies that are applied to an entities data to help produce insight and value from it’s data.

There are a multitude of products that help build Business Intelligence solutions – ranging from the humble Excel to sophisticated (aka expensive) solutions requiring complex and extensive infrastructure to support. In the last few years a number of user friendly tools such as Qlikview and Tableau have emerged allowing tech-savvy business people to exploit and re-cut their data without the need for input from the IT department.

Data Science

This is, perhaps, the most exciting area of Big Data. This is where the Big Value is extracted from the data. One Data Scientist partner of ours described as follows: ” Big Data is plumbing and that Data Science is the value driver…”

Data Science is a mixture of scientific research techniques, advance programming and statistical skills (or hacking), philosophical thinking (perhaps previously known as ‘thinking outside the box’) and business insight. Basically it’s being able to think about new/different questions to ask, be technically able to intepret them into a machine based format, process the result, interpret them and then ask new questions based from the results of the previous set…

A diagram by blogger Drew Conway  describes some of the skills needed – maybe explains the lack of skills in this space!

In addition Pete Warden (creator of the Data Science Toolkit) and others have raised caution on the term Data Science “Anything that needs science in the name is not a real science” but confirms the need to have a definition of what Data Scientists do.

Database

Databases can generally be divided into structured and unstructured.

Structured are the traditional relational database management systems such as Oracle, DB2 and SQL-Server which are fantastic at organising large volumes of transactional and other data with the ability to load and query the data at speed with an integrity in the transactional process to ensure data quality.

Unstructured are technologies that can deal with any form of data that is thrown at them and then distribute out to a highly scalable platform. Hadoop is a good example of this product and a number of firms now produce, package and support the open-source product.

Feedback Loops

Feedback loops are systems where the output from the system are fed back into it to adjust or improve the system processing. Feedback loops exist widely in nature and in engineering systems – think of an oven – heat is applied to warm to a specific temperature and is measured by a thermostat – once the correct temperature is reached the thermostat informs the heating element and it shuts down until feedback from the thermostat says it is getting too cold and it turns on again… and so on.

Feedback loops are an essential part of extracting value from Big Data. Building in feedback and then incorporating Machine Learning methods start to allow systems to become semi-autonomous, this allows the Data Scientists to focus on new and more complex questions whilst testing and tweaking the feedback from their previous systems.

Hadoop

Hadoop is one of the key technologies to support the storage and processing of Big Data. Hadoop emerged from Google and its distributed Google File System and Mapreduce processing tools. It is an open source product under the Apache banner but, like Linux, is distributed by a number of commercial vendors that add support, consultancy and advice on top of the products.

Hadoop is a framework for running applications on large clusters of commodity hardware. The Hadoop framework transparently provides applications both reliability and data motion. Hadoop implements a computational paradigm named map/reduce, where the application is divided into many small fragments of work, each of which may be executed or re-executed on any node in the cluster. In addition, it provides a distributed file system that stores data on the compute nodes, providing very high aggregate bandwidth across the cluster. Both map/reduce and the distributed file system are designed so that node failures are automatically handled by the framework.

So Hadoop could almost be seen as a (big) bucket where you can throw any form and quantity of data into it and it will organise and know where that data resides and can retrieve and process it. It also accepts that there may be holes in the bucket and can patch them up by using additional resources to patch itself up – all in all very clever bucket!!

Hadoop runs on a scheduling basis so when a question is asked it breaks up the query and shoots them out to different parts of the distributed network in parallel and then waits and collates the answers.

 

We will continue this theme next month and then start discussing some of the technology organisations involve in more detail, such as covering Hive, Machine Learning, MapReduce, NoSQL and Pig.

 

Feedback Loops: How to maximise value from your Big Data

Posted on : 27-06-2012 | By : richard.gale | In : General News

Tags: , , , , , , , , , , , , , , , , ,

3

Closing the feedback loop

With businesses becoming increasingly sensitive to customer opinion of their brand, monitoring consumer feedback is becoming ever more important.  Additionally, the recognition of social media as an important and valid source of customer opinion has brought about a need for new systems and a new approach.

Traditional approaches of reactive response to any press coverage by a PR department, or conducting infrequent customer surveys whether online or by phone are all part of extremely slow-cycle feedback loops, no longer adequate to capture the ever-changing shifts in public sentiment.

They represent a huge hindrance to any business looking to improve brand relations; delay in feedback can cost real money.  Inevitably, the manual sections of traditional approaches create huge delays in information reaching its users.  These days, we need constant feedback and we need low-latency – the information needs to be almost real-time.  Wait a few moments too long, and suddenly the intelligence you captured could be stale and useless.

 

A social media listening post

Witness the rise of the “social media listening post”: a new breed of system designed to plug directly in to social networks, constantly watching for brand feedback automatically around the clock.  Some forward-thinking companies have already built such systems.  How does yours keep track right now?  If your competitors have it and you don’t, does that give them a competitive advantage over you?

I’d argue for the need for most big brands to have such a system these days.  Gone are the days when businesses could wait months for surveys or focus groups to trickle back with a sampled response from a small select group.  In that time, your brand could have been suffering ongoing damage, and by the time you find out, valuable customers have been lost.  Intelligence is readily available these days on an near-instantaneous basis, can you afford not to use it?

Some emerging “Big Data” platforms offer the perfect tool for monitoring public sentiment toward a company or brand, even in the face of the rapid explosion in data volumes from social media, which could easily overwhelm traditional BI analytics tools.  By implementing a social media “listening post” on cutting-edge Big Data technology, organisations now have the opportunity to unlock a new dimension in customer feedback and insight into public sentiment toward their brands.

Primarily, we must design the platform for low-latency continuous operation to allow complete closure of the feedback loop – that is to say, events (news, ad campaigns etc) can be monitored for near-real time positive/negative/neutral response by the public – thus bringing rapid response, corrections in strategy, into the realm of possibility.  Maybe you could just pull that new ad campaign early if something disastrous and unexpected happened to public reaction to the material?  It’s also about understanding trends and topics of interest to a brand audience, and who are the influencers.  Social media platforms like Twitter offer a rich granular model for exploring this complex web of social influence.

The three main challenges inherent in implementing a social media listening post are:

  • Data volume
  • Complexity of data integration – e.g. unstructured, semi-structured, evolving schema etc
  • Complexity of analysis – e.g. determining sentiment: is it really a positive or negative statement with respect to the brand?

To gain a complete picture of public opinion towards your brand or organisation through social media, many millions of web sites and data services must be consumed, continuously around the clock.  They need to be analysed in complex ways, far beyond traditional data warehouse query functionality.  Even just a sentiment analysis capability on its’ own poses a considerable challenge, and as a science is still an emerging discipline, but even more advanced techniques in Machine Learning may prove necessary to correctly interpret all signals from the data.  Data format will vary greatly among social media sources, ranging from regular ‘structured’ data through semi-and unstructured forms, to complex poly-structured data with many dimensions.  This structural complexity poses extreme difficulty for traditional data warehouses and up-front ETL (Extract-Transform-Load) approaches, and demands a far more flexible data consumption platform.

So how do we architect a system like this?  Generally speaking, at its core you will need some kind of distributed data capture and analysis platform.  Big Data platforms were designed to address problems where you have Volume, Variety, or Velocity of data – and most often, all three.  In this particular use-case, we need to look towards the cutting-edge of the technology, and look for a platform which supports near-real time, streaming data capture and analysis, with the capability to implement Machine Learning algorithms for the analytics/sentiment analysis component.

For the back-end, a high-throughput data capture/store/query capability is required, suitable for continuous streaming operation, probably with redundancy/high-availability, and a non-rigid schema layer capable of evolving over time as the data sources evolve.  So-called “No-SQL” database systems (which in fact stands for “Not Only SQL” rather than NO SQL) such as Cassandra, HBase or MongoDB offer excellent properties for high-volume streaming operation, and would be well suited to the challenge, or there are also commercial derviatives of some of these platforms on the market, such as the excellent Acunu Data Platform which commercialises Cassandra.

Additionally a facility for complex analytics, most likely via parallel, shared-nothing computation (due to the extreme data volumes) will be required to derive any useful insight from the data you capture.  For this component, paradigms like MapReduce are a natural choice, offering the benefits of linear scalability and unlimited flexibility in implementing custom algorithms, and libraries of Machine Learning such as the great Apache Mahout project have grown up around providing a toolbox of analytics on top of the MapReduce programming model.  Hadoop is an obvious choice when it comes to exploiting the MapReduce model, but since the objective here is to achieve near-real time streaming capability, it may not always be the best choice.  Cassandra and HBase (which in fact runs on Hadoop) can be a good choice since they offer the low-latency characteristics, coupled with MapReduce analytic capabilities.

Finally, some form of front-end visualization/analysis layer will be necessary to graph and present results in a usable visual form.  There are some new open-source BI Analytics tools around which might do the job, or a variety of commercial offerings in this area.  The exact package to be selected for this component is strongly dependent on the desired insight and form of visualization and so is probably beyond the scope of this article, but of course requirements are clearly that it needs to interface with whatever back-end storage layer you choose.

Given the cutting-edge nature of many of the systems required, a solid operational team is really essential to maintain and tune the system for continuous operation.  Many of these products have complex tuning requirements demanding specialist skill with dedicated headcount.  Some of the commercial open-source offerings have support packages that can help mitigate this requirement, but either way, the need for operational resource must never be ignored if the project is to be a success.

The technologies highlighted here are evolving rapidly, with variants or entirely new products appearing frequently, as such it would not be unreasonable to expect significant advancement in this field within the 6-12 month timeframe.  This will likely translate into advancement on two fronts: increased and functional capability of the leading distributed data platforms in areas such as query interface and indexing capability, and reduced operational complexity and maintenance requirements.

Tim Seears – CTO Big Data Partnership.

www.bigdatapartnership.com

Big Data Partnership and Broadgate Consultants are working together to help organisations unlock the value in their big data.  This partnership allows us to provide a full suite of services from thought-leadership, strategic advice and consultancy through delivery and implementation of robust, supportable big data solutions.  We would be delighted to discuss further the use case outlined above should you wish to explore adapting these concepts to your own business.

Technology Empowerment vs. Frustration: A User(s) Guide

Posted on : 30-04-2012 | By : richard.gale | In : General News

Tags: , , , , , , , , , , , , ,

0

One of the most exciting aspects of running an IT Consultancy is the variety of views and opinions we get to hear about from our clients, teams, suppliers & partners. We want to focus this month on looking at the relationships between business users of technology and the IT departments that supply solutions.  As with most ‘marriages’ this is a complex, ever changing interaction, but two factors are key to this are: Empowerment  and Frustration.

We think we are on the cusp of a major change in the balance of power between the user and IT departments, this happens very rarely so we are watching with interest how it develops over the next few years. Business users now are digitally aware, often frustrated by tech departments and confident enough to bypass them. This is a dangerous time for the traditional IT team and trying to control and close down alternatives would be a mistake and is probably too late anyway.

The graph below highlights how users frustration with IT has increased whilst their ability to control has diminished. There was a brief (golden?) period in the 1990’s where Desktop computing and productivity tools helped business users become more self-sufficient but that was then reduced as IT took control back of the desktop

 

Business Frustration vs. Empowerment 1970 onwards

Business Frustration vs. Empowerment 1970 onwards

The 70s – the decade of opportunity

Obviously computing did not start in 1970 (although Unix time did start on Jan 1st…) but the ’70s was perhaps the time when IT started making major positive impacts to organisations. Payroll, invoicing, purchasing and accounting functions started to become more widely automated and computerised. The productivity gains from some of these applications transformed businesses and the suppliers (IBM, IBM, IBM etc) did phenomenally well out of it. User empowerment was minimal but frustration was also low as demand for additional functions and flexibility was limited

The 80s – growing demands and an awakening workforce

The 1980s saw the rise of the desktop with Apple and Microsoft fighting for top-dog position. This explosion of functionality was exciting for the home user initially and then quickly grew to be utilised and exploited by organisations. Productivity tools such as spreadsheets, word processing and email allowed business users to create and modify their working practices and processes. The adoption of desktops accelerated towards the end of the decade so we make this decade as: Empowerment up (and growing) and frustration down.

The 90s – Power to the people (sort of… for a while)

Traditional IT departments recognised the power of the utility PC and adjusted (and grew) to support the business. Networks and so file sharing, and as importantly, backups became the norm. Business departments were  becoming more autonomous with the power the PC gave them. Macros and Visual Basic add-ons turned into business critical applications, new software was being produced by innovative companies all the time. Business users were free to download and run pretty much anything on their work computer.  The complexity of IT infrastructure and applications was increasing exponentially… so inevitably things began to creak and break, end user applications (or EUCs as they became known) could be intolerant of change (such as a new version of Excel), also they were often put together in an ad-hoc fashion to solve a particular problem and then woven into a complicated business process which became impossible to change. This, with the additional twist of the ‘computer virus’ gave the opportunity for the IT department to lock-down users PCs and force applications to be developed by the new, in-house, development teams. Result for the 1990s – User frustrations rising, demands rising and empowerment on the way down.

The 00s – Control and process

The dawn of the new millennium, the first crash of the dot coms and the lockdown of user PCs continues at pace. The impacts from the ’90s – unsupportable applications, viruses, complexity of the desktop were joined by higher levels of regulation, audit and internal controls. These combined with a focus on saving money in the still expanding IT departments caused further reduction in user abilities to ‘do IT’. In large organisations most PCs were constrained to such an extent they could only be used for basic email, word processing and Excel (now the only spreadsheet in town). Any new application would have to go through a lengthy evaluation, purchasing, configuration, security testing, ‘packaging’ and finally installation if it was required for business use so inevitably – User frustration was rising to dangerous levels and empowerment was further degraded.

The 10s – A digital workforce demands power

The controls and restrictions of the ’00s now ran into signification budgetary restrictions on IT departments. Costs were, and are, being squeezed, fewer and less experienced resources are dealing with increasing demands an pace. Frustration levels were peaking to a point relationships between IT and business were breaking down. Outsourcing parts of IT organisations made some significant savings on budgets but did nothing to reduce user concerns around delivery and service (at least in the short term).

Some users started to ‘rebel’, the increasing  visibility of software as a service (SaaS) enabled certain functions to implement simple but functionally rich solutions to a team or department relatively easily and without much/any IT involvement. Salesforce.com did amazingly well through an ease of use, globally available, infrastructure free product which did everything a Sales team needed and could be purchased on a credit card and expensed…  Internal productivity tools such as Sharepoint started being used for  complex workflow processes – by the business without need for IT.

At the same time personal devices such as smartphones, tablets and laptops (BYOD) became the norm for the business community. They want and are demanding ability to share business data on these tools.

Public cloud usage by business users is also starting to gather pace and the credit card/utility model means some functions do not use IT for certain areas where quick creation and turnaround of data/processing is needed (whether that is wise or not is a different question).

So what are IT departments doing to ensure they can continue to help business units in the future:

  • Become much more business needs focused (obvious but needs to be addressed as a priority)
  • Encourage the use of BYOD – in the end it will save the firm money through not having to purchase hardware
  • Aggressively addressing traditional structures and costs – ask questions such as
    • “Why can’t we get someone else to run this for us?” – whether outsource, cloud or SaaS
    • “Why don’t you have a SaaS/Cloud enabled product?”
  • Become a service broker to the business – looking ahead and managing service and supplier rather than infrastructure, applications or process.

User empowerment rising but user demands and frustrations still high

The 20s – Business runs business and Utilities run IT

What will happen in the next few years? Who can tell but trends we are seeing include:

  • There will be a small number of large firms with massive computing capacity – most other organisations will just use this power as required.
  • There will be new opportunities for financial engineering such as exchange trading computing & processing power, storage & network capacity.
  • IT infrastructure departments in the majority of organisations would have disappeared
  • IT for business organisations will consist of Strategy, Architecture, Business Design, (small specialised) Development focusing on value-add tooling and integration, Relationship and Supply management of providers, products and  pricing

All these point to more power for the business user but one trend emerging which may reverse that is the on-going impact of legislation and regulation. This could limit business capability to be ‘free’ and the lockdown of IT may begin again but this time more from government onto the external suppliers of the service resulting in increasing frustration levels and reduced empowerment….. interesting to see how this goes.

 

 

Education – How can technology help?

Posted on : 28-03-2012 | By : richard.gale | In : General News

Tags: , , , , , , , , , ,

1

The development of the Raspberry Pi, (a £30 computer designed to give the next generation of children programming skills) started a few of us at Broadgate thinking about technology and education – Are there ways that schools and other organisations could utilise some of the current technology trends?

 

Background

ICT in the classroom has changed radically over the last 30 years. In the 1980s there existed  ‘the school computer’ where a select group of students could spend lunch-times and evenings writing programmes in incomprehensible languages resulting in simple calculators or battleship type games. Now computers are embedded in homes, offices and schools – the UK GCSE ICT course now includes a full project management lifecycle study from initial requirements gathering to system implementation. Outside the classroom computers are used for all the usual business processes including pupil records, finance, scheduling and communications.

In the UK the Professor Steve Furber of Royal Society  criticised the skills of ICT teachers (for example only 35% have a specific qualification in the subject contrasting with 74% of maths teachers) and teaching and proposed the standalone subject be scrapped. He said that IT was so important it should be part of the core curriculum integrated into schools to improve digital literacy alongside reading, writing and arithmetic.

 

Our Broad Thoughts

Integrating technology into the core of the curriculum is key and we think the opportunities for technology to improve, accelerate and enhance the educational experience for both pupils and teachers are huge.

A few of our ideas are below and we’d welcome your thoughts on these and other areas.

 

1. Social Media – collaborative approach

This is an area were the pupils excel and, as a rule, are ahead of the teachers. These technical natives have grown up with technology and the use of social networks is a natural extension of them. They are used for updating friends, promoting themselves, discussing & arguing and sharing information. Are there ways schools can utilise this technology and more importantly energy & enthusiasm?

The key element of Twitter, Facebook, Pinterest etc etc. is socialising and sharing ideas. Discussions started in the classroom can be extended to home/remote working. These often happen informally amongst pupils but could have added value if teachers could interact and assist. Schools could create ecosystems for collaborative working. Initially it may be difficult to attract pupils to the school created areas so a more successful approach may be for the pupils to create and teachers to join. Obviously there are risks to this but the idea that there be a shared area for thoughts and ideas without negativity in a safe space.

 

2. BYOD/Mobility – help or hindrance?

Many pupils now carry smartphones some are starting to carry iPads too. These can be viewed negatively from a school perspective as they can, at worst, be a distraction in class and potentially a cheating and bullying device.

So, accepting they are not going away, how can the positive aspects of smartphones be utilised?

Simple techniques such as using calendar facilities to upload the class timetables, reminders for homework, coursework etc. Alerts for taking in gym kit could be pushed out to pupils (and parents) devices. Obviously this does not completely remove ‘The dog ate my blackberry’ issue for teachers but it should help!

Coursework, homework and useful reference material & links can be also pushed up to phones to consolidate knowledge and aide pupils.

Even more useful would be to think how people use their phones and tablets, as well as communicating they are great research tools and could be used within the classroom situation helping finding different viewpoints on historical events for instance (and so helping improve the critical thinking of children as there are so many different and potentially inaccurate ‘facts’ out there –  “Always check your sources!” as my history teacher used to say).

Tablets and iPads in particular are very exciting tools for learning. They move away from the conformity of keyboards and mice and can make learning truly interactive. They are starting to be adopted in schools but we think there is a great potential to radically change the classroom and learning experience.

Obviously not all pupils can afford smartphones so to avoid technology related poverty trap, less well-off pupils should be provided with the same phones/tablets. Cash rich technology organisations should be approached to assist and a need mechanism could be introduced such as that for school dinners. Also parents’ wishes need to be taken into account as the age that a child is allowed to use a phone can vary widely.

 

3. Data Intelligence – Capturing Trends

As with any organisation there are large amounts of data contained in multiple stores. Also as with any other organisation that data is often not connected with other relevant sources so the information value of that data is lost.

One of our colleagues moved from financial services to education and was surprised by the lack of management information available to the teaching team. The data is there but it was not being translated into meaningful information.

There must be potential to link an individual teachers/class/subject results to identify trends. E.g. if the interim test results for the year 8 history class is going down, is it because the course work has been modified, there is a new teacher or the pupils socio-economic make up has changed? A good business intelligence application can trawl the data to identify the causes and so the appropriate remedial actions taken.

Similarly if maths A level results suddenly improve, what are the reasons for this and how can then they be applied elsewhere (internally or externally see Communications below)

If an individual pupils attainment levels started dropping off then additional attention could be provided to that student to help them get back on track and also identify and help hopefully resolve the underlying cause of the issue.

Other areas which may be more radical may involve gathering the information and identifying the better performing areas within or across schools including measurements such as a ‘cost per GCSE’ or ‘Entry/Exit attainment improvement’ of pupils.

 

4. Communications – sharing

Schools can sometimes be inward looking. Often teachers stay in one school for a considerable time. This is great for continuity and progression but may result in lost opportunities for innovation and changes that are happening in the extended educational community. Some schools encourage visits to other schools, conferences and courses can help here and there is big opportunity to take this further.

Businesses utilise management consultants to help improve organisations for efficiency or growth with the view to build revenue and profits.

Could information sharing, more inter-school communications, best practice and teaching artefact sharing help schools and teaching? Information is now available locally, nationally and internationally so can be shared and used amongst educational establishments.

 

5. Cloud Computing – Who needs infrastructure?

Most schools have a room/office with the computers/servers. As IT requirements grew in terms of finance, pupils’ records, assessments, operational and staff information the amount and complexity of equipment expanded often requiring dedicated resources to support and change. As we have been saying to our clients, with the advent of Cloud and Software as a Service the need for this is reducing to the point where the default should be for someone else to host, manage and support a schools technology infrastructure.

Obviously, as with any sensitive information, the question of student data privacy and security needs to be addressed. This should already be the case and the existing policies should be proved by any potential vendor and tested regularly by the educational authority.

 

6. Security – Paramount

The most important part of the use of technology is pupil safety and confidentiality. This is obvious and needs to be kept in the forefront of any discussion in regard to the introduction of a system whether it is IT or other mechanism.

 

Final thoughts

The opportunities for technology to help improve schools is both immense and exciting, this is not an area we have worked in but are really interested in stimulating a debate and seeing if we can assist in any way. Every time we help people outside our core business areas of finance IT we find not only do we enjoy it but we too learn a great deal from different working structures and cultures.

“If we teach today as we taught yesterday, we rob our children of tomorrow” John Dewey – innovation & technology can help us help the next generation.