February 28, 2017. Technical Podcasts Author: David Fearne

What are the Hottest Technology Trends of 2017? Part 1

Historically, it might take three to four years before businesses would actually embrace brand new technologies from the wild. However, this is changing and can now be as little as months or even weeks. As the competitive edge gets thinner, organisations are always on the lookout for what they can adopt in order to gain an advantage in the marketplace. This is now all about having the fastest, latest, most highly automated and most intelligent technology.

In 2017, the reality is that cutting-edge technology is needed for many aspects of business – not least to enhance the consumer experience and engagement with the brand. However, many businesses aren’t gaining from the differentiation that technology provides because their tech is at least a year old. Companies need to take advantage of these rapid developments and adopt the technology of now. Here lies the opportunity for the channel to take a real thought leadership position within the enterprise. Channel organisations should use their understanding of a customer’s business and help them differentiate from their competitors; whether it’s cognitive computing, artificial intelligence or the latest cloud software. 

So, which technology is growing the fastest? Here’s my take on the hottest trends for 2017…

Data protection, chalkboard 

The power of the edge

We’re creating more data than ever before in human history: from consumer tech like phones and wearables, to more enterprise-focused edge technologies such as traffic monitors and automated machinery. What will change isn’t just the amount of data – instead, it’s how it will be used and processed. 

As edge devices become more capable of processing data into information and then communicating processed information back, two things change. Firstly, the connectivity profile either lowers the bandwidth required dramatically or provides the opportunity to return information at a higher frequency. As we move away from systems that are designed to process the raw, unstructured data into information-driven systems there will be lower overheads, better and quicker insights - due to the decreased time taken to get answers - and the greater interrogation of data, with a more diverse set of data sources. 

 Big Data

The year of data divorce

GDPR (the General Data Protection Regulation) will come into force in May 2018 and brings with it some hefty fines for breaches of the rules. How will enterprises prepare for this? As it stands, all the Personally Identifiable Data (PID) an organisation has will need to be GDPR compliant. However, a lot of companies still need to ask themselves what the value of storing this data is; and do they need to retain it in its current form.

Many businesses have been storing data under the false pretence that one day it may be useful. Hopefully, 2017 will be the year organisations turn that data into usable information and finally get rid of the raw PID they would otherwise have had to apply GDPR to. Customer information such as website transaction logs and captured social data can be processed into anonymised but still valuable information. Companies would then be left with only the valuable PID, for which there would be a sensible case made for the increased overheads of storing it.

Traffic jam London, 

Data-driven decision making 

As it’s become easier to collect and process data, our definition of big data is changing. In 2017, it’s all about increasing the dimensionality of your decision-making. How can you introduce more and more diverse data sources to complement the decisions you’re making?

For instance, the Office of National Statistics has a data set-up for all planned roadworks. Although you can’t stop the roadworks happening, you can predict in advance when you’re going to see changes in traffic patterns and respond to them. For example, retail stores can predict when they are going to see a reduction in footfall – due to those known traffic issues - and resource the stores with people to appropriate levels, ultimately reducing costs. The retailer can plan the impact of these kinds of events into their resource management and planning systems for the whole country; keeping stores open and operations expenses lower. 

We’ve got to embrace our ability to use lots of dimensions and multiple data sources to achieve the most effective decisions and, ultimately, the right answers.

 Smart city

The machine learning revolution 

Every year since I’ve started making tech predictions, I have in some way or another spoken about the rise of machine learning. It’s only recently that machine learning has become so prolific. This is partly due to the availability of the required computing power, but also down to the market maturity and availability of tooling to allow anyone - with even a modicum of understanding - to get started with building machine learning into their applications, processes and daily lives. 

Its importance is such that parallels have been drawn between this current cognitive computing era and the industrial revolution. I say this because it will have the same profound disruptive impact; organisations will look at highly repetitive jobs and processes and start to replace them with machine learning approaches in the same way engines replaced horses. 

2017 is the year for channel customers to get an early mover advantage and start talking to customers about machine learning. Based on these conversations, customers can find opportunities where machine learning can be implemented, based on a business need rather than a technological one.

 API

APIs eating the universe 

APIs have evolved to one of the most important tools in computing. Now, if you’re an application developer and you don’t have an API interface that represents the functionality of your software then you’ve got a problem. 

APIs have the power to link lots of other elements within the network, which means you can automate them and they won’t sit there being technology silos. Currently, there are lots of applications that sit together – so pockets of really fantastic potential – yet they’re in isolation and don’t talk to each other. This means a human has to be in the middle copying and pasting all the data, such as user records. 

A good solid API can link everything together and update everywhere, providing a holistic view. Bringing it back to the business opportunity for the channel, integration is going to become more and more critical. 

 

Don't miss the latest Arrow Bandwidth Episode to hear David and Rich talk about these trends 

I'd love to hear your own thoughts on the top trends for 2017 in the comments below. You can read Part 2 here.

Related Posts

  • Top Technologies, Trends and Concepts in 2016 – Part 1

    Some of you may be familiar with ‘Moore’s Law’, an observation made in 1965 that the number of transistors in an integrated circuit would double every year and continue to do so.

    > More

  • Arrow Bandwidth Christmas Special | 2016 Best Bits

    2016 has been the year of Bandwidth! David, Rich and Producer Hannah wish you all a very Merry Christmas - here's the best bits from 2016.

    > More

  • What are the Hottest Technology Trends of 2017? Part 2

    David Fearne, Arrow's UK&I Technical Director, concludes his technology predictions for 2017. Read more to find out which innovations made the cut.

    > More

Blog Updates

Popular Posts

The NetApp, They Are A-Changin’

> More

What are the Hottest Technology Trends of 2017? Part 1

> More

Arrow Bandwidth Episode 1 – IOT 101: From Sensor to Sunset

> More

IoT – The Internet of Treachery

> More

What are the Hottest Technology Trends of 2017? Part 2

> More

Blog Archive