Top IoT Data Challenges and Management Strategies for Analytics, Tracking, and Growth

Share This Post

Share on facebook
Share on linkedin
Share on twitter
Share on email

IoT is fun, IoT is accessible, IoT can generate a lot of useful (And useless) data to inform business operations and properly used, it can drive revenue growth (We’ll have a look at a few of these) But, most obviously, IoT comes with its own set of IoT Data Challenges and is the wild wild west of hard and software innovation in today’s technical landscape. There is a lot of hype in this sector (But for once, we’re not going to talk about that) – the “Big trick” to IoT is:-

  • Capturing data at the edge – securely, accurately, and reliably
  • Transporting the data securely (And securing the whole communications layer)
  • Storing the data in a sensible, scalable, and structured form
  • Making it accessible for other applications to combine, analyze and develop insights

That is it. Do all those things, and your well on your way to IoT success.

There are many well-established standard mechanisms to achieving all of the above – my first recommendation is to use them.

First, a bit of history.

In the relatively recent past, I worked for a large IoT vendor. One of my roles was triaging new devices; typically this meant listening to pitches from “hot” (Read “Keen and earnest”) new startups who had given birth to a “groundbreaking new device/service concept” – Almost all of which were not groundbreaking, new or particularly original.

Still, if you kiss enough frogs, you’ll eventually find a prince (Or develop a stunning rash)

Almost all of the presented devices broke one of the above golden rules. Sometimes fundamentally, sometimes fixably.

Let’s dig deeper into the golden rules:-


Capturing data at the edge securely, accurately, and reliably.

A lot of this comes down to the hardware design – selecting components with the appropriate tolerances, calibrating them well, maintaining robust time synchronization, and taking readings in the proper interval (Or stimulus).

In many applications, nationally mandated standards exist for the accuracy of sensors – for example, in the utility industry, energy meters need to measure (Water/Gas/Electricity) within fine tolerances, particularly if consumers are going to be billed using the data acquired. These standards often mandate a requirement to ENCRYPT data not only when being transmitted but as soon as captured – a requirement often expressed as “data in motion and at rest.”

Edge processing of data is an increasingly frequent requirement. Either extract information for a complex reading or purely to reduce the amount of data passed across a constrained data channel.

I came across an excellent example of edge processed IoT a couple of years ago at a trade fair. The startup in question had developed an AI camera system that could count people walking past and (Pretty accurately) score the emotions of all of those individuals from their facial expressions. The output was in the form of a three-digit tuple that coded the non-linear scale of human emotions. (Happy to Sad is an axis – but what about angry, irritated, etc.) It seemed to work pretty well across ethnicities and genders. They had set up an excellent demo to show that people going into the restaurant area were less happy coming out than when they went in. The time-stamped information presented on their platform showed clearly that the change in the emotive state increased with footfall. There’s a valuable insight there for the restaurant operator, but the applications across retail are simply enormous.


Transporting the data securely

You could have an extensive argument (And I frequently do) about whether data is most at risk of compromise when it is static on the sensor or in motion across the network. Network security is a huge, huge topic. But a few sound practice principles should be called out:-

  • Use standards-based encryption – Never invent your encryption. (For IoT systems, either TLS or DTLS are the current standards of choice)
  • Think HARD about key distribution – Again, standards exist for this. Systems that all use the same key or expose the private key are often the subject of some delightful news articles.
  • Limit peer-to-peer communications – Securing N links is challenging – this makes it less likely that the compromise of one node will compromise the entire network.

Almost all networks have their own link security – however, that should be viewed as an overlay. Devices should establish their own security relationship with the target platform.

Many networks also distribute time (And timing – these are two different things); care should be taken with network provided time to understand what guarantees are provided of an accurate timing source across the entire network infrastructure.


Storing the data in a sensible, scalable, and structured form

The first destination for data after receipt and validation should be a secure data store. The type and form of this depend very much on the data being received. It’s worthwhile assessing this against the Big Data 6V principles – there is another post on this very topic.

Time series data is often best stored in a specialist time-series database. Substantial amounts of data or data from a massive volume of sources can require the use of specific Big Data frameworks such as Hadoop to store it appropriately.

Conversely, alarm data and event data are not time-series sensor data in their commonly understood form; other products such as relational or flat file databases may provide a better framework.


Making it accessible for follow-on applications to combine, analyze and develop insights

Everything we have talked about so far is primarily infrastructure Capture, transport, store, and secure.

This layer is where the magic happens, and the value is added.

In its simplest form, making the data available via a well-structured API is a REALLY GOOD START. This opens the way to exposing the data to an array of data analytics and dashboarding tools. Whether these be Python, R, Spark, Power BI, or Tableau – your organization is bound to have a preference for its IoT Data Management Strategy – and if it doesn’t Umbric has extensive experience in this space.

You’ve got your data where it’s useful. Let us help you deliver the value.



About Umbric Data Services

Forget knowledge; data is power – especially when hooked up to custom web applications leveraging the latest in big data, machine learning, and AI to deliver profitable results for business.

Umbric Data Services combines the latest in tech with good old-fashioned customer service to deliver innovative, efficient software that drives productive business insight and revenues in the click of a button. Isn’t it time you worked smart, not hard? Find out more about how we help businesses to grow – visit today.

Subscribe To Our Newsletter

Get updates and learn from the best

More To Explore