This is part one of our series on the Internet Of Things (IoT). Please find Part 2: The Data Goldmine here.
Traditional centralised databases will always have a role to play in analytics. However, the rise of the Internet of Things (IoT) suggests that we will soon live in a world that’s anything but centralised. Estimated to be worth somewhere between $3.9 trillion and $11.1 trillion by 2025 IoT technologies are giving physical products the sensor and online capability to report back on their own status as well as to communicate with other devices, all via the existing internet. As these IoT initiatives flourish, data management will move from the central data repository towards the edge of the network.
IoT organisations are nearly twice as likely as others to have automated data capture. They embed data management into devices to facilitate a smooth and steady stream of information, where data is managed as soon as it is generated within an IoT-ready architecture.
But here’s why traditional centralised databases will never become a thing of the past. All this streaming data must then be aggregated at the edge and delivered to the central databases as averages of manageable periods of time, with the central database having to handle the influx only at a controlled rate.
Any IoT initiative your business considers must be purpose-specific and challenge normal information governance policies so that it will deliver new business models and data structures. Organisations must analyse their current policies to ensure that the information is correctly defined, is of useful quality, and has the appropriate permissions and levels of security for its access.
Much of this is common sense and all of it relies on existing best practice. The only difference is that IoT advances challenge your systems through greater data flow and a wider range of applications, adding more complex requirements to data governance processes and tools . Do you think your organisation can meet those data governance challenges?