“Consider this figure: $136 billion per year. That’s the research firm IDC’s estimate of the size of the big data market, worldwide, in 2016. This figure should surprise no one with an interest in big data.
But here’s another number: $3.1 trillion, IBM’s estimate of the yearly cost of poor quality data, in the US alone, in 2016. While most people who deal in data every day know that bad data is costly, this figure stuns.” — HBR
While human input errors have a significant effect on the $3.1 trillion number, outdated data is playing a major role in costing companies a substantial amount of money every year.
Most predictive and forecasting models depend on historical data, that means that the more recent the data is and the more frequently you capture that data, the more accurate your information model can become.
Internet of Things and real-time sensors promise to significantly reduce the challenges associated with poor quality data by capturing data accurately and consistently over a desired period. Experts predict that in a few short years we will have close to30 Billion IoT devicessystematically capturing data.
While IoT is limited to the type of data it can capture and might not be a solution to all poor quality data challenges, it can help organizations to process information in near real-time. Helping them make more informed, data-driven decisions.
What does it mean to you? It means that IoT should be an integral part of your intelligent systems. You don’t have to install IoT sensors everywhere, there are plenty of APIs that leverage the IoT data, but you do have to consider that capturing quality data in real-time can enable your system to provide better insights than basing results on the outdated existing data.