If you think that the growth of IP traffic has been been exponential in the past decade, then this decade is going to be something else. We're only starting to see what the Internet is capable of. If the past decade has placed HTTP port 80 traffic – the web – at the centre, then expect this decade to be much more decentralised in terms of the types of devices sending out data, and the types of devices receiving data.
Telemetry has been revolutionised by the common protocol. Where data used to be hard to retrieve and share, it's now possible to obtain from practically anywhere, to throw it somewhere, and to make it meaningful. And, making it meaningful is the hardest bit. By far.
Giving both clarity and context to data is gaining in interest, as access to data is becoming easier. Data visualisation – dataviz – is now starting to become interesting to a wider group of people, as the ability to process lots of data and to make it look visually elegant becomes easier to achieve. Giving context to data is essentially the practice of statistical analysis; the word statistics is of central European origin, meaning "political state". The study and discipline of statistics allowed Governments to understand the effectiveness of its functions, and to see where interventions had to be made. As ITO's Chris Osborne observes, the availability of rich data within our environments has transformed statistics into urban arithmetic.
Previously, statisticians and planners were required to model outcomes, based on predictions and assumptions. It was Edward Tufte that transformed these decisions into high-level iconography, making data visualisation accessible to all. Now, with the accessibility of visuals comes the accessibility of data.