This question springs out of the not so distant future seen by Stafford Masie who can be described as a digital content and cloud computing specialist.
Masie, who is said to be associated with SEACOM, said in a statement the internet was developing into a highly intelligent system called Smurfs. This is “a Sensory Membrane of Ubiquitous Real-time Federated Subsystems (Smurfs) that delivers rich services and applications that were the stuff of science-fiction just a decade ago”.
He said a range of technologies are maturing and meshing together into a rich network that already has immense capabilities. Just think about doing a Google search on your phone. The device does none of the processing, yet you have access to a wealth of information within a few seconds of starting the search.
This is just the beginning. “Within the next 18 to 36 months, we can expect to see a range of machine to machine applications as well the growth of big data completely change users’ expectations of what the global network can do for them”.
He said the internet is increasingly driven by machine to machine interactions. In time, the majority of traffic on the Internet will originate from machines gathering and sharing data rather than from humans sharing and accessing information.
In further explaining the Smurfs concept Masie says the next-generation of Internet technologies are:
“Sensory: Inanimate objects – microphones, radio-frequency identification readers, location-aware smartphones, even wearable devices that monitor a user’s health – are increasingly connected to the Internet. They gather and share large volumes of data with the world, feeding systems with plenty of information for decision-making.
Membrane: The whole ecosystem is designed to make it easy to share information and access services across a network; in fact, sharing often happens as organically as osmosis in an automated process.
Ubiquitous: Like electricity, connectivity is always present. Through mobile data connections, apps and devices, people have access to network service, information and applications wherever they are, especially in urban areas. Offline and Online are converging and the resultant is hyperconnectivity and always being tethered.
Real-time: Applications harvest data in real-time and act on it. Users will increasingly expect things to happen right now, rather than sometime today or even in the next few minutes.
Federated subsystems: The whole ecosystem is federated and apps continually call on a range of subsystems to provision services, and information. Apps draw together data from a number of sources to deliver rich and powerful services difficult to conceive of in the past”.
He said one example of Smurfs in action is the Cabsense service in New York City, which analyses tens of millions of GPS data points, updated continuously in real-time, from NYC taxis to help users find the best corner to catch a cab. The iPhone and Android app draws on data from the New York City Taxi & Limousine Commission and other sources to help the user find the best street corners to hail a cab based on the day of the week, the time, and their current location.
Ease of use and accessibility is characteristic of the new world of computing. Users are interacting with data through powerful and easy to use apps rather than through websites. Rather than owning content, they access it through services such as music service Spotify and video streaming service Netflix. We are moving away from the world of owning things, to the notion of accessing things in real-time wherever we may be. Ownership to access; from today to now, from pages to live streams, from me to we, from items to data”.
“We are rapidly moving away from keyboard taps and mouse clicks towards touch and feel.” Motion and location awareness, gesture recognition and touch screens are creating more immersive and intuitive ways for people to interact with Web services and applications.
For example, a company called Runkeeper created an app and website that allows runners to publish data about their exercise activity. When it opened its app up to the market through an application programming interface, it suddenly became possible for third-party devices such as Wi-Fi scales, heart monitors, and wearable activity tracking devices such as the Jawbone UP wristband to share data with the app, providing a stream of real-time health and fitness data about the user.
This emerging computing fabric is transforming network engagement from human-to-machine and into the world of machine-to-machine-human engagement. A world where vast amounts of sensory, biological, atmospheric, location and climatological will be captured ambiently and persistently, creating real-time feedback loops and correlation amongst distributed subsystems