A growing amount of connected devices in our environment and the growing amount of wearables on our body are producing a huge flow of data. We're confronted with informative graphs about our body and our activities such as sleeping and eating, enabling us to decide to do or don't do certain things. Ordinary household objects will start reporting to us directly, expecting us to react on their messages and data. To avoid being overwhelmed with all that data, we're in need of a mechanism to deal with most of the tasks automatically. Instead of us looking at data, an automated script could absorb and analyse the data much better. And if the right rules are defined, it could even act straight away by sending commands to connected objects, or ourselves. Because in some cases, we are the ones that need to take action.
One scenario is that we'll slowly get used to wearing augmented reality wearables that will instruct us what to do and how to do it. It's highly probably that we're going to get a lot more guidance from cloud based automated systems that monitor us and our (data) environment through our semi-digital senses. T hink of the usual suspects such as Google and Microsoft. If the cloud is in control, we'll risk to live our life as human swarm robots.
So it's time to develop a mechanism to remain in control - to be your own robot. The starting point is openness of the logic used to process our data, so you can understand and trace why decisions are being taken, and choose or adapt the behavioral rules that apply to yourself. The proposed system will consist of an augmented reality wearable showing you what to do, as a result of the rules and the continuous analysis of sensordata and cloud data originating from the other people present in the space.
A connected smartphone interface functions as the configuration tool for the rules and the connected sensors:
The basic layer of rules will make use of the data coming straight from the sensors: movement, direction, heartrate, noice detection etc. Because the Hololens device has a highly accurate positional tracking, the origin of data is known and it could be used in the rules:
In addition to measuring the value of the hardware sensors, subjective data can also be obtained and mapped precisely throughout the space:
By gathering all data the system will be able to draw conclusions that are beyond the individual measurements. New rules can be specified based on this derivative data, so the humans in the room will be enhanced with sensors that are capable of sensing abstract terms too.
This set-up of the experiment is a room for 4 participants wearing a Hololens. There's an ongoing session and new participants can join whenever there is a Hololens available. Before entering, they define their own set of rules. When entering, they agree to follow the instructions, or do nothing. Below is a sketch of what can be seen when viewing through the Hololens:
Although the ongoing developments in machine learning will probably lead to a less transparent mechanism controlling us, that doesn't mean that thinking about losing control is less relevant. There are more and more easy to use wearables on the market and sensors are being embedded in more and more devices. When enough data is aggregated, the majority of people will consider the output of these devices as useful, making it harder for critics to ask attention for the downside and risks of uncontrolled data sharing. This thought provoking experiment is a ‘in your face' visualisation of the impact these seemingly innocent sensors might have on us in the future. With just a small set of rules, the project lets participants experience what it is like to act and to be monitored and controlled like a robot. With rules that are well defined but flexible enough, unexpected scenes might surprise participants as well as spectators.