Innovation for Oppression and Freedom
The Commit/ event in Amsterdam hosted a demo market showcasing Big Data innovation. This so-called ‘public-private research community’ of mostly PhD students showing off their graduation projects gives a clear indication of the direction new data technology is pursuing.
But I could not stop noticing that some of these innovations are dangerous tools in the hands of (future) oppressive regimes. These innovations should scare the daylights out of freedom loving individuals. Ranging from technology as seemingly innocent as ‘to which music is the world listening?’, listing as its alternative application the analysis of behavioral patterns in any consumer product, to openly oppressive technology such as ‘wireless crowd monitoring’ and ‘monitoring group emotions’.
Innovation for Oppression
Here is a few applications of innovation for oppression (emphasis mine). Note that most of these oppressive use cases are colloquially listed as ‘alternative applications’.
Our technology might be used in the digital humanities or by intelligence and security services.
The type of challenge faced by investigative journalists can be likened to other intelligence activities in which the starting condition is ‘a pile of documents’. Therefore, alternative application areas are business intelligence, police investigation and academic research.
Alternatively, the crowd emotion monitor can also be used to increase safety at mass events. By monitoring the emotion of a crowd and seeing whether the group emotion gets heated up, possible incidents might be detected in an early stage.
The Fishualization feedback system can help to reduce stress and increase productivity at work on the basis of an estimation of the workload and the mental and physical fitness of a worker. The initial model relies mostly on computer interaction, identification of tasks, and context switches. In the near future it will be combined with affect and physical aspects. We can also include the analysis of facial expressions or e-mail sentiments.
The algorithm we propose can be applied to other domains as well. For example, it can be used to help companies around the world to find potential business partners. If each company would expose its services in a standardized way, our system would be able to search among hundreds of thousands of companies in order to find meaningful partnerships (e.g. Apple with Nike). It could also be used to sell bundles of products that are related or to match people in social networks.
Information networks are everywhere. Our technologies are also applicable for environmental hazards, military operations, and hard-to-deploy sensor networks. Also the monitoring of urban environmental conditions – especially in emergency situations such as explosions or contamination – can profit from a quickly deployable wireless sensor network. These features make us a potential partner for military units and environmental agencies.
This is the perfect tool for human herd control. Intelligent lampposts can signal danger or safety to guide crowd movement.
Our research can also be applied to identify different groups of people visiting a city, such as shoppers, tourists and commuters.
If the technology can differentiate between shoppers and tourists, I’m sure it can also be used to identify obedient masses versus ‘subversive elements’.
Although cleverly marketed for safety, this technology obviously can be used to detect early gatherings of angry mobs, starting uprisings and formation of mass demonstrations.
I am sure that the PhD students and scientists working on these technologies have their hearts in the right place and are genuinely working towards the progress of all mankind. However, good intentions do not rule out oppressive use cases. It is very obvious that when the above technologies are combined and put to use by oppressive regimes, such regimes can (more) effectively control human herds in their own benefit. If the common people weren’t cattle yet, they will be.
Therefore we need to ask ourselves who will protect the people when big funding invests heavily in oppressive herd control technology? Our freedom and our privacy become matters of organized self-defence: crowd defence. We need decentralized swarm-like technologies (as in: cannot be shut down) to secure our freedom. And we need to get funded!