Google as God?

Opportunities and Risks of the Information Age

by Dirk Helbing (ETH Zurich)

“You’re already a walking sensor platform… You are aware of the fact that somebody can know where you are at all times because you carry a mobile device, even if that mobile device is turned off. You know this, I hope? Yes? Well, you should… Since you can’t connect dots you don’t have, it drives us into a mode of, we fundamentally try to collect everything and hang on to it forever… It is really very nearly within our grasp to be able to compute on all human generated information.” 

 Ira “Gus” Hunt, CIA Chief Technology Officer [1] 


If God did not exist – people would invent one! The development of human civilization requires mechanisms promoting cooperation and social order. One of these mechanisms is based on the idea that everything we do is seen and judged by God – bad deeds will be punished, while good ones will be rewarded. The Information Age has now fueled the dream that God-like omniscience and omnipotence can be created by man.

For many decades, the processing power of computer chips has increased exponentially – a process known as “Moore’s Law”. Storage capacity is growing even faster. We are now entering a phase of the “Internet of Things”, where computer chips and measurement sensors will soon be scattered everywhere producing huge amounts of data (“Big Data”). It’s not just cell phones, computers and factories that are more and more connected, but our coffee machines, fridges, shoes and clothes, among others.

Gold Rush for the 21st Century Oil

This huge amount of data, including credit card transactions, communication with friends and colleagues, mobility data and more is already celebrated as the “Oil of the 21st Century”. The gold rush to exploit this valuable resource is just starting. But the more data are generated, stored and interpreted, the more will it be possible for companies and secret services to know us better than our friends and families do. For example, the company “Recorded Future” – apparently a joint initiative between Google and the CIA – seems to investigate people’s social networks and mobility profiles. Furthermore, credit card companies analyze “consumers’ genes” – the factors that determine our consumer behaviour.

Our individual motivations are analyzed in order to understand our decisions and influence our behavior through personalized search, individualized advertisements, and recommendations or decisions of our Facebook friends. But how many of these “friends” are trustable, how many of them are paid to influence us, and how many are software robots?

Humans Controlled by Computers?

Today, computers autonomously perform the majority of financial transactions. They decide how much we have to pay for our loans or insurances, based on our behavioral data and on those of our friends, neighbors and colleagues. People are increasingly discriminated by obscure “machine learning” algorithms, which are neither transparent nor have to meet particular quality standards. People classified as dangerous are now eliminated by drones, without a chance to prove their innocence, while some countries are discussing robots rights. Soon, Google will drive our cars. And in ten years, supercomputers will exceed the performance of human brains.

Is Privacy Still Needed?

What will the role of privacy be in such an information society? Some companies are already trying to turn privacy into a marketable commodity. This is done by first taking away our privacy and then selling it back to us. The company Acxiom, for example, is said to sell detailed data about more than 500 million people. Would it be possible to know beforehand whether the data will be used for good or bad? Many will pay to have their personal data removed from the Internet and commercial databases. And where data removal is not possible, fake identities and mobility profiles will be offered for sale, to obfuscate our traces.

Information Overload

“Big Data” do not necessarily mean that we’ll see the world more accurately. Rather, we will have to pay for “digital eyewear” that allows us to keep an overview in the data deluge. Those not willing to pay (possibly also with personal data) will be blinded by an information overload. Already today, we cannot assess the quality of the answers we get online. As the way in which the underlying data are processed remains hidden to the user, it is hard to know how much we are being manipulated by web services and social media.  But given the huge economic potential, it is pretty clear that manipulation is happening.

The Knowledge-Is-Power Society

The statement “knowledge is power” seems to imply that “omniscience is omnipotence” – a tempting idea indeed. Therefore, who collects all the data in the world, such as the National Security Agency (NSA) in the United States, might hope to become almighty, especially if equipped with suitable manipulation tools. By knowing everything about us, one can always find an Achille’s heel. Even CIA director General David Petraeus was not immune to this risk. He became the victim of a love affair irrelevant to his duty.

The developments outlined above are not fantasy – they are already taking place behind the scenes or are just around the corner. Yet, our society and legal system are not well prepared for this.

The American dream of omniscience and omnipotence (see the God’s eye) is imprinted in each 1 Dollar bill. This and the belief in God (“In God We Trust”) is suggested to be the basis of a new world order (“Novus Ordo Seclorum”). Source: Wikimedia Common, see,_reverse,_series_2009.jpg

A New World Order Based on Information?

Some people may see information and technologies as new tools to create social order. Why should one object to a computer or government or company taking decisions for us, as long as they are in our interest? But who would decide how to use these tools? Can the concept of a ‘caring state’ or a ‘benevolent dictator’ really work? In other words, can supercomputers enabled by Big Data take the decisions that are best for us?

This has always failed in the past, and it will be also unsuccessful in the future. Not only do many systems fail under asymmetric information (if some stakeholders are very well informed and others very badly). The performance of all computers in the world will also never be sufficient to optimize our economy and society in real time. Supercomputers cannot even optimize the traffic lights of a big city in real time. This is because the computational effort explodes with the size and complexity of the system. Just a very simple society could be optimized top down, but who would want to live in it?

Privacy and Socio-Diversity Need Protection

The aforementioned “omniscient almighty society” cannot work. If we all did what a super-intelligent institution thinks is right – it would be as if children always did what parents are asking for, never becoming adolescents. Then they would never take autonomous decisions, and find their own way. Privacy is a necessary ingredient for the development of individual responsibility and for society. It should not to be understood as a concession to the citizens.

“Private” and “public” are two sides of the same coin, each of which cannot exist without the other. People can only adjust to the thousands of normative public expectations every day, if there is a private, protected space where they can be free and relax. Privacy is an invention that reduces mutual interference to a degree that allows us to “live and let live”. If we knew what others think, we would have far more conflicts.

The importance of unobserved opinion formation is demonstrated by the crucial role of anonymous votes in democracies. Would we only adjust ourselves to expectations of others, many new ideas would not emerge or spread. Social diversity would decrease, and thus the ability of our society to adapt. Innovation requires the protection of minorities and new ideas. It is an engine of our economy. Social diversity also promotes happiness, social well-being, and the ability of our society to recover from shocks (“resilience”).

Social diversity must be protected just as much as biodiversity. Today, however, the Internet recommends us opinions about books, music, movies and even about friends and partners. This undermines the principle of the “wisdom of crowds” and collective intelligence. Why should a company decide what is good for us? Why can’t we determine the recommendation algorithms ourselves? Why don’t we get access to relevant data?

An Alternative Vision of the Information Age

Also in an increasingly unstable world, surveillance, combined with the manipulation or suppression of undesired behaviors, is not a sustainable solution. But is there an alternative to the omniscient almighty state that matches our ethical values? An alternative that can create cultural and economic prosperity? Yes, indeed!

Our society and economy are currently undergoing a fundamental transformation. Global networking creates increasing complexity and instability that cannot be properly managed by planning, optimization and top-down control. A flexible adaptation to local needs works better for complex, variable systems. This means that managing complexity requires a stronger bottom-up component.

In the economy and the organization of the Internet, decentralized self-organization principles have always played a big role. Now they have also spread to intelligent energy networks (“smart grids”) and traffic control. One day, societal decision-making and economic production processes will also be run in a more participatory way to better manage the increase in complexity. It seems the natural course of history. A growing desire of citizens to participate in social, political and economic affairs is already found in many parts of the world.

The Democratic, Participatory Market Society

In connection with a participatory economy, one often speaks of “prosumers”, i.e. co-producing consumers. Advanced collaboration platforms will allow anyone to set up projects with others to create their own products, for example with 3D printers. Thus, classical companies and political parties and institutions might increasingly be replaced by project-based initiatives – a form of organization that I would like to call “democratic, participatory market society”.

To ensure that the participatory market society will work well and create jobs on a large scale, the right decisions will have to be taken. For example, it seems essential that the information systems of the future will be open, transparent and participatory. This requires us to create a participatory information and innovation ecosystem, i.e. to make large amounts of data accessible to everyone.

The Benefit of Opening Data to All

The great advantage of information is that it is (re)producible in a cheap and almost unlimited way, so that the eternal struggle for limited resources might be overcome. It is important that we take advantage of this and open the door to an age of creativity rather than limiting access to information, thereby creating artificial scarcity again. Today, many companies collect data, but lack access to other important data. The situation is as if everyone owned a few words of our language, but had to pay for the use of all the other words. It is pretty clear that, under such conditions, we could not fully capitalize on our communicative potentials.

To overcome this dissatisfactory data exchange situation and achieve “digital literacy”, one could decide to open up data for all. Remember that most countries have also decided to turn the privilege of reading and writing into a public good by establishing public schools. It is well known that this step has boosted the development of modern societies. Similarly, “Open Data” could boost the development of the information society, but the producers of data must be adequately compensated.

A New Paradigm to Manage Complexity

Access to data is essential for the successful management of complex dynamical systems, as it requires three elements: (i)  proper systems design to support predictability and controllability, (ii) probabilistic short-term forecasts of the system dynamics, which need plenty of reliable real-time data, and (iii) suitable adaptive mechanisms (“feedback loops”) that support the desired system behaviour.

Managing complexity should build on the natural tendency of complex dynamical systems to self-organize. To enable self-organization, it is crucial to find the right institutional settings and suitable ‘rules of the game”, while avoiding too much top down control. Then, complex systems can essentially regulate themselves.

One must be aware, however, that complex systems often behave in counterintuitive ways. Hence, it is easy to choose the wrong rules, thereby ending up with suboptimal results, unwanted side effects, or unstable system behaviors that can lead to man-made disasters. The financial system, which went out of control, might serve as a warning. These problems have traditionally been managed by top-down regulation, which is usually inefficient and expensive.

Loss of Control due to a Wrong Way of Thinking

Whether a system can be adequately managed or is self-organizing in the way we want is a matter of systems design. If the system is designed in the wrong way, then it will get out of control sooner or later, even if all actors involved are highly trained, well equipped and highly motivated to do the right things. “Phantom traffic jams” and crowd disasters are examples of unwanted situations that occur despite all efforts to prevent them from happening. Likewise, financial crises, conflicts and wars can be unintended consequences of systemic instabilities. Even today, we are still not immune to them.

Therefore, we need a much better understanding of our techno-socio-economic-ecological systems and their interdependencies. Appropriate institutions and rules for our highly networked world must still be found. The information age will revolutionize our economy and society in a dramatic way. If we do not pay sufficient attention to these developments, we will suffer the fate of a car driving too fast on a foggy day.

Decisions Needed to Use Opportunities and Avoid Risks

To meet the challenges of the 21st century and benefit from its great opportunities, a Global Systems Science needs to be established in order to fill the current knowledge gaps. It aims to generate new insights allowing politics, economy and society to take better informed, more successful decisions. This could help us to use the chances of the information age and minimize its risks. We must be aware that everything is possible – ranging from a Big Brother society to a participatory economy and society. The choice is ours!


About the author

Dirk Helbing is Professor of Sociology, in particular of Modeling and Simulation at ETH Zurich, and member of its Computer Science Department. He earned a PhD in physics and was Managing Director of the Institute of Transport & Economics at Dresden University of Technology in Germany. He is internationally known for his work on pedestrian crowds, vehicle traffic, and agent-based models of social systems. Furthermore, he is coordinating the FuturICT Initiative (, which focuses on the understanding of techno-socio-economic systems, using Big Data. His work is documented by hundreds of scientific articles, dozens of keynote talks, and media reports in all major languages. Helbing is elected member of the World Economic Forum’s Global Agenda Council on Complex Systems and of the German Academy of Sciences “Leopoldina”. He is also chairman of the Physics of Socio-Economic Systems Division of the German Physical Society and co-founder of ETH Zurich’s Risk Center.

[1] See: and

Share this Article:

One Response to “Google as God?”

  1. Thanks to the opportunities offered by Coursera I have just signed up for studies offered at Stanford University entitled, ‘Social and Economic Networks: Models and Analysis’. My intention is to advance my limited knowledge and skills in this field. I have been prompted to do so by the new social and economic opportunities generated in the information world – of particular recent interest being the field of e-medicine and wellness. Whilst I am well qualified and experienced in the field of medicine and wellness – particularly having researched the key role of subjectivity in respect of the study of psychoneuroimmunology (pni ) – I am still technologically challenged in the ‘e’ aspect.

    Which brings me to the point of this article above. The empowerment of IT with complexity-based thinking raises the Orwellian specter described above. My mentor and author of the concept of holism, Jan Smuts, already warned in 1931:”…A serious lag has already developed between our rapid scientific advance and our stationary ethical development. Science itself must help to close this dangerous gap in our advance that threatens the disruption of our civilization and the decay of our species.”

    Simply thus the more powerful our technology becomes the more does it amplify the vulnerable and exponentially dangerous side of human nature. A colleague, Richard Barrett, in his recent book ‘Love, Fear and the Destiny of Nations’ points out that the cultural dissonance in our societies, that produces social toxicity, is founded in deficiency needs. Those core human needs that are not met, the necessities for survival, social cohesion, self recognition etc, unconsciously and perniciously permeate all of our thinking and human endeavour. This comes at the expense of our natural inclinations to mutuality, trust and co-creation. Scarily this includes the appropriation and employment of our most powerful technology, often overtly being portrayed as well-intentioned but also potentially covertly nefarious, in economic, social and political management. The case presented here above is moot.

    I would argue now that the greatest danger facing humankind is the development of this technology when applied from a paradigm of reductionism, mechanistic materialism, and the intrinsic separatism that accompanies such consciousness. The objectification of the human experience into an economic equation which inevitably follows reductionist thinking needs a radical reassessment. It inexorably points to holistic thinking as the requisite paradigm of consciousness that reveals the interrelatedness of all being especially in relation to new intuitions into economic possibility. The commoditization of knowledge and information enabled by the very systems identified by the writer enables the manipulation of humanity as the marketplace. A holistic view of humanity points to economics based on a sharing of resources in an exchange of value that is enriching of human experience in its deepest sense and also life-enhancing – in respect of the entire enabling milieu. The ‘commons’ notion introduced by Elinor Ostrom points to this opportunity.

    Now complexity science and its powerful applications to IT can be employed by both paradigms. It’s ethical application calls for a grass roots mobilization – a civil society empowerment – inspired by a new collective quest for the revelation of a more meaningful human destiny. It calls for people like me, technophobes, to get involved.

Leave a Reply