Chris Anderson recently wrote a thought-provoking piece in Wired and Edge called The End of Theory in which he focuses on the rise of massive datasets, computational algorithms and correlations (instead of causation) for the next step in scientific evolution. Highly recommended reading.
More importantly, different leading thinkers like Kevin Kelly, Bruce Sterling, Daniel Hillis, George Dyson, Douglas Rushkoff, Jaron Lanier, Stewart Brand and John Horgan respond to the Anderson view in this post on Edge analyzing the pros and cons of The End of Theory article. Below some quotes by Anderson and his critics.
In my view, correlation might boost useful science in the sense of working or realistic correlations. Nonetheless, in most disciplines intuition, creativity, asking good questions (perspectives/frames !), understanding, models and theory still have a clear value add, albeit for social / sharing reasons on top of a deeper understanding of the why of natural or social phenomena. Additionally, the reasoning of Chris Anderson is relevant for the rise of the mobile internet and its ubiquitous computing role in the near future. All these real-time mobile sensors might boost correlations and predictive capabilities to a certain degree while still acknowledging the power of Black Swans. Furthermore, Andersons' view seems to resonate with the Internet Scenario and Digital Gaia Scenario within the Singularity according to Vernor Vinge in which the continuing profileration and advancement of the internet will give rise to posthuman sense of consciousness as its too complex to contemplate. Finally, the role of the Semantic Web/Web 3.0 is interesting in the light of Andersons' reasoning. He seems to disagree with the benefits of the meaning and top-down structures of the Semantic Web. It would be great to see the responses of Tim Berners Lee and Nova Spivack to the Anderson piece.
"All models are wrong, but some are useful. So proclaimed statistician George Box 30 years ago, and he was right. But what choice did we have? Only models, from cosmological equations to theories of human behavior, seemed to be able to consistently, if imperfectly, explain the world around us. Until now. Today companies like Google, which have grown up in an era of massively abundant data, don't have to settle for wrong models. Indeed, they don't have to settle for models at all.
Sixty years ago, digital computers made information readable. Twenty years ago, the Internet made it reachable. Ten years ago, the first search engine crawlers made it a single database. Now Google and like-minded companies are sifting through the most measured age in history, treating this massive corpus as a laboratory of the human condition. They are the children of the Petabyte Age.
Chris Anderson seems to think computers will reduce science to pure induction, predicting the future based on the past. This method of course can't predict black swans, anomalous, truly novel events. Theory-laden human experts can't foresee black swans either, but for the foreseeable future, human experts will know how to handle black swans more adeptly when they appear.
Just because we remove the limits and biases of human narrativity from science, does not mean other biases don't rush in to fill the vacuum.
It is clear to me that while numerical simulation and computation are welcome tools, they are helpful only when they are used by good scientists to enhance their powers of creative reasoning. One rarely succeeds by “throwing a problem onto a computer”, instead it takes years and even decades of careful development and tuning of a simulation to get it to the point where it yields useful output, and in every case where it has done so it was because of sustained, creative theoretical work of the kind that has been traditionally at the heart of scientific progress."