Special issue with the Sunday Magazine
MILLENNIUM : January 23, 2000
Discovery of the future
Biman B. Nath
The author is an astrophysicist at the Raman Research Institute, Bangalore.
No other metaphor is likely to be more apt than that of an avalanche to describe the growth of the enterprise we call "modern science". Slowly awakened by the subterranean rumbles of the first half of the millennium, it has snowballed into a dominant aspect of our civilisation. Its spectacular and ever increasing growth is evident from the fact that the number of people engaged in "scientific research and development" doubles almost every 14 years. The word "scientist", which we take so much for granted today, has not been with us for a long time as we might expect from the influence scientists have on our lives today; it was coined only in the last century.
The difference that the last few centuries have made can be felt by considering what the scientists of today would have done in the past. If telescoped back in time, they would have spent most of the time studying old texts, interpreting and comparing them, without giving much thought to doing simple experiments to find out the correct answer to a problem. Instead, the answer would have been sought in the texts of who were judged to be authorities. Of course there was some practical knowledge available, for working in the mines, for example, but it constituted little more than a haphazard collection of isolated facts, without any underlying connecting thoughts. To understand nature was not a priority; what thrived instead was black magic, the attempt to subjugate nature by making her obey human wish through esoteric and dubious means.
Curiously, the last millennium can be divided into two almost neat halves, with a black and white contrast between them. If black magic dominated the first half, it was the "white magic" of science that ruled the latter half. While the wish to tame nature against her will, disobeying her own laws, was the driving force behind black magic in the medieval era, the attitude changed (sometime around 1500 A.D.) to trying to dominate nature by understanding her laws and making her obey them. Not only in science (or natural philosophy, as it was called then), but this attitude of looking at nature without the biases of ancient texts influenced the world of arts too, and developed the concept of perspective, for example. Madonnas in the hands of painters looked like real women. Then, Gutenberg's invention of the printing press stoked the furnace by making this "magical knowledge" freely available and becoming less esoteric. And of course the beginning of exploration in the ocean in the late 15th century gave another nudge to the avalanche by demanding ever more accurate time keeping devices and accuracy in astronomical observations for navigation.
The world had suddenly discovered that there were things yet to be discovered. The nectar of the past was no longer enough to dip into; there was a brave new future to be explored. Also, it was not enough to talk about the spiritual world; the Humanists put Man in the centre of nature, as her interpreter. This difference in the attitude is echoed, for example, in a famous painting by an Italian around 1500 A.D. The background landscape against which the bewitching smile of Mona Lisa was set was revolutionary, in the sense that no one had thought of painting landscapes before this. The painter did not only paint mountains and rivers behind the shoulders of Mona Lisa, he also clambered up and down the mountains of his native North Italy. Examining the fossils of shells buried there, he argued against the legend of the Biblical flood. Man never looked at himself and his surroundings with so much enthusiasm before. And, there was no stopping him now.
Everything that was considered to be irrelevant details of life was to be looked at anew. Some of the topics of research may sound silly from the viewpoints of science now. (Consider the experiments listed in the proceedings of the Royal Society, which was founded in 1666, as "feeding a carp in the air" or "experiments with poysoned Indian Dagger on several animals", but they were like the excesses of youth.) What is impressive though is the sudden interest in the surrounding nature. The discoveries made by Galileo with his telescope are legendary, but he too did not leave anything around him - gnats, flies, fleas - unobserved when he constructed a crude microscope soon after.
It was not passively observing nature, but quantifying the results of experiments, and comparing them with expectations, that was the hallmark of this new vision. Copernicus did not simply announce his model of the solar system as if it were something final. He added that accurate observations - accurate to about half a degree, according to him - would be able to test his model. It was a bold gesture, to be able to put one's ideas at stake, making them liable to be discarded by experiments, and not seek refuge in the words of ancient authorities. Inspired by the works of Copernicus, Tyco Brahe built his observatory to get such accurate observations, based on which Kepler derived his laws of planetary motions. These two strands, of using the precise language of mathematics, along with that of testing hypotheses against experiments, culminated with Newton, whose works epitomised the spirit of modern science.
This avalanche probably would not have snowballed this far if the fruits of the new science were not used for the Industrial Revolution in the 18th century. Francis Bacon, whose writings greatly influenced the community of scientists then, put this link of science with its practical applications above other motivations to pursue science. The lack of this vital link, probably, is also the reason why science shrivelled up in many other societies, in the East, for example, although its seeds appeared much earlier there. If Joseph Black, who discovered latent heat (heat shed by matter when condensing from vapour to liquid form), had not met James Watt, who was worried by the outrageous loss of heat in his rudimentary steam engine, then the steam engine would have only been of passing interest to the historians of science later. With Black's knowledge of the physical properties of steam and conduction of heat through metals, Watt was able to design a steam engine that could be really put to practical use (in 1788). This singular invention ushered in the age of the engineers. Just as the progress of science during the Renaissance was reflected in paintings and sculptures, in the 19th century it was mirrored in massive bridges and buildings.
The demand from the industry made sure that science became more organised. Laboratories sprung up, and with them, centres for training scientists. (Alfred North Whitehead once remarked that the greatest invention of the 19th century was the invention of the method of invention.) Scientific journals became essential for storage and communication of knowledge. The effect of this avalanche was such that although there were less than ten journals in 1700, the number reached 10,000 in two centuries, and it has since been growing at breakneck speed. For those who love statistics, it has been said that science is growing at such a pace that 90 per cent of all scientists who have lived on this planet are alive today.
What this revolution has achieved for humankind of course is common knowledge. The average lifespan has increased steadily (a far smaller fraction than today used to reach the age of 70 in the middle of the last millennium) and our exploratory spaceships have viewed the farthest planets in our solar systems from a giddying proximity. Of course, along with the leisure it has given us, science has also bestowed on us the privilege of inconceivably dangerous weapons. In the backdrop of such machines of mass destruction that our planet is now decked with, and of the bewildering pace of science, it is perhaps not surprising that our society has had occasional hiccups of anxiety over the whole scientific enterprise. Curiously, the final years of the last three centuries have been peppered with thoughts about whether or not the bandwagon of science was speeding away misguided, or if it was going to collapse and die soon.
The end of the 18th century saw the rise of such doubts in the minds of positivists. They maintained that science should content itself with only a limited number of topics and keep away from questions which are "unknowable" (like the cause of gravity). Science should not forget its limits. Of course, many of their claims have been proved wrong, and sociologists in fact argue whether positivism (with its centre in France) was the cause of the decline of science in France in later years.
Around a hundred years later, physicists thought that the study of physics would soon reach its final chapters, and philosophers argued about a new set of "unknowable" questions (many of which, admittedly, have not yet been solved, like the origin of life). Yet, physics soon discovered relativity and quantum mechanics which put many of earlier notions about the universe on their head.
At the dawn of the new millennium, such thoughts have been nudging us again, although in different forms. Many physicists have proclaimed that physics would end soon, after the so called "theory of everything" is worked out (in which the four basic forces of nature are unified), since the rest of physics would be just details. There have been even more disturbing voices. Science has been alleged to have become "post-modern," in the sense that scientists are increasingly indulging in questions, answers to which cannot be determined by experiments in the near future. This makes their research more and more speculative and like postmodern literature, it is devoid of any reader-independent truth (just as in literary criticism, deconstructionism holds that there is no truth to be looked for in works of literature, and any interpretation of a work is as valid as any other). Also, if one compares the growth of science with, say, that of population, one is tempted to conclude that accelerated growth cannot be sustained ad infinitum. After all, it is argued, there must be some limits to what can be known.
It is possible that the ability to do experiments will lag behind that of theorising in the future, and that many fields of science will necessarily become speculative. But there is reason to be excited about many other aspects of science. The last few decades have witnessed significant developments in the research on chaos and complexity. It has become clear that knowing the laws of nature correctly does not guarantee that one can predict the outcome accurately. Small deviations at some point of time can result in completely different outcomes later. (The so called "butterfly effect" conveys the fact that even the flutter of a butterfly's wing can precipitate large changes in the weather, making it impossible to predict any long term behaviour). Therefore, science will not end just by delineating the underlying laws; there will be a whole lot of things to study, for example, the patterns of long term behaviour. There is a more interesting aspect to this. Science usually studies a phenomenon by reducing it to its simplest version, devoid of apparently irrelevant details. After having understood this version, one then adds the details, making it more and more complex, as in reality. Scientists have recently found that complex systems do not always behave as one would expect from naive extrapolations from simple systems. As the degree of complexity increases gradually beyond some thresholds, the change in behaviour is often hysterical (and not gradual), taking the system to completely different states. Scientists have barely scratched the surface of "complexity" - the study of complex systems - but it is certain that simply knowing the correct laws of nature is not enough to predict how a real, complex system would behave in future.
Bringing their vision closer to the specifics, away from the questions on ultimate limits of science, scientists do foresee a steadily growing activity in science. After the revolution in technology brought about by quantum mechanics in the 20th century, which gave us transistors and lasers and in turn paved the way for the computer revolution, the next century is being seen as the era of biotechnology. The precise knowledge of the DNA sequence in a large number of organisms will help in curing many genetic diseases and battling other hideous microbes. Quantum mechanics will continue to help the progress down to smaller and smaller sizes, making micro-machines possible, even manipulating individual molecules and atoms. In the world of the large, futurists have been debating on the possible extent of space exploration. Some scientists have predicted that in a century or two we may become a civilisation which would have mastered the art of harnessing all possible energy sources in the planet, and in doing so, become a "planetary civilisation," putting aside nationalistic and religious differences. By the end of the next millennium we may even have fully mastered the energy of the Sun, when the solar system would become our virtual backyard. Long term predictions can be deceptive though, as we have just seen.
What seems almost certain is that this millennium will be shaped by science with more vigour than in the past. The scientific avalanche may change its colour and shape but it does not look ready to end in a whimper any time soon. This millennium will probably be remembered as one in which we looked away disenchanted from the past and discovered the future.
Copyrights © 2000, The Hindu.
Republication or redissemination ofthe contents of this screen are expressly prohibited
without the written consent of The Hindu.