John R. Rice
Department of Computer Sciences
Purdue University
October, 1996
Changes are coming in the way scientific research is funded and many people are concerned that industry and government will decrease funding levels, and that the emphasis of research funding may shift. The latter fear comes from the emerging focus on funding strategic research -- research that has the clear potential to benefit the nation's economy, health, environment, education, or other important goals. This new focus replaces the goal of simply advancing science, which prevailed for the previous 40 to 50 years. Thus the grand challenges in computational science of the late 1980s, which were to advance the frontiers of science, were later enlarged to include ``national challenges'' of benefit to society. Strategic research is the continuation of this shift from science for science's sake to science for society's sake. Computer science is at heart an applied field and the great majority of computing research has strategic connections. Computational science and engineering applications are one driving force that involves all aspects of computing research.
Strategic research is still fuzzily defined, but it refers to areas of research and not to types. It is orthogonal to concepts like basic, applied, long-term, or short-term. The nature of strategic research is illustrated by the six fundamental and over-reaching goals defined for all federal science and technology investments (from the National Science and Technology Council, [1], which reports to the President):
Examples of important science areas that would not fit into the ``strategic'' scheme include studying the origins of the universe (``big bang'' theory), proving Fermat's last theorem, finding whether P = NP or not, solving the four-color problem, and discovering the evolutionary origin of birds.
In my view, the current pressure to decrease -- or at least realign -- science research budgets is due almost entirely to the push to control budget deficits. Another intriguing but more nebulous theory also deserves passing mention: perhaps the science establishment is reaching its mature size, measured as a reasonable proportion of human activity. After four centuries of exponential growth, the number of scientists may need to stabilize. In this view the problem is not so much the lack of a reasonable amount of research money, but the ever-increasing flow of researchers competing for it.
Whichever theory is correct, the emphasis on strategic research appears inevitable; but as computer scientists we need not fear it. My thesis is that all subareas of computing research can prosper in an era of funding emphasis on strategic research. I argue for this thesis by examining science and engineering applications, though one could make equally compelling arguments based on other strategic areas. However, prosperity will not come automatically. The computing research community has been described as inward-looking, [2], and many, perhaps the bulk, of its members have avoided applications entirely. This can be understood and justified by the fact that the young field of computer science needed time to firmly establish its own foundations. Those foundations are laid. Now computer scientists must become more outward-looking, and appreciate that computational science and other applications will essentially involve and greatly challenge all subareas of computing research.
Growth in computing power continues to be astounding and shows no signs of abating. That this growth is unprecedented in recorded history is illustrated in Table 1, where quantitative changes in computing are compared to changes in speed of transportation, maximum power of an explosion, construction, and education. The growth of computing power over the next two decades alone -- coming on top of five decades of already explosive growth -- will exceed the growth in transportation speed from the time when everyone walked to the supersonic jets projected for the early 2000s.
Area | |||||
---|---|---|---|---|---|
|
|||||
Computation | Education | ||||
Transportation | (multiplies | Explosive power | (avg years, | ||
Year | (miles per day) | per second) | (tons of TNT) | Construction | US) |
|
|||||
Ancient times | 40 | 0.005 | 0.0003 | Great Wall | None |
1890 | 200 | 0.04 | 0.5 | Suez Canal | 1 |
1950 | 6,000 | 40.0 | 1,000,000.0 | Fort Peck Dam | 8 |
1970 | 35,000 | 10,000,000.0 | 100,000,000.0 | Aswan Dam | 10 |
1990 | 35,000 | 5,000,000,000.0 | 100,000,000.0 | US highways | 12 |
2010 | 150,000 | 20,000,000,000,000.0 | 100,000,000.0 | US highways | ? |
The nature of this growth is illustrated by a look at the recent history of a simple application: compute where the cooling water pipes should go in an automobile engine block (see Figure 1). This real-world problem has been ``solved'' for many decades by building engine prototypes, making experiments, and using analog methods -- expensive and time-consuming approaches that do not optimize the design. A computer solution should not be fundamentally difficult. It involves one of the best-understood physical phenomena, heat flow. One just has to solve the Poisson problem for a complicated three-dimensional object. Methods and machines were available in 1940 that could, in principle, solve such a problem. Yet as a practical matter (or rather, impractical), I estimate that this computation for just one engine block would have cost the entire wealth of the United States in 1940. When I first encountered the problem in 1963 there had been enormous progress in both computing hardware and algorithms since 1940. Nevertheless, the computation was still not economically feasible. Today the computer time to solve it costs a few tens of dollars. Very significantly, algorithmic progress has been a larger factor in decreasing the cost than progress in the speed of computing hardware.
Figure 1: A typical automotive engine block. Three decades ago it
was economically infeasible to compute the best locations for the
cooling pipes. Today it is quick and cheap.
Though we can quibble about whether computers will become 1,000 or 5,000 times faster over the next 20 years, as an order-of magnitude estimate we can expect with confidence that 10 megaflops of power with 10 megabytes of memory will cost $5 in 2015. Moreover, for every computer we are using now, we can expect that by 2015 there can be, for the same cost, 999 other machines working in tandem with it to provide better service.
Three examples illustrate how these ``strategic'' computational science projects will invigorate all aspects of computer science.
The first application is the use of computer simulation for designing simple and complex physical mechanisms, an area also known as electronic prototyping. It is a near term application that will be pervasive in industry and which involves essentially all of computer science. Figure 2 shows a collection of images from a current system devised to facilitate such work [3]. This is a prototype problem-solving environment [4], for applications based on partial differential equations. A typical problem is illustrated: optimizing the shape of the end of a piston rod. We want to reduce the size of the end piece while maintaining adequate strength.
Figure 2: Screen images from the PDELab system, showing steps in
optimizing the shape of a simple engine part. Improving this type
of design-and-optimization system demands advances in many areas of
computer science.
At the top level, this application involves the following subareas of computing:
The substructure of this design application involves an even broader range of the subareas of computing.
As if the above were not enough to keep computing researchers busy, this application is harder yet because it involves:
Finally, this application involves the central, unsolved problem of computer science: how to engineer software effectively. Millions of lines of code will be involved, from multitudes of sources; still, the system must be reliable and efficient. We have a long way to go in understanding how to build such systems well.
Reality is the starting point of our first application. Simulation should compute what would happen in reality. The design/optimization computation is simulation; the second contact with reality comes when the object is manufactured and used.
Virtual reality is a form of simulation that is a direct extension of the design system. Whatever is involved in the virtual reality environment is simulated well enough that the human sensory inputs are (nearly) the same as for reality. As this methodology advances, it will involve all the human senses, not just vision. And, as more senses are involved, the simulation must be more complete. For example, walls in a building now are simulated by idealized planes with color and texture superimposed. When sound and touch are included, then the walls must simulate much of the physical structure of real walls. Virtual environments are similar to virtual reality in that everything must seem ``real'' to the humans in the environment. However, they may combine many actual objects with simulation. For example, in pilot-training simulators the cockpit is real but the motion, the rest of the airplane, and the views out the windows are simulated. A virtual environment may also be completely or partly artificial. For example, one could be that of a ``boat'' navigating through the bloodstream of a person or through the molten materials inside a blast furnace. A completely artificial virtual environment could be based on a pseudophysical representation of the flow of money and other financial instruments in the economy of a city. Then a person in the environment could directly ``observe'' these flows as the economy changes.
Within two decades virtual reality and environments will provide very high levels of realism using accurate and rather complete simulations of the physical (or pseudophysical) environment. This will draw on all the subareas of computing that the design of physical objects involved, and with more demanding performance for most of them.
Finally, consider robots. Robots with reasonable speech and vision capabilities will appear within 20 years. Their movement and touch capabilities will be useful. Their capabilities to access information and do computations will be enormous. None of these capabilities will be anywhere close to those of humans, but that is not necessary for them to be very useful. Recall that a frog sees only in black and white, and sees only things that are moving. In spite of this primitive vision system, frogs get along quite well. Robots will also.
The computational problems for robots are much more difficult than for virtual reality. Compare the requirements for walking down a hall, by a robot and by a human within a virtual reality system. The principal activities are as follows:
These four areas of robot capability are currently in different states of development. The mechanical control and motion problems have been studied for a long time and much progress has been made. While there are still unsolved problems, this does not appear to be a major hurdle. The vision problems also have been studied for a long time, but with less progress. It is plausible that vision requires more computational power than previously expected, and that more rapid progress can be made with continued increases in this power. The auditory problems have also been studied for some time, primarily in the context of speech recognition. This effort has been much less than for vision but it appears that speech recognition, at least, is now feasible. It is no doubt a major challenge to extend this technology to general sound analysis. The problem of touch seems to be much less studied. We can, however, hope that useful robots can be made with primitive touch capabilities.
I have focused on the high-level computational problems of creating robots but, just as in the previous applications, there is a large, complex substructure based on the subareas of computing. A robot will be controlled by a network of powerful processors with a specialized operating system, databases, distributed control, knowledgebases, and semi-autonomous processes. The focus on ``strategic'' research is not likely to be a passing fad, even though the final definition of the term is still unclear. Fortunately for computer scientists, almost all areas of computing research are applicable to strategic applications, often in the context of the extraordinarily varied universe of computational science and engineering. Many computing researchers may have to put some effort into establishing the connection of their work to the larger world, but this task is just part of living in an ever-changing, dynamic society. This position paper is based on [5].