High Performance Computing, Computer Simulation, and Theoretical Physics: Evolution or Revolution?

By Prof. Giovanni Ciccotti, University of Rome La Sapienza

Numerical physics, i.e. numerical calculations serving the needs of traditional theoretical physics, exists at least since the times of Galileo, and probably long before. As Computer Simulation (started with solving problems in Statistical Mechanics), it exists only since the end of the second World War. It is based on the possibility of having computation speeds largely beyond human capabilities, even including speeds reachable by exploiting team work.

Computer simulation in Theoretical Physics is based on the idea that we can solve, by brute computing power, models of matter based on the exact laws of physics. In the case of Condensed Matter, that amounts to solving the Schroedinger equation or, possibly, justified approximations to it (often classical mechanics and a suitable model for the interactions is a sufficient substitute), and to use the mechanical information so obtained to compute the statistical mechanical properties of the system. By this token, Computer Simulation has established itself as the key tool of Theoretical Physics to which it has become an integral part. Let us refer to it as Modern Theoretical Physics. It is not an exaggeration to say that this process – still now not completely understood by some traditional practitioners – has been much more a revolution than a simple evolution of the discipline. So much so that today the predictive power of Physics has gone largely beyond its historical boundaries invading (and in part been reshaped) not only by Chemistry and its related disciplines, but also Biology, Materials Sciences, Geosciences, etc: from simple fluids to the human immune response!

The fundamental tools of this approach (Monte Carlo, Molecular Dynamics, classical and, then, Ab initio, Path Integrals, etc and, correspondingly, an entire arsenal of statistical tools) have been progressively included, but depend in a really dramatic way on the availability of potent algorithms and computer power. Indeed, it is known that the progress and development of Computer Simulation is due to the combined introduction of efficient algorithms to confront specific scientific challenges and to the exponential growth of computer power. The latter in turn is the result not only of increased speed in the transmission of signals in processing units, but also to the development of more sophisticated architectures.

From the point of view of scientific progress, it is easy to show that the development of new algorithms has been in the past much more effective in advancing the field than the bare increase in computer power. However, the situation is slowly changing and new algorithms cannot be any more easily developed without some help coming from people developing software using the new possibilities offered by the new computers: vectorization and parallelization are two typical aspects of this new trend.

Associated with this is the birth of a new profession: software engineering. Scientific progress in computer simulation will slow down if the help which can come from new software tools is neglected. The awarness of this change is already present in the US but is slow in penetrating European funding agencies. The damage induced by this delay can become indeed very serious.

At the same time, progress in computing power can saturate if not helped by the challenges offered by computational sciences, be it in the fast production of large scale dynamic data, the retrieval of large masses of stored data or their handling and high-level analysis. This new intricate field is what we encompass when speaking of High Performance Computing as proposed and apparently developed today in the European National-level Computing Centers and stimulated (although with some bias: too much attention to exascale power and too little to the scientific targets…) from the more than welcome European PRACE project which distribute to high level computationally intensive projects computer time and some software assistance.

It is a serious misfortune that the collaboration between computational and computer scientists is still in its infancy. Only a few national authorities have been able to start a close collaboration between scientists, software engineers and, possibly, full level computer scientists. Jülich is certainly one of these smart enterprises but much remains to be done at National and European levels even in Germany. To understand the reasons of this delay in joining forces let us look at the situation a bit more closely.

Computational scientists are normally under pressure to produce good scientific results, and in particular publications. The details about the way in which these results have been obtained using computers fall well outside their focus. At the same time, the management of computer centers is keen to use scientists as high level testers of the best computational facilities, largely disregarding the scientific value of their output.

The result is a confused and confusing development which doesn’t help the efficiency of the entire process, with the consequence of wasted investments and progress slower than desirable and possible.
To help in such a situation, one new element should be added or created, a new generation of scientifically trained software engineers to interface constructively with computational and computer scientists, not to speak of the technological environment. This profession is already largely accepted in the US and, at least for what I know, in Japan, but finds great difficulties in old Europe.

The new profile, despite being a necessity for the progress of computational science, is ignored in academic circles, while with few exceptions, it is not found in the computer centers even at high levels. A strong action is needed to change this negative trend. UCD and ICHEC in Ireland, as coordinators of the European project E-CAM, together with CECAM, the historical European hub of computational scientists in soft and hard matter, have taken a courageous initiative opening a European collaboration for software development. This is a very wise step in the right direction, and we can only wish full success to it. It has to be hoped that complementary initiatives like PRACE, computational centers scientifically oriented to HPC and academic institutions, will, sooner rather than later give due and full credit to that.

Share