“Collaboration has never been as close as it is today”
This week, the Swiss National Computing Centre (CSCS) in Lugano celebrates its 25th anniversary. Christoph Sch?r, ETH professor with the Institute for Atmospheric and Climate Science, was there from the very start. In the following interview, he describes how supercomputers have developed over this period.
The Swiss National Computing Centre (external page CSCS) is hosting a small party on the 19th of October to celebrate its external page 25th anniversary. Since it was founded in 1991, CSCS has made its supercomputers available to users working in research and industry who need access to massive computing power to solve complex tasks. CSCS is operated by ETH Zurich and is located in Lugano.
Christoph Sch?r, ETH professor and climate scientist, was one of the first to use the new centre. That was not merely a coincidence: even at an early stage, climate scientists were working on numerical simulations for predicting climate trends.
ETH News: Professor Sch?r, what memories do you have of the early days of CSCS?
Christoph Sch?r: Right from the start, we used the computing facilities at CSCS to develop and run our climate models. Prior to that, I used the ETH computing centre while I was working on my doctoral thesis. When CSCS was first established, users were initially a little sceptical. Even today, you have to look at the decision to locate the centre in the canton of Ticino primarily in terms of federal considerations. After all, a computing centre is not just built around technology, but includes experts with valuable know-how, which should be accessible to the users. Initially the ETH’s own computing centre seemed a more convenient way to achieve this.
Do these reservations still exist?
As researchers, our collaboration with CSCS has always been very good. In 1999 we completed our first joint project, the Mesoscale Alpine Programme. It enabled us to producing the first real-time weather forecasts with a resolution of three kilometres. Today we are able to calculate climate models over periods of decades with high resolutions, which back then would only have been possible for short-term weather forecasts over the next 18 hours.
Who used the CSCS infrastructure in those days, apart from climate scientists?
The astronomers were also there from the word go, along with researchers working in computational chemistry, mechanical engineering and solid-state physics.
Since the CSCS was founded 25 years ago, its computing power has increased by a factor of several million. At the same time, climate models have become increasingly complex and informative. What’s been the driving force behind this trend?
Eight years ago I was a member of the Scientific Advisory Committee of the European Centre for Medium-Range Weather Forecasts (ECMWF) in Reading. At that time, I was able to observe how the computer industry and the scientific community work hand-in-hand to some extent, but on occasion also tend to compete with each other. During the procurement process, for example, the source codes of the ECMWF programme was handed to the biggest hardware manufacturers, who tried to optimise them in a bid to win the contract. In the background, however, there was always the question of how much the programmes should be adapted to a specific computer architecture. After all, we cannot simply rewrite our source code every few years.
But surely that was precisely the aim of the two initiatives “Platform for Advanced Scientific Computing” (PASC) and “High Performance and High Productivity Computing” (HP2C) before that: adapting software to a new hardware architecture?
This has to do with the introduction of new computers with graphics processing units (GPUs). It did in fact require us to completely rewrite some of the programmes. We were encouraged to do so by the current Director of CSCS, Thomas Schulthess, who managed to convince us that such a move set a new strategic direction for the decades ahead. Now we have a long-term strategy for keeping the programmes fit to cope with different computer architectures. Collaboration with CSCS has never been as close as it is today.
How have PASC and HP2C influenced your research?
Adapting the programmes involved a huge amount of work. It would have been impossible for a university research group to complete this task on its own. Since then, we have worked very closely with the CSCS, MeteoSwiss and the Center for Climate Systems Modeling (C2SM). Oliver Fuhrer from MeteoSwiss took charge of the project. Over a period of around five years we have collectively invested working hours in excess of 15 person-years. We are now also collaborating with ETH Zurich’s Computer Science department as part of a Sinergia Project.
What would climate research look like if there were no supercomputers?
Simulation has become the third main pillar of scientific research, alongside theory and experiment. This is particularly the case in climate sciences, as we are unable to perform any real-life experiments. The potential hazards of “experimenting” with the Earth’s climate system, such as is evident with the ozone hole or climate change, show that such experimentation is not a good idea. So simulations are definitely key to our research.
What have been the most crucial developments over the past 25 years, in your opinion?
One of the quantum leaps for us was when we were able to explicitly resolve storms and rain showers for the first time in our climate models. Another important step was the integration of ocean models into the atmospheric model. Nowadays we talk of Earth system models which integrate the oceans, sea ice and land areas.
… and what about supercomputers?
The shift from what is known as a shared-memory paradigm to a distributed-memory paradigm, and the question of how the memory is organised on a chip, shows that even today computing operations no longer pose the biggest problem. The most important aspect now is how computer memory is organised and how the data is distributed to and moved between the processors.
By the end of this decade the current petaflop era, in which supercomputers are capable of performing trillions of operations a second, is set to give way to the exascale era – with the ability to perform a billion billion calculations per second. What benefits do you hope this will bring?
One of the biggest goals of climate science is to quantify the uncertainties related to heavy rainstorms and clouds by performing the most explicit simulations possible. The shift towards exascale computing is essential for this.
And what would you like to see in future from CSCS?
Apart from the service and operation of the supercomputers, we need the expertise to assure the ongoing development of high-performance computing and to shape the dominant architecture of the future. This forms the basis of a long-term and constantly-evolving strategy which will hopefully serve us well for several decades. Computing centres are vital tools for climate scientists. For us, they play a similar role as CERN does for particle physicists.
Impressions of the 25 year celebration
Former Federal Councillor Flavio Cotti with State Councillors Manuele Bertoli, Christian Vitta and Paolo Beltramidelli (from left). State Councillor Christian Vitta, guiding his colleague Manuele Bertoli on a stroll through CSCS.
Former State Councillor Fulvio Caccia. (all photographs: Marco Abram)