TOKYO, Nov 24, 2020 – (JCN Newswire) – A Japanese research group consisting of the National Institute for Environmental Studies, RIKEN, Fujitsu Limited, Metro, Inc., and the University of Tokyo has implemented an unprecedented, large-scale coupled calculation of global weather simulation and data assimilation, for a 3.5 km horizontal mesh with 1024 ensembles with the supercomputer “Fugaku”, which is installed at the RIKEN Center for Computational Science in Kobe, Japan. The scale of the calculations performed in this study is approximately 500 times larger than daily ensemble-based data assimilation for weather forecasts performed by meteorological organizations throughout the world. This result demonstrates the high overall performance of Fugaku and showed that the collaborative development of the latest supercomputer, simulation models, and data assimilation systems can realize a much larger scale weather forecasting system. This result also offers the potential to significantly improve the accuracy of future weather forecasts and climate change predictions.
The research group presented the results of this research at SC20 (held online from November 9 to 19, 2020), an international conference on supercomputers, and was recognized as a finalist for the ACM Gordon Bell Prize, one of the most prestigious awards in computational science. The ACM Gordon Bell Prize is jointly sponsored by the ACM, an international academic society in computer science, and the IEEE Computer Society. This award recognizes studies with the most remarkable achievement throughout the year in the innovation of applying high-performance parallel computing to the field of science and technology.
1. Research Background
Meteorological data is indispensable to our daily lives. Almost every year, damages from heavy rains and tropical cyclones (typhoons) are reported both in Japan and throughout the world, and ever more accurate weather forecasts remain crucial to protecting our lives and property from such meteorological disasters. Current weather forecasting is made possible by information from meteorological observation around the world, numerical simulations, and data assimilation, which uses mathematical methods to combine meteorological data and simulations. To further improve the accuracy of weather forecasts, it is necessary to increase the efficiency of the use of observation data, perform numerical simulations with a finer mesh, and perform more ensemble calculations. However, since all of these require more calculations, possible calculations have been performed within the constraints of limited computer resources. In addition to calculations, the volume of data generated by simulations is also exploding, making it challenging for researchers to complete the transfer of the results within a realistic timeframe.
The design and development of the supercomputer Fugaku (Fig. 1), Japan’s new flagship machine, commenced in 2014 as the successor to the K computer. In June and November 2020, supercomputer Fugaku consecutively won first place in four categories of the world rankings for a variety of computing performance metrics. Despite this recognition, the real goal of Fugaku has not only been to demonstrate simple benchmark performance, but to achieve effective performance up to 100 times that of the K computer by executing application programs actually used in various research fields. To attain this goal, collaborative design (co-design) between the supercomputer system and scientific computing software. Technological trends in the supercomputing are rapidly changing, while the execution speed of software does not necessarily increase (or could even decrease) by using new supercomputers. The global high-resolution atmospheric model NICAM and the Local Ensemble Transform Kalman Filter LETKF used in this study are part of the representative application group selected to advance the codesign. Based on these applications, the research group has been conducting research and development into what kind of algorithms and optimization techniques should be selected to achieve high performance for the latest supercomputers.
This research is the result of conducting the largest and most feasible experiment of high-resolution and large ensemble data assimilation using Fugaku, marking the culmination of these co-design achievements.
2. Various Initiatives to Make Large-Scale Experiments Possible
In the ensemble data assimilation, multiple simulations (Ensemble) with slightly different calculation results are performed, and the results are compared with actual meteorological data, and the trajectory of the simulation is modified to enhance “accuracy”. This makes it possible to predict weather from conditions closer to the real atmosphere. In this study, we performed 1,024 ensemble calculations, more than 30 times larger than the calculations currently performed by the Japan Meteorological Agency. The increase in the number of ensembles enables the effective utilization of information from observation data, offering the potential to significantly improve the accuracy of actual weather forecasts. Figure 2 shows the flow of the calculation and the amount of data input and output for each calculation. The first step is to run 256 simulation calculations, which are repeated four times to obtain 1,024 ensemble results. Each calculation is based on a different initial value, resulting in a total of 1.4 petabytes of data and one million files. In the data assimilation part that followed, the output data were read all at once and analyzed. The need to read and write these data and to perform extensive data switching between the simulation and data assimilation parts has a significant impact on the overall computation time.
In this study, we first performed a “co-design” between the atmospheric simulation model NICAM and the LETKF data assimilation system, which were developed separately. We modified applications in order to ensure as many files as possible were read and written at the same time. The amount of the data transfer was minimized and a fast disk (SSD) near each computer was used. This optimization has dramatically reduced the time it takes to read and write files when many computers are used at the same time. The next step was to reduce the number of digits in the real number. In computers, numbers are expressed in binary, and more zeros and ones must be prepared to represent real numbers with more significant digits. By decreasing the precision of real numbers to reduce the number of digits, it becomes possible to reduce the size of the data to be moved and increase the number of data that the computer processes in a single instruction. On the other hand, since the simulation results deteriorate when the accuracy is reduced, we used idealized experiments to verify the calculation sections that may deteriorate the results.
This is also a “co-design” between scientific performance requirements and computational performance requirements.
In the area of system and application co-design, we worked together with Fujitsu to improve system software that translates program code into machine language, called compilers, so that some of the faster calculations can be applied automatically. A method called “Household Account Method” was also developed that efficiently finds many small sections of wasted time hidden in 100,000 lines of program code and improved the numerical calculation library to suit the needs of the project. The software and knowledge gained from these R&D efforts can be applied to many other types of software.
3. Research Outcomes
As an example of the calculation result, the calculation time of the data assimilation part is summarized in Fig. 3. The horizontal axis represents the number of ensemble members, the vertical axis represents the elapsed time, the dotted line represents double-precision (8 bytes) reals, and the solid line represents mostly single-precision (4 bytes) reals. Each color represents the result of changing the horizontal resolution of the simulation model to 56 km, 14 km, and 3.5 km. The figure shows that increasing the ensemble size increases the time required for calculation, but in the breakdown, the time required for file input/output hardly increases. It was found that the calculation using single-precision real numbers was faster than that using double-precision real numbers, and the increase in calculation time due to the increase in the ensemble size was suppressed. For simulation parts as well, calculations using single precision were performed 1.6 times faster than calculations using double precision real numbers. The largest calculation, on a 3.5-km mesh and 1,024-member basis, utilized 131,072 nodes (6,291,456 computational cores), or 82% of the total number of nodes of Fugaku, yielding 29 petaflops for the simulation part and 79 petaflops for the data assimilation part. The entire series of calculations is estimated to be completed in less than four hours.
The results of this research show that by using a supercomputer with balanced calculation and data transfer performances, such as Fugaku, it is possible to increase the scale and speed of calculation of complicated software such as the one used for numerical weather prediction systems, which is not only simple fluid calculation. These findings are expected to produce results in various scientific fields in the future, because they reflect not only the speed of simple arithmetic operations but also Fugaku’s true performance ability. In the field of meteorology, the results of this calculation are one of the major accomplishments, paving the way for providing more accurate weather forecasts in the future. Data assimilation is used not only for weather forecasting, but also to improve the performance of simulation models themselves and to estimate the emissions and removals of greenhouse gases and air pollutants. We plan to further expand the use of programs that have been innovatively improved through this study, and use them in research on weather, climate, and global environment.
4. Research Support
This work was supported by MEXT as “Program for Promoting Researches on the Supercomputer Fugaku” (Large Ensemble Atmospheric and Environmental Prediction for Disaster Prevention and Mitigation) and used computational resources of the supercomputer Fugaku.
5. Announcement Thesis
H. Yashiro, K. Terasaki, Y. Kawai, S. Kudo, T. Miyoshi, T. Imamura, K. Minami, H. Inoue, T. Nishiki, T. Saji, M. Satoh, and H. Tomita, “A 1024-Member Ensemble Data Assimilation with 3.5-Km Mesh Global Weather Simulations,” in SC20: International Conference for High Performance Computing, Networking, Storage and Analysis (SC), Atlanta, GA, US, 2020 pp. 1-10. doi: 10.1109/SC41405.2020.00005
1. K computer
A supercomputer with a computational speed of 10 petaflops that was jointly developed by RIKEN and Fujitsu and began operations in September 2012 as the core system of the “Building an Innovative High-Performance Computing Infrastructure (HPCI)” program promoted by the Ministry of Education, Culture, Sports, Science and Technology. It was decommissioned in August 2019.
2. Supercomputer Fugaku
Fugaku is the successor to the K computer. It was developed with the aim of contributing to Japan’s growth and producing world-leading results by solving social and scientific problems in the 2020s, delivering the world’s highest level supercomputer in terms of power performance, computational performance, user-friendliness and ease of use, epoch-making results, and comprehensive capabilities for big data and AI acceleration. Full operation is scheduled to begin in fiscal 2021.
“Fugaku” is another name for Mt. Fuji – the height of Mt. Fuji represents the world-class performance of the supercomputer, while the wide base of the famous mountain represents the broad range of applications and users of the supercomputer. In addition, RIKEN selected “Mt. Fuji” because it was well known overseas, and because of the recent trend of naming supercomputers after mountains.
NICAM stands for Nonhydrostatic ICosahedral Atmospheric Model. A weather and climate model that can simulate the global atmosphere at high resolution. In conventional global climate models, the cloud and precipitation processes require some assumptions due to a lack of horizontal resolution, which is a major source of uncertainty. NICAM realizes highly accurate global simulations by explicitly representing the generation and behavior of clouds.
LETKF stands for Local Ensemble Transform Kalman Filter. A type of data assimilation method, which is a practical method with excellent parallel computing efficiency. It was first developed at the University of Maryland and has been implemented in various numerical weather forecasting systems around the world. Data assimilation is a statistical, mathematical, interdisciplinary science that connects simulations to the real world and plays a fundamental role in determining the accuracy of numerical weather predictions.
5. Ensemble calculation
Ensemble means “with” and “at the same time” in French. In weather simulations, small differences in initial values and boundary conditions amplify over time, resulting in differences in forecast results. Therefore, a number of calculations (ensemble calculation) are performed, in which initial values and boundary conditions are intentionally changed little by little, to obtain more statistically reliable results. In ensemble data assimilation, a higher level of data assimilation is realized by estimating the uncertainty of simulation which fluctuates from ensemble calculation every moment.
7. Research Group
National Institute for Environmental Studies
Satellite Remote Sensing Section, Center for Global Environmental Research
Senior Researcher Hisashi Yashiro
RIKEN Center for Computational Science
Computational Climate Science Research Team
Team Leader Hirofumi Tomita
Postdoctoral Researcher Yuta Kawai
Data Assimilation Research Team
Team Leader Takemasa Miyoshi
Researcher Koji Terasaki
Large-Scale Parallel Numerical Computing Technology Research Team
Team Leader Toshiyuki Imamura
Postdoctoral Researcher Shuhei Kudo
Application Tuning Development Unit
Unit Leader Kazuo Minami
Technical Computing Business Unit, Computational Science Division
Senior Manager Hikaru Inoue
Technical Service Division, High Performance Computing Solutions Department
Atmosphere and Ocean Research Institute, The University of Tokyo
Professor Masaki Satoh
About National Institute for Environmental Studies (NIES)
NIES was established in 1974 as Japan’s leading institute for comprehensive research in environmental science and technology. Experts in diverse fields ranging from pure sciences, engineering, agriculture sciences, medical sciences, pharmaceutical sciences, and fishery sciences to law and economics, cooperate to pioneer new methodologies and foster pioneering research that will facilitate improvements in environmental circumstances.
About RIKEN Center for Computational Science
RIKEN is Japan’s largest comprehensive research institution renowned for high-quality research in a diverse range of scientific disciplines. Founded in 1917 as a private research foundation in Tokyo, RIKEN has grown rapidly in size and scope, today encompassing a network of world-class research centers and institutes across Japan including the RIKEN Center for Computational Science (R-CCS), the home of the supercomputer Fugaku. As the leadership center of high-performance computing, the R-CCS explores the “Science of computing, by computing, and for computing.” The outcomes of the exploration – the technologies such as open source software – are its core competence. The R-CCS strives to enhance the core competence and to promote the technologies throughout the world.
Fujitsu is the leading Japanese information and communication technology (ICT) company offering a full range of technology products, solutions and services. Approximately 130,000 Fujitsu people support customers in more than 100 countries. We use our experience and the power of ICT to shape the future of society with our customers. Fujitsu Limited (TSE:6702) reported consolidated revenues of 3.9 trillion yen (US$35 billion) for the fiscal year ended March 31, 2020. For more information, please see www.fujitsu.com.
About the Atmosphere and Ocean Research Institute
The Atmosphere and Ocean Research Institute (AORI), the University of Tokyo is one of the world’s leading research institutes in oceanic and atmospheric science, and is authorized as a Joint Usage/Research Center for Atmosphere and Ocean Science by the Ministry of Education, Culture, Sports, Science and Technology, Japan. AORI promotes cooperative research in a variety of fields including using the research vessels “Hakuho-maru” and “Shinsei-maru”, onshore laboratory facilities, climate system models and so on.
National Institute for Environmental Studies
Senior Researcher Hisashi Yashiro
Data Assimilation Team
Researcher Koji Terasaki
RIKEN Center for Computational Science
Business Sales Division Sales Promotion Office
Copyright 2020 JCN Newswire. All rights reserved. www.jcnnewswire.com