Blue Matter: Scaling of N-body simulations to one atom per node

被引:13
作者
Fitch, B. G. [1 ]
Rayshubskiy, A. [1 ]
Eleftheriou, M. [1 ]
Ward, T. J. C. [2 ]
Giampapa, M. E. [1 ]
Pitman, M. C. [1 ]
Pitera, J. W. [3 ]
Swope, W. C. [1 ,3 ]
Germain, R. S. [1 ]
机构
[1] IBM Corp, Div Res, Thomas J Watson Res Ctr, Yorktown Hts, NY 10598 USA
[2] IBM United Kingdom Ltd, Winchester S021 2JN, Hants, England
[3] IBM Corp, Almaden Res Ctr, Div Res, San Jose, CA 95120 USA
关键词
D O I
10.1147/rd.521.0145
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
N-body simulations present some of the most interesting challenges in the area of massively parallel computing, especially when the object is to improve the time to solution for a fixed-size problem. The Blue Matter molecular simulation framework was developed specifically to address these challenges, to explore programming models for massively parallel machine architectures in a concrete context, and to support the scientific goals of the IBM Blue Gene((R)) Project. This paper reviews the key issues involved in achieving ultrastrong scaling of methodologically correct biomolecular simulations, particularly the treatment of the long-range electrostatic forces present in simulations of proteins in water and membranes. Blue Matter computes these forces using the particle-particle particle-mesh Ewald (P3ME) method, which breaks the problem up into two pieces, one that requires the use of three-dimensional fast Fourier transforms with global data dependencies and another that involves computing interactions between pairs of particles within a cutoff distance. We summarize our exploration of the parallel decompositions used to compute these finite-ranged interactions, describe some of the implementation details involved in these decompositions, and present the evolution of strong-scaling performance achieved over the course of this exploration, along with evidence for the quality of simulation achieved.
引用
收藏
页码:145 / 158
页数:14
相关论文
共 49 条
[1]   Blue Gene: A vision for protein science using a petaflop supercomputer [J].
Allen, F ;
Almasi, G ;
Andreoni, W ;
Beece, D ;
Berne, BJ ;
Bright, A ;
Brunheroto, J ;
Cascaval, C ;
Castanos, J ;
Coteus, P ;
Crumley, P ;
Curioni, A ;
Denneau, M ;
Donath, W ;
Eleftheriou, M ;
Fitch, B ;
Fleischer, B ;
Georgiou, CJ ;
Germain, R ;
Giampapa, M ;
Gresh, D ;
Gupta, M ;
Haring, R ;
Ho, H ;
Hochschild, P ;
Hummel, S ;
Jonas, T ;
Lieber, D ;
Martyna, G ;
Maturu, K ;
Moreira, J ;
Newns, D ;
Newton, M ;
Philhower, R ;
Picunko, T ;
Pitera, J ;
Pitman, M ;
Rand, R ;
Royyuru, A ;
Salapura, V ;
Sanomiya, A ;
Shah, R ;
Sham, Y ;
Singh, S ;
Snir, M ;
Suits, F ;
Swetz, R ;
Swope, WC ;
Vishnumurthy, N ;
Ward, TJC .
IBM SYSTEMS JOURNAL, 2001, 40 (02) :310-327
[2]  
Amdahl G. M., 1967, Proceedings of the April 18-20, 1967, spring joint computer conference, AFIPS'67 (Spring), pag, P483, DOI DOI 10.1145/1465482.1465560
[4]   COMPUTER-SIMULATION STUDY OF THE MEAN FORCES BETWEEN FERROUS AND FERRIC IONS IN WATER [J].
BADER, JS ;
CHANDLER, D .
JOURNAL OF PHYSICAL CHEMISTRY, 1992, 96 (15) :6423-6427
[5]  
Bowers K. J., 2006, P ACM IEEE C SUP TAM
[6]   The midpoint method for parallelization of particle simulations [J].
Bowers, Kevin J. ;
Dror, Ron O. ;
Shaw, David E. .
JOURNAL OF CHEMICAL PHYSICS, 2006, 124 (18)
[7]   PARTICLE MESH EWALD - AN N.LOG(N) METHOD FOR EWALD SUMS IN LARGE SYSTEMS [J].
DARDEN, T ;
YORK, D ;
PEDERSEN, L .
JOURNAL OF CHEMICAL PHYSICS, 1993, 98 (12) :10089-10092
[8]   How to mesh up Ewald sums. II. An accurate error estimate for the particle-particle-particle-mesh algorithm [J].
Deserno, M ;
Holm, C .
JOURNAL OF CHEMICAL PHYSICS, 1998, 109 (18) :7694-7701
[9]   How to mesh up Ewald sums. I. A theoretical and numerical comparison of various particle mesh routines [J].
Deserno, M ;
Holm, C .
JOURNAL OF CHEMICAL PHYSICS, 1998, 109 (18) :7678-7693
[10]  
Eleftheriou M, 2005, LECT NOTES COMPUT SC, V3648, P795