Going through Rough Times: from Non-Equilibrium Surface Growth to Algorithmic Scalability

  • PDF / 624,860 Bytes
  • 12 Pages / 612 x 792 pts (letter) Page_size
  • 87 Downloads / 107 Views

DOWNLOAD

REPORT


Going through Rough Times: from Non-Equilibrium Surface Growth to Algorithmic Scalability G. Korniss,1 M.A. Novotny,2 P.A. Rikvold,3 H. Guclu,1 and Z. Toroczkai4 1 Department of Physics, Applied Physics, and Astronomy, Rensselaer Polytechnic Institute, 110 8th Street, Troy, NY 12180-3590 USA 2 Department of Physics and Astronomy and Engineering Research Center, Mississippi State University, Mississippi State, MS 39762-5167 USA 3 School of Computational Science and Information Technology, Department of Physics, and Center for Materials Research and Technology, Florida State University, Tallahassee, FL 32306 USA 4 Theoretical Division and Center for Nonlinear Studies, MS-B258 Los Alamos National Laboratory, Los Alamos, NM 87545 USA ABSTRACT Efficient and faithful parallel simulation of large asynchronous systems is a challenging computational problem. It requires using the concept of local simulated times and a synchronization scheme. We study the scalability of massively parallel algorithms for discrete-event simulations which employ conservative synchronization to enforce causality. We do this by looking at the simulated time horizon as a complex evolving system, and we identify its universal characteristics. We find that the time horizon for the conservative parallel discrete-event simulation scheme exhibits Kardar-Parisi-Zhang-like kinetic roughening. This implies that the algorithm is asymptotically scalable in the sense that the average progress rate of the simulation approaches a non-zero constant. It also implies, however, that there are diverging memory requirements associated with such schemes. INTRODUCTION Faithful and efficient simulation of complex systems with many interacting degrees of freedom is an important and challenging computational task. In a large class of systems the dynamic evolution is inherently stochastic and changes in the local configuration of the system occur randomly in space and time. The modeling of these systems can follow a “bottom-up” approach, starting with the definition of the “microscopic” dynamics. Then a master equation can be constructed with the corresponding transition probabilities. In most cases, for systems with many interacting degrees of freedom, even the numerical solution of the master equation (typically involving the numerical diagonalization of huge matrices) becomes insurmountable. This is when simulation becomes an invaluable tool for complex system modeling. In the physics, chemistry, and biology communities these types of simulations are most commonly referred to as dynamic or kinetic Monte Carlo simulations. In computer science they are called discrete-event simulations. The updates (governed by the microscopic dynamics) in the (typically local) configuration of the system are considered discrete events. The basic notion of discrete-event simulation is that time is continuous and the discrete events occur instantaneously. Between events, the state (configuration) of the system remains unchanged. If the events occur at random instants of time, the dynamics can be

T