Center for the Simulation of Advanced Rockets

From Wikipedia, the free encyclopedia
Jump to: navigation, search
Founded 1997
Location University of Illinois at Urbana-Champaign
Department Computational Science and Engineering
Goal Develop accurate computation models of solid-state rocket propellant systems
Staff Approx. 80 Faculty, Staff, and Students [1]
Research Areas

Fluids and Combustion

Structures and Materials

Computer Science

System Integration

Uncertainty Integration

The Center for Simulation of Advanced Rockets (CSAR) is an interdisciplinary research group at the University of Illinois at Urbana-Champaign, and is part of the United States Department of Energy's Advanced Simulation and Computing Program. CSAR's goal is to accurately predict the performance, reliability, and safety of solid propellant rockets.[2]

CSAR was founded in 1997 as part of the Department of Energy's Advanced Simulation and Computing Program. The goal of this program is to "enable accurate prediction of the performance, reliability, and safety of complex physical systems through computational simulation." CSAR extends this motive into the realm of solid rocket propellants, specifically those used by the Space Shuttle. [1]

CSAR aims to be able to simulate entire rocket systems, under normal and abnormal situations. This involves highly accurate modeling of components and dynamics of fuel flow and other environmental factors. Modeling this requires large computational power, on the order of thousands of processors. Development of the computational infrastructure is critical in achieving their goal.[1]

Areas of research[edit]

There are several fields researched by CSAR.[3] Physical simulations are implemented in CSAR's Rocstar software suite.

  • Fluids and combustion - The study of how a rocket's fuels are ignited and directed in such a way as to provide thrust.
    • Multiphase Flow
    • Turbulence Modeling
    • Multiscale Acceleration (via "time zooming")
    • Propellant Morphology/characterization
    • Propellant combustion modeling
  • Structures and Materials - Analysis of the physical structure of a rocket
    • Constitutive and damage modeling
    • Crack propagation
    • Multiscale materials modeling
    • Molecular modeling of material interfaces
    • Space-time discontinuous Galerkin Methods
  • Computer Science - Development of advanced simulation and visualization tools
    • Parallel programming environments
    • Parallel I/O
    • Parallel performance modeling and prediction
    • Meshing
    • Hybrid geometric/topological mesh partitioner
    • Visualization
  • System Integration - Bringing the available tools and resources together in an efficient manner
    • Object-oriented integration framework
    • Flexible parallel orchestration
    • Stable component coupling and time-stepping
    • Accurate and conservative data transfer based on common refinement
    • Stable and efficient surface propagation
  • Uncertainty Quantification - Determining accuracy and confidence in simulation results
    • Clustering techniques for sampling-based Uncertainty Quantification

Computation environment[edit]

Physical simulations are performed using CSAR's Rocstar suite of numerical solver applications. Rocstar was built by CSAR, and is designed to run efficiently on massively parallel computers. Implementation of Rocstar is done in MPI and is entirely compatible with Adaptive MPI. Rocstar is currently in its third version, Rocstar 3. Documentation on using Rocstar 3 is available through a User's Guide.

CSAR uses a number of supercomputing resources for their simulations. Along with CSAR, the National Center for Supercomputing Applications is located at the University of Illinois at Urbana-Champaign. CSAR takes advantage of the computing environment provided by NCSA for many simulations. The university's department of Computational Science and Engineering has a supercomputing cluster known as Turing, which is also utilized by CSAR.[4]

The computation environment used by CSAR takes advantage of work done by the University of Illinois' Parallel Programming Lab, in particular Charm++ and Adaptive MPI.[5] These parallel programming frameworks allow for application development that scales easily to thousands of processors, which allows for highly complex computations to finish quickly. The Run-time system employed by both Charm++ and AMPI has two primary features that are used by CSAR's software: load-balancing, which helps improve performance by keeping work distributed evenly across all processors, and checkpointing, which allows a lengthy computation to be saved and restarted without having to start over.

Using these highly parallel tools, CSAR's developers have built a number of components which are able to simulate various physical phenomena related to rocket propulsion. Combined together, they provide a complete simulation environment. Below is a list of all the Rocstar modules and links to their respective users guides.

Rocstar Modules[6]
Field Name User's Manual Description
Combustion Rocburn [1]
Fluids RocfloMP [2]
RocfluMP [3]
Roctpart [4]
Rocturb [5]
Rocrad [6]
Solids Rocfrac [7]
Rocsolid [8]
Computer Science Rocman [9]
Roccom [10]
Rocface [11]
Rocblas [12]
Rocin [13]
RocHDF [14]
Rocmop [15]
Rocrem [16]
Rocketeer [17] Visualization tool for complex 2-D and 3-D data sets.
Utilities Rocbuild [18]
Roctest [19]
Rocdiff [20]
Rocprep [21]



  1. ^ a b c About CSAR Archived May 13, 2008, at the Wayback Machine. Retrieved October 10, 2008
  2. ^ CSAR Homepage Archived October 6, 2008, at the Wayback Machine. Retrieved October 10, 2008
  3. ^ Basic Research at CSAR Archived May 10, 2008, at the Wayback Machine. Retrieved October 10, 2008
  4. ^ CSAR Computing Archived February 1, 2009, at the Wayback Machine. Retrieved October 11, 2008
  5. ^ Parallel Programming Lab: Rocket Simulation Retrieved October 11, 2008
  6. ^ CSAR Software Documentation Archived May 13, 2008, at the Wayback Machine. Retrieved October 15, 2008
  7. ^ International Symposium on Solid Rocket Modeling and Simulation Archived May 13, 2008, at the Wayback Machine. Retrieved October 10, 2008