This is the multi-page printable view of this section. Click here to print.

Return to the regular view of this page.

CSE Seminars

Computational Science and Engineering (CSE) Seminars

    Computational Science and Engineering Seminars are an avenue for researchers to share progress on CSE topics. If you are interested in getting seminar annoucements join our mailing list.

    Schedule

    2023

    All seminars are located in JEC 3117 from 4-5PM unless otherwise noted.

    • 11/29: Strongly correlated quantum chemistry: numerical analysis and application of the TCC method, Fabian Maximilian Faulstich

      Strongly correlated quantum systems include some of the most challenging problems in science. I will present the first numerical analysis for the coupled cluster method tailored by matrix product states, which is a promising method for handling strongly correlated systems. I will then discuss recent applications of the coupled cluster method and matrix product states for solving the magic angle twisted bilayer graphene system at the level of interacting electrons.

    • 11/15: Multi-Fidelity Error-Estimate Based Model-Management, Tucker Babcock

      We present a novel multi-fidelity optimization algorithm that globalizes the optimization based on the estimated error between the low-fidelity and high-fidelity models. The optimization algorithm is similar to classical multi-fidelity trust-region model-management approaches, but it replaces the trust-radius constraint with a bound on the estimated error between the low- and high-fidelity models. This enables globalization without requiring the user to specify non-intuitive parameters such as the initial trust radius. We demonstrate the framework on a series of analytical optimization problems and a realistic electric-motor optimization. We show that for low-fidelity models that accurately capture the trends of the high-fidelity model, the developed framework can significantly improve the efficiency of obtaining high-fidelity optima compared to a direct high-fidelity optimization.

    • 11/1: RPI Platforms - CCI and Quantum Computer, Jay McGlothlin and CCI Team

      Computational science is core to the academic and research enterprise at Rensselaer. We procure and operate leadership class systems to enable our researchers’ projects. Our presention will provide information on 3 of these systems. The two super computers at the Center for Computational Innovation (CCI), AiMOS and AiMOSx, and Rensselaer’s new IBM Quantum System One. It will cover the technical details and information on how you can access and use these for your research needs. Collaboration with you, our clients, is key to meeting these current and future needs. The Manager of Research Computing Jay McGlothlin, CCI Director Chris Carothers , and CCI System Engineer Andrew Damin will be available for questions and feedback at the end of the presentation.

    • 10/18: GraphPAN - ML Preconditioner for Advection-dominated Non-symmetric problems, Soha Yusuf

      Solving large linear systems is a fundamental problem in many scientific and engineering domains. In recent years, there has been a growing interest in using machine learning techniques to accelerate the convergence of iterative solvers. My work proposes a novel approach that uses a graph neural network for preconditioning GMRES. We generate data by solving the linear advection equation in MFEM, and then perform incomplete LU factorization using message-passing graph neural network. This approach builds on recent advances in deep learning and graph neural networks and shows promising results on a set of benchmark problems, reducing the computational cost of solving large linear systems. The implications of this work extend to the development of efficient and scalable algorithms for scientific computing and data analysis.

    • 10/4: Development and Validation Studies in Hybrid Kinetic-Continuum Modeling of Jet Expansion Flows, Ozgur Tumuklu

      When conducting a controlled lunar surface landing, the mission’s success is significantly influenced by the interaction between the plumes generated by positioning rockets and the lunar surface itself. These plumes, originating from the rockets, not only make contact with the surface but can also dislodge material from it. This ejected material poses multiple potential risks, including structural damage to the landing vehicle, rendering it unsafe for Earth return, as well as potential threats to human safety and the structural integrity of any lunar base. Therefore, the modeling of these plume flows carries immense importance and presents considerable challenges. Notably, the lunar atmosphere is exceptionally thin, with molecules experiencing far fewer collisions than on Earth, placing it in the free molecular flow regime. For modeling the core plume, the Navier-Stokes equations are employed, while for the rarefied lunar atmosphere, the most computationally efficient and accurate method is Direct Simulation Monte Carlo (DSMC). The challenge lies in reconciling these two disparate methods within the plume-atmosphere interaction zone. Given the turbulent nature of the plume, the Large Eddy Simulation (LES) methodology with subgrid scale models emerges as the most accurate and computationally suitable approach for modeling the Navier-Stokes equations in this context. This seminar will delve into the development and validation studies of the LES-DSMC hybrid solver. In particular, a comprehensive explanation of DSMC will be provided, considering its limited familiarity, before delving into the coupling efforts.

    • 9/20: Ensemble Methods for the Magnetohydrodynamics Equations, John Carter

      In this presentation, we investigate second-order ensemble methods for fast computation of an ensemble of MHD flows. Computing an ensemble of flow equations with different input parameters is a common procedure for uncertainty quantification in many engineering applications, for which the computational cost can be prohibitively expensive for nonlinear complex systems. We propose multiple ensemble algorithms that reduce the computational cost and simulation time by requiring only solving a single linear system with multiple right-hand sides instead of solving multiple different linear systems. Specifically, we define an ensemble mean for the velocity and electric potential of the reduced MHD equations at small magnetic Reynold’s number and present the corresponding algorithm. Then, we extend this approach to the viscosity and magnetic resistivity of the full MHD equations and combine it with a generalized positive auxiliary variable technique (GPAV) to achieve an algorithm that’s unconditionally stable with respect to a modified energy. We also present numerical tests that illustrate the theoretical results and demonstrate the efficiency of the resulting algorithms.

    • 9/6: GITRm- A 3D Unstructured Mesh-based Multi Species Particle Tracking Code for Global Impurity Transport in Fusion Devices, Dhyanjyoti Nath

      To understand the behavior of impurities in tokamak fusion reactors, the physics of plasma-material interaction needs to be modeled accurately. In particular, a tool is needed to model impurity-transport in order to predict the migration pattern and interaction kinetics as well as re-deposition of impurities in realistic tokamak geometries. GITRm is a fully 3D unstructured mesh-based Monte Carlo particle (neutral atom and ion) tracking code for global impurity transport in fusion-relevant devices. It is designed to target complex wall geometries, including non-axisymmetric local features such as bumpers, probes, etc., and account for their interactions with the plasma. GITRm fully resolves the gyrokinetic-motion of impurity particles in 3D. It is built on the PUMIPic infrastructure that is designed to be performant on multi-GPU computers. It uses strongly graded and anisotropic elements to efficiently and accurately represent the plasma fields from various edge-plasma codes, e.g. SOLPS, OEDGE, SOLEDGE, especially in a 3D/volumetric fashion. Currently, the capabilities of GITRm are being extended to simulate multiple impurity species simultaneously and their interactions with different material surfaces including mixed surfaces.

      A high-fidelity impurity-transport simulation in fusion reactors involves multiple physics models, each targeting a certain aspect of the simulation. Furthermore, these coupled processes each happen on different temporal and spatial scales. As an example, the physics in the plasma sheath region needs to be modeled at much finer spatial and temporal scales than the impurity transport at the global level. GITRm provides flexibility in the use of fields from various sources (including various sheath models) and in-memory coupling with other codes to address the disparity in spatial and temporal scales of various physical processes that is encountered in plasma material interaction studies. For example, the sheath electric field can be calculated analytically or obtained through the integration of a sheath mesh (with GITRm) on which an electric field has been calculated using a sheath simulator.