We present a new package for Mathematica system, called Libra. Its purpose is to provide convenient tools for the transformation of the first-order differential systems ∂_i j = M_i j for one or several variables. In particular, Libra is designed for the reduction to ϵ-form of the differential systems which appear in multiloop calculations. The package also contains some tools for the construction of general solution: both via perturbative expansion of path-ordered exponent and via generalized power series expansion near regular singular points. Libra also has tools to determine the minimal list of coefficients in the asymptotics of the original master integrals, sufficient for fixing the boundary conditions.
Contributors:M. Osorno, M. Schirwon, N. Kijanski, R. Sivanesapillai, H. Steeb et al
Efficient numerical simulations of fluid flow on the pore scale allow for the numerical estimation of effective material properties of porous media like effective permeability or tortuosity, among others. In contrast to time-consuming and often expensive laboratory tests, pore scale-resolved numerical simulations further enable the computational quantification of anisotropy of inherent material properties and the estimation of representative sample domains. Numerically calculated quantities are valuable in several fields, such as carbon dioxide sequestration, geothermal energy production and groundwater contamination remediation. Our specific pore scale-resolved simulation method directly based on images obtained from Micro X-Ray Computed Tomography (μXRCT) is based on the weakly compressible Smoothed Particle Hydrodynamics (SPH) approach. SPH is a meshless Lagrangian method, highly suitable for modeling complex geometries and flow at moderate Reynolds numbers. Low Reynolds number flow, also denoted as creeping flow, is a typical scenario present in the above mentioned applications. However, SPH is computationally demanding, especially in simulations of large domains. To overcome these difficulties, we have designed a specific SPH module for the highly optimized HOOMD-blue Molecular Dynamics software. Our implementation supports single-phase flow, and targets both CPU and GPU clusters. Due to the high computational demands, scalability is essential to make the software practically usable, and our tests indicate that our implementation can scale almost ideally. We study a wide variety of test cases, which are not only representative for XRCT-based geometries, but for pore scale-resolved flow simulations in general. Additionally, we present a large-scale simulation investigating an unconventional high porous volcanic rock sample (Reticulite).
Contributors:Yuxiang Wang, Alper Kiziltas, Patrick Blanchard, Tiffany R. Walsh
Characterization of structural information at the atomistic level in molecular dynamics (MD) simulations is a necessary task for researchers in the fields of materials modeling and simulation. Visualization of the density distribution is typically one of the most important properties in structural characterization. Visual Molecular Dynamics (VMD) is a widely used molecular visualization package that can not only visualize complex molecular systems but also perform analysis by integrating special plugins or by running in-house generated TCL scripts. However, a density analysis is still not an in-built feature of VMD. This work presents a flexible and easy-to-use TCL code to be used in VMD, that can perform both 1D and 2D density calculations over any specified local areas of a given system. By using the built-in commands of VMD, the code can access and process trajectory files in any formats that are supported by VMD, as produced by mainstream simulation packages, i.e., LAMMPS, GROMACS, NAMD, and CHARMM, etc. This work introduces the calculation method, code, and usages in detail to provide a quick start for users in their density analysis work.
Contributors:Vei WANG, Nan Xu, Jin-Cheng Liu, Gang Tang, Wen-Tong Geng
We present the VASPKIT, a command-line program that aims at providing a robust and user-friendly interface to perform high-throughput analysis of a variety of material properties from the raw data produced by the VASP code. It consists of mainly the pre- and post-processing modules. The former module is designed to prepare and manipulate input files such as the necessary input files generation, symmetry analysis, supercell transformation, k-path generation for a given crystal structure. The latter module is designed to extract and analyze the raw data about elastic mechanics, electronic structure, charge density, electrostatic potential, linear optical coefficients, wave function plots in real space, etc. This program can run conveniently in either interactive user interface or command line mode. The command-line options allow the user to perform high-throughput calculations together with bash scripts. This article gives an overview of the program structure and presents illustrative examples for some of its usages. The program can run on Linux, macOS, and Windows platforms. The executable versions of VASPKIT and the related examples and tutorials are available on its official website vaspkit.com.
The udkm1Dsim toolbox is a collection of Python classes and routines to simulate the thermal, structural, and magnetic dynamics after laser excitation as well as the corresponding X-ray scattering response in one-dimensional samples, such as multilayers. The toolbox provides the capabilities to define arbitrary layered structures on the atomic level including a rich database of element-specific physical properties. The excitation of dynamics is represented by an N-temperature-model which is commonly applied in ultrafast physics. Structural dynamics due to thermal stresses are calculated by a linear-chain model of masses and springs. The implementation of specific magnetic dynamics can be easily accomplished by the user employing a generalized magnetization interface class. The resulting X-ray diffraction response is computed by kinematical or dynamical X-ray theory which can also include polarization-dependent magnetic scattering. The udkm1Dsim toolbox is highly modular and allows for injecting user-defined inputs at any step within the simulation procedure.
The previous version of this program (AERH_v1_0) can be found at https://doi.org/10.1016/j.cpc.2013.10.009.
A Monte Carlo framework for solving optimal control problems governed by kinetic models is presented. The focus is on a kinetic model with Keilson-Storer linear collision term and the control mechanism is an external space-dependent force. The purpose of this control is to drive an ensemble of particles to acquire a desired mean velocity and position and to reach a desired final configuration in phase space. For this purpose, a gradient-based computational strategy in the framework of Monte Carlo methods is developed. Results of numerical experiments successfully validate the proposed control framework.
We present a high-energy neutrino event generator, called LeptonInjector, alongside an event weighter, called LeptonWeighter. Both are designed for large-volume Cherenkov neutrino telescopes such as IceCube. The neutrino event generator allows for quick and flexible simulation of neutrino events within and around the detector volume, and implements the leading Standard Model neutrino interaction processes relevant for neutrino observatories: neutrino-nucleon deep-inelastic scattering and neutrino-electron annihilation. In this paper, we discuss the event generation algorithm, the weighting algorithm, and the main functions of the publicly available code, with examples.
Circular dichroism spectroscopy is a structural biology technique frequently applied to determine the secondary structure composition of soluble proteins. Our recently introduced computational analysis package SESCA aids the interpretation of protein circular dichroism spectra and enables the validation of proposed corresponding structural models. To further these aims, we present the implementation and characterization of a new Bayesian secondary structure estimation method in SESCA, termed SESCA_bayes. SESCA_bayes samples possible secondary structures using a Monte Carlo scheme, driven by the likelihood of estimated scaling errors and non-secondary-structure contributions of the measured spectrum. SESCA_bayes provides an estimated secondary structure composition and separate uncertainties on the fraction of residues in each secondary structure class. It also assists efficient model validation by providing a posterior secondary structure probability distribution based on the measured spectrum. Our presented study indicates that SESCA_bayes estimates the secondary structure composition with a significantly smaller uncertainty than its predecessor, SESCA_deconv, which is based on spectrum deconvolution. Further, the mean accuracy of the two methods in our analysis is comparable, but SESCA_bayes provides more accurate estimates for circular dichroism spectra that contain considerable non-SS contributions.
Contributors:Adam Spannaus, Kody J.H. Law, Piotr Luszczek, Farzana Nasrin, Cassie Putman Micucci et al
Significant progress in many classes of materials could be made with the availability of experimentally-derived large datasets composed of atomic identities and three-dimensional coordinates. Methods for visualizing the local atomic structure, such as atom probe tomography (APT), which routinely generate datasets comprised of millions of atoms, are an important step in realizing this goal. However, state-of-the-art APT instruments generate noisy and sparse datasets that provide information about elemental type, but obscure atomic structures, thus limiting their subsequent value for materials discovery. The application of a materials fingerprinting process, a machine learning algorithm coupled with topological data analysis, provides an avenue by which here-to-fore unprecedented structural information can be extracted from an APT dataset. As a proof of concept, the material fingerprint is applied to high-entropy alloy APT datasets containing body-centered cubic (BCC) and face-centered cubic (FCC) crystal structures. A local atomic configuration centered on an arbitrary atom is assigned a topological descriptor, with which it can be characterized as a BCC or FCC lattice with near perfect accuracy, despite the inherent noise in the dataset. This successful identification of a fingerprint is a crucial first step in the development of algorithms which can extract more nuanced information, such as chemical ordering, from existing datasets of complex materials.
Contributors:Jonas Klappert, Fabian Lange, Philipp Maierhöfer, Johann Usovitsch
We present the new version 2.0 of the Feynman integral reduction program Kira and describe the new features. The primary new feature is the reconstruction of the final coefficients in integration-by-parts reductions by means of finite field methods with the help of FireFly. This procedure can be parallelized on computer clusters with MPI. Furthermore, the support for user-provided systems of equations has been significantly improved. This mode provides the flexibility to integrate Kira into projects that employ specialized reduction formulas, direct reduction of amplitudes, or to problems involving linear system of equations not limited to relations among standard Feynman integrals. We show examples from state-of-the-art Feynman integral reduction problems and provide benchmarks of the new features, demonstrating significantly reduced main memory usage and improved performance w.r.t. previous versions of Kira.