Skip to main content

Computer Physics Communications

ISSN: 0010-4655

Visit Journal website

Datasets associated with articles published in Computer Physics Communications

Filter Results
1970
2024
1970 2024
5926 results
  • ElasTool v3.0: Efficient computational and visualization toolkit for elastic and mechanical properties of materials
    Efficient computation and visualization of elastic and mechanical properties are crucial in the selection of materials and the design of new materials. The ElasTool v3.0 toolkit marks a significant advancement in the computational analysis and visualization of elastic and mechanical properties of materials, essential in material selection and design. This enhanced version extends beyond standard calculations like elastic tensor, Young's modulus, bulk modulus, and Poisson's ratio. It introduces capabilities for computing minimum thermal conductivity, linear compressibility, rendering the Christoffel equation, and elastic energy density. Notably, it integrates advanced visualization tools, including compatibility with Plotly and Elate web platforms for interactive web-based property exploration. A key feature of ElasTool v3.0 is the implementation of second-order elastic constants (SOECs) for tubular 2D-based nanostructures and nanotubes. Leveraging high-efficiency strain-matrix sets (OHESS), the toolkit now facilitates efficient computation of elastic constants and mechanical properties at both zero and finite temperatures for 1D, 2D, and 3D dimensions. ElasTool is openly accessible on GitHub: https://github.com/elastool.
    • Dataset
  • ERCS24: An updated version of the ERCS08 program for calculations of the cross sections for atomic electron removal based on the ECPSSR theory and its variants
    ERCS24, an updated version of the ERCS08 program, calculates the atomic electron removal cross sections. It is written in FORTRAN in order to make it more portable and easier to customize by a large community of physicists, but it also comes with a separate windows graphics user interface control application ERCS24w that makes it easy to quickly prepare the input file, run the program, as well as view and analyze the output. The calculations are based on the ECPSSR theory for direct (Coulomb) ionization and non-radiative electron capture. With versatility in mind, the program allows for selective inclusion or exclusion of individual contributions to the cross sections from effects such as projectile energy loss, Coulomb deflection of the projectile, perturbation of electron's stationary state (polarization and binding), as well as relativity. This makes it straightforward to assess the importance of each effect in a given collision regime. The control application also makes it easy to setup for calculations in inverse kinematics (i.e. ionization of projectile ions by target atoms or ions). The previous version of this program may be found at https://doi.org/0.1016/j.cpc.2008.12.034.
    • Dataset
  • ePDFpy: A Python-based interactive GUI tool for electron pair distribution function analysis of amorphous materials
    ePDFpy is an interactive analysis program with a graphical user interface (GUI), designed to process the electron Pair Distribution Function (PDF) analysis of diffraction patterns from Transmission Electron Microscope (TEM), to identify the local atomic structure of amorphous materials. The program offers a user-friendly Python-based interface, providing a straightforward and adaptable workflow for PDF analysis. Various optimization and fitting processes were implemented to accurately reduce the electron diffraction data, including center-fitting and elliptical correction of diffraction data. An improved parameter-estimation feature is available to enhance the efficiency of the fitting process, along with an interactive GUI. ePDFpy will be freely distributed for academic purposes, with additional features, including a beam mask drawing module.
    • Dataset
  • Neutrinos from muon-rich ultra high energy electromagnetic cascades: The MUNHECA code
    An ultra high energy electromagnetic cascade, a purely leptonic process and initiated by either photons or e^±, can be a source of high energy neutrinos. We present a public python3 code, MUNHECA, to compute the neutrino spectrum by taking into account various QED processes, with the cascade developing either along the propagation in the cosmic microwave background in the high-redshift universe or in a predefined photon background surrounding the astrophysical source. The user can adjust various settings of MUNHECA, including the spectrum of injected high energy photons, the background photon field and the QED processes governing the cascade evolution. We improve the modeling of several processes, provide examples of the execution of MUNHECA and compare it with some earlier and more simplified estimates of the neutrino spectrum from electromagnetic cascades.
    • Dataset
  • BIMBAMBUM: A potential flow solver for single cavitation bubble dynamics
    In the absence of analytical solutions for the dynamics of non-spherical cavitation bubbles, we have implemented a numerical simulation solver based on the boundary integral method (BIM) that models the behavior of a single bubble near an interface between two fluids. The density ratio between the two media can be adjusted to represent different types of boundaries, such as a rigid boundary or a free surface. The solver allows not only the computation of the dynamics of the bubble and the fluid-fluid interface, but also, in a secondary processing phase, the computation of the surrounding flow field quantities. We present here the detailed implementation of this solver and validate its capabilities using theoretical solutions, experimental observations, and results from other simulation softwares. This solver is called BIMBAMBUM which stands for Boundary Integral Method for Bubble Analysis and Modeling in Bounded and Unbounded Media.
    • Dataset
  • The polarimeter vector for τ → 3πν_τ decays
    The polarimeter vector of the τ represents an optimal observable for the measurement of the τ spin. In this paper we present an algorithm for the computation of the τ polarimeter vector for the decay channels τ^- → π^- π^+ π^- ν_τ and τ^- → π^- π^0 π^0 ν_τ. The algorithm is based on a model for the hadronic current in these decay channels, which was fitted to data recorded by the CLEO experiment [1].
    • Dataset
  • Evaluation of classical correlation functions from 2/3D images on CPU and GPU architectures: Introducing CorrelationFunctions.jl
    Correlation functions are becoming one of the major tools for quantification of structural information that is usually represented as 2D or 3D images. In this paper we introduce CorrelationFunctions.jl open-source package developed in Julia and capable of computing all classical correlation functions based on imaging input data. Images include both binary and multi-phase representations. Our code is capable of evaluating two-point probability S_2, phase cross-correlation ρ_ij, cluster C_2, lineal-path L_2, surface-surface F_ss, surface-void F_sv, pore-size P and chord-length p distribution functions on both CPU and GPU architectures. Where possible, we presented two types of computations: full correlation map (correlations of each point with other points on the image, that also allows obtaining ensemble averaged CF) and directional correlation functions (currently in major orthogonal and diagonal directions). Such an implementation allowed for the first time to assemble a completely free solution to evaluate correlation functions under any operating system with well documented application programming interface (API). Our package includes automatic tests against analytical solutions that are described in the paper. We measured execution times for all CPU and GPU implementations and as a rule of thumb full correlation maps on GPU are faster than other methods. However, full maps require more RAM and, thus, are limited to available RAM resources. On the other hand, directional CFs are memory efficient and can be evaluated for huge datasets – this way they are the first candidates for structural data compression of feature extraction. The package itself is available through Julia package ecosystem and on GitHub, the latter source also contains documentation and additional helpful resources such as tutorials. We believe that a single powerful computational tool such as CorrelationFunctions.jl presented in this paper will significantly facilitate the usage of correlation functions in numerous areas of structural description and research of porous materials, as well as in machine learning applications. We also present some examples as applied to ceramic, soil composite and oil-bearing rock samples based on their 3D X-ray tomography and 2D scanning electron microscope images. Finally, we conclude our paper with discussion of possible ways to further improve presented computational framework.
    • Dataset
  • Massively parallel implementation of iterative eigensolvers in large-scale plane-wave density functional theory
    The Kohn-sham density functional theory (DFT) is a powerful method to describe the electronic structures of molecules and solids in condensed matter physics, computational chemistry and materials science. However, large and accurate DFT calculations within plane waves process a cubic-scaling computational complexity, which is usually limited by expensive computation and communication costs. The rapid development of high performance computing (HPC) on leadership supercomputers brings new opportunities for developing plane-wave DFT calculations for large-scale systems. Here, we implement parallel iterative eigensolvers in large-scale plane-wave DFT calculations, including Davidson, locally optimal block preconditioned conjugate gradient (LOBPCG), projected preconditioned conjugate gradient (PPCG) and the Chebyshev subspace iteration (CheFSI) algorithms, and analyze the performance of these algorithms in massively parallel plane-wave computing tasks. We adopt a two-level parallelization strategy that combines the message passing interface (MPI) with open multi-processing (OpenMP) parallel programming to handle data exchange and matrix operations in the construction and diagonalization of large-scale Hamiltonian matrix within plane waves. Numerical results illustrate that these iterative eigensolvers can scale up to 42,592 processing cores with high peak performance of 30% on leadship supercomputers to study the electronic structures of bulk silicon systems containing 10,648 atoms.
    • Dataset
  • micrOMEGAs 6.0: N-component dark matter
    micrOMEGAs is a numerical code to compute dark matter (DM) observables in generic extensions of the Standard Model (SM) of particle physics. We present a new version of micrOMEGAs that includes a generalization of the Boltzmann equations governing the DM cosmic abundance evolution which can be solved to compute the relic density of N-component DM. The direct and indirect detection rates in such scenarios take into account the relative contribution of each component such that constraints on the combined signal of all DM components can be imposed. The co-scattering mechanism for DM production is also included, whereas the routines used to compute the relic density of feebly interacting particles have been improved in order to take into account the effect of thermal masses of t-channel particles. Finally, the tables for the DM self-annihilation - induced photon spectra have been extended down to DM masses of 110 MeV, and they now include annihilation channels into light mesons.
    • Dataset
  • A SPIRED code for the reconstruction of spin distribution
    In Nuclear Magnetic Resonance (NMR), it is of crucial importance to have an accurate knowledge of the spin probability distribution corresponding to inhomogeneities of the magnetic fields. An accurate identification of the sample distribution requires a set of experimental data that is sufficiently rich to extract all fundamental information. These data depend strongly on the control fields (and their number) used experimentally to perturb the spin system. In this work, we present and analyze a greedy reconstruction algorithm, and provide the corresponding SPIRED code, for the computation of a set of control functions allowing the generation of data that are appropriate for the accurate reconstruction of a sample probability distribution. In particular, the focus is on NMR and spin dynamics governed by the Bloch system with inhomogeneities in both the static and radio-frequency magnetic fields applied to the sample. We show numerically that the algorithm is able to reconstruct non trivial joint probability distributions of the two inhomogeneous Hamiltonian parameters. A rigorous convergence analysis of the algorithm is also provided.
    • Dataset
1