GPU-accelerated adjoint algorithmic differentiation

Published: 1 March 2016| Version 1 | DOI: 10.17632/w43rdsfm46.1
Contributors:
Felix Gremse, Andreas Höfter, Lukas Razik, Fabian Kiessling, Uwe Naumann

Description

Abstract Many scientific problems such as classifier training or medical image reconstruction can be expressed as minimization of differentiable real-valued cost functions and solved with iterative gradient-based methods. Adjoint algorithmic differentiation (AAD) enables automated computation of gradients of such cost functions implemented as computer programs. To backpropagate adjoint derivatives, excessive memory is potentially required to store the intermediate partial derivatives on a dedicated da... Title of program: AD-GPU Catalogue Id: AEYX_v1_0 Nature of problem Gradients are required for many optimization problems, e.g. classifier training or nonlinear image reconstruction. Often, the function, of which the gradient is required, can be implemented as a computer program. Then, algorithmic differentiation methods can be used to compute the gradient. Depending on the approach this may result in excessive requirements of computational resources, i.e. memory and arithmetic computations. GPUs provide massive computational resources but require special consid ... Versions of this program held in the CPC repository in Mendeley Data AEYX_v1_0; AD-GPU; 10.1016/j.cpc.2015.10.027 This program has been imported from the CPC Program Library held at Queen's University Belfast (1969-2018)

Files

Categories

Computer Hardware, Software, Programming Languages, Computational Physics, Computational Method

Licence