Journal of Computational Finance

Risk.net

The efficient application of automatic differentiation for computing gradients in financial applications

Wei Xu, Xi Chen and Thomas F. Coleman

  • Efficiently applying automatic differentiation in finance applications.
  • Fast computation of greeks and gradients in finance applications.
  • Memory efficient implementation of automatic differentiation for calibration and Monte Carlo applications.

ABSTRACT

Automatic differentiation (AD) is a practical field of computational mathematics that is of growing interest across many industries, including finance. The use of reverse-mode AD is particularly interesting, since it allows for the computation of gradients in the same time required to evaluate the objective function itself. However, it requires excessive memory. This memory requirement can make reverse-mode AD infeasible in some cases (depending on the function complexity and available RAM) and slower than expected in others, due to the use of secondary memory and nonlocalized memory references. However, it turns out that many complex (expensive) functions in finance exhibit a natural substitution structure. In this paper, we illustrate this structure in computational finance as it arises in calibration and inverse problems, and determine Greeks in a Monte Carlo setting. In these cases, the required memory is a small fraction of that required by reverse-mode AD, but the computing time complexity is the same. In fact, our results indicate a significant realized speedup compared with straight reverse-mode AD.

To continue reading...

You need to sign in to use this feature. If you don’t have a Risk.net account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an indvidual account here: