Journal of Computational Finance
ISSN:
1460-1559 (print)
1755-2850 (online)
Editor-in-chief: Christoph Reisinger

Automatic adjoint differentiation for special functions involving expectations
Need to know
- We propose effective AAD algorithms for certain functions involving expectations.
- Rigorous mathematical proofs for convergence of the algorithms are provided.
- Methods are fully implemented and the technique is applied to calibrate European options.
Abstract
In this paper we explain how to compute gradients of functions of the form G = ½∑mi=1(Eyi - Ci)2, which often appear in the calibration of stochastic models, using automatic adjoint differentiation and parallelization. We expand on the work of Goloubentsev and Lakshtanov and give approaches that are faster and easier to implement. We also provide an implementation of our methods and apply the technique to calibrate European options.
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. Printing this content is for the sole use of the Authorised User (named subscriber), as outlined in our terms and conditions - https://www.infopro-insight.com/terms-conditions/insight-subscriptions/
If you would like to purchase additional rights please email info@risk.net
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. Copying this content is for the sole use of the Authorised User (named subscriber), as outlined in our terms and conditions - https://www.infopro-insight.com/terms-conditions/insight-subscriptions/
If you would like to purchase additional rights please email info@risk.net