Journal of Computational Finance
ISSN:
1460-1559 (print)
1755-2850 (online)
Editor-in-chief: Christoph Reisinger
Automatic adjoint differentiation for special functions involving expectations
Need to know
- We propose effective AAD algorithms for certain functions involving expectations.
- Rigorous mathematical proofs for convergence of the algorithms are provided.
- Methods are fully implemented and the technique is applied to calibrate European options.
Abstract
In this paper we explain how to compute gradients of functions of the form G = ½∑mi=1(Eyi - Ci)2, which often appear in the calibration of stochastic models, using automatic adjoint differentiation and parallelization. We expand on the work of Goloubentsev and Lakshtanov and give approaches that are faster and easier to implement. We also provide an implementation of our methods and apply the technique to calibrate European options.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@risk.net
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@risk.net