Journal of Computational Finance

Risk.net

Non-parametric calibration of jump–diffusion option pricing models

Rama Cont and Peter Tankov

ABSTRACT

We present a non-parametric method for calibrating jump–diffusion and, more generally, exponential Lévy models to a finite set of observed option prices. We show that the usual formulations of the inverse problem via non-linear least squares are ill-posed and propose a regularization method based on relative entropy: we reformulate our calibration problem into a problem of finding a risk-neutral exponential Lévy model that reproduces the observed option prices and has the smallest possible relative entropy with respect to a chosen prior model. Our approach allows us to reconcile the idea of calibration by relative entropy minimization with the notion of risk-neutral valuation in a continuoustime model. We discuss the numerical implementation of our method using a gradient-based optimization algorithm and show by simulation tests on various examples that the entropy penalty resolves the numerical instability of the calibration problem. Finally, we apply our method to data sets of index options and discuss the empirical results obtained.

Sorry, our subscription options are not loading right now

Please try again later. Get in touch with our customer services team if this issue persists.

New to Risk.net? View our subscription options

You need to sign in to use this feature. If you don’t have a Risk.net account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here