Journal of Computational Finance

It is a pleasure to introduce the latest issue of The Journal of Computational Finance. In it, we once again highlight a wealth of new developments in the field. These include conceptual improvements to model calibration, optimal hedging in illiquid and incomplete markets, and the optimization of the forecasting performance of neural networks using evolutionary algorithms.

In our first paper, “The Chebyshev method for the implied volatility”, Kathrin Glau, Paul Herold, Dilip B. Madan and Christian Potz introduce a Chebyshev interpolation method for the implied volatility function. A judicious splitting of the domain in the strike and maturity variables, combined with the application of suitable scaling functions, allows the authors to take advantage of the subexponential convergence that is proven in the paper. Numerical tests demonstrate the method’s robustness, and it achieves almost machine-precision accuracy.

Xavier Warin proposes a regression-based Monte Carlo algorithm for the construction of mean–variance optimal hedging strategies under illiquidity constraints in “Variance optimal hedging with application to electricity markets”, the issue’s second paper. He applies the dynamic programming principle to determine the value function and the associated optimal cashflows. An application to energy markets with load uncertainty illustrates the efficiency of the computed strategy.

In the third paper in the issue, “One-dimensional Markov-functional models driven by a non-Gaussian driver”, Jaka Gogala and Joanne Kennedy provide new insights into Markov-functional models for the calibration of interest rate models. The novelty lies in the extension of classical ideas to non-Gaussian processes and the inclusion of copula-based modeling concepts.

“Ensemble models in forecasting financial markets”, our fourth and final paper, finds Andreas Karathanasopoulos, Mitra Sovan, Chia Chun Lo, Adam Zaremba and Mohammed Osman presenting a comparison of different evolutionary algorithms to optimize the architecture of hybrid neural networks for time series forecasting tasks. Systematic numerical studies show the utility of this approach and highlight the best-performing networks.

As ever, I hope you will find these contributions interesting and useful.

Christoph Reisinger

University of Oxford

You need to sign in to use this feature. If you don’t have a Risk.net account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here