Authored by, Davide Gianatti, Serena Manti and Gianluca Molteni of the Financial Engineering and A.I. team at List.
The aim of this post is to introduce a novel systematic approach that could be used to calibrate quickly any model describing interest rates. The core of the algorithm is a Neural Network (NN) that outputs the parameters of the model under study. In particular, we focused on the HJM-FMM model that has been already described in a previous post.
The target of a calibration process is to find the set of parameters that best fit the market data, such as term structures or volatility matrices. These methods, such as the Levenberg-Marquardt algorithm, often involve a huge computational burden, which could be bypassed using a Neural Network trained to recognize the values of the optimal parameters given the market data.
In fact, in this post we will show that we could reduce the computational time from an average of 34 seconds to 0.05 seconds by exploiting a Neural Network.
The algorithm we have developed is structured in two main parts: the generation of the training set and the training of the Neural Network.
Regarding the former, it is well known that Machine Learning methods are data hungry, in the sense that often large datasets are required to achieve satisfactory performances (applications that need billions of samples are countless).
To generate our training set, we have designed an algorithm that can simulate, starting from today, the evolution of the model parameters for the next market days. One sample in our training set is composed of a yield curve, a swaption volatility matrix and the optimal parameters of the model at issue, and is created exploiting the algorithm mentioned above in a way that the synthetic data set resembles today’s market state.
The generation of the training set and the training of the network can be done when the market is closed. For example, on Saturday we make the market data of Friday evolve for one week multiple times and we train the network on all those possible evolutions. On Monday, when the market is open, we will have our network ready to calibrate the model all week long.
As for the training of the Neural Network, it is useful to imagine that an ideal function exists that maps the market data into the optimal model parameters, and, hence, the training of the neural network can be simply seen as an approximation of this function through a fitting process.
The Neural Network that was found to best serve our purposes is a multilayer perceptron trained to minimize the mean absolute error of the parameters over the whole training set.
We have selected three dates within our historical data set, from which, by exploiting the past market data, we have created three corresponding training sets resembling different moments in the market state. Then, based on each set, we have trained a network capable of reproducing the model parameters in the near future market evolution.
The three dates are listed below, along with our reasons for choosing them.
- 26th November 2021. The containment policies of the COVID pandemic were not yet over and the overnight interest rates were kept rather stable around -0.5%, in order to revitalize the economy.
- 22nd July 2022. After the outbreak of the war in Ukraine and the increase in inflation, central banks have decided to raise interest rates; this date occurred just a few days before the ECB announced the first raise of interest rates in Europe.
- 28th April 2023. This is the last period we had at our disposal, characterized by high interest rates and another upward adjustment due to ECB decisions.
We simulated the market evolution for one month (20 business days), to see if the algorithm was sufficiently stable even for periods longer than a single week, and we generated small data sets of 200,000 samples each.
Below we show the table comparing the prices obtained through the Neural Network and the market prices, on the tenth market day of each time frame.
Next Figures 2, 3, 4 represent the swaption pricing errors with respect to market prices and the relative differences between the NN and the LM results, on the same days. The errors produced on the other days of the month are analogous. We also show the computational time of model parameters calibration using NN and LM.
In this post we have shown a novel approach to generate efficiently huge synthetic data sets resembling the evolution of the market in the next days to come. This procedure opens up the possibility of training a Neural Network with the aim of calibrating a model given the current state of the market.
We proved that the performances of the network are practically analogous – considering the HJM model – to those of standard calibration algorithms, but with a considerable gain in computational time: the neural network is, on average, 708 times faster (0.05 secs against 35.4 secs)!
We believe that the algorithm we propose can be generalized effectively for most of the market models, and that the computational gain would increase dramatically when considering models whose calibration is way more expensive than the HJM one.