The core objective of this challenge is to predict future call and put option prices by implementing a Quantum Machine Learning approach. Given the provided dataset, our task is to develop a model that leverages quantum computing to achieve accurate price forecasting.
We started with an exploratory analysis and found 250 columns. Aside from the date, the rest are configurations of tenor and maturity, which essentially define the volatility surface grid.
We realized we're dealing with a high-dimensional variable. However we also noted specific technical constraints for our implementation because in simulation we are limited to up to 20 modes and 10 photons.
Note: The full EDA process and data visualization is in the following notebook: eda.ipynb
We realized that call and put prices depend directly on implied volatility through the Black-76 formula.
Call Price:
Put Price:
where:
-
$F_t$ = forward price, -
$K$ = strike, -
$r$ = risk-free rate, -
$\tau$ = time to option maturity, -
$\hat{\sigma}$ = predicted implied volatility.
Although other factors are involved, volatility is the only non-observable variable that ultimately dictates the final price. Therefore, we understood that the core task was not to predict the price itself, but to first solve the forecasting of the volatility surface and then simply apply the financial formula.
We define our mathematical problem as the evolution of a discretized implied volatility surface represented by
Financial markets are inherently time-dependent and influenced by external information, meaning the volatility surface does not evolve as a closed autonomous system. Instead, it is more naturally described as an open dynamical system where future states depend on variables that are not always directly observable.
We model this evolution using:
In this framework:
-
$s(t)$ is a latent internal market state, -
$u(t)$ represents external inputs, -
$F$ governs the temporal dynamics, -
$G$ maps the latent state to observable implied volatilities.
This formulation directly motivates the use of Quantum Reservoir Computing since reservoir architectures are specifically designed to approximate open dynamical systems with memory.
To model the complex dynamics of the volatility surface, we explored two distinct approaches based on how we handle the input data for the quantum system.
Hybrid Approach A processes the original 250 dimensions to capture global correlations, whereas Hybrid Approach B uses PCA to compress the data for efficient processing.
Note: The full Hybrid Approach A implementation is in the following notebook: qml_regression.ipynb
3.2.1 Quantum Reservoir Computing
To model nonlinear temporal dependencies, we use a Quantum Reservoir Computing framework.
The reduced input
where:
-
$\rho(t)$ is the quantum state of the reservoir, -
$U(Z_t)$ is a unitary transformation dependent on the input, -
$U^\dagger$ is the adjoint operator.
In this hackathon this evolution is implemented using the Merlin photonic quantum framework, which abstracts the explicit construction of interferometers, unitary operators, state evolution, and measurement processes.
Instead of manually defining the full quantum circuit and its mathematical formalism, we rely on Merlin’s high-level quantum layer to handle the internal photonic transformations and feature extraction, allowing us to focus on the hybrid quantum–classical modeling strategy rather than the low-level quantum implementation details.
Of everything Merlin offers, our implementation uses the QuantumLayer to run the quantum state evolution abstractly. This layer integrates Quandela's photonic hardware directly into PyTorch, letting us transform the
Our architecture also implements a QR2 Model, leveraging Merlin’s capabilities to run multiple quantum reservoirs in parallel. This ensemble configuration, optimized through lexgrouping, allows us to process the temporal dynamics of volatility with much more robustness than a single reservoir. By integrating this logic within the QuantumLayer, we delegate the management of photonic interference and high dimensional feature extraction to Quandela’s hardware, while maintaining full compatibility with the PyTorch ecosystem.
Note: The full Hybrid Approach B implementation is in the following notebook: Hybrid_QRC_Model.ipynb
3.2.1 PCA
Directly embedding a 250 dimensional input into a quantum circuit is impractical. We use PCA to compress the volatility surface into a manageable vector that respects the 20 mode and 10 photon constraint of the simulation. This ensures that the Quantum Reservoir processes high-density information without the noise of redundant features, leading to more stable and faster forecasting.
Note: The full PCA implementation is in the following notebook: quantum_feature_engineering.ipynb
3.2.2 Quantum Reservoir Computing
The QRC serves as a high-dimensional feature extractor. By processing data through the reservoir's dynamics, the system transforms the input into a complex quantum space, capturing non-linear dependencies
3.2.3 Linear Regression
The Linear Regression acts as the final readout layer, translating high-dimensional quantum features into financial predictions.
- Rocio Lizeth Valentin Carhuancho
- Ariana Camila Lopez Julcarima
- Jose Alessandro Quispe Caballero
- Dayana Gomez Rodriguez
- GitHub Repository: https://github.com/RocioValentin/QML-Hackathon.git



