Physics-Informed Singular-Value Learning For Cross-Covariances Forecasting In Financial Markets

Seminars

Speaker: Efstathios Manolakis, PhD Candidate

Date: February 17, 2025

Bio:
Efstratios Manolakis originally moved to Germany to compete in professional water polo, playing in the Bundesliga as well as for the youth National team. After finishing school there, he began his studies in Physics, eventually earning both his Bachelor’s and Master’s degrees from the University of Duisburg-Essen, where he specialized in Random Matrix Theory and high-frequency data handling.

He subsequently won a European Scholarship at the University of Catania to pursue a PhD in Complex Systems. Most recently, he spent a year as a Visiting Researcher at CentraleSupélec in Paris. There, he deepened his work on Physics-Informed Neural Networks and assisted in teaching courses on Graph Theory and Python for time-series analysis.

Link 1

Abstract:
This talk will introduce a cross-covariance estimator that is based on analytical Ran-dom Matrix Theory (RMT) [1] and extends it by using a physics-inspired neural network (PINN) [2] framework. Extending our previous work, we develop a physics-inspired archi-tecture that blends denoising and forecasting to address complex, non-stationary dynamics beyond the scope of analytical theory.

In the study of complex systems, characterizing the cross-covariance between distinct sets of variables is a fundamental challenge which suffers in high-dimensional regimes due to sampling noise. While recent progress in RMT has delivered asymptotically optimal analytical solutions [3, 4] for covariance cleaning, extending these results to the rectangular matrices remains an open question. Current research relies on strong stationarity and mesoscopic regularity conditions that are frequently violated in real-world data.

Our method operates in the empirical singular-vector basis and defines a nonlinear mapping from empirical singular values to their cleaned counterparts. The architecture recovers the analytical RMT solution as a limiting case while utilizing a PINN to adapt to non-stationary distortions. We demonstrate that this framework achieves systemati-cally lower out-of-sample mean squared errors than purely analytical cleaners across both synthetic benchmarks and decades of real-market equity returns.

REFERENCES:
[1] Florent Benaych-Georges, Jean-Philippe Bouchaud, and Marc Potters. Optimal cleaning for singular values of cross-covariance matrices. The Annals of Applied Probability, 33(2):1295– 1326, 2023.
[2] Christian Bongiorno, Efstratios Manolakis, and Rosario Nunzio Mantegna. End-to-end large portfolio optimization for variance minimization with neural networks through covariance cleaning. arXiv preprint arXiv:2507.01918, 2025.
[3] Olivier Ledoit and Michael Wolf. Quadratic shrinkage for large covariance matrices.Bernoulli, 28(3):1519–1547, 2022.
[4] Joël Bun, Jean-Philippe Bouchaud, and Marc Potters. Cleaning large correlation matrices: tools from random matrix theory. Physics Reports, 666:1–109, 2017.