Contents Online
Advances in Theoretical and Mathematical Physics
Volume 27 (2023)
Number 4
Machine-learned Calabi–Yau metrics and curvature
Pages: 1107 – 1158
DOI: https://dx.doi.org/10.4310/ATMP.2023.v27.n4.a3
Authors
Abstract
$\def\SingX{\mathrm{Sing}X}$Finding Ricci-flat (Calabi–Yau) metrics is a long standing problem in geometry with deep implications for string theory and phenomenology. A new attack on this problem uses neural networks to engineer approximations to the Calabi–Yau metric within a given Kähler class. In this paper we investigate numerical Ricci-flat metrics over smooth and singular K3 surfaces and Calabi–Yau threefolds. Using these Ricci-flat metric approximations for the Cefalú family of quartic twofolds and the Dwork family of quintic threefolds, we study characteristic forms on these geometries. We observe that the numerical stability of the numerically computed topological characteristic is heavily influenced by the choice of the neural network model, in particular, we briefly discuss a different neural network model, namely spectral networks, which correctly approximate the topological characteristic of a Calabi–Yau. Using persistent homology, we show that high curvature regions of the manifolds form clusters near the singular points. For our neural network approximations, we observe a Bogomolov–Yau type inequality $3c_2 \geq c^2_1$ and observe an identity when our geometries have isolated $A_1$ type singularities. We sketch a proof that $\chi (X \setminus \SingX) + 2 {\lvert \SingX \rvert} = 24$ also holds for our numerical approximations.
Published 6 June 2024