The Topology and Complexity of Deep Computations (Part I)
The goal of this talk is to show how the asymptotic limits of computational models, ranging from deep neural networks to numerical approximation algorithms, can be rigorously classified using the techniques from Cp-theory.
Specifically, this two-part presentation aims to demonstrate that the topological closure of certain families of computations forms a Rosenthal compactum, the structure of which dictates the learnability and stability of the underlying algorithms.
Part I focuses on exhibiting a sort of "Rosetta Stone" between Topology, Model Theory and Statistical Learning, showing that notions such as Baire class 1, NIP, and PAC learnability are not merely analogous but are different manifestations of a single underlying classification phenomenon.
This is joint work with Eduardo Dueñez, José Iovino, Luciano Salvetti, and Franklin D. Tall.

