UC Berkeley, Department of Statistics.
6th July 2023, 04:00pm - 05:00pm (GST)
|Title:||Optimal PAC Bounds without Uniform Convergence.|
In statistical learning theory, determining the sample complexity of realizable binary classification for VC classes was a long-standing open problem. The results of Simon and Hanneke established sharp upper bounds in this setting. However, the reliance of their argument on the uniform convergence principle limits its applicability to more general learning settings such as multiclass classification. In this talk, we will discuss a simple technique that addresses this issue. We will present optimal high probability risk bounds through a framework that surpasses the limitations of uniform convergence arguments.
In addition to binary classification, we will see applications in settings where uniform convergence is provably sub-optimal. For multiclass classification, we prove an optimal risk bound scaling with the one-inclusion hypergraph density of the class, addressing the suboptimality of the analysis by Daniely and Shalev-Shwartz
Based on joint work with Ishaq Aden-Ali, Yeshwanth Cherapanamjeri and Abhishek Shetty.
Nikita Zhivotovskiy is a Tenure-Track Assistant Professor at the University of California, Berkeley. He pursued his undergraduate and doctoral studies at the Moscow Institute of Physics and Technology (MIPT), completing his studies in 2018. Before his position at UC Berkeley, Nikita had worked at several institutions and teams, including the Technion - Israel Institute of Technology, Google Research in Zurich, the Brain Team, and the Swiss Federal Institute of Technology (ETH Zurich).