Dates
Friday, December 08, 2023 - 11:00am to Friday, December 08, 2023 - 12:00pm
Location
NCS 220
Event Description

Abstract:

Modern machine learning (ML) problems are combinatorial and non-convex, for which theoretical guarantees are quite limited. Furthermore, while quantitative guarantees (e.g., small test error) have been studied, qualitative guarantees (e.g., outlier robustness) are mostly lacking. My long-term research goal is to uncover the general foundations of ML and optimization that drives the empirical success across many specific combinatorial and non-convex ML problems. I aim to develop a set of optimization-theoretic frameworks and tools to bridge the aforementioned gaps, to further our understanding of continuous (possibly non-convex) relaxations of combinatorial problems, as well as our knowledge of non-convexity.

In this talk, I introduce a unifying framework, which uses the power of continuous relaxations (beyond convexity), Karush-Kuhn-Tucker conditions, primal-dual certificates and concentration inequalities. This framework has produced breakthroughs not only for classical combinatorial and non-convex problems, e.g., Bayes nets, structured prediction, community detection, sparse PCA, but also on areas of recent interest such as fairness, meta learning, federated learning and robustness.

As a warmup example, I present inference in structured prediction, showing that the success of semidefinite programming depends on the algebraic connectivity of the input graph. As a breakthrough, I also provide guarantees for graphs with poor expansion properties (e.g., planar graphs) with noise (cf. smooth analysis).

As a second example, I consider the problem of fair sparse regression on a biased dataset where bias depends upon a hidden binary attribute. Thus, one needs to combine sparse regression with clustering. An invex (nonconvex) relaxation is proposed for this combinatorial problem. Besides the technical breakthrough, I show that fair sparse regression is possible even without knowing the sensitive attribute(s).

As a third example, I study the problem of exact partitioning of high-order models, formulated as a tensor optimization problem. This high-order combinatorial problem is relaxed into a convex conic form problem. To this end, I carefully define the Caratheodory symmetric tensor cone, and show its convexity, and the convexity of its dual cone.

Bio:

Jean Honorio will soon join the School of Computing and Information Systems at The University of Melbourne, as Senior Lecturer. Jean was an Assistant Professor in the Computer Science Department at Purdue University, as well as in the Statistics Department (by courtesy). Prior to joining Purdue, Jean was a postdoctoral associate at MIT. His Erdős number is 3. His work has been partially funded by NSF. He is an editorial board reviewer of JMLR, and has served as area chair of NeurIPS and ICML, senior PC member of AAAI and IJCAI, PC member of NeurIPS, ICML, AIStats among other conferences and journals.

Event Title
Seminar: 'Computational and Statistical Foundations in Combinatorial and Non-Convex ML'