"Uniform Approximation of VC Classes" (This Week at the Statistics Seminar)
Something I have been meaning to post about is a series of papers by Terry
Adams and Andrew Nobel, on the intersection of machine learning theory (in the
form of Vapnik-Chervonenkis dimension
property) with stochastic processes,
specifically ergodic theory.
(arxiv:1010.3162; arxiv:1007.2964; arxiv:1007.4037)
I am very excited by this work, which I think is extremely important for
understanding learning from
dependent data, and so very pleased to report that, this week, our seminar
speaker is —
- Andrew Nobel, "Uniform Approximation of VC Classes"
- Abstract: The Vapnik-Chervonenkis (VC) dimension measures the ability of a family of sets to separate finite collections of points. The VC dimension plays a foundational role in the theory of machine learning and empirical processes. In this talk we describe new research concerning the structure of families with finite VC dimension. Our principal result states the following: for any family of sets in a probability space, either (i) the family has infinite VC dimenion or (ii) every set in the family can be approximated to within a given error by a fixed, finite partition. Immediate corollaries include the fact that families with finite VC dimenion have finite bracketing numbers, and satisfy uniform laws of large numbers for ergodic processes. We will briefly discuss analogous results for VC major and VC graph families of functions.
- The talk will include definitions and examples of VC dimension and related quantities, and should be accessible to anyone familiar with theoretical machine learning.
- This is joint work with Terry M. Adams.
- Time and Place: 4:30--5:30 pm on Thursday, 30 September 2010, in the Adamson Wing, Baker Hall
Through poor planning on my part, I have a
prior and conflicting
engagement, though a very worthwhile one.
Enigmas of Chance
Posted at September 27, 2010 15:30 | permanent link