February, Tuesday 13th
14:30 (room 2014, 'Digiteo Shannon' 660 building) (see location)Zoltan Szabo
(CMAP & DSI, École Polytechnique, http://www.cmap.polytechnique.fr/~zoltan.szabo/)Title: Linear-time Divergence Measures with Applications in Hypothesis Testing
Abstract
Maximum mean discrepancy and Hilbert-Schmidt independencecriterion are among the most popular and successful techniques in
machine learning to measure the difference and the independence of
random variables, respectively. Their computational complexity is
however rather restrictive, quadratic in the number of samples. In order
to mitigate this serious computational bottleneck, I am going to present
3 linear-time kernel-based alternatives with illustrations in hypothesis
testing. The power of the new linear-time methods is demonstrated in
natural language processing (distinguishing articles from two
categories), computer vision (differentiating positive and negative
emotions), dependency testing of media annotations (song - year of
release, video - caption) and criminal data analysis.
(Joint work with Wittawat Jitkrittum, Wenkai Xu, Kacper Chwialkowski,
Arthur Gretton and Kenji Fukumizu)
Links
Information Theoretical Estimators toolbox:https://bitbucket.org/szzoli/ite-in-python/,
Linear-time Two-sample Testing (NIPS-2016, Oral):
http://papers.nips.cc/paper/6148-interpretable-distribution-features-with-maximum-testing-power
(code: https://github.com/wittawatj/interpretable-test),
Linear-time Independence Testing (ICML-2017):
http://proceedings.mlr.press/v70/jitkrittum17a.html
(code: https://github.com/wittawatj/fsic-test),
Linear-time Goodness-of-fit Testing (NIPS-2017, Best Paper Award):
https://papers.nips.cc/paper/6630-a-linear-time-kernel-goodness-of-fit-test
(code: https://github.com/wittawatj/kernel-gof).
Contact: guillaume.charpiat at inria.fr
All TAU seminars: here