February, Tuesday 13th

14:30 (room 2014, 'Digiteo Shannon' 660 building) (see location)

Zoltan Szabo

(CMAP & DSI, École Polytechnique, http://www.cmap.polytechnique.fr/~zoltan.szabo/)

Title: Linear-time Divergence Measures with Applications in Hypothesis Testing


Maximum mean discrepancy and Hilbert-Schmidt independence
criterion are among the most popular and successful techniques in
machine learning to measure the difference and the independence of
random variables, respectively. Their computational complexity is
however rather restrictive, quadratic in the number of samples. In order
to mitigate this serious computational bottleneck, I am going to present
3 linear-time kernel-based alternatives with illustrations in hypothesis
testing. The power of the new linear-time methods is demonstrated in
natural language processing (distinguishing articles from two
categories), computer vision (differentiating positive and negative
emotions), dependency testing of media annotations (song - year of
release, video - caption) and criminal data analysis.
(Joint work with Wittawat Jitkrittum, Wenkai Xu, Kacper Chwialkowski,
Arthur Gretton and Kenji Fukumizu)


Information Theoretical Estimators toolbox:

Linear-time Two-sample Testing (NIPS-2016, Oral):
(code: https://github.com/wittawatj/interpretable-test),

Linear-time Independence Testing (ICML-2017):
(code: https://github.com/wittawatj/fsic-test),

Linear-time Goodness-of-fit Testing (NIPS-2017, Best Paper Award):
(code: https://github.com/wittawatj/kernel-gof).

Contact: guillaume.charpiat at inria.fr
All TAU seminars: here