

Wasserstein gradient flows and applications to sampling in machine learning - lecture 1
De Anna Korba


Wasserstein gradient flows and applications to sampling in machine learning - lecture 2
De Anna Korba
Apparaît dans la collection : 2016 - T1 - WS5 - Secrecy and privacy theme
This talk will focus on a type of problems, the goal of which is to detect existence of an anomalous object over a network. An anomalous object, if it exists, corresponds to a cluster of nodes in the network that take data samples generated by an anomalous distribution q whereas all other nodes in the network receive samples generated by a distinct distribution p. Such a problem models a variety of applications such as detection of an anomalous intrusion via sensor networks and detection of an anomalous segment in a DNA sequence. All previous studies of this problem have taken parametric models, i. e. , distributions p and q are known. Our work studies the nonparametric model, in which distributions can be arbitrary and unknown a priori. In this talk, I will first introduce the approach that we apply, which is based on mean embedding of distributions into a reproducing kernel Hilbert space (RKHS). In particular, we adopt the quantity of maximum mean discrepancy (MMD) as a metric of distance between mean embeddings of two distributions. I will then present our construction of MMD-based tests for anomalous detection over networks and our analysis of performance guarantee of the proposed tests. I will finally present a number of numerical results to demonstrate our results. Towards the end of the talk, I will discuss some related problems and conclude with a few future directions.