

Wasserstein gradient flows and applications to sampling in machine learning - lecture 1
By Anna Korba


Wasserstein gradient flows and applications to sampling in machine learning - lecture 2
By Anna Korba
Appears in collection : 2016 - T1 - WS1 - Distributed computation and communication theme
We study the communication complexity for the problem of computing a conjunctive query on a large database in a massively parallel setting. In contrast to previous work, where upper and lower bounds on the communication were specified for particular structures of data, in this work we focus on worst-case analysis of the communication cost. We also show a surprising connection to the external memory model, which allows us to translate parallel algorithms to external memory algorithms. This technique allows us to recover several recent results on the I/O complexity for computing join queries, and also obtain optimal algorithms for other classes of queries.