Photo by Camille Gladu-Douin
Full Professor, Université de Montréal
Founder and Scientific Director, Mila
Scientific Director, IVADO
Registration is required for this event. Click here to register.
ABSTRACT: Generative Flow Networks (or GFlowNets) have been introduced as a method to sample a diverse set of candidates in an active learning context, with a training objective that makes them approximately sample in proportion to a given reward function. We show a number of additional theoretical properties of GFlowNets. They can be used to estimate joint probability distributions and corresponding marginal distributions (when some variables are unspecified) and are particularly interesting to represent distributions over composite objects like sets and graphs. They amortize in a single but trained generative pass the work typically done by computationally expensive MCMC methods. They can be used to estimate partition functions and free energies, conditional probabilities of supersets or of larger graphs (supergraphs) given a subset of an included subgraph, as well as marginal distributions over all supersets of a set or supergraphs of a graph. The talk will highlight the relations and differences to standard approaches in generative modeling and reinforcement learning and summarize early experimental results obtained in the context of exploring the space of molecules to discover ones with properties of interest.
BIO: Yoshua Bengio is Full Professor in the Department of Computer Science and Operations Research at Université de Montreal, as well as the Founder and Scientific Director of Mila and the Scientific Director of IVADO. Considered one of the world’s leaders in artificial intelligence and deep learning, he is the recipient of the 2018 A.M. Turing Award with Geoff Hinton and Yann LeCun, known as the Nobel prize of computing. He is a Fellow of both the Royal Society of London and Canada, an Officer of the Order of Canada, and a Canada CIFAR AI Chair.