University of Toronto St. George Campus - Statistics
Google Research
Applied machine vision methods to solve content-based video classification problems. Contributed\nto the DistBelief framework
a close-to-the-metal distributed deep learning framework.
Google Research
Max Planck Institute for Intelligent Systems
Tübingen
Germany
Visiting Researcher
Doctor of Philosophy (PhD)
Machine Learning
University of Cambridge
French
Master of Science (M.Sc.)
Computer Science
Green College
The University of British Columbia / UBC
Invenia
Co-founded a machine learning research consulting company. Recruited
trained and supervised five\nresearch assistants
plus consultants. Drafted
presented and was awarded several research funding\ngrants. Led two research contracts applying modern machine learning methods to energy forecasts.\nThese projects led to the successful deployment of several automated forecasting systems for major\nutilities
forecasting electric load
wind generation and energy prices.
Invenia
Postodctoral Fellow
Harvard University
University of Toronto
Assistant Professor
Toronto
Canada Area
Frantic Films
University of Toronto
Frantic Films
Mathematical Modeling
Science
Pattern Recognition
Probabilistic Models
Python
Machine Learning
Data Analysis
Software Engineering
Statistics
Matlab
Research
Consulting
Artificial Intelligence
Computer Science
Algorithms
Solid Presentation Skills
LaTeX
Optimally-Weighted Herding is Bayesian Quadrature
We prove several connections between an efficient procedure for estimating moments (herding) which minimizes a worst-case error
and a model-based way of estimating integrals (Bayesian Quadrature). It turns out that both are optimizing the same criterion
and that Bayesian Quadrature is doing this in an optimal way. This means
among other things
that we can place worst-case error bounds on the optimal Bayesian estimator!
Optimally-Weighted Herding is Bayesian Quadrature
Zoubin Ghahramani
Josh Tenenbaum
Roger Grosse
To search through an open-ended class of structured
nonparametric regression models
we introduce a simple grammar which specifies composite kernels. These structured models often allow an interpretable decomposition of the function being modeled
as well as long-range extrapolation. Many common regression methods are special cases of this large family of models. We give several example decompositions time series.
Structure Discovery in Nonparametric Regression through Compositional Kernel Search
Zoubin Ghahramani
Tomoharu Iwata
If you fit a mixture of Gaussians to a single cluster that is curved or heavy-tailed
your model will report that the data contains many clusters! To fix this problem
we simply warp a latent mixture of Gaussians to produce nonparametric cluster shapes. The low-dimensional latent mixture model summarizes the properties of the high-dimensional clusters (or density manifolds) describing the data.
Nonparametric Clustering with Warped Mixture Models
Duvenaud
Harvard University
Max Planck Institute for Intelligent Systems