Image Home
Image People
Image Publication
Image Applications
Image Data & Code
Image Library
Image Seminar
Image Link
Image Album

Search LAMDA

Seminar abstract


Optimal Reverse Prediction: A Unified Perspective on Supervised, Unsupervised and Semi-supervised Learning

Linli Xu
Department of Computing Science, University of Alberta, Canada

Abstract :

Training principles for unsupervised learning are often derived from motivations that appear to be independent of supervised learning, causing a proliferation of semisupervised training methods. In this talk we present a simple unification of several supervised and unsupervised training principles through the concept of optimal reverse prediction: predict the inputs from the target labels, optimizing both over model parameters and any missing labels. In particular, we show how supervised least squares, principal components analysis, k-means clustering and normalized graph-cut clustering can all be expressed as instances of the same training principle, differing only in constraints made on the target labels. Natural forms of semi-supervised regression and classification are then automatically derived, yielding semi-supervised learning algorithms for regression and classification that, surprisingly, are novel and refine the state of the art. These algorithms can all be combined with standard regularizers and made non-linear via kernels.


Linli Xu is currently a postdoctoral fellow in Department of Computer Science at University of Alberta, Canada. From 1997 to 2002, she studied for the Bachelor degree at University of Science and Technology of China. She received her Ph.D degree in Computer Science from University of Waterloo in 2007. Her research area is Machine Learning, and her interests include unsupervised learning and semi-supervised learning, clustering, large margin approaches, support vector machines, optimization and convex programming.
  Name Size

(for FireFox 3+ and IE 7+)
Contact LAMDA: (email) (tel) +86-025-89681608 © LAMDA, 2016