Image Home
Image People
Image Publication
Image Applications
Image Data & Code
Image Library
Image Seminar
Image Link
Image Album

Search LAMDA

Seminar abstract

Theory of Dual-sparse Regularized Randomized Reduction

Assistant Professor
University of Iowa

Abstract: In this talk,I will present our recent research work on dual-sparse regularized randomized reduction methods, which reduce high-dimensional features into low-dimensional space by randomized methods(e.g., random projection, random hashing), for large-scale high-dimensional classification. In the first part, I will review some previous theoretical results on randomized reduction methods to motivate our work. In the second part, I will present the proposed dual-sparse regularized randomized reduction methods that introduce a sparse regularizer into the reduced dual problem. I will also show you the theoretical guarantees on the resulting dual solution and the recovered model. Finally, I will show you some experimental results to support the analysis and the proposed methods.

Bio: Dr. Tianbao Yang is currently an assistant professor at the University of Iowa(UI). He received his Ph.D. degree in Computer Science from Michigan State University in 2012. Before joining UI, he was a researcher in NEC Laboratories America at Cupertino(2013—2014) and a Machine Learning Researcher in GE Global Research(2012—2013), mainly focusing on developing distributed optimization system for various classification and regression problems. Dr. Yang has board interests in machine learning and he has focused on several research topics, including large-scale optimization in machine learning, online optimization and distributed optimization. His recent research interests revolve around randomized algorithms for solving big data problems. He has published over 25 papers in prestigious machine learning conferences and journals. He has won the Mark Fulk Best student paper award at 25th Conference on Learning Theory(COLT) in 2012.
  Name Size

(for FireFox 3+ and IE 7+)
Contact LAMDA: (email) (tel) +86-25-89685926. © LAMDA, 2016