Usage
  • 324 views
  • 277 downloads

Bregman Divergence Clustering: A Convex Approach

  • Author / Creator
    Cheng, Hao
  • Due to its wide application in various fields, clustering, as a fundamental unsupervised learning problem, has been intensively investigated over the past few decades. Unfortunately, standard clustering formulations are known to be computationally intractable. Although many convex relaxations of clustering have been proposed to overcome the challenge of computational intractability, current
    formulations of clustering remain largely restricted to spherical Gaussian or discriminative models and are susceptible to imbalanced clusters. To address these shortcomings, we propose a new class
    of convex relaxations that can be flexibly applied to more general forms of Bregman divergence clustering. By basing these new formulations on normalized equivalence relation matrix, we retain additional control on relaxation quality, which allows improvement in clustering quality. We fur-
    thermore develop optimization methods that improve scalability by exploiting recent implicit matrix norm methods. We find that the new formulations are able to efficiently produce tighter clusterings that improve the accuracy of state of the art methods.

  • Subjects / Keywords
  • Graduation date
    Fall 2013
  • Type of Item
    Thesis
  • Degree
    Master of Science
  • DOI
    https://doi.org/10.7939/R3R785W2T
  • License
    This thesis is made available by the University of Alberta Libraries with permission of the copyright owner solely for non-commercial purposes. This thesis, or any portion thereof, may not otherwise be copied or reproduced without the written consent of the copyright owner, except to the extent permitted by Canadian copyright law.