Constrained submodular minimization for missing labels and class imbalance in multi-label learning

B. Wu, S. Lyu, B. Ghanem

Research output: Contribution to conferencePaperpeer-review

34 Scopus citations

Abstract

In multi-label learning, there are two main challenges: missing labels and class imbalance (CIB). The former assumes that only a partial set of labels are provided for each training instance while other labels are missing. CIB is observed from two perspectives: first, the number of negative labels of each instance is much larger than its positive labels; second, the rate of positive instances (i.e. the number of positive instances divided by the total number of instances) of different classes are significantly different. Both missing labels and CIB lead to significant performance degradation. In this work, we propose a new method to handle these two challenges simultaneously. We formulate the problem as a constrained submodular minimization that is composed of a submodular objective function that encourages label consistency and smoothness, as well as, class cardinality bound constraints to handle class imbalance. We further present a convex approximation based on the Lovasz extension of submodular functions, leading to a linear program, which can be efficiently solved by the alternative direction method of multipliers (ADMM). Experimental results on several benchmark datasets demonstrate the improved performance of our method over several state-of-the-art methods. © Copyright 2016, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rights reserved.
Original languageEnglish (US)
Pages2229-2236
StatePublished - 2016

Keywords

  • Artificial intelligence
  • Benchmarking
  • Linear programming, Convex approximation
  • Method of multipliers
  • Multi-label learning
  • Objective functions
  • Performance degradation
  • Positive instances
  • State-of-the-art methods
  • Submodular functions, Learning systems

Fingerprint Dive into the research topics of 'Constrained submodular minimization for missing labels and class imbalance in multi-label learning'. Together they form a unique fingerprint.

Cite this