Fast greedy $$\mathcal {C}$$ C -bound minimization with guarantees

  • PDF / 5,253,062 Bytes
  • 42 Pages / 439.37 x 666.142 pts Page_size
  • 44 Downloads / 244 Views

DOWNLOAD

REPORT


Fast greedy C‑bound minimization with guarantees Baptiste Bauvin1,3   · Cécile Capponi3 · Jean‑Francis Roy2 · François Laviolette3,4 Received: 16 September 2019 / Revised: 21 July 2020 / Accepted: 11 August 2020 / Published online: 23 September 2020 © The Author(s) 2020

Abstract The C-bound is a tight bound on the true risk of a majority vote classifier that relies on the individual quality and pairwise disagreement of the voters and provides PAC-Bayesian generalization guarantees. Based on this bound, MinCq is a classification algorithm that returns a dense distribution on a finite set of voters by minimizing it. Introduced later and inspired by boosting, CqBoost uses a column generation approach to build a sparse C-bound optimal distribution on a possibly infinite set of voters. However, both approaches have a high computational learning time because they minimize the C-bound by solving a quadratic program. Yet, one advantage of CqBoost is its experimental ability to provide sparse solutions. In this work, we address the problem of accelerating the C-bound minimization process while keeping the sparsity of the solution and without losing accuracy. We present CB-Boost, a computationally efficient classification algorithm relying on a greedy–boosting-based–C-bound optimization. An in-depth analysis proves the optimality of the greedy minimization process and quantifies the decrease of the C-bound operated by the algorithm. Generalization guarantees are then drawn based on already existing PAC-Bayesian theorems. In addition, we experimentally evaluate the relevance of CB-Boost in terms of the three main properties we expect about it: accuracy, sparsity, and computational efficiency compared to MinCq, CqBoost, Adaboost and other ensemble methods. As observed in these experiments, CB-Boost not only achieves results comparable to the state of the art, but also provides C-bound sub-optimal weights with very few computational demand while keeping the sparsity property of CqBoost. Editors: Ira Assent, Carlotta Domeniconi, Aristides Gionis, Eyke Hüllermeier. * Baptiste Bauvin baptiste.bauvin@lis‑lab.fr http://qarma.lis-lab.fr Jean‑Francis Roy [email protected] François Laviolette [email protected] 1

Aix Marseille University, Toulon University, CNRS, LIS (Qarma), Marseille, France

2

Coveo Solutions Inc, Québec, QC, Canada

3

Dép. d’informatique et de génie logiciel, Université Laval, Québec, QC, Canada

4

Element AI, Montréal, QC, Canada



13

Vol.:(0123456789)

1946

Machine Learning (2020) 109:1945–1986

Keywords  PAC-Bayes · Boosting · Ensemble methods · C-bound · Greedy optimization

1 Introduction Ensemble-based supervised classification consists in training a combination of many classifiers learnt from various algorithms and/or sub-samples of the initial dataset. Some of the most prominent ensemble learning frameworks include Boosting, with the seminal elegant Adaboost (Freund and Schapire 1997), which has been the inspiration of numerous other algorithms, Bagging (Breiman 1996) leading to the suc