Download PDFOpen PDF in browser

Deep Incremental Boosting

10 pagesPublished: September 29, 2016

Abstract

This paper introduces Deep Incremental Boosting, a new technique derived from AdaBoost, specifically adapted to work with Deep Learning methods, that reduces the required training time and improves generalisation. We draw inspiration from Transfer of Learning approaches to reduce the start-up time to training each incremental Ensemble member. We show a set of experiments that outlines some preliminary results on some common Deep Learning datasets and discuss the potential improvements Deep Incremental Boosting brings to traditional Ensemble methods
in Deep Learning.

Keyphrases: deep learning, ensembles, transfer learning

In: Christoph Benzmüller, Geoff Sutcliffe and Raul Rojas (editors). GCAI 2016. 2nd Global Conference on Artificial Intelligence, vol 41, pages 293-302.

BibTeX entry
@inproceedings{GCAI2016:Deep_Incremental_Boosting,
  author    = {Alan Mosca and George Magoulas},
  title     = {Deep Incremental Boosting},
  booktitle = {GCAI 2016. 2nd Global Conference on Artificial Intelligence},
  editor    = {Christoph Benzmüller and Geoff Sutcliffe and Raul Rojas},
  series    = {EPiC Series in Computing},
  volume    = {41},
  publisher = {EasyChair},
  bibsource = {EasyChair, https://easychair.org},
  issn      = {2398-7340},
  url       = {/publications/paper/NR},
  doi       = {10.29007/qlvr},
  pages     = {293-302},
  year      = {2016}}
Download PDFOpen PDF in browser