[1] |
KEARNS M J, VALIANT L G. Learning boolean formulae or finite automata is as hard as factoring[R]. Cambridge, USA: Harvard University, 1988.
|
[2] |
SCHAPIRE R E. The strength of weak learnability[J]. Machine Learning, 1990, 5(2): 197-227.
|
[3] |
FREUND Y, SCHAPIRE R E. A decision-theoretic generalization of on-line learning and an application to boosting[J]. Journal of Computer and System Sciences, 1997, 55(1): 119-139.
|
[4] |
FREUND Y, SCHAPIRE R E. A decision-theoretic generalization of on-line learning algorithms and an application to boosting[J]. Journal of Popular Culture, 1997, 13(5): 663-671.
|
[5] |
FREUND Y. Boosting a weak learning algorithm by majority[J]. Information and Computation, 1995, 121(2): 256-285.
|
[6] |
FREUND Y, SCHAPIRE R E. Experiments with a new boosting algorithm[C/OL]//Proceedings of the 13th International Conference on Machine Learning, 1996: 148-156[2015-08-12].http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.90.4143&rep=rep1&type=pdf.
|
[7] |
FRIEDMAN J, HASTIE T, TIBSHIRANI R. Additive logistic regression: a statistical view of boosting[J]. The Annals of Statistics, 2000, 28(2): 374-376.
|
[8] |
FRIEDMAN J H. Greedy function approximation: a gradient boosting machine[J]. The Annals of Statistics, 2001: 1189-1232.
|
[9] |
NAGHIBI T, PFISTER B. A boosting framework on grounds of online learning[C/OL]//Advances in Neural Information Processing Systems, 2014: 2267-2275[2015-08-12]. http://papers.nips.cc/paper/5512-a-boosting-framework-on-grounds-of-online-learning.pdf.
|
[10] |
FREUND Y, IYER R, SCHAPIRE R E, et al. An efficient boosting algorithm for combining preferences[J]. The Journal of Machine Learning Research, 2003, 4: 933-969.
|
[11] |
FRIEDMAN J H. Stochastic gradient boosting[J]. Computational Statistics & Data Analysis, 2002, 38(4): 367-378.
|
[12] |
ESCUDERO G, MRQUEZ L, RIGAU G. Boosting applied to word sense disambiguation[C]// Proceedings of the 12th European Conference on Machine Learning. Berlin :Springer, 2000: 129-141.
|
[13] |
WEBB G I. Multiboosting: a technique for combining boosting and wagging[J]. Machine Learning, 2000, 40(2): 159-196.
|
[14] |
FREUND Y. An adaptive version of the boost by majority algorithm[J]. Machine Learning, 2001, 43(3): 293-318.
|
[15] |
BENNETT K P, DEMIRIZ A, MACLIN R. Exploiting unlabeled data in ensemble methods[C]//Proceedings of the Eighth ACM International Conference on Knowledge Discovery and Data Mining. New York: ACM, 2002: 289-296.
|
[16] |
DEMIRIZ A, BENNETT K P, SHAWE-TAYLOR J. Linear programming boosting via column generation[J]. Machine Learning, 2002, 46(1/2/3): 225-254.
|
[17] |
BHLMANN P, YU B. Boosting with the L2-loss: regression and classification[J]. Journal of the American Statistical Association, 2003, 98(462): 324-339.
|
[18] |
SERVEDIO R A. Smooth boosting and learning with malicious noise[J]. The Journal of Machine Learning Research, 2003, 4: 633-648.
|
[19] |
KGL B, WANG L. Boosting on manifolds: adaptive regularization of base classifiers[C/OL]//Advances in Neural Information Processing Systems,2004: 665-672[2015-08-12]. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.64.9620&rep=rep1&type=pdf.
|
[20] |
HERTZ T, BAR-HILLEL A, WEINSHALL D. Boosting margin based distance functions for clustering[C]//Proceedings of the Twenty-first International Conference on Machine learning. New York :ACM, 2004: 50.
|
[21] |
VEZHNEVETS A, VEZHNEVETS V. Modest AdaBoost: teaching AdaBoost to generalize better[J]. Graphicon, 2005, 12(5): 987-997.
|
[22] |
HATANO K. Smooth boosting using an information-based criterion[C]//Proceedings of the 17th International Conference on Algorithmic Learning Theory. Berlin :Springer, 2006: 304-318.
|
[23] |
WARMUTH M K, LIAO J, R?TSCH G. Totally corrective boosting algorithms that maximize the margin[C]//Proceedings of the 23rd International Conference on Machine Learning. New York: ACM, 2006: 1001-1008.
|
[24] |
BHLMANN P, YU B. Sparse boosting[J]. The Journal of Machine Learning Research, 2006, 7: 1001-1024.
|
[25] |
BRADLEY J K, SCHAPIRE R E. Filterboost: regression and classification on large datasets[C/OL]//Advances in Neural Information Processing Systems, 2007: 185-192[2015-08-12]. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.404.946&rep=rep1&type=pdf.
|
[26] |
RTSCH G, WARMUTH M K, GLOCER K A. Boosting algorithms for maximizing the soft margin[C/OL]//Advances in Neural Information Processing Systems, 2007: 1585-1592[2015-08-12]. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.88.6621&rep=rep1&type=pdf.
|
[27] |
MASNADI-SHIRAZI H, VASCONCELOS N. On the design of loss functions for classification: theory, robustness to outliers, and savageboost[C/OL]//Advances in neural information processing systems,2009: 1049-1056[2015-08-12]. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.163.470&rep=rep1&type=pdf.
|
[28] |
FREUND Y. A more robust boosting algorithm[EB/OL]. (2009-05-13)[2015-08-12].http://aixiv.org/abs/0905.2138.
|
[29] |
BHLMANN P, HOTHORN T. Twin boosting: improved feature selection and prediction[J]. Statistics and Computing, 2010, 20(2): 119-138.
|
[30] |
ZHAI S, XIA T, TAN M, et al. Direct 0-1 loss minimization and margin maximization with boosting[C/OL]//Advances in Neural Information Processing Systems, 2013: 872-880[2015-08-12]. http://machinelearning.wustl.edu/mlpapers/paper_files/NIPS2013_5214.pdf.
|
[31] |
SHEN C, LIN G, VAN DEN HENGEL A. Structboost: boosting methods for predicting structured output variables[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2014, 36(10): 2089-2103.
|
[32] |
SCHAPIRE R E, SINGER Y. Boostexter: a boosting-based system for text categorization[J]. Machine Learning, 2000, 39(2): 135-168.
|
[33] |
BERGSTRA J, CASAGRANDE N, ERHAN D, et al. Aggregate features and AdaBoost for music classification[J]. Machine Learning, 2006, 65(2/3): 473-484.
|
[34] |
LI F F, FERGUS R, TORRALBA A. Recognizing and learning object categories[A]// Tutorial at International Conference on Computer Vision. Lisbon, Portugal: ACM Press, 2009.
|
[35] |
LIU P, HAN S, MENG Z, et al. Facial expression recognition via a boosted deep belief network[C/OL]// IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2014: 1805-1812[2015-08-12]. http://www.cv-foundation.org/openaccess/content_cvpr_2014/papers/Liu_Facial_Expression_Recognition_2014_CVPR_paper.pdf.
|
[36] |
VIOLA P, JONES M. Rapid object detection using a boosted cascade of simple features[C]// Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition: vol. 1. Piscataway: IEEE Press, 2001: 511-518.
|
[37] |
SABERIAN M, VASCONCELOS N. Boosting algorithms for detector cascade learning [J]. The Journal of Machine Learning Research, 2014, 15(1): 2569-2605.
|
[38] |
FREUND Y, SCHAPIRE R, ABE N. A short introduction to boosting[J]. Journal of Japanese Society For Artificial Intelligence, 1999, 14(50): 771-780.
|
[39] |
SCHAPIRE R E. A brief introduction to boosting[C]//Proceedings of the Sixteenth International Joint Conference on Artificial Intelligence: vol.2. San Francisco, CA, USA: Morgan Kaufmann Publishers Inc. , 1999: 1401-1406.
|
[40] |
SHEN X H, ZHOU Z H, WU J X, et al. Survey of boosting and bagging[J]. Computer Engineering and Application, 2000, 12: 31-32.
|
[41] |
SCHAPIRE R E. The boosting approach to machine learning: an overview[M]//Nonlinear estimation and classification. New York: Springer, 2003: 149-171.
|
[42] |
LIAO H W, ZHOU D L. Review of AdaBoost and Its Improvement[J]. Computer Systems & Applications, 2012, 21(5): 240-244.
|
[43] |
CAO Y, MIAO Q G, LIU J C, et al. Advance and prospects of AdaBoost algorithm[J]. Acta Automatica Sinica, 2013, 39(6): 745-758.
|
[44] |
BHLMANN P. Boosting methods: why they can be useful for high-dimensional data[C/OL]//Proceedings of the 3rd International Workshop on Distributed Statistical Computing (DSC), 2003[2015-08-12]. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.6.2694&rep=rep1&type=pdf.
|
[45] |
SCHAPIRE R E, FREUND Y, BARTLETT P, et al. Boosting the margin: a new explanation for the effectiveness of voting methods[J]. The Annals of Statistics, 1998,26(5): 1651-1686.
|
[46] |
NOCK R, ALI W B H, D'AMBROSIO R, et al. Gentle nearest neighbors boosting over proper scoring rules[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2015, 37(1): 80-93.
|
[47] |
CHI Y, PORIKLI F. Classification and boosting with multiple collaborative representations[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2014, 36(8): 1519-1531.
|
[48] |
BEYGELZIMER A, KALE S, LUO H. Optimal and adaptive algorithms for online boosting[EB/OL]. (2015-02-09)[2015-08-12]. http://arxiv.org/abs/1502.02651.
|
[49] |
VALIANT L G. A theory of the learnable[J]. Communications of the ACM, 1984, 27(11): 1134-1142.
|
[50] |
SHALEV-SHWARTZ S, BEN-DAVID S. Understanding machine learning: from theory to algorithms[M]. Cambridge, UK: Cambridge University Press, 2014.
|
[51] |
ZHANG T, YU B. Boosting with early stopping: convergence and consistency[J]. The Annals of Statistics, 2005,33(4): 1538-1579.
|
[52] |
BAUER E, KOHAVI R. An empirical comparison of voting classification algorithms: bagging, boosting, and variants[J]. Machine Learning, 1999, 36(1): 105-139.
|
[53] |
DIETTERICH T G. An experimental comparison of three methods for constructing ensembles of decision trees: bagging, boosting, and randomization[J]. Machine Learning, 2000, 40(2): 139-157.
|
[54] |
DUBOUT C, FLEURET F. Adaptive sampling for large scale boosting[J]. The Journal of Machine Learning Research, 2014, 15(1): 1431-1453.
|
[55] |
CHI E C, ALLEN G, ZHOU H, et al. Imaging genetics via sparse canonical correlation analysis[C]// 2013 IEEE 10th International Symposium on Biomedical Imaging (ISBI).Piscataway: IEEE Press, 2013: 740-743.
|
[56] |
BREIMAN L. Bias, variance, and arcing classifiers[R/OL]. [2015-08-12].http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.30.8572&rep=rep1&type=pdf.
|
[57] |
DRUCKER H, CORTES C. Boosting decision trees[C/OL]//Advances in Neural Information Processing Systems, 1996: 479-485 [2015-08-12]. http://papers.nips.cc/paper/1059-boosting-decision-trees.pdf.
|
[58] |
QUINLAN J R. Bagging, boosting, and C4. 5[C]// Proceedings of the Thirteenth National Conference on Artificial Intelligence. Palo Alto: AAAI Press, 1996: 725-730.
|
[59] |
BREIMAN L. Prediction games and arcing algorithms[J]. Neural Computation, 1999, 11(7): 1493-1517.
|
[60] |
SCHAPIRE R E, FREUND Y, BARTLETT P, et al. Boosting the margin: a new explanation for the effectiveness of voting methods[J]. The Annals of Statistics, 1998,26(5): 1651-1686.
|
[61] |
BREIMAN L. Bagging predictors[J]. Machine Learning, 1996, 24(2): 123-140.
|
[62] |
REYZIN L, SCHAPIRE R E. How boosting the margin can also boost classifier complexity[C]//Proceedings of the 23rd International Conference on Machine Learning. New York: ACM, 2006: 753-760.
|
[63] |
BREIMAN L. Prediction games and arcing classifiers: Technical Report 504 [R]. Berkeley :University of California, 1997.
|
[64] |
ZHOU Z H. Boosting 25 years[R]. Beijing: Institute of Automation, Chinese Academy of Science, 2013.
|
[65] |
GAO W, ZHOU Z H. On the doubt about margin explanation of boosting[J]. Artificial Intelligence, 2013, 203: 1-18.
|
[66] |
TOMER H. Learning distance functions: algorithms and applications[D]. Jerusalem: Hebrew University of Jerusalem, 2006.
|
[67] |
GARCA-PEDRAJAS N, ORTIZ-BOYER D. Boosting k-nearest neighbor classifier by means of input space projection[J]. Expert Systems with Applications, 2009, 36(7): 10570-10582.
|
[68] |
PIRO P, NOCK R, NIELSEN F, et al. Boosting k-NN for categorization of natural scenes[EB/OL].(2010-01-08)[2015-08-12]. http://arxiv.org/abs/1001.1221.
|
[69] |
CHI Y, PORIKLI F. Connecting the dots in multi-class classification: from nearest subspace to collaborative representation[C]// 2012 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). Piscataway: IEEE Press, 2012: 3602-3609.
|
[70] |
OZA N C, RUSSELL S. Experimental comparisons of online and batch versions of bagging and boosting[C]//Proceedings of the Seventh ACM SIGKDD International Conference on Knowledge Discovery and Data mining. New York:ACM, 2001: 359-364.
|
[71] |
OZA N C. Online bagging and boosting[C]// 2005 IEEE International Conference on IEEE Systems, Man and Cybernetics: Vol. 3. Piscataway: IEEE Press, 2005: 2340-2345.
|
[72] |
WU B, NEVATIA R. Improving part based object detection by unsupervised, online boosting[C]// IEEE Conference on Computer Vision and Pattern Recognition, 2007. Piscataway: IEEE Press, 2007: 1-8.
|
[73] |
LEISTNER C, SAFFARI A, ROTH P M, et al. On robustness of on-line boosting-a competitive study[C]// 2009 IEEE 12th International Conference on Computer Vision Workshops (ICCV Workshops). Piscataway: IEEE Press, 2009: 1362-1369.
|
[74] |
BABENKO B, YANG M H, BELONGIE S. A family of online boosting algorithms[C]// 2009 IEEE 12th International Conference on Computer Vision Workshops (ICCV Workshops). Piscataway: IEEE Press, 2009: 1346-1353.
|
[75] |
GRABNER H, BISCHOF H. On-line boosting and vision[C]// 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition: Vol. 1. Piscataway: IEEE Press, 2006: 260-267.
|
[76] |
LIU X, YU T. Gradient feature selection for online boosting[C] // IEEE 11th International Conference on Computer Vision. Piscataway: IEEE Press, 2007: 1-8.
|
[77] |
GRABNER H, LEISTNER C, BISCHOF H. Semi-supervised on-line boosting for robust tracking[C]//Proceedings of the 10th European Conference on Computer Vision. Berlin: Springer-Verlag, 2008: 234-247.
|
[78] |
CHEN S T, LIN H T, LU C J. An online boosting algorithm with theoretical justifications[EB/OL]. (2012-06-27)[2015-08-12]. http://arxiv.org/abs/1206.6422.
|
[79] |
LUO H, SCHAPIRE R E. A drifting-games analysis for online learning and applications to boosting[C/OL]//Advances in Neural Information Processing Systems, 2014: 1368-1376[2015-08-12]. http://papers.nips.cc/paper/5469-a-drifting-games-analysis-for-online-learning-and-applications-to-boosting.pdf.
|
[80] |
CHEN S T, LIN H T, LU C J. Boosting with online binary learners for the multiclass bandit problem[C/OL]//Proceedings of the 31st International Conference on Machine Learning (ICML-14), 2014: 342-350[2015-08-12]. http://machinelearning.wustl.edu/mlpapers/paper_files/icml2014c1_chenb14.pdf.)
|
[1] |
KEARNS M J, VALIANT L G. Learning boolean formulae or finite automata is as hard as factoring[R]. Cambridge, USA: Harvard University, 1988.
|
[2] |
SCHAPIRE R E. The strength of weak learnability[J]. Machine Learning, 1990, 5(2): 197-227.
|
[3] |
FREUND Y, SCHAPIRE R E. A decision-theoretic generalization of on-line learning and an application to boosting[J]. Journal of Computer and System Sciences, 1997, 55(1): 119-139.
|
[4] |
FREUND Y, SCHAPIRE R E. A decision-theoretic generalization of on-line learning algorithms and an application to boosting[J]. Journal of Popular Culture, 1997, 13(5): 663-671.
|
[5] |
FREUND Y. Boosting a weak learning algorithm by majority[J]. Information and Computation, 1995, 121(2): 256-285.
|
[6] |
FREUND Y, SCHAPIRE R E. Experiments with a new boosting algorithm[C/OL]//Proceedings of the 13th International Conference on Machine Learning, 1996: 148-156[2015-08-12].http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.90.4143&rep=rep1&type=pdf.
|
[7] |
FRIEDMAN J, HASTIE T, TIBSHIRANI R. Additive logistic regression: a statistical view of boosting[J]. The Annals of Statistics, 2000, 28(2): 374-376.
|
[8] |
FRIEDMAN J H. Greedy function approximation: a gradient boosting machine[J]. The Annals of Statistics, 2001: 1189-1232.
|
[9] |
NAGHIBI T, PFISTER B. A boosting framework on grounds of online learning[C/OL]//Advances in Neural Information Processing Systems, 2014: 2267-2275[2015-08-12]. http://papers.nips.cc/paper/5512-a-boosting-framework-on-grounds-of-online-learning.pdf.
|
[10] |
FREUND Y, IYER R, SCHAPIRE R E, et al. An efficient boosting algorithm for combining preferences[J]. The Journal of Machine Learning Research, 2003, 4: 933-969.
|
[11] |
FRIEDMAN J H. Stochastic gradient boosting[J]. Computational Statistics & Data Analysis, 2002, 38(4): 367-378.
|
[12] |
ESCUDERO G, MRQUEZ L, RIGAU G. Boosting applied to word sense disambiguation[C]// Proceedings of the 12th European Conference on Machine Learning. Berlin :Springer, 2000: 129-141.
|
[13] |
WEBB G I. Multiboosting: a technique for combining boosting and wagging[J]. Machine Learning, 2000, 40(2): 159-196.
|
[14] |
FREUND Y. An adaptive version of the boost by majority algorithm[J]. Machine Learning, 2001, 43(3): 293-318.
|
[15] |
BENNETT K P, DEMIRIZ A, MACLIN R. Exploiting unlabeled data in ensemble methods[C]//Proceedings of the Eighth ACM International Conference on Knowledge Discovery and Data Mining. New York: ACM, 2002: 289-296.
|
[16] |
DEMIRIZ A, BENNETT K P, SHAWE-TAYLOR J. Linear programming boosting via column generation[J]. Machine Learning, 2002, 46(1/2/3): 225-254.
|
[17] |
BHLMANN P, YU B. Boosting with the L2-loss: regression and classification[J]. Journal of the American Statistical Association, 2003, 98(462): 324-339.
|
[18] |
SERVEDIO R A. Smooth boosting and learning with malicious noise[J]. The Journal of Machine Learning Research, 2003, 4: 633-648.
|
[19] |
KGL B, WANG L. Boosting on manifolds: adaptive regularization of base classifiers[C/OL]//Advances in Neural Information Processing Systems,2004: 665-672[2015-08-12]. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.64.9620&rep=rep1&type=pdf.
|
[20] |
HERTZ T, BAR-HILLEL A, WEINSHALL D. Boosting margin based distance functions for clustering[C]//Proceedings of the Twenty-first International Conference on Machine learning. New York :ACM, 2004: 50.
|
[21] |
VEZHNEVETS A, VEZHNEVETS V. Modest AdaBoost: teaching AdaBoost to generalize better[J]. Graphicon, 2005, 12(5): 987-997.
|
[22] |
HATANO K. Smooth boosting using an information-based criterion[C]//Proceedings of the 17th International Conference on Algorithmic Learning Theory. Berlin :Springer, 2006: 304-318.
|
[23] |
WARMUTH M K, LIAO J, R?TSCH G. Totally corrective boosting algorithms that maximize the margin[C]//Proceedings of the 23rd International Conference on Machine Learning. New York: ACM, 2006: 1001-1008.
|
[24] |
BHLMANN P, YU B. Sparse boosting[J]. The Journal of Machine Learning Research, 2006, 7: 1001-1024.
|
[25] |
BRADLEY J K, SCHAPIRE R E. Filterboost: regression and classification on large datasets[C/OL]//Advances in Neural Information Processing Systems, 2007: 185-192[2015-08-12]. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.404.946&rep=rep1&type=pdf.
|
[26] |
RTSCH G, WARMUTH M K, GLOCER K A. Boosting algorithms for maximizing the soft margin[C/OL]//Advances in Neural Information Processing Systems, 2007: 1585-1592[2015-08-12]. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.88.6621&rep=rep1&type=pdf.
|
[27] |
MASNADI-SHIRAZI H, VASCONCELOS N. On the design of loss functions for classification: theory, robustness to outliers, and savageboost[C/OL]//Advances in neural information processing systems,2009: 1049-1056[2015-08-12]. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.163.470&rep=rep1&type=pdf.
|
[28] |
FREUND Y. A more robust boosting algorithm[EB/OL]. (2009-05-13)[2015-08-12].http://aixiv.org/abs/0905.2138.
|
[29] |
BHLMANN P, HOTHORN T. Twin boosting: improved feature selection and prediction[J]. Statistics and Computing, 2010, 20(2): 119-138.
|
[30] |
ZHAI S, XIA T, TAN M, et al. Direct 0-1 loss minimization and margin maximization with boosting[C/OL]//Advances in Neural Information Processing Systems, 2013: 872-880[2015-08-12]. http://machinelearning.wustl.edu/mlpapers/paper_files/NIPS2013_5214.pdf.
|
[31] |
SHEN C, LIN G, VAN DEN HENGEL A. Structboost: boosting methods for predicting structured output variables[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2014, 36(10): 2089-2103.
|
[32] |
SCHAPIRE R E, SINGER Y. Boostexter: a boosting-based system for text categorization[J]. Machine Learning, 2000, 39(2): 135-168.
|
[33] |
BERGSTRA J, CASAGRANDE N, ERHAN D, et al. Aggregate features and AdaBoost for music classification[J]. Machine Learning, 2006, 65(2/3): 473-484.
|
[34] |
LI F F, FERGUS R, TORRALBA A. Recognizing and learning object categories[A]// Tutorial at International Conference on Computer Vision. Lisbon, Portugal: ACM Press, 2009.
|
[35] |
LIU P, HAN S, MENG Z, et al. Facial expression recognition via a boosted deep belief network[C/OL]// IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2014: 1805-1812[2015-08-12]. http://www.cv-foundation.org/openaccess/content_cvpr_2014/papers/Liu_Facial_Expression_Recognition_2014_CVPR_paper.pdf.
|
[36] |
VIOLA P, JONES M. Rapid object detection using a boosted cascade of simple features[C]// Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition: vol. 1. Piscataway: IEEE Press, 2001: 511-518.
|
[37] |
SABERIAN M, VASCONCELOS N. Boosting algorithms for detector cascade learning [J]. The Journal of Machine Learning Research, 2014, 15(1): 2569-2605.
|
[38] |
FREUND Y, SCHAPIRE R, ABE N. A short introduction to boosting[J]. Journal of Japanese Society For Artificial Intelligence, 1999, 14(50): 771-780.
|
[39] |
SCHAPIRE R E. A brief introduction to boosting[C]//Proceedings of the Sixteenth International Joint Conference on Artificial Intelligence: vol.2. San Francisco, CA, USA: Morgan Kaufmann Publishers Inc. , 1999: 1401-1406.
|
[40] |
SHEN X H, ZHOU Z H, WU J X, et al. Survey of boosting and bagging[J]. Computer Engineering and Application, 2000, 12: 31-32.
|
[41] |
SCHAPIRE R E. The boosting approach to machine learning: an overview[M]//Nonlinear estimation and classification. New York: Springer, 2003: 149-171.
|
[42] |
LIAO H W, ZHOU D L. Review of AdaBoost and Its Improvement[J]. Computer Systems & Applications, 2012, 21(5): 240-244.
|
[43] |
CAO Y, MIAO Q G, LIU J C, et al. Advance and prospects of AdaBoost algorithm[J]. Acta Automatica Sinica, 2013, 39(6): 745-758.
|
[44] |
BHLMANN P. Boosting methods: why they can be useful for high-dimensional data[C/OL]//Proceedings of the 3rd International Workshop on Distributed Statistical Computing (DSC), 2003[2015-08-12]. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.6.2694&rep=rep1&type=pdf.
|
[45] |
SCHAPIRE R E, FREUND Y, BARTLETT P, et al. Boosting the margin: a new explanation for the effectiveness of voting methods[J]. The Annals of Statistics, 1998,26(5): 1651-1686.
|
[46] |
NOCK R, ALI W B H, D'AMBROSIO R, et al. Gentle nearest neighbors boosting over proper scoring rules[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2015, 37(1): 80-93.
|
[47] |
CHI Y, PORIKLI F. Classification and boosting with multiple collaborative representations[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2014, 36(8): 1519-1531.
|
[48] |
BEYGELZIMER A, KALE S, LUO H. Optimal and adaptive algorithms for online boosting[EB/OL]. (2015-02-09)[2015-08-12]. http://arxiv.org/abs/1502.02651.
|
[49] |
VALIANT L G. A theory of the learnable[J]. Communications of the ACM, 1984, 27(11): 1134-1142.
|
[50] |
SHALEV-SHWARTZ S, BEN-DAVID S. Understanding machine learning: from theory to algorithms[M]. Cambridge, UK: Cambridge University Press, 2014.
|
[51] |
ZHANG T, YU B. Boosting with early stopping: convergence and consistency[J]. The Annals of Statistics, 2005,33(4): 1538-1579.
|
[52] |
BAUER E, KOHAVI R. An empirical comparison of voting classification algorithms: bagging, boosting, and variants[J]. Machine Learning, 1999, 36(1): 105-139.
|
[53] |
DIETTERICH T G. An experimental comparison of three methods for constructing ensembles of decision trees: bagging, boosting, and randomization[J]. Machine Learning, 2000, 40(2): 139-157.
|
[54] |
DUBOUT C, FLEURET F. Adaptive sampling for large scale boosting[J]. The Journal of Machine Learning Research, 2014, 15(1): 1431-1453.
|
[55] |
CHI E C, ALLEN G, ZHOU H, et al. Imaging genetics via sparse canonical correlation analysis[C]// 2013 IEEE 10th International Symposium on Biomedical Imaging (ISBI).Piscataway: IEEE Press, 2013: 740-743.
|
[56] |
BREIMAN L. Bias, variance, and arcing classifiers[R/OL]. [2015-08-12].http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.30.8572&rep=rep1&type=pdf.
|
[57] |
DRUCKER H, CORTES C. Boosting decision trees[C/OL]//Advances in Neural Information Processing Systems, 1996: 479-485 [2015-08-12]. http://papers.nips.cc/paper/1059-boosting-decision-trees.pdf.
|
[58] |
QUINLAN J R. Bagging, boosting, and C4. 5[C]// Proceedings of the Thirteenth National Conference on Artificial Intelligence. Palo Alto: AAAI Press, 1996: 725-730.
|
[59] |
BREIMAN L. Prediction games and arcing algorithms[J]. Neural Computation, 1999, 11(7): 1493-1517.
|
[60] |
SCHAPIRE R E, FREUND Y, BARTLETT P, et al. Boosting the margin: a new explanation for the effectiveness of voting methods[J]. The Annals of Statistics, 1998,26(5): 1651-1686.
|
[61] |
BREIMAN L. Bagging predictors[J]. Machine Learning, 1996, 24(2): 123-140.
|
[62] |
REYZIN L, SCHAPIRE R E. How boosting the margin can also boost classifier complexity[C]//Proceedings of the 23rd International Conference on Machine Learning. New York: ACM, 2006: 753-760.
|
[63] |
BREIMAN L. Prediction games and arcing classifiers: Technical Report 504 [R]. Berkeley :University of California, 1997.
|
[64] |
ZHOU Z H. Boosting 25 years[R]. Beijing: Institute of Automation, Chinese Academy of Science, 2013.
|
[65] |
GAO W, ZHOU Z H. On the doubt about margin explanation of boosting[J]. Artificial Intelligence, 2013, 203: 1-18.
|
[66] |
TOMER H. Learning distance functions: algorithms and applications[D]. Jerusalem: Hebrew University of Jerusalem, 2006.
|
[67] |
GARCA-PEDRAJAS N, ORTIZ-BOYER D. Boosting k-nearest neighbor classifier by means of input space projection[J]. Expert Systems with Applications, 2009, 36(7): 10570-10582.
|
[68] |
PIRO P, NOCK R, NIELSEN F, et al. Boosting k-NN for categorization of natural scenes[EB/OL].(2010-01-08)[2015-08-12]. http://arxiv.org/abs/1001.1221.
|
[69] |
CHI Y, PORIKLI F. Connecting the dots in multi-class classification: from nearest subspace to collaborative representation[C]// 2012 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). Piscataway: IEEE Press, 2012: 3602-3609.
|
[70] |
OZA N C, RUSSELL S. Experimental comparisons of online and batch versions of bagging and boosting[C]//Proceedings of the Seventh ACM SIGKDD International Conference on Knowledge Discovery and Data mining. New York:ACM, 2001: 359-364.
|
[71] |
OZA N C. Online bagging and boosting[C]// 2005 IEEE International Conference on IEEE Systems, Man and Cybernetics: Vol. 3. Piscataway: IEEE Press, 2005: 2340-2345.
|
[72] |
WU B, NEVATIA R. Improving part based object detection by unsupervised, online boosting[C]// IEEE Conference on Computer Vision and Pattern Recognition, 2007. Piscataway: IEEE Press, 2007: 1-8.
|
[73] |
LEISTNER C, SAFFARI A, ROTH P M, et al. On robustness of on-line boosting-a competitive study[C]// 2009 IEEE 12th International Conference on Computer Vision Workshops (ICCV Workshops). Piscataway: IEEE Press, 2009: 1362-1369.
|
[74] |
BABENKO B, YANG M H, BELONGIE S. A family of online boosting algorithms[C]// 2009 IEEE 12th International Conference on Computer Vision Workshops (ICCV Workshops). Piscataway: IEEE Press, 2009: 1346-1353.
|
[75] |
GRABNER H, BISCHOF H. On-line boosting and vision[C]// 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition: Vol. 1. Piscataway: IEEE Press, 2006: 260-267.
|
[76] |
LIU X, YU T. Gradient feature selection for online boosting[C] // IEEE 11th International Conference on Computer Vision. Piscataway: IEEE Press, 2007: 1-8.
|
[77] |
GRABNER H, LEISTNER C, BISCHOF H. Semi-supervised on-line boosting for robust tracking[C]//Proceedings of the 10th European Conference on Computer Vision. Berlin: Springer-Verlag, 2008: 234-247.
|
[78] |
CHEN S T, LIN H T, LU C J. An online boosting algorithm with theoretical justifications[EB/OL]. (2012-06-27)[2015-08-12]. http://arxiv.org/abs/1206.6422.
|
[79] |
LUO H, SCHAPIRE R E. A drifting-games analysis for online learning and applications to boosting[C/OL]//Advances in Neural Information Processing Systems, 2014: 1368-1376[2015-08-12]. http://papers.nips.cc/paper/5469-a-drifting-games-analysis-for-online-learning-and-applications-to-boosting.pdf.
|
[80] |
CHEN S T, LIN H T, LU C J. Boosting with online binary learners for the multiclass bandit problem[C/OL]//Proceedings of the 31st International Conference on Machine Learning (ICML-14), 2014: 342-350[2015-08-12]. http://machinelearning.wustl.edu/mlpapers/paper_files/icml2014c1_chenb14.pdf.)
|