Additive Logistic Regression - Essay by Frozen7



  • Since 2008
  • Free revisions
  • Money-back guaranty
  • Up to 5% for the first order. Up to 15% for the orders starting from 2nd

from $9.97/pp

visit site

  • Since 2009
  • Free title page, revisions
  • Discount policy
  • Satisfaction guarantee
  • More than 100 000 orders delivered

from $9.97/pp

visit site

  • Since 2010
  • PhD holding authors only
  • SMS notifications & VIP support
  • Discount policy

from $22/pp

visit site

  • Since 2010
  • 24/7 support team
  • More than 500 writers
  • Money-back guaranty
  • Up to 15% discounts

from $9.97/pp

visit site


My Account
Anti Essays


Anti Essays offers essay examples to help students with their essay writing.

Sign Up

Additive Logistic Regression Essay

Open Document

Below is an essay on "Additive Logistic Regression" from Anti Essays, your source for research papers, essays, and term paper examples.

Additive Logistic Regression: a Statistical View of Boosting
Jerome Friedman Trevor Hastie Robert Tibshirani y
July 23, 1998

Boosting (Freund & Schapire 1996, Schapire & Singer 1998) is one of the most important recent developments in classi cation methodology. The performance of many classi cation algorithms often can be dramatically improved by sequentially applying them to reweighted versions of the input data, and taking a weighted majority vote of the sequence of classi ers thereby produced. We show that this seemingly mysterious phenomenon can be understood in terms of well known statistical principles, namely additive modeling and maximum likelihood. For the two-class problem, boosting can be viewed as an approximation to additive modeling on the logistic scale using maximum Bernoulli likelihood as a criterion. We develop more direct approximations and show that they exhibit nearly identical results to that of boosting. Direct multi-class generalizations based on multinomial likelihood are derived that exhibit performance comparable to other recently proposed multi-class generalizations of boosting in most situations, and far superior in some. We suggest a minor modi cation to boosting that can reduce computation, often by factors of 10 to 50. Finally, we apply these insights to produce an alternative formulation of boosting decision trees. This approach, based on best- rst truncated tree induction, often leads to better performance, and can provide interpretable descriptions of the aggregate decision rule. It is also much faster computationally making it more suitable to large scale data mining applications.
Department of Public Health Sciences, and Department of Statistics, University of Toronto [email protected]


fjhf,[email protected]

Department of Statistics, Sequoia Hall, Stanford University, Stanford California 94305


1 Introduction
The starting point for this paper is an interesting procedure called...

Show More

Related Essays


MLA Citation

"Additive Logistic Regression". Anti Essays. 16 Jan. 2018


APA Citation

Additive Logistic Regression. Anti Essays. Retrieved January 16, 2018, from the World Wide Web: