Tree-based methods for statisitcal learning in R
Author(s)
Bibliographic Information
Tree-based methods for statisitcal learning in R
(Chapman & Hall/CRC data science series)(A Chapman & Hall book)
CRC Press, 2022
1st ed
- : hbk
Available at 2 libraries
  Aomori
  Iwate
  Miyagi
  Akita
  Yamagata
  Fukushima
  Ibaraki
  Tochigi
  Gunma
  Saitama
  Chiba
  Tokyo
  Kanagawa
  Niigata
  Toyama
  Ishikawa
  Fukui
  Yamanashi
  Nagano
  Gifu
  Shizuoka
  Aichi
  Mie
  Shiga
  Kyoto
  Osaka
  Hyogo
  Nara
  Wakayama
  Tottori
  Shimane
  Okayama
  Hiroshima
  Yamaguchi
  Tokushima
  Kagawa
  Ehime
  Kochi
  Fukuoka
  Saga
  Nagasaki
  Kumamoto
  Oita
  Miyazaki
  Kagoshima
  Okinawa
  Korea
  China
  Thailand
  United Kingdom
  Germany
  Switzerland
  France
  Belgium
  Netherlands
  Sweden
  Norway
  United States of America
Note
Includes bibliographical references (p. 359-380) and index
Description and Table of Contents
Description
Thorough coverage, from the ground up, of tree-based methods (e.g., CART, conditional inference trees, bagging, boosting, and random forests).
A companion website containing additional supplementary material and the code to reproduce every example and figure in the book.
A companion R package, called treemisc, which contains several data sets and functions used throughout the book (e.g., there's an implementation of gradient tree boosting with LAD loss that shows how to perform the line search step by updating the terminal node estimates of a fitted rpart tree).
Interesting examples that are of practical use; for example, how to construct partial dependence plots from a fitted model in Spark MLlib (using only Spark operations), or post-processing tree ensembles via the LASSO to reduce the number of trees while maintaining, or even improving performance.
Table of Contents
1 Introduction 2 Binary recursive partitioning with CART 3 Conditional inference trees 4 "The hitchhiker's GUIDE to modern decision trees" 5 Ensemble algorithms 6 Peeking inside the "black box": post-hoc interpretability 7 Random forests 8 Gradient boosting machines
by "Nielsen BookData"