Statistical learning with sparsity : the lasso and generalizations
著者
書誌事項
Statistical learning with sparsity : the lasso and generalizations
(Monographs on statistics and applied probability, 143)
CRC Press, Taylor & Francis Group, c2015
大学図書館所蔵 全54件
  青森
  岩手
  宮城
  秋田
  山形
  福島
  茨城
  栃木
  群馬
  埼玉
  千葉
  東京
  神奈川
  新潟
  富山
  石川
  福井
  山梨
  長野
  岐阜
  静岡
  愛知
  三重
  滋賀
  京都
  大阪
  兵庫
  奈良
  和歌山
  鳥取
  島根
  岡山
  広島
  山口
  徳島
  香川
  愛媛
  高知
  福岡
  佐賀
  長崎
  熊本
  大分
  宮崎
  鹿児島
  沖縄
  韓国
  中国
  タイ
  イギリス
  ドイツ
  スイス
  フランス
  ベルギー
  オランダ
  スウェーデン
  ノルウェー
  アメリカ
注記
"A Chapman & Hall book"
Bibliography: p. 315-335
Includes indexes
内容説明・目次
内容説明
Discover New Methods for Dealing with High-Dimensional Data
A sparse statistical model has only a small number of nonzero parameters or weights; therefore, it is much easier to estimate and interpret than a dense model. Statistical Learning with Sparsity: The Lasso and Generalizations presents methods that exploit sparsity to help recover the underlying signal in a set of data.
Top experts in this rapidly evolving field, the authors describe the lasso for linear regression and a simple coordinate descent algorithm for its computation. They discuss the application of 1 penalties to generalized linear models and support vector machines, cover generalized penalties such as the elastic net and group lasso, and review numerical methods for optimization. They also present statistical inference methods for fitted (lasso) models, including the bootstrap, Bayesian methods, and recently developed approaches. In addition, the book examines matrix decomposition, sparse multivariate analysis, graphical models, and compressed sensing. It concludes with a survey of theoretical results for the lasso.
In this age of big data, the number of features measured on a person or object can be large and might be larger than the number of observations. This book shows how the sparsity assumption allows us to tackle these problems and extract useful and reproducible patterns from big datasets. Data analysts, computer scientists, and theorists will appreciate this thorough and up-to-date treatment of sparse statistical modeling.
目次
Introduction. The Lasso for Linear Models. Generalized Linear Models. Generalizations of the Lasso Penalty. Optimization Methods. Statistical Inference. Matrix Decompositions, Approximations, and Completion. Sparse Multivariate Methods. Graphs and Model Selection. Signal Approximation and Compressed Sensing. Theoretical Results for the Lasso. Bibliography. Author Index. Index.
「Nielsen BookData」 より