Information bounds and nonparametric maximum likelihood estimation
Author(s)
Bibliographic Information
Information bounds and nonparametric maximum likelihood estimation
(DMV seminar, Bd. 19)
Birkhäuser Verlag, 1992
- : sz
- : us
Available at 35 libraries
  Aomori
  Iwate
  Miyagi
  Akita
  Yamagata
  Fukushima
  Ibaraki
  Tochigi
  Gunma
  Saitama
  Chiba
  Tokyo
  Kanagawa
  Niigata
  Toyama
  Ishikawa
  Fukui
  Yamanashi
  Nagano
  Gifu
  Shizuoka
  Aichi
  Mie
  Shiga
  Kyoto
  Osaka
  Hyogo
  Nara
  Wakayama
  Tottori
  Shimane
  Okayama
  Hiroshima
  Yamaguchi
  Tokushima
  Kagawa
  Ehime
  Kochi
  Fukuoka
  Saga
  Nagasaki
  Kumamoto
  Oita
  Miyazaki
  Kagoshima
  Okinawa
  Korea
  China
  Thailand
  United Kingdom
  Germany
  Switzerland
  France
  Belgium
  Netherlands
  Sweden
  Norway
  United States of America
Note
Includes bibliographical references (p. [123]-126)
Description and Table of Contents
Description
This book contains the lecture notes for a DMV course presented by the authors at Gunzburg, Germany, in September, 1990. In the course we sketched the theory of information bounds for non parametric and semiparametric models, and developed the theory of non parametric maximum likelihood estimation in several particular inverse problems: interval censoring and deconvolution models. Part I, based on Jon Wellner's lectures, gives a brief sketch of information lower bound theory: Hajek's convolution theorem and extensions, useful minimax bounds for parametric problems due to Ibragimov and Has'minskii, and a recent result characterizing differentiable functionals due to van der Vaart (1991). The differentiability theorem is illustrated with the examples of interval censoring and deconvolution (which are pursued from the estimation perspective in part II). The differentiability theorem gives a way of clearly distinguishing situations in which 1 2 the parameter of interest can be estimated at rate n / and situations in which this is not the case. However it says nothing about which rates to expect when the functional is not differentiable. Even the casual reader will notice that several models are introduced, but not pursued in any detail; many problems remain. Part II, based on Piet Groeneboom's lectures, focuses on non parametric maximum likelihood estimates (NPMLE's) for certain inverse problems. The first chapter deals with the interval censoring problem.
Table of Contents
I. Information Bounds.- 1 Models, scores, and tangent spaces.- 1.1 Introduction.- 1.2 Models P.- 1.3 Scores: Differentiability of the Model.- 1.4 Tangent Sets P0 and Tangent Spaces P.- 1.5 Score Operators.- 1.6 Exercises.- 2 Convolution and asymptotic minimax theorems.- 2.1 Introduction.- 2.2 Finite-dimensional Parameter Spaces.- 2.3 Infinite-dimensional Parameter Spaces.- 2.4 Exercises.- 3 Van der Vaart's Differentiability Theorem.- 3.1 Differentiability of Implicitly Defined Functions.- 3.2 Some Applications of the Differentiability Theorem.- 3.3 Exercises.- II. Nonparametric Maximum Likelihood Estimation.- 1 The interval censoring problem.- 1.1 Characterization of the non-parametric maximum likelihood estimators.- 1.2Exercises.- 2 The deconvolution problem.- 2.1 Decreasing densities and non-negative random variables.- 2.2 Convolution with symmetric densities.- 2.3 Exercises.- 3 Algorithms.- 3.1 The EM algorithm.- 3.2 The iterative convex minorant algorithm.- 3.3 Exercises.- 4 Consistency.- 4.1 Interval censoring, Case 1.- 4.2 Convolution with a symmetric density.- 4.3 Interval censoring, Case 2.- 4.4 Exercises.- 5 Distribution theory.- 5.1 Interval censoring, Case 1.- 5.2 Interval censoring, Case 2.- 5.3 Deconvolution with a decreasing density.- 5.4 Estimation of the mean.- 5.5 Exercises.- References.
by "Nielsen BookData"