Performance improvement of the NLMS algorithm and its applications 学習同定法の性能改善とその応用

この論文をさがす

著者

    • Jirasak Tanpreeyachaya ジラサック タンパリヤチャヤー

書誌事項

タイトル

Performance improvement of the NLMS algorithm and its applications

タイトル別名

学習同定法の性能改善とその応用

著者名

Jirasak Tanpreeyachaya

著者別名

ジラサック タンパリヤチャヤー

学位授与大学

名古屋工業大学

取得学位

博士 (工学)

学位授与番号

甲第167号

学位授与年月日

1996-03-22

注記・抄録

博士論文

目次

  1. Contents / p1 (0003.jp2)
  2. 1 Introduction / p1 (0007.jp2)
  3. 1.1 The Linear Filtering Problem / p1 (0007.jp2)
  4. 1.2 Adaptive Filters / p2 (0008.jp2)
  5. 1.3 Applications / p5 (0009.jp2)
  6. 1.4 Outline of the Thesis / p7 (0010.jp2)
  7. 2 Adaptive Systems and Theory of Adaptation / p11 (0012.jp2)
  8. 2.1 General Description / p11 (0012.jp2)
  9. 2.2 Input Signal and Weight Vectors / p11 (0012.jp2)
  10. 2.3 Desired Response and Error / p13 (0013.jp2)
  11. 2.4 The Performance Function / p15 (0014.jp2)
  12. 2.5 Gradient and Minimum Mean-square Error / p17 (0015.jp2)
  13. 2.6 Alternative Expression of the Gradient / p18 (0016.jp2)
  14. 2.7 Properties of the Quadratic Performance Surface / p20 (0017.jp2)
  15. 2.8 Normal Form of the Input Correlation Matrix / p20 (0017.jp2)
  16. 2.9 Eigenvalues and Eigenvectors of the Input Correlation Matrix / p21 (0017.jp2)
  17. 2.10 Geometrical Significance of Eigenvectors and Eigenvalues / p23 (0018.jp2)
  18. 2.11 Searching the Performance Surface / p26 (0020.jp2)
  19. 2.12 Basic Ideas of Gradient Search Methods / p27 (0020.jp2)
  20. 2.13 Simple Gradient Search and Its Solution / p28 (0021.jp2)
  21. 2.14 Stability and Rate of Convergence / p29 (0021.jp2)
  22. 2.15 The Learning Curve / p30 (0022.jp2)
  23. 2.16 Gradient Search by Newton's Method / p31 (0022.jp2)
  24. 2.17 Newton's Method in Multidimensional Space / p34 (0024.jp2)
  25. 2.18 Gradient Search by the Method of Steepest Descent / p35 (0024.jp2)
  26. 3 The LMS and NLMS Algorithm / p41 (0027.jp2)
  27. 3.1 Derivation of the LMS algorithm / p41 (0027.jp2)
  28. 3.2 Derivation of the NLMS algorithm / p47 (0030.jp2)
  29. 4 The Conditioned NLMS method / p51 (0032.jp2)
  30. 4.1 Introduction / p51 (0032.jp2)
  31. 4.2 The Conditioned NLMS Algorithm / p52 (0033.jp2)
  32. 4.3 Steady State Residual Error Analysis / p54 (0034.jp2)
  33. 4.4 Transient State Length Analysis / p59 (0036.jp2)
  34. 4.5 Optimal Selection of the Threshold p / p63 (0038.jp2)
  35. 4.6 Comparison with the ε-NLMS Method / p68 (0041.jp2)
  36. 4.7 Conclusion / p74 (0044.jp2)
  37. 5 The Partial-NLMS method / p75 (0044.jp2)
  38. 5.1 Introduction / p75 (0044.jp2)
  39. 5.2 Partial Normalized LMS Algorithm / p76 (0045.jp2)
  40. 5.3 Convergence Behavior Analysis / p77 (0045.jp2)
  41. 5.4 Performance Comparison between the P-NLMS and NLMS Algorithm / p82 (0048.jp2)
  42. 5.5 Application of P-NLMS to an IIR ADF / p83 (0048.jp2)
  43. 5.6 Conclusion / p87 (0050.jp2)
  44. 6 Performance Improvement of Variable Stepsize NLMS / p89 (0051.jp2)
  45. 6.1 Introduction / p89 (0051.jp2)
  46. 6.2 Optimal Controlled Stepsize NLMS / p92 (0053.jp2)
  47. 6.3 The Noise Variance Estimation Method and VS-NLMS Algorithm / p93 (0053.jp2)
  48. 6.4 Simulation Results / p98 (0056.jp2)
  49. 6.5 Conclusions / p102 (0058.jp2)
  50. 7 Summary / p113 (0063.jp2)
  51. A Derivation of the Correlation Eq.(4.5) / p115 (0064.jp2)
  52. B Derivation of Eq.(4.10) / p117 (0065.jp2)
  53. C Derivation of Eq.(4.11) and (4.12) / p119 (0066.jp2)
  54. D Proof of the Condition for[数式]<1 and[数式]≈0.5N / p121 (0067.jp2)
  55. D.1 Sufficient Condition for[数式]<1 / p121 (0067.jp2)
  56. D.2 Consideration for[数式]≈O.5N / p122 (0068.jp2)
  57. E Derivation of Eq.(5.8),(5.9) and (5.10) / p123 (0068.jp2)
  58. Acknowledgments / p125 (0069.jp2)
  59. Reference / p127 (0070.jp2)
  60. Authors'published papers / p130 (0072.jp2)
  61. Biography / p131 (0072.jp2)
2アクセス

各種コード

  • NII論文ID(NAID)
    500000133214
  • NII著者ID(NRID)
    • 8000000133485
  • DOI(NDL)
  • NDL書誌ID
    • 000000297528
  • データ提供元
    • NDL ONLINE
    • NDLデジタルコレクション
ページトップへ