Canonical dependency analysis based on squared-loss mutual information.

HANDLE オープンアクセス

この論文をさがす

抄録

Canonical correlation analysis (CCA) is a classical dimensionality reduction technique for two sets of variables that iteratively finds projection directions with maximum correlation. Although CCA is still in vital use in many practical application areas, recent real-world data often contain more complicated nonlinear correlations that cannot be properly captured by classical CCA. In this paper, we thus propose an extension of CCA that can effectively capture such complicated nonlinear correlations through statistical dependency maximization. The proposed method, which we call least-squares canonical dependency analysis (LSCDA), is based on a squared-loss variant of mutual information, and it has various useful properties besides its ability to capture higher-order correlations: for example, it can simultaneously find multiple projection directions (i.e., subspaces), it does not involve density estimation, and it is equipped with a model selection strategy. We demonstrate the usefulness of LSCDA through various experiments on artificial and real-world datasets.

収録刊行物

詳細情報 詳細情報について

  • CRID
    1050282810714140672
  • NII論文ID
    120004873600
  • NII書誌ID
    AA11540311
  • ISSN
    08936080
  • HANDLE
    2433/159940
  • 本文言語コード
    en
  • 資料種別
    journal article
  • データソース種別
    • IRDB
    • CiNii Articles

問題の指摘

ページトップへ