ランキングSVMの近似に基づく効率的なAUC最大化 Efficient AUC Maximization by Approximate Reduction of Ranking SVMs

この論文をさがす

著者

抄録

The formulation of Ranking SVMs is popular for maximizing AUC scores. More precisely, the formulation is given as a hard/soft margin optimization over pn pairs of p positive and n negative instances. Directly solving the problem is impractical since we have to deal with a sample of size pn, which is quadratically larger than the original sample size p+n. In this paper, we propose (approximate) reduction methods from the hard/soft margin optimization over pn pairs to variants of hard/soft margin optimization over p+n instances. The resulting classifiers of our methods are guaranteed to have a certain amount of margin over pn pairs.

収録刊行物

  • 電子情報通信学会技術研究報告. IBISML, 情報論的学習理論と機械学習 = IEICE technical report. IBISML, Information-based induction sciences and machine learning

    電子情報通信学会技術研究報告. IBISML, 情報論的学習理論と機械学習 = IEICE technical report. IBISML, Information-based induction sciences and machine learning 112(279), 243-249, 2012-10-31

    一般社団法人電子情報通信学会

参考文献:  16件中 1-16件 を表示

各種コード

  • NII論文ID(NAID)
    110009642237
  • NII書誌ID(NCID)
    AA12482480
  • 本文言語コード
    ENG
  • 資料種別
    ART
  • ISSN
    0913-5685
  • NDL 記事登録ID
    024149425
  • NDL 請求記号
    Z16-940
  • データ提供元
    CJP書誌  NDL  NII-ELS 
ページトップへ