人物再同定のための重み付き特徴統合の検討  [in Japanese] Weighted Feature Integration for Person Re-identification  [in Japanese]

Access this Article

Author(s)

Abstract

Person re-identification is the task of finding and matching the same individual in different camera views. For robust person re-identification, we propose a weighted feature integration method that adapts to illumination changes and appearance differences caused by different camera views. First, we extract four kinds of local features (color histograms, frequency features, gray-level co-occurrence matrices, and histogram of oriented gradient features) from the image of a person as cloth appearance information. Second, in the pre-training phase, we calculate the difference in value of each local feature for a pair of images taken by different cameras. Local features are then weighted and integrated based on these differences. We tested three weighting functions: reciprocal, probability density function, and average Bhattacharyya distance. In the experiments, we utilized four public datasets, iLIDS-VID, GRID, PRID, and VIPeR, and verified the effectiveness of the proposed method. The results demonstrate a general improvement in person re-identification performance when the feature integration is weighted by the average Bhattacharyya distance.

Journal

  • Journal of the Japan Society for Precision Engineering

    Journal of the Japan Society for Precision Engineering 82(12), 1119-1127, 2016

    The Japan Society for Precision Engineering

Codes

  • NII Article ID (NAID)
    130005179277
  • Text Lang
    JPN
  • ISSN
    0912-0289
  • Data Source
    J-STAGE 
Page Top