A Multiscale Method for HOG-Based Face Recognition

X Wei, H. Wang, Gongde Guo, Huan Wan

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

10 Citations (Scopus)


Image representation is an important process in image classification, and there are many different methods for representing images. HOG (Histograms of Oriented Gradients) is a popular one which has been used in many applications including face recognition, pedestrian detection and palmprint recognition. In this paper, a novel method is presented to improve HOG-based image classification by using the multiscale features of images. For each image, multiple HOG feature vectors are extracted under different spatial dimensions (or ’scales’). These ’multiscale’ feature vectors are then fused into a distance function to calculate the distance between two images. Experiments have been conducted on ORL face database, AR face database and FERET face database. Results show the use of multiscale HOG features has led to significant improvement in performance over the use of single scale HOG features. Results also show that the nearest neighbour classifier equipped with our distance function is comparable to the well-known and widely-used benchmark classifier.
Original languageEnglish
Title of host publicationInternational Conference on Intelligent Robotics and Applications
Subtitle of host publicationICIRA 2015
Number of pages11
ISBN (Electronic)978-3-319-22879-2
ISBN (Print)978-3-319-22878-5
Publication statusPublished (in print/issue) - 20 Aug 2015
EventInternational Conference on Intelligent Robotics and Applications (ICIRA) -
Duration: 1 Jan 2015 → …

Publication series

NameLecture Notes in Computer Science
PublisherSpringer Nature


ConferenceInternational Conference on Intelligent Robotics and Applications (ICIRA)
Period1/01/15 → …


  • Multiscale image descriptor
  • face recognition
  • HOG


Dive into the research topics of 'A Multiscale Method for HOG-Based Face Recognition'. Together they form a unique fingerprint.

Cite this