Memory-efficient NBNN image classification

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 685
  • Download : 0
Naive Bayes nearest neighbor (NBNN) is a simple image classifier based on identifying nearest neighbors. NBNN uses original image descriptors (e.g., SIFTs) without vector quantization for preserving the discriminative power of descriptors and has a powerful generalization characteristic. However, it has a distinct disadvantage. Its memory requirement can be prohibitively high while processing a large amount of data. To deal with this problem, we apply a spherical hashing binary code embedding technique, to compactly encode data without significantly losing classification accuracy. We also propose using an inverted index to identify nearest neighbors among binarized image descriptors. To demonstrate the benefits of our method, we apply our method to two existing NBNN techniques with an image dataset. By using 64 bit length, we are able to reduce memory 16 times with higher runtime performance and no significant loss of classification accuracy. This result is achieved by our compact encoding scheme for image descriptors without losing much information from original image descriptors.
Publisher
Korean Institute of Information Scientists and Engineers
Issue Date
2017-03
Language
English
Article Type
Article
Keywords

Bins; Encoding (symbols); Indexing (of information); Classification accuracy; Discriminative power; Embedding technique; Hashing; Memory efficiency; Memory requirements; NBNN; Run-time performance; Image classification

Citation

Journal of Computing Science and Engineering, v.11, no.1, pp.1 - 8

ISSN
1976-4677
DOI
10.5626/JCSE.2017.11.1.1
URI
http://hdl.handle.net/10203/225261
Appears in Collection
CS-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0