Cheng, Ming-Ming, Liu, Yun, Lin, Wen-Yan, Zhang, Ziming, Rosin, Paul L. ORCID: https://orcid.org/0000-0002-4965-3884 and Torr, Philip H. S. 2019. BING: Binarized normed gradients for objectness estimation at 300fps. Computational Visual Media 5 (1) , pp. 3-20. 10.1007/s41095-018-0120-1 |
Preview |
PDF
- Accepted Post-Print Version
Available under License Creative Commons Attribution. Download (4MB) | Preview |
Abstract
Training a generic objectness measure to produce object proposals has recently become of significant interest. We observe that generic objects with well-defined closed boundaries can be detected by looking at the norm of gradients, with a suitable resizing of their corresponding image windows to a small fixed size. Based on this observation and computational reasons, we propose to resize the window to 8 × 8 and use the norm of the gradients as a simple 64D feature to describe it, for explicitly training a generic objectness measure. We further show how the binarized version of this feature, namely binarized normed gradients (BING), can be used for efficient objectness estimation, which requires only a few atomic operations (e.g., add, bitwise shift, etc.). To improve localization quality of the proposals while maintaining efficiency, we propose a novel fast segmentation method and demonstrate its effectiveness for improving BING’s localization performance, when used in multithresholding straddling expansion (MTSE) postprocessing. On the challenging PASCAL VOC2007 dataset, using 1000 proposals per image and intersectionover- union threshold of 0.5, our proposal method achieves a 95.6% object detection rate and 78.6% mean average best overlap in less than 0.005 second per image.
Item Type: | Article |
---|---|
Date Type: | Publication |
Status: | Published |
Schools: | Computer Science & Informatics |
Publisher: | SpringerOpen |
ISSN: | 2096-0433 |
Date of First Compliant Deposit: | 1 October 2018 |
Date of Acceptance: | 27 May 2018 |
Last Modified: | 29 Nov 2024 03:00 |
URI: | https://orca.cardiff.ac.uk/id/eprint/115427 |
Citation Data
Cited 52 times in Scopus. View in Scopus. Powered By Scopus® Data
Actions (repository staff only)
Edit Item |