This paper addresses the challenge of reading low contrast text on tyre sidewall images of vehicles in motion. It presents first of its kind, a full scale industrial system which can read tyre codes when installed along driveways such as at gas stations or parking lots with vehicles driving under 10 mph. Tyre circularity is first detected using a circular Hough transform with dynamic radius detection. The detected tyre arches are then unwarped into rectangular patches. A cascade of convolutional neural network (CNN) classifiers is then applied for text recognition. Firstly, a novel proposal generator for the code localization is introduced by integrating convolutional layers producing HOG-like (Histogram of Oriented Gradients) features into a CNN. The proposals are then filtered using a deep network. After the code is localized, character detection and recognition are carried out using two separate deep CNNs. The results (accuracy, repeatability and efficiency) are impressive and show promise for the intended application.
|Pages (from-to)||1264 - 1275|
|Number of pages||12|
|Journal||IEEE Transactions on Intelligent Transportation Systems|
|Early online date||24 Jan 2020|
|Publication status||Published - Feb 2021|
Bibliographical note© 2020 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
- Intelligent vehicles
- Optical Character Recognition (OCR)
- computer vision
- deep learning
- tyre (tire) sidewall