An Efficient Industrial System for Vehicle Tyre (Tire) Detection and Text Recognition Using Deep Learning

Wajahat Kazmi, Ian Nabney, George Vogiatzis, Peter Rose, Alex Codd

Research output: Contribution to journalArticle

Abstract

This paper addresses the challenge of reading low contrast text on tyre sidewall images of vehicles in motion. It presents first of its kind, a full scale industrial system which can read tyre codes when installed along driveways such as at gas stations or parking lots with vehicles driving under 10 mph. Tyre circularity is first detected using a circular Hough transform with dynamic radius detection. The detected tyre arches are then unwarped into rectangular patches. A cascade of convolutional neural network (CNN) classifiers is then applied for text recognition. Firstly, a novel proposal generator for the code localization is introduced by integrating convolutional layers producing HOG-like (Histogram of Oriented Gradients) features into a CNN. The proposals are then filtered using a deep network. After the code is localized, character detection and recognition are carried out using two separate deep CNNs. The results (accuracy, repeatability and efficiency) are impressive and show promise for the intended application.
Original languageEnglish
JournalIEEE Transactions on Intelligent Transportation Systems
Early online date24 Jan 2020
DOIs
Publication statusE-pub ahead of print - 24 Jan 2020

Bibliographical note

© 2020 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.

Fingerprint Dive into the research topics of 'An Efficient Industrial System for Vehicle Tyre (Tire) Detection and Text Recognition Using Deep Learning'. Together they form a unique fingerprint.

  • Cite this