Fast Segmentation of Industrial Quality Pavement Images using Laws Texture Energy Measures and k-Means Clustering

Senthan Mathavan, Akash Kumar, Khurram Kamal, Michael Nieminen, Hitesh Shah, Mujib Rahman

Research output: Contribution to journalArticlepeer-review

Abstract

Thousands of pavement images are collected by road authorities daily for condition monitoring surveys. These images typically have intensity variations and texture nonuniformities that make their segmentation challenging. The automated segmentation of such pavement images is crucial for accurate, thorough, and expedited health monitoring of roads. In the pavement monitoring area, well-known texture descriptors, such as gray-level co-occurrence matrices and local binary patterns, are often used for surface segmentation and identification. These, despite being the established methods for texture discrimination, are inherently slow. This work evaluates Laws texture energy measures as a viable alternative for pavement images for the first time. k-means clustering is used to partition the feature space, limiting the human subjectivity in the process. Data classification, hence image segmentation, is performed by the k-nearest neighbor method. Laws texture energy masks are shown to perform well with resulting accuracy and precision values of more than 80%. The implementations of the algorithm, in both MATLAB® and OpenCV/C++, are extensively compared against the state of the art for execution speed, clearly showing the advantages of the proposed method. Furthermore, the OpenCV-based segmentation shows a 100% increase in processing speed when compared to the fastest algorithm available in literature.
Original languageEnglish
Article number053010
JournalJournal of Electronic Imaging
Volume25
Issue number5
DOIs
Publication statusPublished - 16 Sept 2016

Fingerprint

Dive into the research topics of 'Fast Segmentation of Industrial Quality Pavement Images using Laws Texture Energy Measures and k-Means Clustering'. Together they form a unique fingerprint.

Cite this