skip to content

Asphalt road layer detection for automated construction progress monitoring

Ph.D Candidate Steven M. Vick with Laing O'Rouke reader Dr Ioannis Brilakis, are researching methods for automating construction progress monitoring on large civil infrastructure projects, specifically focusing on detecting progress in the layers of asphalt road construction. The current practice for identifying such progress relies on visual surveys and collation/interpretation of data from a variety of sources to include daily checklists, site photographs, verbal reports, and material receipt ledgers. Studies have shown that this approach is inconsistent (due to the inherent subjectivity) and arduous, consuming as much as 50% of project management’s time. Advances in image sensors, photogrammetry, and laser technology have made it possible to collect dense and accurate as-built data. Unmanned aerial systems, in particular, provide a promising platform for collecting such data remotely without interfering with ongoing work. However, distilling this data into meaningful progress information remains a predominantly manual task. 

One goal of Mr. Vick’s research is to facilitate the automated interpretation of this spatial data by automatically recognizing 3D design surfaces layers in the point cloud scene. To accomplish this, a novel hierarchical data structure, the BrickTree, is proposed. This structure allows for incremental detection of design layers in discrete regions of as-built point cloud data. The hierarchical nature ensures that only one layer can be recognized in each discrete region of the road corridor, and enables rapid consensus testing in the local neighborhood to account for random errors and noise in the data. A proof-of-concept software application was developed in C++ and C# languages using an in-house coding platform named Gygax. The solution was then tested and verified on real-world data collected on two separate days during the construction of a residential road in Cambridge, UK. The experimental results culminated in an overall F1 Score of 91.6% for the two datasets.