Vision-based Urban Services


Real-time Communication for Distributed Vision System in Urban Services

Infrastructure-assisted autonomous driving is an emerging paradigm that expects to significantly improve the driving safety of autonomous vehicles. The key enabling technology for this vision is to enhance the vehicle’s perception in real time by fusing the data from multiple distributed 3D sensors such as LiDARs on the roadside infrastructure and the passing vehicles. In this project, we have proposed new multi-hop wireless networks that can achieve high-bandwidth communication between multiple smart lampposts by leveraging advanced network coding. We also developed new systems that can achieve decimeter-level and real-time (up to 100ms) perception fusion between driving vehicles and roadside infrastructure. The key idea of our systems is to exploit highly efficient matching of data structures that encode objects’ lean representations as well as their relationships, such as locations, semantics, sizes, and spatial distribution.