The future of camera networks: staying smart in a chaotic world

Research output: Chapter in Book/Report/Conference proceedingConference contribution

View graph of relations Save citation



Research units


Camera networks become smart when they can interpret video data on board, in order to carry out tasks as a collective, such as target tracking and (re-)identi cation of objects of interest. Unlike today’s deployments, which are mainly restricted to lab settings and highly controlled high-value applications, future smart camera networks will be messy and unpredictable. They will operate on a vast scale, drawing on mobile resources connected in networks structured in complex and changing ways. They will comprise heterogeneous and decentralised aggregations of visual sensors, which will come together in temporary alliances, in unforeseen and rapidly unfolding scenarios. The potential to include and harness citizen-contributed mobile streaming, body-worn video, and robot- mounted cameras, alongside more traditional xed or PTZ cameras, and supported by other non-visual sensors, leads to a number of di cult and important challenges. In this position paper, we discuss a variety of potential uses for such complex smart camera networks, and some of the challenges that arise when staying smart in the presence of such complexity. We present a general discussion on the challenges of heterogeneity, coordination, self-recon gurability, mobility, and collaboration in camera networks.



Publication date7 Sep 2017
Publication title11th International Conference on Distributed Smart Cameras
Number of pages6
ISBN (Electronic)978-1-4503-5487-5
Original languageEnglish
Event11th International Conference on Distributed Smart Cameras - Stanford University, Palo Alto, United States
Duration: 5 Sep 20177 Oct 2017


Conference11th International Conference on Distributed Smart Cameras
CountryUnited States
CityPalo Alto

Bibliographic note

Copyright: The authors


Download statistics

No data available

Employable Graduates; Exploitable Research

Copy the text from this field...