Abstract
As a new type of neuromorphic sensor, the events-driven Dynamic Vision Sensor (DVS) has the advantages of low latency and wide dynamic range. The development of DVS overcomes the application limitations of traditional cameras in the field of high speed robots and autonomous driving. However, there are abundant unwanted noise events contained in the output of DVS, which severely affects the subsequent data-processing. To tackle the problem, a novel noise filter for DVS is proposed in this paper. Firstly, since the noise is composed of hot pixel events with high-frequency and random background activities, a time window is constructed to continuously detect hot pixels. Secondly, a spatiotemporal correlation-based denoising approach for each incoming event is conducted, while excluding its surrounding hot pixels, to avoid the negative effect caused by their high temporal correlation on the filter. Finally, the hot pixels are compensated with the events that are most probably to occur depending on the adjacent events, to achieve the completeness and precision of the output event stream. Experiments in different scenes demonstrate that the proposed noise filter can effectively eliminate the noise from the event stream of DVS and obviously outperforms the baseline method.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Patrick, L., Christoph, P., Tobi, D.: A 128 x 128 120db 15mµs latency asynchronous temporal contrast vision sensor. IEEE J. Solid-State Circuits 43(2), 566–576 (2008)
Brandli, C., Berner, R., Yang, M.: A 240 x 180 130 db 3 µs latency global shutter spatiotemporal vision sensor. IEEE J. Solid-State Circuits 49(10), 2333–2341 (2014)
Kueng, B., Mueggler, E., Gallego, G.: Low-latency visual odometry using event-based feature tracks. In: 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 16–23 (2016)
Ni, Z., Bolopion, A., Agnus, J.: Asynchronous event-based visual shape tracking for stable haptic feedback in microrobotics. IEEE Trans. Rob. 28(5), 1081–1089 (2012)
Lagorce, X., Orchard, G., Gallupi, F.: Hots: A hierarchy of event-based time-surfaces for pattern recognition. IEEE Trans. Pattern Anal. Mach. Intell. 39(7), 1346–1359 (2017)
Liu, H., Moeys, D.P., Das, G.: Combined frame and event-based detection and tracking. In: IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2511–2514 (2016)
Jing, H., Guo, M., Chen, S.: A dynamic vision sensor with direct logarithmic output and full-frame picture-on-demand. In: IEEE International Symposium on Circuits and Systems (ISCAS), pp. 28–31 (2017)
Khodamoradi, A., Kastner, R.: O(n)-space spatiotemporal filter for reducing noise in neuromorphic vision sensors. IEEE Trans. Emerg. Top. Comput. 9(1), 15–23 (2018)
Liu, H., Brandli, C., Li, C.: Design of a spatiotemporal correlation filter for event-based sensors. In: IEEE International Symposium on Circuits and Systems (ISCAS), pp. 722–725 (2015)
Lee, S., Kim, H., Kim, H.J.: Edge detection for event cameras using intra-pixel-area events. In: Proceedings of the British Machine Vision Conference, pp. 196–207 (2019)
Feng, Y., Lv, H., Liu, H.: Event density based denoising method for dynamic vision sensor. Appl. Sci. 10(6), 76–94 (2020)
Padala, V., Basu, A., Orchard, G.: A noise filtering algorithm for event-based asynchronous change detection image sensors on true north and its implementation on TrueNorth. Front. Neurosci. 12(118), 1276–1290 (2018)
Ramesh, B., Yang, H., Orchard, G.: DART: distribution aware retinal transform for event-based cameras. IEEE Trans. Pattern Anal. Mach. Intell. 42(11), 2767–2780 (2020)
Ramesh, B., Ussa, A., Vedova, L.D., Yang, H., Orchard, G.: PCA-RECT: an energy-efficient object detection approach for event cameras. In: Carneiro, G., You, S. (eds.) ACCV 2018. LNCS, vol. 11367, pp. 434–449. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-21074-8_35
Wu, J., Ma, C., Yu, X.: Denoising of event-based sensors with spatial-temporal correlation. In: IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 4437–4441 (2020)
Jinjian, W., Ma, C., Li, L., Dong, W., Shi, G.: Probabilistic undirected graph based denoising method for dynamic vision sensor. IEEE Trans. Multimedia 23, 1148–1159 (2021)
Acknowledgements
The work was supported by National Natural Science Foundation of China (61773113), Primary Research & Development Plan of Jiangsu Province (BE2018384).
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Xu, N., Zhao, J., Ren, Y., Wang, L. (2022). A Noise Filter for Dynamic Vision Sensor Based on Spatiotemporal Correlation and Hot Pixel Detection. In: Wu, M., Niu, Y., Gu, M., Cheng, J. (eds) Proceedings of 2021 International Conference on Autonomous Unmanned Systems (ICAUS 2021). ICAUS 2021. Lecture Notes in Electrical Engineering, vol 861. Springer, Singapore. https://doi.org/10.1007/978-981-16-9492-9_78
Download citation
DOI: https://doi.org/10.1007/978-981-16-9492-9_78
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-16-9491-2
Online ISBN: 978-981-16-9492-9
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)