Skip to main content

A Noise Filter for Dynamic Vision Sensor Based on Spatiotemporal Correlation and Hot Pixel Detection

  • Conference paper
  • First Online:
Proceedings of 2021 International Conference on Autonomous Unmanned Systems (ICAUS 2021) (ICAUS 2021)

Part of the book series: Lecture Notes in Electrical Engineering ((LNEE,volume 861))

Included in the following conference series:

Abstract

As a new type of neuromorphic sensor, the events-driven Dynamic Vision Sensor (DVS) has the advantages of low latency and wide dynamic range. The development of DVS overcomes the application limitations of traditional cameras in the field of high speed robots and autonomous driving. However, there are abundant unwanted noise events contained in the output of DVS, which severely affects the subsequent data-processing. To tackle the problem, a novel noise filter for DVS is proposed in this paper. Firstly, since the noise is composed of hot pixel events with high-frequency and random background activities, a time window is constructed to continuously detect hot pixels. Secondly, a spatiotemporal correlation-based denoising approach for each incoming event is conducted, while excluding its surrounding hot pixels, to avoid the negative effect caused by their high temporal correlation on the filter. Finally, the hot pixels are compensated with the events that are most probably to occur depending on the adjacent events, to achieve the completeness and precision of the output event stream. Experiments in different scenes demonstrate that the proposed noise filter can effectively eliminate the noise from the event stream of DVS and obviously outperforms the baseline method.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 549.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 699.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 699.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Patrick, L., Christoph, P., Tobi, D.: A 128 x 128 120db 15mµs latency asynchronous temporal contrast vision sensor. IEEE J. Solid-State Circuits 43(2), 566–576 (2008)

    Article  Google Scholar 

  2. Brandli, C., Berner, R., Yang, M.: A 240 x 180 130 db 3 µs latency global shutter spatiotemporal vision sensor. IEEE J. Solid-State Circuits 49(10), 2333–2341 (2014)

    Article  Google Scholar 

  3. Kueng, B., Mueggler, E., Gallego, G.: Low-latency visual odometry using event-based feature tracks. In: 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 16–23 (2016)

    Google Scholar 

  4. Ni, Z., Bolopion, A., Agnus, J.: Asynchronous event-based visual shape tracking for stable haptic feedback in microrobotics. IEEE Trans. Rob. 28(5), 1081–1089 (2012)

    Article  Google Scholar 

  5. Lagorce, X., Orchard, G., Gallupi, F.: Hots: A hierarchy of event-based time-surfaces for pattern recognition. IEEE Trans. Pattern Anal. Mach. Intell. 39(7), 1346–1359 (2017)

    Article  Google Scholar 

  6. Liu, H., Moeys, D.P., Das, G.: Combined frame and event-based detection and tracking. In: IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2511–2514 (2016)

    Google Scholar 

  7. Jing, H., Guo, M., Chen, S.: A dynamic vision sensor with direct logarithmic output and full-frame picture-on-demand. In: IEEE International Symposium on Circuits and Systems (ISCAS), pp. 28–31 (2017)

    Google Scholar 

  8. Khodamoradi, A., Kastner, R.: O(n)-space spatiotemporal filter for reducing noise in neuromorphic vision sensors. IEEE Trans. Emerg. Top. Comput. 9(1), 15–23 (2018)

    Google Scholar 

  9. Liu, H., Brandli, C., Li, C.: Design of a spatiotemporal correlation filter for event-based sensors. In: IEEE International Symposium on Circuits and Systems (ISCAS), pp. 722–725 (2015)

    Google Scholar 

  10. Lee, S., Kim, H., Kim, H.J.: Edge detection for event cameras using intra-pixel-area events. In: Proceedings of the British Machine Vision Conference, pp. 196–207 (2019)

    Google Scholar 

  11. Feng, Y., Lv, H., Liu, H.: Event density based denoising method for dynamic vision sensor. Appl. Sci. 10(6), 76–94 (2020)

    Article  Google Scholar 

  12. Padala, V., Basu, A., Orchard, G.: A noise filtering algorithm for event-based asynchronous change detection image sensors on true north and its implementation on TrueNorth. Front. Neurosci. 12(118), 1276–1290 (2018)

    Google Scholar 

  13. Ramesh, B., Yang, H., Orchard, G.: DART: distribution aware retinal transform for event-based cameras. IEEE Trans. Pattern Anal. Mach. Intell. 42(11), 2767–2780 (2020)

    Google Scholar 

  14. Ramesh, B., Ussa, A., Vedova, L.D., Yang, H., Orchard, G.: PCA-RECT: an energy-efficient object detection approach for event cameras. In: Carneiro, G., You, S. (eds.) ACCV 2018. LNCS, vol. 11367, pp. 434–449. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-21074-8_35

    Chapter  Google Scholar 

  15. Wu, J., Ma, C., Yu, X.: Denoising of event-based sensors with spatial-temporal correlation. In: IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 4437–4441 (2020)

    Google Scholar 

  16. Jinjian, W., Ma, C., Li, L., Dong, W., Shi, G.: Probabilistic undirected graph based denoising method for dynamic vision sensor. IEEE Trans. Multimedia 23, 1148–1159 (2021)

    Article  Google Scholar 

Download references

Acknowledgements

The work was supported by National Natural Science Foundation of China (61773113), Primary Research & Development Plan of Jiangsu Province (BE2018384).

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Xu, N., Zhao, J., Ren, Y., Wang, L. (2022). A Noise Filter for Dynamic Vision Sensor Based on Spatiotemporal Correlation and Hot Pixel Detection. In: Wu, M., Niu, Y., Gu, M., Cheng, J. (eds) Proceedings of 2021 International Conference on Autonomous Unmanned Systems (ICAUS 2021). ICAUS 2021. Lecture Notes in Electrical Engineering, vol 861. Springer, Singapore. https://doi.org/10.1007/978-981-16-9492-9_78

Download citation

Publish with us

Policies and ethics