Skip to main content

A Defocus Based Novel Keyboard Design

  • Conference paper
  • First Online:
Human-Computer Interaction. Multimodal and Natural Interaction (HCII 2020)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 12182))

Included in the following conference series:

  • 2175 Accesses

Abstract

Defocus based Depth estimation has been widely applied for constructing 3D setup from 2D image(s), reconstructing 3D scenes and image refocusing. Using defocus enables us to infer depth information from a single image using visual clues which can be captured by a monocular camera. In this paper, we propose an application of Depth from Defocus to a novel, portable keyboard design. Our estimation technique is based on the concept that depth of the finger with respect to our camera and its defocus blur value is correlated, and a map can be obtained to detect the finger position accurately. We have utilised the near-focus region for our design, assuming that the closer an object is to our camera, more will be its defocus blur. The proposed keyboard can be integrated with smartphones, tablets and Personal Computers, and only requires printing on plain paper or projection on a flat surface. The detection approach involves tracking the finger’s position as the user types, measuring its defocus value when a key is pressed, and mapping the measured defocus together with a precalibrated relation between the defocus amount and the keyboard pattern. This is utilised to infer the finger’s depth, which, along with the azimuth position of the stroke, identifies the pressed key. Our minimalistic design only requires a monocular camera, and there is no need for any external hardware. This makes the proposed approach a cost-effective and feasible solution for a portable keyboard.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Kölsch, M., Turk, M.: Keyboards without keyboards: a survey of virtual keyboards. In: Workshop on Sensing and Input for Media-Centric Systems, Santa Barbara, CA (2002)

    Google Scholar 

  2. Kim, J.R., Tan, H.Z.: Haptic feedback intensity affects touch typing performance on a flat keyboard. In: Auvray, M., Duriez, C. (eds.) EUROHAPTICS 2014. LNCS, vol. 8618, pp. 369–375. Springer, Heidelberg (2014). https://doi.org/10.1007/978-3-662-44193-0_46

    Chapter  Google Scholar 

  3. Kanevsky, D., Sabath, M., Zlatsin, A.: Virtual invisible keyboard. US Patent 7,042,442, 9 May 2006

    Google Scholar 

  4. Habib, H.A., Mufti, M.: Real time mono vision gesture based virtual keyboard system. IEEE Trans. Consum. Electron. 52(4), 1261–1266 (2006)

    Google Scholar 

  5. Du, H., Charbon, E.: A virtual keyboard system based on multi-level feature matching. In: 2008 Conference on Human System Interactions, pp. 176–181. IEEE (2008)

    Google Scholar 

  6. Murase, T., Moteki, A., Suzuki, G., Nakai, T., Hara, N., Matsuda, T.: Gesture keyboard with a machine learning requiring only one camera. In: Proceedings of the 3rd Augmented Human International Conference, p. 29. ACM (2012)

    Google Scholar 

  7. Huan, D., Oggier, T., Lustenberger, F., Charbon, E.: A virtual keyboard based on true-3D optical ranging. In: Proceedings of the British Machine Vision Conference, vol. 1, pp. 220–229 (2005)

    Google Scholar 

  8. Su, X., Zhang, Y., Zhao, Q., Gao, L.: Virtual keyboard: a human-computer interaction device based on laser and image processing. In: 2015 IEEE International Conference on Cyber Technology in Automation, Control, and Intelligent Systems (CYBER), pp. 321–325. IEEE (2015)

    Google Scholar 

  9. Lee, M., Woo, W.: ARKB: 3D vision-based augmented reality keyboard. In: ICAT (2003)

    Google Scholar 

  10. Adajania, Y., Gosalia, J., Kanade, A., Mehta, H., Shekokar, N.: Virtual keyboard using shadow analysis. In: 2010 3rd International Conference on Emerging Trends in Engineering and Technology, pp. 163–165. IEEE (2010)

    Google Scholar 

  11. Posner, E., Starzicki, N., Katz, E.: A single camera based floating virtual keyboard with improved touch detection. In: 2012 IEEE 27th Convention of Electrical and Electronics Engineers in Israel, pp. 1–5. IEEE (2012)

    Google Scholar 

  12. Grossmann, P.: Depth from focus. Pattern Recogn. Lett. 5(1), 63–69 (1987)

    Article  Google Scholar 

  13. Bülthoff, H.H., Mallot, H.A.: Integration of depth modules: stereo and shading. J. Opt. Soc. Am. A 5(10), 1749 (1988)

    Article  Google Scholar 

  14. Ullman, S.: The interpretation of structure from motion. Proc. R. Soc. Lond. Ser. B Biol. Sci. 203(1153), 405–426 (1979)

    Google Scholar 

  15. Levin, A., Fergus, R., Durand, F., Freeman, W.T.: Image and depth from a conventional camera with a coded aperture. ACM Trans. Graph. 26(3), 70 (2007)

    Article  Google Scholar 

  16. Khoshelham, K., Elberink, S.O.: Accuracy and resolution of Kinect depth data for indoor mapping applications. Sensors 12(2), 1437–1454 (2012)

    Article  Google Scholar 

  17. Srikakulapu, V., Kumar, H., Gupta, S., Venkatesh, K.S.: Depth estimation from single image using defocus and texture cues. In: 2015 Fifth National Conference on Computer Vision, Pattern Recognition, Image Processing and Graphics (NCVPRIPG), pp. 1–4. IEEE (2015)

    Google Scholar 

  18. Magoulès, F., Zou, Q.: A novel contactless human machine interface based on machine learning. In: 2017 16th International Symposium on Distributed Computing and Applications to Business, Engineering and Science (DCABES), pp. 137–140. IEEE (2017)

    Google Scholar 

  19. AlKassim, Z.: Virtual laser keyboards: a giant leap towards human-computer interaction. In: 2012 International Conference on Computer Systems and Industrial Informatics. IEEE, December 2012

    Google Scholar 

  20. Salmansha, P.N., Parveen, S., Yohannan, F., Vasavan, A., Kurian, M.: Mini keyboard: portative human interactive device. In: 2017 International Conference on Communication and Signal Processing (ICCSP), pp. 1531–1535. IEEE (2017)

    Google Scholar 

  21. Goldstein, M., Chincholle, D., Backström, M.: Assessing two new wearable input paradigms: the finger-joint-gesture palm-keypad glove and the invisible phone clock. Pers. Technol. 4(2–3), 123–133 (2000)

    Article  Google Scholar 

  22. Fukumoto, M., Tonomura, Y.: Body coupled FingerRing. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems - CHI 1997. ACM Press (1997)

    Google Scholar 

  23. Zhao, Y., Lian, C., Zhang, X., Sha, X., Shi, G., Li, W.J.: Wireless IoT motion-recognition rings and a paper keyboard. IEEE Access 7, 44514–44524 (2019)

    Article  Google Scholar 

  24. Lv, Z., et al.: A new finger touch detection algorithm and prototype system architecture for pervasive bare-hand human computer interaction. In: 2013 IEEE International Symposium on Circuits and Systems (ISCAS 2013), pp. 725–728. IEEE (2013)

    Google Scholar 

  25. Erdem, M.E., Erdem, I.A., Atalay, V., Cetin, A.E.: Computer vision based unistroke keyboard system and mouse for the handicapped. In: 2003 International Conference on Multimedia and Expo, ICME 2003, Proceedings (Cat. No. 03TH8698), vol. 2, pp. II-765. IEEE (2003)

    Google Scholar 

  26. Srivastava, S., Tripathi, R.C.: Real time mono-vision based customizable virtual keyboard using finger tip speed analysis. In: Kurosu, M. (ed.) HCI 2013. LNCS, vol. 8007, pp. 497–505. Springer, Heidelberg (2013). https://doi.org/10.1007/978-3-642-39330-3_53

    Chapter  Google Scholar 

  27. Livada, Č., Proleta, M., Romić, K., Leventić, H.: Beyond the touch: a web camera based virtual keyboard. In: 2017 International Symposium ELMAR, pp. 47–50. IEEE (2017)

    Google Scholar 

  28. Malik, S.: Real-time hand tracking and finger tracking for interaction csc2503f project report. Technical report, Department of Computer Science, University of Toronto (2003)

    Google Scholar 

  29. Zhuo, S., Sim, T.: Defocus map estimation from a single image. Pattern Recogn. 44(9), 1852–1858 (2011)

    Article  Google Scholar 

  30. Subbarao, M., Surya, G.: Depth from defocus: a spatial domain approach. Int. J. Comput. Vision 13(3), 271–294 (1994)

    Article  Google Scholar 

  31. Tang, C., Hou, C., Song, Z.: Defocus map estimation from a single image via spectrum contrast. Opt. Lett. 38(10), 1706–1708 (2013)

    Article  Google Scholar 

  32. Karaali, A., Jung, C.R.: Edge-based defocus blur estimation with adaptive scale selection. IEEE Trans. Image Process. 27(3), 1126–1137 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  33. Kumar, H., Gupta, S., Venkatesh, K.S.: Defocus map estimation from a single image using principal components. In: 2015 International Conference on Signal Processing, Computing and Control (ISPCC). IEEE (2015)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tushar Goswamy .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Gupta, P., Goswamy, T., Kumar, H., Venkatesh, K.S. (2020). A Defocus Based Novel Keyboard Design. In: Kurosu, M. (eds) Human-Computer Interaction. Multimodal and Natural Interaction. HCII 2020. Lecture Notes in Computer Science(), vol 12182. Springer, Cham. https://doi.org/10.1007/978-3-030-49062-1_25

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-49062-1_25

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-49061-4

  • Online ISBN: 978-3-030-49062-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics