skip to main content
research-article

Mulsemedia: State of the Art, Perspectives, and Challenges

Published:01 October 2014Publication History
Skip Abstract Section

Abstract

Mulsemedia—multiple sensorial media—captures a wide variety of research efforts and applications. This article presents a historic perspective on mulsemedia work and reviews current developments in the area. These take place across the traditional multimedia spectrum—from virtual reality applications to computer games—as well as efforts in the arts, gastronomy, and therapy, to mention a few. We also describe standardization efforts, via the MPEG-V standard, and identify future developments and exciting challenges the community needs to overcome.

Skip Supplemental Material Section

Supplemental Material

References

  1. E. Aarts and B. de Ruyter. 2009. New research perspectives on ambient intelligence. J. Ambient Intell. Smart Environ. 1, 1, 5--14. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. O. Ademoye and G. Ghinea. 2009. Synchronization of olfaction-enhanced multimedia. IEEE Trans. Multimedia 11, 3, 561--565. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. J. R. Anderson. 2004. Cognitive Psychology and Its Implications 6th Ed. Worth Publishers.Google ScholarGoogle Scholar
  4. J. G. Apostolopoulos, P. A. Chou, B. Culbertson, T. Kalker, M. D. Trott, and S. Wee. 2012. The road to immersive communication. Proc. IEEE 100, 4, 974--990.Google ScholarGoogle ScholarCross RefCross Ref
  5. S. Ayabe--Kanamura, I. Schicker, M. Laska, R. Hudson, H. Distel, T. Koboyakawa, and S. Saito. 1998. A Japanese-German cross-cultural study. Chem. Sens., 23, 1, 31--38.Google ScholarGoogle ScholarCross RefCross Ref
  6. A. Bodnar, R. Corbett, and D. Nekrasovski. 2004. AROMA: Ambient awareness through olfaction in a messaging application: Does olfactory notification make ‘scents’? In Proceedings of the 6th International Conference on Multimodal Interfaces (ICMI'04). 183--190. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. S. Boyd-Davis, G. Davies, R. Haddad, and M. Lai. 2006. Smell Me: Engaging with an interactive olfactory game. In Proceedings 25th Annual Meeting of the Human Factors and Ergonomics Society, 25--40.Google ScholarGoogle Scholar
  8. S. A. Brewster, D. K. Mcgookin, and C. A. Miller. 2006. Olfoto: Designing a smell-based interaction. In Proceedings of the Conference on Human Factors in Computing Systems. 653--662. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. D. Campbell, E. Jones, and M. Glavin. 2009. Audio quality assessment techniques—A review, and recent developments. Signal Process., 89, 8, 1489--1500. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. C.-C. Carbon and M. Jakesch. 2013. A model for haptic aesthetic processing and its implications for design. Proc. IEEE. 101, 9, 2123--2133.Google ScholarGoogle ScholarCross RefCross Ref
  11. J. P. Cater. 1992. The nose have it! Letters to the editor. Presence 1, 4, 493--494.Google ScholarGoogle Scholar
  12. A. Chang and C. O'Sullivan. 2005. Audio-haptic feedback in mobile phones. In Proceedings of the Extended Abstracts on Human Factors in Computing Systems (CHI EA'05). ACM, New York, NY, 2005, 1264--1267. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. A. D. Craig. 2003. Interoception: The sense of the physiological condition of the body. Curr. Opinion Neurobiol. 13, 4, 500--505.Google ScholarGoogle ScholarCross RefCross Ref
  14. A. R. Damasio. 1989. Time-locked multiregional retroactivation: A systems-level proposal for the neural substrates of recall and recognition. Cognition 33, 1--2, 25--62.Google ScholarGoogle ScholarCross RefCross Ref
  15. B. de Ruyter and E. Aarts. 2004. Ambient intelligence: Visualizing the future. In Proceedings of the Working Conference on Advanced Visual Interfaces (AVI'04). ACM Press, New York, NY, 203--208. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. P. Dimaggio. 1997. Culture and cognition. Ann. Rev. Sociol. 23, 263--287.Google ScholarGoogle ScholarCross RefCross Ref
  17. H. Q. Dinh, N. Walker, L. F. Hodges, C. Song, and A. Kobayashi. 1999. Evaluating the importance of multi-sensory input on memory and the sense of presence in virtual environments. In Proceedings of the Virtual Reality Annual International Symposium. 222--228. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. C. Fadel and C. Lemke. 2008. Multimodal learning through media: What the research says. CISCO Systems Report.Google ScholarGoogle Scholar
  19. G. Ghinea and O. Ademoye. 2010a. Perceived synchronization of olfactory multimedia. IEEE Trans. Syst. Man Cybernet.-Part A 40, 4, 657--663. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. G. Ghinea and O. Ademoye. 2010b. A user perspective of olfaction-enhanced mulsemedia. In Proceedings of the International Conference on Management of Emergent Digital EcoSystems (MEDES'10). 277--280. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. G. Ghinea and O. Ademoye. 2011. Olfaction-enhanced multimedia: Perspectives and challenges. Multimedia Tools Appl. 55, 3, 601--626. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. G. Ghinea and O. Ademoye. 2012. The sweet smell of success: Enhancing multimedia applications with olfaction. ACM Trans. Multimedia Comput. Commun. Appl. 8, 1, 2. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. E. B. Goldstein. 2013. Sensation and Perception. Cengage Learning.Google ScholarGoogle Scholar
  24. R. Gray, C. Spence, C. Ho, and H. Z. Tan. 2013. Efficient multimodal cuing of spatial attention Proc. IEEE 101, 9, 2113--2121.Google ScholarGoogle ScholarCross RefCross Ref
  25. M. Grega, L. Janowski, M. Leszczuk, P. Romaniak, and Z. Papir. 2008. Quality of experience evaluation for multimedia services - Szacowanie postrzeganej jako sci uslug (QoE) komunikacji multimedialnej. Przegla d Telekomunika- cyjny 81, 4, 142--153.Google ScholarGoogle Scholar
  26. S. Gumtau. 2011. Affordances of touch in multisensory embodied interface design. Ph.D., Dissertation University of Portsmouth, U.K.Google ScholarGoogle Scholar
  27. D. Hands. 2004. A basic multimedia quality model. IEEE Trans. Multimedia 6, 6, 806--816. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. M. L. Heilig. 1962. Sensorama simulator. United States Patent 3,050,870. Filed January 10, 1961, Patented August 28, 1962.Google ScholarGoogle Scholar
  29. P. Hinterseer and E. Steinbach. 2006. A psychophysically motivated compression approach for 3D haptic data. In Proceedings of the 14th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. 35--41. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. C. Ho and C. Spence. 2005. Olfactory facilitation of dual-task performance. Neurosci. Lett. 389, 1, 35--40.Google ScholarGoogle Scholar
  31. T. Hoß feld, D. Hock, P. Tran-Gia, K. Tutschku, and M. Fiedler. 2008. Testing the IQX hypothesis for exponential interdependency between QoS and QoE of voice codecs iLBC and G.711. In Proceedings of the 18th ITC Specialist Seminar on Quality of Experience.Google ScholarGoogle Scholar
  32. Y. Ishibashi, T. Kanbara, and S. Tasaka. 2004. Inter-stream synchronization between haptic media and voice in collaborative virtual environments. In Proceedings of the 12th Annual ACM International Conference on Multimedia. ACM, New York, NY, 604--611. Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. L. Itti, C. Koch, and E. Niebur. 1998. A model of saliency-based visual attention for rapid scene analysis. IEEE Trans. Patt. Anal. Mach. Intell. 20, 11, 1254-9. Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. ITU. 2008a. Subjective video quality assessment methods for multimedia applications. ITU-T Rec. P.910.Google ScholarGoogle Scholar
  35. ITU. 2008b. Subjective audiovisual quality assessment methods for multimedia applications. ITU-T Rec. P.911.Google ScholarGoogle Scholar
  36. R. Jain. 2003. Experiential computing. Commun. ACM 46, 7, 48--55. Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. N. Jayant, J. Johnston, and R. Safranek. 1993. Signal compression based on models of human perception. Proc. IEEE 81, 10, 1385--1422.Google ScholarGoogle ScholarCross RefCross Ref
  38. L. Jones, C. A. Bowers, D. Washburn, A. Cortes, and R. V. Satya. 2004. The effect of olfaction on immersion into virtual environments. In Human Performance, Situation Awareness and Automation: Issues and Considerations for the 21st Century. Lawrence Erlbaum Associates, 282--285.Google ScholarGoogle Scholar
  39. K. Kahol, P. Tripathi, T. Mcdaniel, L. Bratton, and S. Panchanathan. 2006. Modeling context in haptic perception, rendering, and visualization. ACM Trans. Multimedia Comput. Commun. Appl. 2, 3, 219--240. Google ScholarGoogle ScholarDigital LibraryDigital Library
  40. J. Kammerl, I. Vittorias, V. Nitsch, B. Faerber, E. Steinbach, and S. Hirche. 2010. Perception-based data reduction for haptic force-feedback signals using adaptive deadbands. Presence, Teleoper. Virtual Environ. 19, 5, 450--462. Google ScholarGoogle ScholarDigital LibraryDigital Library
  41. D. Kahneman. 2003. A perspective on judgement and choice. Am. Psychol. 58, 9, 697--720.Google ScholarGoogle ScholarCross RefCross Ref
  42. J. N. Kaye. 2001. Symbolic olfactory display. Master's Thesis, Massachusetts Institute of Technology, MA. http://www.media.mit.edu/∼jofish/thesis/.Google ScholarGoogle Scholar
  43. H. Kim, H.-J. Kwon, and K.-S. Hong. 2010. Location awareness-based intelligent multi-agent technology. Multimedia Syst. 16, (4--5), 275--292.Google ScholarGoogle Scholar
  44. R. L. Klatzky, D. Pawluk, and A. Peer. 2013. Haptic perception of material properties and implications for applications. Proc. IEEE 101, 9, 2081--2092.Google ScholarGoogle Scholar
  45. P. Le Callet, S. Möller, and A. Perkis. (Eds). 2013. Qualinet white paper on definitions of quality of experience. White paper, European Network on Quality of Experience in Multimedia Systems and Services (COST Action IC 1003). Lausanne, Switzerland, Version 1.2.Google ScholarGoogle Scholar
  46. W. Lin. 2006. Computational models for just-noticeable difference. In Digital Video Image Quality and Perceptual Coding, H. R. Wu and K. R. Rao, (Eds.), CRC Press, Chapter 9.Google ScholarGoogle Scholar
  47. W. Lin and C.-C. Jay Kuo. 2011. Perceptual visual quality metrics: A survey. J. Visual Commun. Image Represent. 22, 4, 297--312. Google ScholarGoogle ScholarDigital LibraryDigital Library
  48. K. Liu and S. R. Gulliver. 2013. Semiotics in building space for working and living. In Intelligent Building: Design, Management and Operation, D. Clements-Croome, (Ed.), ICE Publishing.Google ScholarGoogle Scholar
  49. Z. Lu, W. Lin, X. Yang, E. Ong, and S. Yao. 2005. Modeling visual attention's modulatory aftereffects on visual sensitivity and quality evaluation. IEEE Trans. Image Process., 14, 11, 1928--1942. Google ScholarGoogle ScholarDigital LibraryDigital Library
  50. Y.-F. Ma, X.-S. Hua, L. Lu, and H.-J. Zhang. 2005. A generic framework of user attention model and its application in video summarization. IEEE Trans. Multimedia. 7, 5, 907--919. Google ScholarGoogle ScholarDigital LibraryDigital Library
  51. R. E. Mayer. 2003. Elements of a science of e-learning. J. Edu. Comput. Res. 29, 3, 297--313.Google ScholarGoogle ScholarCross RefCross Ref
  52. B. S. Manjunath, P. Salembier, and T. Sikora. 2002. Introduction to MPEG-7: Multimedia Content Description Interface. John Wiley 8. Sons Inc. Google ScholarGoogle ScholarDigital LibraryDigital Library
  53. R. Marois and J. Ivanoff. 2005. Capacity limits of information processing in the brain. Trends Cognitive Sci., 9, 6, 296--305.Google ScholarGoogle Scholar
  54. T. Metzinger. 1995. Faster than thought: Holism, homogeneity and temporal coding. In Conscious Experience T. Metzinger (Ed.), Imprint Academic, 425--461.Google ScholarGoogle Scholar
  55. A. Mochizuki, T. Amada, S. Sawa, T. Takeda, S. Motoyashiki, K. Kohyama, M. Imura, and K. Chihara. 2004. Fragra: A visual-olfactory VR game. In Proceedings of the ACM SIGGRAPH (SIGGRAPH'04). ACM Press, New York, NY, 123. Google ScholarGoogle ScholarDigital LibraryDigital Library
  56. S. Möller, W.-Y. Chan, N. Côté, T. H. Falk, A. Raake, and M. Wältermann. 2011. Speech quality estimation: Models and trends. IEEE Signal Process. Mag. 28, 6, 18--28.Google ScholarGoogle ScholarCross RefCross Ref
  57. G. Morrot, F. Brochet, and D. Dubourdieu. 2001. The color of odors. Brain Lang. 79, 2, 309--320.Google ScholarGoogle ScholarCross RefCross Ref
  58. M. Narwaria and W. Lin. 2012. SVD-based quality metric for image and video using machine learning. IEEE Trans. Syst. Man Cybernet. Part B 42, 2, 347--364. Google ScholarGoogle ScholarDigital LibraryDigital Library
  59. M. Narwaria, W. Lin, I. Mcloughlin, S. Emmanue, and L. T. Chia. 2012. Nonintrusive quality assessment of noise suppressed speech with mel-filtered energies and support vector regression. IEEE Trans. Audio Speech Lang. Process. 20, 4, 1217--1232. Google ScholarGoogle ScholarDigital LibraryDigital Library
  60. T. Nakamoto, S. Otaguro, M. Kinoshita, M. Nagahama, K. Ohinishi, and T. Ishida. 2008. Cooking up an interactive olfactory game display. IEEE Comput. Graph. Appl. 28, 1, 75--78. Google ScholarGoogle ScholarDigital LibraryDigital Library
  61. H.-C. Nothdurft. 2000. Salience from feature contrast: Additivity across dimensions. Vis. Res. 40, 10--12, 1183--1201.Google ScholarGoogle Scholar
  62. M. A. Otaduy, C. Garre, and M. C. Lin. 2013. Representations and algorithms for force-feedback display. Proc. IEEE 101, 9, 2068--2080.Google ScholarGoogle ScholarCross RefCross Ref
  63. F. Pereira. 2005. A triple user characterization model for video adaptation and quality of experience evaluation. In Proceedings of the 7th IEEE Workshop on Multimedia Signal Processing. 1--4.Google ScholarGoogle Scholar
  64. S. Pyo, S. Joo, B. Choi, M. Kim, and J. Kim. 2008. A metadata schema design on representation of sensory effect information for sensible media and its service framework using UPnP. In Proceedings of the 10th International Conference on Advanced Communication Technology (ICACT'08). 1129--1134.Google ScholarGoogle Scholar
  65. B. Rainer, M. Waltl, E. Cheng, M. Shujau, C. Timmerer, S. Davis, I. Burnett, C. Ritz, and H. Hellwagner. 2012. Investigating the impact of sensory effects on the quality of experience and emotional response in Web videos. In Proceedings of the 4th International Workshop on Quality of Multimedia Experience (QoMEX'12). IEEE, 278--283.Google ScholarGoogle Scholar
  66. E. Reinhard, A. A. Efros, J. Kautz, and H.-P. Seidel. 2013. On visual realism of synthesized imagery. Proc. IEEE 101, 9, 1998--2007.Google ScholarGoogle ScholarCross RefCross Ref
  67. A. Revonsuo. 1999. Binding and the phenomenal unity of consciousness. Consciou. Cognition 8, 2, 173--185.Google ScholarGoogle Scholar
  68. G. Richard, S. Sundaram, and S. Narayanan. 2013. An overview on perceptually motivated audio indexing and classification. Proc. IEEE 101, 9, 1939--1954.Google ScholarGoogle ScholarCross RefCross Ref
  69. L. A. Rowe and R. Jain. 2005. ACM SIGMM retreat report on future directions in multimedia research. ACM Trans. Multimedia Comput. Commun. Appl. 1, 1, 3--13. Google ScholarGoogle ScholarDigital LibraryDigital Library
  70. J. S. Rubinstein, D. E. Meyer, and J. E. Evans. 2001. Executive control of cognitive processes in task switching. J. Exp. Psychol. Human Percep. Perform. 27, 4, 763.Google ScholarGoogle ScholarCross RefCross Ref
  71. N. Sarter. 2013. Multimodal support for interruption management: Models, empirical findings, and design recommendations. Proc. IEEE 101, 9, 2105--2112.Google ScholarGoogle ScholarCross RefCross Ref
  72. P. H. Schiller. 1986. The central visual system. Vision Res. 26, 9, 1351--1386.Google ScholarGoogle ScholarCross RefCross Ref
  73. C. Seungmoon and K. J. Kuchenbecker. 2013. Vibrotactile display: Perception, technology, and applications. Proc. IEEE 101, 9, 2093--2104.Google ScholarGoogle ScholarCross RefCross Ref
  74. J. R. Smythies. 1994a. The Walls of Plato's cave. Avebury.Google ScholarGoogle Scholar
  75. J. R. Smythies. 1994b. Requiem for the identity theory. Inquiry 37, 311--329.Google ScholarGoogle ScholarCross RefCross Ref
  76. R. K. Stamper. 1973. Information in Business and Administrative Systems. John Wiley & Sons, New York. Google ScholarGoogle ScholarDigital LibraryDigital Library
  77. E. Steinbach, S. Hirche, M. Ernst, F. Brandi, R. Chaudhari, J. Kammerl, and I. Vittorias. 2012. Haptic communications. Proc. IEEE 100, 4, 937--956.Google ScholarGoogle ScholarCross RefCross Ref
  78. C. B. Suk, J. S. Hyun, and L. H. Yong. 2009. Sensory effect metadata for SMMD media service. In Proceedings of the 4th International Conference on Internet and Web Applications and Services. IEEE Computer Society, 649--654. Google ScholarGoogle ScholarDigital LibraryDigital Library
  79. ISO. 2011. ISO/IEC 23005-3 FDIS Information technology—Media context and control—Part 3: Sensory information. ISO.Google ScholarGoogle Scholar
  80. C. Timmerer, M. Waltl, B. Rainer, and H. Hellwagner. 2012. Assessing the quality of sensory experience for multimedia presentations. Signal Process. Image Commun. 27, 8, 909--916. Google ScholarGoogle ScholarDigital LibraryDigital Library
  81. R. Tortell, D. P. Luigi, A. Dozois, S. Bouchard, J. F. Morie, and D. Ilan. 2007. The effects of scent and game play experience on memory of a virtual environment. VirtualReality 11, 1, 61--68. Google ScholarGoogle ScholarDigital LibraryDigital Library
  82. A. Vetro and C. Timmerer. 2005. Digital item adaptation: Overview of standardization and research activities. IEEE Trans. Multimedia 7, 3, 418--426. Google ScholarGoogle ScholarDigital LibraryDigital Library
  83. M. Waltl, C. Timmerer, B. Rainer, and H. Hellwagner. 2012. Sensory effect dataset and test setups. In Proceedings of the 4th International Workshop on Quality of Multimedia Experience (QoMEX'12). IEEE, 115--120.Google ScholarGoogle Scholar
  84. M. Waltl, B. Rainer, C. Timmerer, and H. Hellwagner. 2013. An end-to-end tool chain for sensory experience based on MPEG-V Signal Process. Image Commun. 28, 2, 136--150. Google ScholarGoogle ScholarDigital LibraryDigital Library
  85. A. Williams, S. Langron, and A. Noble. 1984. Influence of appearance on the assessment of aroma in Bordeaux wines by trained assessors. J. Institute Brew. 90, 250--253.Google ScholarGoogle ScholarCross RefCross Ref
  86. H. R. Wu, A. Reibman, W. Lin, F. Pereira, and S. S. Hemami. 2013. Perceptual visual signal compression and transmission. Proc. IEEE 101, 9, 2025--2043.Google ScholarGoogle ScholarCross RefCross Ref
  87. X. Yang, W. Lin, Z. Lu, E. Ong, and S. Yao. 2005. Just noticeable distortion model and its applications in video coding. Signal Process. Image Commun. 20, 7, 662--680.Google ScholarGoogle ScholarCross RefCross Ref
  88. A. L. Yarbus. 1967. Eye movements during perception of complex objects. In Eye Movements and Vision. Springer, 171--211.Google ScholarGoogle Scholar
  89. A. Yazdani, E. Kroupi, J. Vesin, and T. Ebrahimi. 2012. Electroencephalogram alterations during perception of pleasant and unpleasant odors. In Proceedings of the 4th International Workshop on Quality of Multimedia Experience (QoMEX'12). IEEE, 272--277.Google ScholarGoogle Scholar
  90. K. Yoon, B. Choi, E.-S. Lee, and T.-B. Lim. 2010. 4-D Broadcasting with MPEG-V. In Proceedings of the IEEE International Workshop on Multimedia Signal Processing (MMSP). 257--262.Google ScholarGoogle ScholarCross RefCross Ref
  91. J. You, U. Reiter, M. M. Hannuksela, M. Gabbouj, and A. Perkis. 2010. Perceptual-based quality assessment for audio-visual services: A survey. Signal Process. Image Commun. 25, 7, 482--501. Google ScholarGoogle ScholarDigital LibraryDigital Library
  92. J. You, G. Liu, L. Sun, and H. Li. 2007. A multiple visual models based perceptive analysis framework for multilevel video summarization. IEEE Trans. Circuits Syst. Video Technol. 17, 3, 273--285. Google ScholarGoogle ScholarDigital LibraryDigital Library
  93. W. A. Yost and D. W. Nielsen. 1985. Fundamentals of Hearing. Holt, Rinehart and Winston, New York.Google ScholarGoogle Scholar
  94. L. M. Zhang and W. Lin. 2013. Selective Visual Attention: Computational Models and Applications. John Wiley & Sons. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Mulsemedia: State of the Art, Perspectives, and Challenges

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in

      Full Access

      • Published in

        cover image ACM Transactions on Multimedia Computing, Communications, and Applications
        ACM Transactions on Multimedia Computing, Communications, and Applications  Volume 11, Issue 1s
        Special Issue on Multiple Sensorial (MulSeMedia) Multimodal Media : Advances and Applications
        September 2014
        260 pages
        ISSN:1551-6857
        EISSN:1551-6865
        DOI:10.1145/2675060
        Issue’s Table of Contents

        Copyright © 2014 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 1 October 2014
        • Accepted: 1 April 2014
        • Revised: 1 March 2014
        • Received: 1 December 2013
        Published in tomm Volume 11, Issue 1s

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article
        • Research
        • Refereed

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader