Skip to content
BY 4.0 license Open Access Published by De Gruyter Open Access February 21, 2021

The look at the various uses of VR

  • Patrik Voštinár EMAIL logo , Dana Horváthová , Martin Mitter and Martin Bako
From the journal Open Computer Science

Abstract

Virtual, augmented and mixed reality (VR, AR and MR) infiltrated not only gaming, industry, engineering, live events, entertainment, real estate, retail, military, etc., but as surveys indicate, also healthcare and education. In all these areas there is a lack of software development experts for VR, AR and MR to meet the needs of practice. Therefore, our intention at the Department of Computer Science, Faculty of Natural Sciences, Matej Bel University in Banská Bystrica, Slovakia, is to focus on the education and enlightenment of these areas. The aim of this article is to show the role of interactivity in different VR applications and its impact on users in three different areas: gaming, healthcare and education. In the case of one application of Arachnophobia, we also present the results of the research using a questionnaire.

1 Introduction

VR offers a unique personal experience that alters one’s perception of their surroundings. While VR headsets have many mechanical limitations, the immersion is strong enough that the elusive goal of presence is reached [1]. The terms “presence” and “immersion” are discussed by a wide field of researchers. “Presence” generally refers to the sensation of “being there”, or as a subjective phenomenon such as the sensation of being in a virtual environment. “Immersion” is the extent to which the senses are engaged by the mediated environment, or an objective description of aspects of the system such as field of view and display resolution [2].

In the context of virtual reality, the term immersion is used to describe the user’s emotional reaction to the virtual world in terms of feeling as if they are actually a part of the virtual world [3]. A VR headset offers a good immersion experience, as this device not only delivers visually perceived information, but is also able to generate or convey the sound of the simulated environment, which then reinforces the impression of being in virtual world [4].

In ’Understanding Virtual Reality’ Sherman and Craig describe virtual reality (VR) as a medium consisting of computer simulations, giving the user a sense of presence in the simulations [5]. In other words, it is a completely virtual world made by a computer in which the user interacts only with virtual objects. The Virtual Reality technology has traditionally consisted of cumbersome created environments and has often required complex sensors worn on the body for an individual to interact with the environment. The emergence of head mounted virtual reality devices is shifting the technology into the commercial consumer area [6].

After decades of experimentation with VR software and hardware, beginning in the late 1980s, consumer mind-sets are finally ready for the immersive VR experiences its early visionaries dreamed of [7].

6 years ago, we (at the Department of Computer Science in Banská Bystrica, Slovakia) were inspired by the idea of a psychotherapist to try to create VR applications that he could use in the treatment of phobias. He was willing to share his experience with us and together we formulated the requirements, which we gradually began to incorporate into our projects. Therefore, we became more interested in VR. During these six years of research and investigation, we gradually profiled ourselves in all areas that we also describe in this article. So it is not only the area of treatment of phobias, but also of games, which, thanks to game engines, we also connect with the area of phobia treatment. And in our case, both areas are also connected to education, which is best implemented when there is a realistic goal.

The first field we have focused on in this paper is the field of games, which has always been, is and probably will be, a pioneer in crossing borders in the development of VR. Therefore, we would like to show in our research how a simple game can be useful from the perspective of interactivity. We give an application that shows the different possibilities of using interactive elements. Its main function is to show the mutual influence of man and virtual reality environment. In four different rooms there is number of different objects that the user can catch and manipulate. There is a living room with furniture and various objects that can be grabbed and carried. From this room the user can get to the next three, which demonstrate the interactivity, physics and credibility of the virtual environment. They also train movement skills and space coordination at the same time.

The second field is the usage of VR for healthcare. Experiments from several research studies [8, 9, 10, 11, 12, 13] show the success and effectiveness of treatment of some phobias with VR.

VR has been used comprehensively for treating phobias (like fear of heights, closed spaces, flying and public speaking) and post-traumatic stress disorder. This kind of therapy replaces the traditional cognitive and behavioural therapy and has shown an effective outcome and incredible results. A computer based simulation technology allows users to intermingle with the virtual environment and has shown numerous advantages over the real training [14].

We have developed several applications that can be used to treat different phobias. Here we focus on the treatment of one of the most widespread specific phobias, the arachnophobia – the irrational fear of spiders. The application is designed to be controlled not only by the patient, but also by the therapist. This way, the therapist has the full process under control. Over the last few months, we have been testing the app and getting feedback through a questionnaire method. Cooperating with the therapist, we also performed the therapy itself, which we consider to be a great success because the application has become usable for the purpose for which it was designed.

The third field is education. VR has a good impact on the education system. It makes the education system more entertaining and students interacting with each other in a 3D environment learn new things faster compared to when using the traditional education system. Students can learn by having fun in a VR environment [9]. VR breaks down geographical boundaries. For schools this can be priceless as it means that students can virtually visit places that are beyond their means in the real world. VR allows students to travel in time, experience the past and the future and actually see what cannot be seen in the real world (objects of a micro and a macro world). Thanks to VR it is possible to move within a virtual space and engage with elements or manipulate various objects. All of this engages learners like never before. VR allows students to break the laws of physics and to do things they cannot or do not know how to do in the real world. It opens up new learning possibilities in the classroom. Using VR to practice and hone skills without fear of failure is incredibly powerful and can help students build confidence in new areas of learning. This could mean for example using VirtualSpeech to practice public speaking. The very nature of VR being framed inside a headset means that the learners are less prone to distractions from their physical surroundings. For some students this can be immensely beneficial as they may be prone to distractions leading to loss of focus and ultimately loss of learning. Students using VR connect with other students and also attend lectures and lessons delivered by educators across the globe [15].

Thanks to our research, projects and collaboration with companies, we have the necessary software and hardware with which we started to develop applications, such as e.g. HTC Vive with a powerful PC with a dedicated graphics card. A pair of HTC Vive controllers, whose movement is transmitted to the PC using a couple of Base Station sensors, ensure interactivity.

2 Virtual reality in games

According to [10] a “game” is a system within which players traditionally engage in an artificial conflict, trying to solve a specific problem. A game is defined by rules and measured by a quantifiable outcome. Usually a game has several key elements:

  1. A specific goal that people are willing to work for.

  2. Rules that stimulate creativity.

  3. A feedback system that lets individuals know how they are doing with respect to the goal.

  4. Voluntary acceptance of the goal, rules, and feedback systems.

VR and digital games (DG) are two areas, which share many similar characteristics. Both of them have to focus on the human to succeed (feeling of presence for VR, entertainment for DG). VR and DG exploit the technological breakthroughs of several fields like image synthesis, electronics, etc. And finally, sometimes they both act in a VR world like a flight simulator or therapy environment. Other times, they allow operating in a fantasy world which does not necessarily respect the usual laws of physics [16]. Though modern technological approaches to VR originated in gaming, its applications extend beyond those platforms [17]. Many principles and techniques in conventional computer game design allow for creating immersive, interesting and fun games [18]. Virtual reality gaming is where a person can experience being in a three-dimensional environment and can interact with that environment during a game. This is an essential part of the game [19]. Since all players want to be immersed when playing a games, in particular when playing in a virtual environment, and the feeling of submerging oneself can be strongly influenced being able to interact with the environment and the objects therein, we also focus on ways to achieve delusion of the senses.

2.1 Application VR Interactivity

The first application of ours we want to introduce was already mentioned “four rooms flat”. It is an interactive virtual environment that offers a sport-adventure experience.

There are three doors in the living room. Upon entering the Volleyball door, the user moves to a beach island, there is Hawaiian music playing in the background. In addition to decorative sculptures, a table and a door, there is also a volleyball court. The player is able to pick up the ball and hit it in such a way that it is thrown to the other side of the volleyball net. If he succeeds, the system will award points for successful scoring, which will appear above the board.

The other door – the lightsaber door, will take the user to the Jedi Cave on Tatooine, with action music playing in the background. There is a villain, an enemy soldier who shoots at the user. The user can take a laser sword into his hand, which can reflect missiles. Positive points are added for a successful deterrence. The colour of the light from the sword can be different and the player can even hold a double-sided lightsaber. Figure 1 shows the door with the Jedi Cave on Tatooine.

Figure 1 Software Unreal Engine and Jedi Cave room.
Figure 1

Software Unreal Engine and Jedi Cave room.

The third and last door with the word Basketball teleports the user to the basketball arena. There are basketball balls scattered on the floor and a basketball basket in front of the player. Sports music imitating a match is playing in the background. The user can grab the ball using the controller. After the correct swing, direction and release, the ball can be thrown into the basketball hoop. If the roll is successful, the score is credited on the screen. The ball collides with other balls and after an accidental collision with them, these balls move, allowing for functional and believable physics.

2.2 Creation tools

The software Unreal Engine was used for creating the application and its virtual environment and the project was adapted to the mentioned HTC Vive hardware. We put the player’s camera in the new level, which contained virtual reality and collision controllers. Then we placed surfaces with textures to 3D models, modelled them in 4D Cinema and then imported them into the Unreal Engine. Here we set collisions, so for example the ball could fall through the hoop so that it does not remain stuck at the top of the basket, or to bounce if the hit was not accurate and so on. With the ball, it was necessary to adjust the dynamic physics with respect to gravity, the angle of incidence of the ball and, of course, to reflect the ball from the surface as naturally as possible. For creating a VR game, we also used Photoshop CC 2019. Figure 2 shows the software Unreal Engine with open project Basketball room.

Figure 2 Software Unreal Engine and Basketball room project.
Figure 2

Software Unreal Engine and Basketball room project.

2.3 Interactivity

The interactivity in this application focuses on the movement around the environment and the possibility of catching objects. The movement is done using the buttons of the HTC Vive controllers – the user has to press a button, hold it and point towards where he wants to go. After it is released, the player will teleport to that location. Three types of motion can be used in the application.

One allows to move the user’s viewing direction by pressing the button. As the user rotates his head, the direction of movement changes. This way of moving in VR can cause nausea, negative feelings, even dizziness, or loss of balance.

The second movement uses hand movements. This movement is called swinging arm. The user must start to move his hands beside his body in order to move, as he does naturally when walking or running. This reduces the chance of nausea as this movement is more natural than just standing and heading in some direction.

The third movement uses a teleport. The space defines where the user can move through the controls, and then quickly recalculates the new view from where the user teleported. This movement is the least demanding to adapt to the nervous system, but is also the least realistic.

In the basketball and volleyball rooms, the grip of the ball is achieved using the trigger release button. When pressed, the player is able to bounce it, but as soon as he gets close to it, he pulls the trigger, the player can grab the ball and throw it into the basket or over the net. The last step is to adjust the collisions of the ball with the inner space of the hoop to add the points that appear on the screen. We added a button to the environment where the player resets the level when he throws the ball. Figure 3 shows interactivity in our VR game.

Figure 3 Example of created VR game – living room.
Figure 3

Example of created VR game – living room.

3 Phobia treatment by help of virtual reality

Clinical psychology describes phobias as an anxiety disorder, characterized by an intense irrational fear of specific objects or situations. This excessive fear does not correspond to the potential amount of danger of the stimulus. Despite this fact, people suffering from phobias experience intensive psychological symptoms (anxiety, loss of control, fear) and physiological symptoms (increased heartbeat, fainting, sweating, problems with breathing) etc. [11, 12, 13].

VR is irreplaceable in medicine in the treatment of various kinds of phobias by the exposure method. This way, environments or situations can be created that would be difficult or impossible to capture in the real world. Thus, we can adapt the environment and 3D objects exactly to the needs for treating phobia in a particular patient.

Generally, phobias could be divided into three main categories: agoraphobias, social phobias, and specific phobias [20].

3.1 Application to Arachnophobia

In designing phobia treatment applications, we introduce the VR application for HTC Vive, which focuses on the treatment of one of the most widespread specific phobias, in particular arachnophobia – the pathological fear of spiders. The purpose of the application is to streamline the course of therapy and provide a tool to tailor the entire process according to specific requirements.

3.2 Creation tools

The application was created on the VR Ready PC, which was connected to HTC Vive. Even though several software products were used to create it, the core of the application runs on the game engine Unity 3D. We used Blender to model additional objects and we edited textures in Adobe Photoshop. Graphics and buttons were created in Adobe Illustrator. The whole logic of the application is programmed in the C# programming language using scripts that are assigned to individual objects in the scene. These are scripts for random movement of the spider, for avoiding obstacles, for touching and catching objects, killing the spider, for controlling the GUI elements and so on. For digital editing of audio files we used editor Audacity 2.3.2.

3.3 Interactivity

When you open the application, the main menu appears. There are several options there: launching the main part of the application, description of the operation, settings, as well as information about the application and its author. The menu is also enriched with a model of an animated spider, whose task is to conjure a smile on the user’s face rather than induce a phobic state.

The application is designed to be controlled not only by the patient, but also by the therapist. This way, the therapist is in control of the entire course of treatment. For the patient, interactivity is ensured by a pair of HTC Vive controllers and a couple of Base Station sensors. The therapist can use the keyboard and the mouse to interfere with the application. We have programmed the application controls to be as simple and intuitive as possible for the user. The original controller’s model that came with the VIVE Input Utility (VIU) library has been replaced by realistic-looking hand models. These include animations of idle, touching, catching, and clenching. From a programming point of view, we have transformed these movements into a functional interaction with the surrounding objects. We used the trackpad button to touch the object, and we decided to use the grip button to catch and release, as it is the most natural button for the task. Killing the spiders and confirming the choice is mapped to the trigger controller. The last function button is the menu button, which activates and deactivates the laser pointer.

3.4 Patients and therapists controlling

The user interface of the application is divided into two-dimensional and three-dimensional interfaces. The two-dimensional (Non-Diegetic UI) is only displayed on the PC monitor. It serves the therapist to control the elements of the environment at the discretion and response of the patient. The three-dimensional interface (Diegetic UI) has the ability to see both, but only the patient can modify the environment objects through it. The advantage of such a divided user interface is that the patients can use the application in the comfort of their own home, which can significantly shorten the treatment process, not to mention the money saved while sitting with the therapist.

By clicking the start button on the main menu, the patient finds himself in a virtual environment that creates a calm atmosphere and eliminates stress. Directly opposite the patient is a TV set that serves as the initial phase of the therapy. On the TV screen, the patient is shown images of spiders, from a small cartoon spider to a real photograph of a tarantula. Individual levels are used to eliminate unwanted or exaggerated patient reactions. By humorously beginning therapy, we encourage the patient not to be afraid to proceed to the next levels. The advantage is that if necessary, e.g. when we adapt an app to a different type of phobia, we can easily replace those images with another.

On the left side, the patient can see a whiteboard on which the user interface elements are located. There are buttons, checkboxes and sliders. These elements are used to add spiders on the table, also to remove, enlarge, shrink, discolor, change the spider speed, move the table away from and to the patient or, in the case of impulsive patient response, to remove all spiders. We have implemented the spider instances into the stack data structure (LIFO). Each spider model is assigned a random animation using the Mecanim state machine and the corresponding script. By random generation, we have ensured that each spider behaves differently at any given moment, thereby achieving a more natural effect. Figure 5 shows interactivity in Arachnophobia application.

Figure 4 Therapist’s and patient’s menu.
Figure 4

Therapist’s and patient’s menu.

Figure 5 Interactivity in application Arachnophobia.
Figure 5

Interactivity in application Arachnophobia.

The visual aspect of the application is complemented by sound effects that contribute to the greater immersion of the patient into the virtual environment, which in turn helps to increase the effectiveness of the treatment. The user interface in the app is represented by the use case diagram in Figure 6.

Figure 6 Diagram of use of VR app environment.
Figure 6

Diagram of use of VR app environment.

The application itself is also made more attractive by the elements of gamification, which move it to a higher level. We decided to incorporate the play elements to make the treatment process more interesting, interactive and engaging. Another reason for it to provide a more attractive form of treatment-encouraging active participation and engagement, allowing for more direct and immediate feedback. Information on how the patient is doing during therapy (the number of spiders touched, caught and killed) motivates them to progressively improve, thereby achieving their goal of getting rid of their unjustified fear.

3.5 Testing the application

The testing of the application was carried out in the “Laboratory of virtual reality and exploring the user experience” at the Department of Computer Science, Faculty of Natural Sciences, Matej Bel University in Banská Bystrica on a sample of 39 respondents. However, six did not complete or submit our electronic questionnaire. The questionnaire consisted of the following sections: basic information about the participant, physiological aspect, psychological aspect, application controllability, visual aspect and others. The largest group of respondents were secondary school students who participated in the Department’s Open day, university students taking lectures on a virtual reality subject and students on the Forensic Chemistry study program. A very important respondent was the psychotherapist himself and his 12-year-old daughter, who suffers from arachnophobia. Of these 33 respondents, 24 were men and 9 were women between 12 and 65 years of age, with the highest age share being 21–25 years. This was a relatively well-balanced sample of respondents suffering from arachnophobia (21.21%), claustrophobia (9.09%), social phobia (3.03%), unspecified phobia (6.06%), and totally healthy people (60.61%). Nearly half of them (45.45%) had experienced some type of VR technology before. On average, the duration of testing the application in a virtual environment was approximately 16 minutes (ranging from 8 to 80 minutes).

The hardware kit we used in this experiment consisted of an Intel Core i9-9900K processor, 32 GB RAM and a NVIDIA GeForce GTX 1060 6GB graphics card. The Windows 10 Education 64-bit operating system was installed. The computer assembly was transmitting the application’s output image to the HTC Vive via the wireless adapter. HTC Vive also included a pair of controllers. For free movement we had a space of 2.5 m × 2.5 m (6.25 square meters).

3.6 Feedback

The majority of users fully (57.58%) or partially (39.39%) agreed that the GUI was clear and intuitive for them. In general, the interaction and manipulation of the virtual environment seemed natural to them. They also positively evaluated the possibilities and functions available in the application.

The feeling of presence reached a relatively high level (48.48% of respondents absolutely agreed and 45.45% partially agreed that they felt engulfed in the virtual environment, which fully involved them in the simulation process). This assumption is also confirmed by the question concerning the visual aspect of the virtual interior environment, which 51.52% of respondents consider realistic and 48.48% average.

VR discomfort is difficult to measure because its effects do not affect everyone equally and cannot be measured with a single parameter. The standard tool for measuring nausea is the Simulator Sickness Questionnaire (SSQ) [20]. The most common problems were vision (48.48% focusing, 48.48% blurred vision, 39.39% eye strain) and 33.33% overall discomfort. Exposure to the feared object, in this case a spider, triggered different reactions in the participants. These objects did not cause any stress or tension to healthy people.

For people suffering from arachnophobia, using the app meant facing their fear. From the data we can conclude that the 3D models of spiders moving on the table caused more fear (36.36% of the positive answers) than the 2D images of the spiders displayed on the television screen (24.24% of the positive answers). We also met subjects who completely refused any interaction with spiders (touch or grabbing a spider in their hands).

The results obtained from this research point to the tremendous potential of VR in the treatment of specific phobias, namely arachnophobia. Up to 84.85% of respondents would be willing to undergo phobia treatment in this form. We also analyzed whether they would recommend this application to psychotherapists and their patients. Most respondents (90.91%) would recommend it. Most users liked the interaction with objects and the possibility of catching spiders in their hands. Others pointed out the creativity of the idea and the actual elaboration of the project. They were excited about how their simulated environment was able to immerse them into another world.

In the future, we plan to extend the application to other types of phobias. By simply exchanging animal models and their animations, we can create for example an application for the treatment of one of the zoophobias: musophobia, ophidiophobia, cockroachphobia, etc. There is also the possibility of targeting other VR platforms.

4 Virtual reality in education

In recent years, the media has presented VR and AR as technologies that reach home through various electronic devices such as helmets, goggles, smartphones, etc. This will involve the implementation of these technologies in educational environments by supporting different learning styles and easing teaching and learning processes [22]. Our goal in this field is to prepare our students to become future developers of the VR, AR and MR.

In this article, we want to present the course of VR in LMS Moodle. Here, our students have available lessons not only to develop virtual environments and applications, but but also to work with a panoramic camera, as well as with creating MR and AR. They can also find presentations, links to various websites and useful materials. Using video tutorials, they do assignments to build VR applications in Unity 3D or Unreal Engine, panorama video and photography in Action Director to combine models and 360° video and to create AR applications too. Figure 7 shows a small part of the main menu of LMS Moodle course for VR.

Figure 7 LMS Moodle course for VR.
Figure 7

LMS Moodle course for VR.

4.1 Combination of panoramic video and animated model

One of the interesting and challenging tasks that students do is to create a mixed reality, where they create a combination of real-world, captured by a panoramic camera and animated models. Such combination brings new possibilities of using virtual graphics not only in the area of games and entertainment, but it can also help in the area of health and well-being, when a realistic looking environment could create a feeling of dreadful situations. Figure 8 shows such a combination. An animated deer model is placed in the real-world forest environment.

Figure 8 Combination of forest environment with an animated deer model.
Figure 8

Combination of forest environment with an animated deer model.

4.2 Hardware and software tools

Panoramic videos can be created using special cameras with wide-angle lenses. They can take a full 360° panorama in both horizontal and vertical directions. In our case, we use Samsung Gear 360 for video recording. It is simple, user-friendly and affordable. The camera captures two different spherical images into one file, which must then be combined into a so-called „panoramic format“. We use the Gear 360 Action Director software that comes with the panoramic camera. We use Blender 3D software and its advanced motion tracking functions to create a mixed reality that combines panoramic video with an animated object.

4.3 The process of creating

The process of creating a MR in this software consists of several steps. First, markers had to be placed on the contrast points in the movie – the number of those depends on the video capture technique. After placing enough tags, we tracked them. An important step was to set the camera parameters. We had to ensure that the virtual camera in the program had the same parameters as the camera we recorded the video with. Then the program calculated the movement of the real camera based on the amount and quality of the traced tags. The calculation resulted in information on the size of the deviation. In the next step, we prepared the scene so that the 3D object could be placed in the video, which included dividing the scene into the foreground and the background. We then imported a 3D model into the scene that included the armature and animations. We placed the loaded model in the desired position, rotated it correctly and resized it to fit the video exactly. We provided lighting for the scene using two lighting methods, which were combined to create the same lighting conditions as in reality. The first method is IBL (Image-Based Lighting), which uses image information from real photographs to create the illumination. As a second method, we used the virtual illumination incorporated in the program. Finally, we rendered the scene into the resulting panoramic video enriched with an artificial 3D object.

4.4 Interactivity

An essential feature of the panoramic video is its interactivity - the user decides what he is interested in at the moment (he chooses the direction and the angle of view). There are several ways to interact with the image. We can talk about the so-called „immersive“ and „non-immersive manner“ of interaction or presentation of the video. The immersive way is possible with one of the HMDs (HTC Vive, Samsung Gear VR, Google Cardboard, etc.) that responds to head movements or direction of vision and automatically adjusts the image rotation. The non-immersive way is represented by software players (mobile or desktop) and we interact with the video with a finger or mouse.

The possibilities of using MR (a combination of panoramic video and animated 3D object) are seen mainly in the film industry, in the creation of film tricks and visual effects (shortly VFX).

4.5 VR as a subject

Subject curriculum should respond as quickly as possible to the changes brought by practice. Although the course is called only “Virtual Reality”, students are now introduced also to a wider range of modern technologies such as AR and MR.

There is another concept that students become familiar with during their studies, relating to all real-and-virtual combined environments and human-machine interactions generated by computer technology and wearables. It includes representative forms such as AR, MR and VR [23] and the areas interpolated among them. It is called eXtended Reality (XR). The levels of virtuality range from partially sensory inputs to immersive virtuality. XR is a superset which includes the entire spectrum from "the complete real" to "the complete virtual" in the concept of reality–virtuality continuum introduced by Paul Milgram [24]. Still, its connotation lies in the extension of human experiences especially relating to the senses of existence (represented by VR) and the acquisition of cognition (represented by AR). With the continuous development in human–computer interactions, this connotation is still evolving. XR is a rapid growing field being applied in a wide range of fields, such as entertainment, marketing, real estate, training and remote work [25].

Mixed and virtual realities enable our feeling of being in the immersive environment and recreate our sensorial apparatus in the fictional featured space in which our entire body can interact with digital objects or assets [26].

As students learn to create these worlds during their studies, they gain amazing experiences that they can use in their professional careers, and their applications can help other people in various fields.

5 Conclusion

Since we are only at the beginning, we have not finished researching all these fields.

The first field is easy to test because of students in the school environment. Developed games have a positive response among students. Whether they are sports games or escape rooms, we try to interact with the environment and its objects everywhere.

In the second field, we have taken a small step towards treating phobias. Our therapist has started the treatment with real patients and it is going well. However, it is a very complex and long-term process, so the results cannot be expected soon. The use of data helmets and gloves (or controllers) is unpleasant for some users (especially those who suffer from phobia), but research to this date suggests that this feeling is better than touching the real object of their phobia. So adapting to a phobia object via VR is easier and not so frustrating.

We use the third field of VR, AR, MR in education mostly because we are an educational institution that wants to keep up with modern emerging technologies and has the ambition to closely link the process of education with practice. It is the field of education that unites our efforts to create various applications for these areas, which we have the opportunity to test in real life.

The aim of our article was to point out the various possibilities of using the VR, which we focused on in our department. Furthermore, we wanted to present the various possibilities of interactivity offered by the current VR facilities. And last but not least, we wanted to make our work visible, which is beginning to be appreciated by the therapeutic professional public, especially in the field of phobia treatment.

Acknowledgement

This contribution has been processed as part of the grant project Interactive Applications for Teaching Mathematics at Primary Schools, project no. 003TTU-4/2018.

The authors would like to thank therapist Mr. Ján Záskalan for his valuable comments and suggestions to improve the quality of our research.

References

[1] Hodgkinson G., Lock up your stories – here comes Virtual Reality, Journal of Arts and Imaging Science, 2016, 3, 10–1410.15323/techart.2016.11.3.4.10Search in Google Scholar

[2] Hodgkinson G., A New Medium for Animation – Stereo Virtual Reality, Journal of Arts and Imaging Science, 2015Search in Google Scholar

[3] Jennett C., Cox A.L., Cairns P., Dhoparee S., Epps A., Tijs T., Walton A., Measuring and Defining the Experience of Immersion in Games, International Journal of Human-Computer Studies, Harvard, Elsevier, 200810.1016/j.ijhcs.2008.04.004Search in Google Scholar

[4] B. Arnaldi, P. Guitton, G. Moreau, Virtual Reality and Augmented Reality: Myths and Realities. Computer Engineering, 2018.10.1002/9781119341031Search in Google Scholar

[5] Schmalstieg D., Hollerer T., Augmented Reality Principles and Practice, Addison Wesley, Boston, 201610.1145/2897826.2927365Search in Google Scholar

[6] Carter L., Potter L.E., Designing Games for Presence in Consumer Virtual Reality, In: Proceedings of the 2016 ACM SIGMIS Conference (2–4 June 2016, Virginia, Alexandria, USA), Association for Computing Machinery, 2016, 141–14810.1145/2890602.2890626Search in Google Scholar

[7] Stein Ch., Virtual reality design: How upcoming head-mounted displays change design paradigms of virtual reality worlds, MediaTropes eJournal, 2016, VI, 52–85Search in Google Scholar

[8] Rizzo A., Shilling R., Clinical Virtual Reality tools to advance the prevention, assessment, and treatment of PTSD, European Journal of Psychotraumatology, 201710.1080/20008198.2017.1414560Search in Google Scholar PubMed PubMed Central

[9] Bernardo A., Virtual Reality and Simulation in Neurosurgical Training, World Neurosurgery, 2017, 106, 1015–102910.1016/j.wneu.2017.06.140Search in Google Scholar PubMed

[10] Stetina B.U., Felnhofer A., Kothgassner O.D., Lehenbauer M., Games for Health: Have Fun with Virtual Reality!, Virtual Reality in Psychological, Medical and Pedagogical Applications, 2012, 65–80Search in Google Scholar

[11] Horváthová D., Siládi V., Creating virtual environments for phobia treatment, Open Computer Science, 2016, 6, 138–14710.1515/comp-2016-0012Search in Google Scholar

[12] Heretik A., Anxiety (neurotic) disorders, Clinical psychology, 2007, 217–241Search in Google Scholar

[13] Grenier S., Forget H., Bouchard S., Isere S., Belleville S., Potvin O., et al., Using virtual reality to improve the efficacy of cognitive-behavioral therapy (CBT) in the treatment of late-life anxiety: Preliminary recommendations for future research, International Psychogeriatrics, 2015, 27, 1217–122510.1017/S1041610214002300Search in Google Scholar PubMed

[14] G. Parshotam, B. Rashid, S. Asghar, N. Urooj, Phobia Therapy using Virtual Reality (VR) based on KINECTTM motion Sensor for Pakistan’s Medical Rehabilitation Centers, 2019.Search in Google Scholar

[15] Bambury S., 10 Key Benefits of VR in Education. [Online] Available at https://www.vrfocus.com/2019/03/10-key-benefits-of-vr-in-educationSearch in Google Scholar

[16] Bouvier P., Sorbier F., Chaudeyrac P., Biri V., Cross benefits between virtual reality and games, in International Conference and Industry Symposium on Computer Games; Animation, Multimedia, IPTV, Edutainment and Security (CGAT’08),” France, 10pp, 200810.5176/978-981-08-8227-3_cgat08-26Search in Google Scholar

[17] Rubin P., Oculus Rift, Wired, 2014, 22Search in Google Scholar

[18] Schell J., The Art of Game Design, A Book of Lenses, 2nd ed. Wellesley: A K Peters/CRC Press, 201410.1201/b17723Search in Google Scholar

[19] Virtual Reality Society, 2017, What is virtual reality gaming?” Last access October 2019 Available at http://www.vrs.org.uk/virtualreality-games/what-is-vr-gaming.htmlSearch in Google Scholar

[20] International classification of diseases, 2013. [Online]. Available http://www.nczisk.sk/Standardy-v-zdravotnictve/Pages/Medzinarodna-klasifikacia-chorob-MKCH-10.aspxSearch in Google Scholar

[21] Kennedy R.D., Lane N.E., Berbaum K.S., Lilienthal M.G., Simulator Sickness Questionnaire, The International Journal of Aviation Psychology, 1993, 3, 203–22010.1037/t04669-000Search in Google Scholar

[22] Martín-Gutiérrez J., Mora C.E., Añorbe-Díaz B., González-Marrero A., Virtual Technologies Trends in Education, Journal of Mathematics Science and Technology Education, 2017, 13, 469–48610.12973/eurasia.2017.00626aSearch in Google Scholar

[23] Gownder, J., P., Voce, Ch., Mai, M., Lynch, D., Breakout Vendors: Virtual And Augmented Reality, 2016Search in Google Scholar

[24] Milgram, P., Takemura, H., Utsumi, A., Kishino, F. [online] Available at https://www.trekk.com/sites/default/files/inline-imagesSearch in Google Scholar

[25] Hui-Wen, Ch. S., 2018. Why and Who Will Adopt Extended Reality Technology? Literature Review, Synthesis, and Future Research Agenda, 2018Search in Google Scholar

[26] Intel Corporation, Demystifying the Virtual Reality Landscape. The differences between Virtual Reality, Augmented Reality, and Mixed Reality, and how you can get ready to experience a new reality for yourself. [Online] Available at https://www.intel.com/content/www/us/en/tech-tips-and-tricks/virtual-reality-vs-augmented-reality.htmlSearch in Google Scholar

Received: 2020-03-31
Accepted: 2020-06-14
Published Online: 2021-02-21

© 2021 Patrik Voštinár et al., published by De Gruyter

This work is licensed under the Creative Commons Attribution 4.0 International License.

Downloaded on 14.5.2024 from https://www.degruyter.com/document/doi/10.1515/comp-2020-0123/html
Scroll to top button