Next Article in Journal
Advanced Lab-on-Fiber Optrodes Assisted by Oriented Antibody Immobilization Strategy
Next Article in Special Issue
Multifrequency Microwave Radiometry for Characterizing the Internal Temperature of Biological Tissues
Previous Article in Journal
CRISPR-Cas-Integrated LAMP
Previous Article in Special Issue
Wearables for Engagement Detection in Learning Environments: A Review
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Advances in Materials, Sensors, and Integrated Systems for Monitoring Eye Movements

1
School of Engineering and Computer Science, Washington State University, Vancouver, WA 98686, USA
2
IEN Center for Human-Centric Interfaces and Engineering, Institute for Electronics and Nanotechnology, Georgia Institute of Technology, Atlanta, GA 30332, USA
3
School of Electrical and Computer Engineering, Georgia Institute of Technology, Atlanta, GA 30332, USA
4
George W. Woodruff School of Mechanical Engineering, Georgia Institute of Technology, Atlanta, GA 30332, USA
5
Department of Mechanical Engineering, University of Washington, Seattle, WA 98195, USA
6
Wallace H. Coulter Department of Biomedical Engineering, Georgia Tech and Emory University School of Medicine, Atlanta, GA 30332, USA
7
Neural Engineering Center, Institute for Materials, Institute for Robotics and Intelligent Machines, Georgia Institute of Technology, Atlanta, GA 30332, USA
*
Authors to whom correspondence should be addressed.
These authors contributed equally to this work.
Biosensors 2022, 12(11), 1039; https://doi.org/10.3390/bios12111039
Submission received: 20 October 2022 / Revised: 11 November 2022 / Accepted: 13 November 2022 / Published: 17 November 2022
(This article belongs to the Special Issue Biophysical Sensors for Biomedical/Health Monitoring Applications)

Abstract

:
Eye movements show primary responses that reflect humans’ voluntary intention and conscious selection. Because visual perception is one of the fundamental sensory interactions in the brain, eye movements contain critical information regarding physical/psychological health, perception, intention, and preference. With the advancement of wearable device technologies, the performance of monitoring eye tracking has been significantly improved. It also has led to myriad applications for assisting and augmenting human activities. Among them, electrooculograms, measured by skin-mounted electrodes, have been widely used to track eye motions accurately. In addition, eye trackers that detect reflected optical signals offer alternative ways without using wearable sensors. This paper outlines a systematic summary of the latest research on various materials, sensors, and integrated systems for monitoring eye movements and enabling human-machine interfaces. Specifically, we summarize recent developments in soft materials, biocompatible materials, manufacturing methods, sensor functions, systems’ performances, and their applications in eye tracking. Finally, we discuss the remaining challenges and suggest research directions for future studies.

Graphical Abstract

1. Introduction

1.1. Recent Advances in Eye Movement Monitoring

Electrophysiology signals are often used for health status indicators related to all human activities and various applications. Recent advances in wearable technologies and video monitoring systems for eye movement enabled various types of human–machine interface (HMI) [1,2]. Among them, electrooculograms (EOGs), measured by surface-mounted electrodes, have been widely used to track eye movements. Existing devices for EOG signal measurements cause discomfort due to their bulky and rigid properties. Moreover, the conventional EOG measurement device can only be performed in a stationary lab setup. Recent advances in wearable technologies, such as soft materials, manufacturing technology, and electronic chip packaging, are improved to compensate for existing problems, and these advances directly interact with electronic, mechanical, or computing elements, a collective practice known as HMI. Moreover, recent enhancements in computing power have made it possible for real-time eye tracking to monitor changes in eye motions with different types of cameras. Eye tracking is deployed in various research areas, including psychology, neuroscience, and marketing, to understand human intentions and responses (Figure 1).

1.2. Electrooculogram-Based Approaches for Human–Machine Interfaces

EOG is one of the technologies for tracking eye movements by measuring the potential via the positively charged cornea and negatively charged retina [1]. The measured signal results are called EOG. Generally, the range of the measured EOG signals is from 50 μV to 3500 μV depending on the amount of light incident on the retina [16,17]. It is common practice to use generic electrical sensors for EOG detection. Since these conventional EOG devices have rigid properties, wearable device platforms based on soft electronics and wireless data communications could offer an improved user experience. The concept of a wearable EOG device includes measuring the EOG signal in a wearable environment for providing smart diagnostics and application controllers with embedded signal processing such as machine learning algorithms. Building a wearable EOG system requires electrodes, platforms, and signal processing to analyze the EOG signal. Electrodes are essential for measuring bio-potentials. Existing metal-based electrodes are flat in shape with gels for adhesion. Flat-shaped electrodes are not suitable for human skin due to skin deformations. The gel also causes several skin issues such as skin irritation and poor breathability. Due to the above problems, research groups recently studied electrodes with flexible form factors, biocompatible materials, and cost-effective processes. As an example, polymer-based electrodes (sponge [18,19,20], textile [21], and hydrogel [22,23,24,25,26,27,28,29]) have been utilized because of their advantages such as good mechanical flexibility, low density, the ease of processing, and low costs. Recent advances in microfabrication and print technologies enabled new ways to design micro-patterned electrodes (gold [1,30] and graphene [11]). Due to the development of these technologies and 3D printing, designing wearable platforms has become possible, such as eyeglass types [5,31,32,33,34,35,36,37,38,39], face mask types [7,40,41,42], ear plug types [43,44,45,46,47], and headband types [48,49,50,51,52,53,54,55,56,57,58] for various applications. Previously, various controllers for an HMI such as wheelchairs [1,4,51,52], drones [11,59], game interfaces [5,36,47,60,61], and virtual keyboards [34,38,51,62] were created by using only an EOG signal. Recently, various healthcare monitoring systems [7,40,41,44,45,63] and medical health status analyses [64,65,66] have been conducted using both the electroencephalogram (EEG) and EOG with signal processing, such as machine learning algorithms [1,5,36,52]. The studies mentioned above show that the advancement in wearable EOG devices makes it easier to use HMI in daily life.

1.3. Screen-Based Eye Tracking Technology

Over the past few decades, screen-based eye trackers have been successfully used for several applications to find out the involuntary or voluntary recognition of human intention by tracking the gaze point on the screen. The intuitive human intention could be delivered to the human–machine interface with the exact coordinates of the gaze point on an object or screen. The eye tracker-based signal computation process could be represented by two types of methods: machine learning [67,68,69,70] and pupil center-corneal reflection (PCCR) [71,72,73]. Each technique required several cameras to create a trace map or to detect gaze points on the screen. These fundamentals of eye movement and eye gaze analysis are the basic parameters of heat maps, including the area of interest, time to first fixation, dwell time, and integration model. Moreover, recent developments in real-time computer devices led to the emergence of mobile and stationary eye tracker platforms to change daily lives. A new advancement in optical device-based mobile eye tracking systems presents comprehensive nonintrusive human gaze points [73,74]. The form factors of recent eye tracking devices are eyeglasses, screen-attached cameras, and screen-mounted goggles. These non-invasive eye tracking platforms allow the collection of comprehensive eye information data. Various applications have attempted to analyze human attention and intention from algorithmic reproduction using eye tracking data. Here, we focus on an all-inclusive review of eye tracking methods (such as EOG and video monitoring) and wearable systems, including electrodes, platforms, and signal-processing technologies for various applications (Figure 1). We summarize the types of platforms and the characteristics of the electrodes, including biocompatible, mechanical, and electrical properties. In addition, the signal processing strategy is discussed in view of targeted applications and data sets. Moreover, we summarize the principle, platform, and applications of eye tracking employed throughout many fields of psychology, medical examination, cognitive science, and disease diagnosis. Finally, we discuss future works related to next-generation eye tracking technologies, promoting continuous development via cooperation with various technologies.

2. EOG Signals

A key metric of the positive potential on the cornea and negative potential on the retina is shown in Figure 2a. The EOG signal is acquired from electrodes around the eyes or forehead, as described for various EOG platforms in the above section. Because electrodes can transduce bioelectric activities within the body into electrical currents, electrodes are essential components for obtaining EOG signals. For EOG collection, electrodes will be positioned on the user’s face, as shown in Figure 2b. Two electrodes are placed next to the lateral canthus of each eye to detect horizontal (i.e., left and right) eye movements. To detect vertical (i.e., up and down) eye movements, two electrodes are placed on the top and bottom of one of the eyes. An electrode is placed in the middle of the forehead, chin, or back of the ear as a reference. The electrode placed at the lateral canthus of the left eye is connected to the positive terminal of the horizontal channel, and another electrode placed at the lateral canthus of the right eye is connected to the negative terminal of the vertical channel. Other electrode sets for vertical channels are similar to those for horizontal channels. One electrode placed at the top of the left eye is connected to the positive terminal of the vertical channel, and another electrode placed at the bottom of the left eye is connected to the negative terminal of the vertical channel. According to the positions of the electrodes mentioned above, we explain how EOG signals are measured. When the electrodes capture the eye movements, the electrode nearby the eye’s direction detects the positive potential from the cornea, and another electrode opposite of the eye’s direction detects the negative potential from the retina. For example, when the eyes move to the right, the pair of horizontal electrodes detect the negative potential. Alternatively, when the eyes move to the left, the pair of horizontal electrodes set detects the positive potential. Similarly to the horizontal electrodes above, the vertical electrodes measure the potential according to the direction of the eye. When the eyes look up, vertical electrodes detect the positive potential, and when they look down, vertical electrodes detect the negative potential. The blink signal is not an EOG signal. The EOG signal is the electric potential difference between the retina and the cornea. However, the blink signal is an electromyography (EMG) signal from the movement of the eye muscle. EMG measures electrical muscle responses in response to stimulation in the nerves. EOG waveforms show the peaks when the eyes move left, right, up, and down from the first position. Here, this section introduces various types of electrodes, such as hydrogel, fiber, polymer, and micro-patterned types, which can solve problems with existing gel and dry electrodes.

2.1. Existing Electrodes

Wearable EOG devices that require electrodes and wearable platforms can measure changes in eye movements during daily activities [1,75]. Conventional electrodes, wet or dry silver/silver chloride (Ag/AgCl), are generally used to measure EOG signals [76,77,78,79,80]. For example, wet Ag/AgCl electrodes are used for the analysis of various activity recognition fields [65,81,82] or HMI controllers [37,60,62,83]. Dry flat Ag/AgCl electrodes are used on various wearable platforms such as eyeglasses [35], head caps [55], and goggles [34]. From the perspective of wet Ag/AgCl electrodes (Figure 2c), conductive gel dehydration results in electrode performance degradation over time. The conductive gel can cause pain and skin rashes when it is used on human skin [1] and might cause a short circuit if electrodes are placed close to each other [84]. Poor breathability is also one of the gel electrode’s drawbacks. It is hard to use for wearable platforms such as eyeglasses because the gel electrodes are too bulky for mounting around the nose. On the other hand, dry Ag/AgCl electrodes are better for the long-term measurements of the EOG signal than wet electrodes (Figure 2d). However, dry electrodes are thick and stiff, leading to a high electrode–skin impedance and vulnerability to motion artifacts with poor contact on the delicate skin around the eyes [85]. The wearable EOG device systems with existing electrodes, such as conventional wet or dry electrodes, are often bulky and complex, as shown in Figure 2e,f. Therefore, many research groups introduced non-invasive, bio-compatible, and high-quality recording electrode types to address the above issues. This section introduces the various types of electrodes, such as hydrogel, fiber, polymer, and micro-patterned types (Table 1).
Figure 2. EOG detection systems. (a) An anatomical illustration of the eye (cornea being positive and retina being negative). (b) Positions of electrodes for EOG detection (reprinted under terms of the CC-BY license [51]. Copyright 2017, the authors. Published by MDPI). (c) Conventional Ag/AgCl electrodes. (d) Example of a stiff material (metal disc) (Reprinted with permission [86]. Copyright 2013 Institute of Physics and Engineering in Medicine). (e) Exiting EOG devices with conventional gel electrodes (Left: Reprinted with permission [81]. Copyright 2009 Elsevier, Middle: reprinted with permission [60]. Copyright 2018 Elsevier, Right: reprinted under terms of the CC-BY license [37]. Copyright 2021, the Authors. Published by MDPI). (f) Exiting EOG devices with dry metal electrodes (Reprinted under terms of the CC-BY-NC license [87]. Copyright 2018, the Authors. Published by Springer Nature).
Figure 2. EOG detection systems. (a) An anatomical illustration of the eye (cornea being positive and retina being negative). (b) Positions of electrodes for EOG detection (reprinted under terms of the CC-BY license [51]. Copyright 2017, the authors. Published by MDPI). (c) Conventional Ag/AgCl electrodes. (d) Example of a stiff material (metal disc) (Reprinted with permission [86]. Copyright 2013 Institute of Physics and Engineering in Medicine). (e) Exiting EOG devices with conventional gel electrodes (Left: Reprinted with permission [81]. Copyright 2009 Elsevier, Middle: reprinted with permission [60]. Copyright 2018 Elsevier, Right: reprinted under terms of the CC-BY license [37]. Copyright 2021, the Authors. Published by MDPI). (f) Exiting EOG devices with dry metal electrodes (Reprinted under terms of the CC-BY-NC license [87]. Copyright 2018, the Authors. Published by Springer Nature).
Biosensors 12 01039 g002

2.1.1. Composite Electrodes

The composite electrodes are introduced to compensate for the drawbacks of conventional electrodes, such as skin irritation or motion artifacts from human skin. Composite electrodes aim to achieve softness and high conductivity to acquire a continuous high-quality biopotential. The composite electrodes are fabricated with various materials such as a polymer, fibers, and hydrogel, and that could be represented as soft materials for measuring biopotentials. We present the manufacturing method and characteristics of the composite electrodes with various materials. In the case of elastomeric composite electrodes, Lee et al. [5] reported soft, elastomeric composite elements for biopotential, as shown in Figure 3a. The elastomeric composites are made of three different types of carbon nanotubes (CNTs) (HANOS CM-95, CM- 250 and CM-280). In all cases, such as mechanical endurance, robustness, and deformation, CM-280 is an optimized composite material considering mechanical endurance, robustness, and deformation. Moreover, elastomeric composite electrodes based on CM-280 showed the lowest rate of electrical resistance changes among the three types of CNTs. From the signal acquisition quality perspective, such as the signal-to-noise ratio (SNR), the elastomeric composite electrodes based on CM-280 are comparable to commercial gel electrodes. This electrode is a representative electrode for overcoming the disadvantages of the existing electrode, such as skin irritation and dehydration. Lin et al. [38,84] designed conductive polymer foam electrodes based on urethane and taffeta materials coated with Ni/Cu on all surfaces (Figure 3b). This polymer electrode can reduce motion artifacts by absorbing the motion force and the rubbing and sliding of the electrode on the human skin. Fiber-type electrodes are generally divided into fabric-type and paper-type electrodes. As a flexible electronic, it has fiber-based substrate printability and is low cost, lightweight, and can be used in disabilities [88]. As shown in Figure 3c, fiber electronics manufacturing processes are also simple to apply in conductive inks on a fiber-based substrate. In previous research, Antti et al. [89] reported accessible silver-coated fiber-type electrodes (20 × 20 mm2). Fiber-based electrodes are affordable but are vulnerable to motion artifacts from the forehead depending on the facial movements.
To overcome the disadvantages of the previous fabric electrodes, Eskandarian et al. [90] introduced 3D-Knit fabric-type electrodes based on conductive elastomeric filaments (CEFs), which are flexible, breathable, and washable, as shown in Figure 3d. The conductive elastomeric materials are knitted or weaved to be electrodes, and the fabric electrodes also can be integrated into the general garment. This unique combination of fabric-type electrodes and garments enables one to monitor electrophysiological signals. The fabric-type electrodes are developed with a 3D structure to be compatible with human skin. The following is a brief summary of the manufacturing process. (1) CEF fiber is used for the electrode’s surface. (2) The polyester yarn is then knitted as a 3D structural filler in the spacer layer. To support the 3D structures, polyester is knitted on the back layer. With these fabric-type electrodes, smart garments can be used for the long-term monitoring of electrophysiological signals without severe levels of motion artifacts. Paper-based electrodes have similar advantages to the above fabric-type electrodes but have a simpler manufacturing process. Paper-based electrodes are fabricated using inkjet printing [91,92], spin coating [93], and screen-printing [94]. However, Golparvar et al. [57,95,96] introduced wearable graphene textiles with a different fabrication process [57]. First, an ordinary textile is dipped in a graphene oxide (GO) solution. Moreover, thermal treatment and chemical reduction are conducted to obtain reduced graphene oxide (rGO). These graphene textile electrodes promise flexibility, breathability, and usability for a daily garment. The flexibility is able to match skin deformations. Moreover, permeability relieves skin irritation relative to air and moisture. Due to usability, the wearable graphene textile electrode is likely to be adopted by sportswear companies for smart wearable devices.
Figure 3. Examples of composite electrodes. (a) Carbon nanotubes embedded in a printed eyeglass (Reprinted with permission [5]. Copyright 2020 American Chemical Society). (b) Conductive polymer foam based on urethane and taffeta materials (Reprinted with permission under the terms of the CC-BY license [38]. Copyright 2021, the Authors. Published by MDPI). (c) Silver embroidered electrode and electrode-lead connection (Reprinted with permission under the terms of the CC-BY license [21]. Copyright 2021, the Authors. Published by MDPI). (d) 3D-Knit dry electrodes using conductive elastomeric fibers with CEF (Reprinted with permission [90]. Copyright 2022 Wiley-VCH GmbH). (e) Photographs demonstrating adhesion of the flexible hydrogel (Reprinted with permission under the terms of the CC-BY license [97]. Copyright 2021, the authors. Published by MDPI). (f) Tortuosity of the proposed hydrogel at −115 °C (Reprinted with permission under the terms of the CC-BY license [98]. Copyright 2021, the Authors. Published by IOP). (g) Photographs demonstrating the stretchability of the starch hydrogel (Reprinted with permission [61]. Copyright 2022 Wiley-VCH GmbH).
Figure 3. Examples of composite electrodes. (a) Carbon nanotubes embedded in a printed eyeglass (Reprinted with permission [5]. Copyright 2020 American Chemical Society). (b) Conductive polymer foam based on urethane and taffeta materials (Reprinted with permission under the terms of the CC-BY license [38]. Copyright 2021, the Authors. Published by MDPI). (c) Silver embroidered electrode and electrode-lead connection (Reprinted with permission under the terms of the CC-BY license [21]. Copyright 2021, the Authors. Published by MDPI). (d) 3D-Knit dry electrodes using conductive elastomeric fibers with CEF (Reprinted with permission [90]. Copyright 2022 Wiley-VCH GmbH). (e) Photographs demonstrating adhesion of the flexible hydrogel (Reprinted with permission under the terms of the CC-BY license [97]. Copyright 2021, the authors. Published by MDPI). (f) Tortuosity of the proposed hydrogel at −115 °C (Reprinted with permission under the terms of the CC-BY license [98]. Copyright 2021, the Authors. Published by IOP). (g) Photographs demonstrating the stretchability of the starch hydrogel (Reprinted with permission [61]. Copyright 2022 Wiley-VCH GmbH).
Biosensors 12 01039 g003
Fiber-type electrodes have limited stretchability, which is not suitable for uneven skin. Moreover, those fiber-type electrodes are vulnerable relative to temperature and humidity. Some research groups presented hydrogel electrodes to overcome the limitation of the fiber-type electrode [61,97,98] (Figure 3e). Among those research groups, Wang et al. [98] introduced a conductive nanocomposite network hydrogel fabricated by projection microstereolithography (PμSL)-based 3D printing. This 3D-printed hydrogel shows high stretchability with high conductivity. Moreover, it can capture biopotentials precisely. As shown in Figure 3f, the 3D-printed hydrogel is stretchable and bendable even at low temperatures (−115 °C). Wan et al. presented the starch hydrogel patch made by lotus rhizome. As shown in Figure 3g, this conductive starch hydrogel has high stretchability (790%), adhesion, and a low Young’s modulus (4.4 kPa). This starch hydrogel patch enables a conformal attachment on uneven human skin based on these properties. To fabricate a starch hydrogel patch, skeleton material (lotus rhizome) and electrolyte (NaCl) are integrated. These materials allowed capturing EOG signals with biocompatibility and biodegradability. Wang et al. introduced another flexible hydrogel electrode [61], providing exceptional breathability, a low modulus (286 kPa), and adhesion to the human skin as a biocompatible biosensor. Compared to conventional gel electrodes, this hydrogel electrode has biocompatibility, which causes fewer skin irritations. This flexible hydrogel electrode is made of conductive hydroxypropyl cellulose/Polyvinyl alcohol (HPC/PVA) hydrogel and flexible polydimethylsiloxane (PDMS) substrate.

2.1.2. Dry Electrodes

Recent advancements in microfabrication technologies opened the possibility for micro-patterned electrode designs and facilitated the design of sophisticated micro- or nano-scaled electrode with diverse sizes, shapes, and even mechanical and electrical properties. Here, we introduce micro-patterned electrodes on various substrates such as polymer [11], paper [99], and metal (gold [1] and silver [100]). Among the three types of dry electrodes utilized for capturing EOG signals, the polymeric substrate was regarded as an attractive material because its scaleable property enables various forms of electrode fabrication. As shown in Figure 4a (left), Ameri et al. [11] introduced graphene electronic tattoos with ultrathin, ultrasoft, transparent, and breathable substrates. These electrodes are manufactured with graphene and polymethyl methacrylate (PMMA). Figure 4a (right) shows a manufacturing process; graphene is grown on copper foil, and the 350 nm film of PMMA is coated on graphene by spin coating. Then, the copper layer is etched away and rinsed with deionized (DI) water. The graphene/PMMA layer is transferred onto a commercial tattoo paper. Then, the graphene/PMMA layer on tattoo paper is carved with the shadow mask and a mechanical cutter plotter (Silhouette America Inc., Lindon, UT, USA). This electrode is designed in serpentine-shaped ribbons to enable stretchability (50%) [10]. Other electrodes are applied as paper-type substrates to materialize a dry electrode [99]. Epidermal paper-based electronic devices (EPEDs, Figure 4b (Left)) are manufactured by a benchtop razor printer, which is simple, low-cost, and compatible. As shown in Figure 4b (Right), to manufacture EPEDs, paper substrates are used. These paper substrates are silanized with fluoroalkyl trichlorosilane for inexpensive, water-resistant, and mechanically compliant materials relative to human skin. Moreover, conductive inks or thin films are attached to the side of paper substrates. The open mesh serpentine layout of the EPEDs is craved by a programmable razor printer (Silhouette CameoTM, Silhouette America Inc., Lindon, UT, USA). The silanization with fluoroalkyl trichlorosilane prevents the EPEDs from being wet because silane is used for hydrophobic paper. Due to the low thickness, the EPED is able to be compatible with skin wrinkles [101]. Moreover, the mechanical reinforcement of EPEDs allows withstanding accidental stresses of up to 2.5 MPa. The design of the EPED electrodes, the serpentine pattern, enables them to endure stretching up to 58% before mechanical failure. The “skin-like” bioelectrode made of metal (gold or silver) is feasible to draw the advantage of mesh-patterned dry electrodes (Figure 4c (Left)). One of the representative examples utilizing Au was introduced by Mishra et al. [1]. A cleaned glass slide is coated with primer (MicroChem Corp., Westborough, MA) for adhesion. After coating with PMMA and PI on the glass slide, curing of PMMA and PI is followed. Au deposition on the PI and photolithography-defined patterns is designed according to a “skin-like” fractal pattern. From the perspective of skin assessments, the fractal bioelectrode is advantageous over the conventional gel electrode. The conventional gel electrode causes skin irritation by heating skin temperature, while this “skin-like” fractal electrode shows a negligible change in skin temperature. This fractal electrode demonstrates mechanical compliance in both stretchability (30%) and bendability (up to 180°). Another manufacturing method of “skin-like” bioelectrode is aerosol jet printing (AJP) as shown in Figure 4c (Right). As a potentially low-cost and scalable printing method [102], AJP allows the direct printing of an open-mesh structure onto a soft membrane without using an expensive nano/microfabrication facility [103]. With silver nanoparticles (AgNPs) (UT Dots Inc., Champaign, IL, USA), AJP allows the direct printing of an open-mesh structure onto a soft membrane. The “skin-like” bioelectrode designed by computational modeling showed highly flexible (180° with a radius of 1.5 mm) and stretchable (up to 100% in biaxial strain) characteristics. Peng et al. [100] proposed a flexible dry electrode with an Ag pad and ten thousand micro-AgCl pads (Figure 4d). This flexible dry electrode is manufactured with parylene C (PC) (Sigma–Aldrich, St. Louis, MO, USA). As shown in Figure 4d (Right), the parylene layer is deposited on a glass wafer by chemical vapor deposition (CVD). After that, a positive photoresist (PR) spun on the parylene film is patterned by ultraviolet (UV) light. Next, a sputtering process and a lift-off process are carried out. Ag is electroplated and partly chloridized by electrochemical methods. Finally, the PR is removed. These dry electrodes are based on parylene, which is biocompatible, flexible, and good adhesive. Because this electrode is thin and flexible compared to conventional electrodes, it can maintain a stable and low electrode–skin impedance.
Table 1. Summary of electrodes for measuring EOG signals.
Table 1. Summary of electrodes for measuring EOG signals.
Electrode TypeConductive MaterialSupporting SubstrateBiopotentialBiocompatibilityStretchabilityBendabilityFabricationSizeModulusAdvantageRefs.
PolymerCNTPDMSEOG, EEGOOOMix and
curing
20 × 5 × 5 mm3Elasticity
4 MPa
Less changes in electrical resistance against mechanical deformation
-High signal-to-noise ratio
[5]
Ni/CuUrethane foam EOG, EEGOXXAssembling Metal and foam14 × 8 × 8 mm3Compression set 5%Low interference from skin-electrode interface [38,84]
Ag/AgClParyleneEOG, EEG, EMGOXOMicrofabrication process10 × 10 × 0.05 mm3-Ease of thickness control, ultrathin fabrication
Well-fitting skin topology
[100]
GraphenePDMSEOGO50%OAPCVD and Coating6 × 20 mm2-Ultrathin, ultrasoft, transparent, and breathable.
Angular resolution of 4° of eye movement
[11]
FiberGrapheneCotton textile fabricsEOGOXOSimple pad-dry technique35 × 20
mm2
-Simple and scalable production method[104]
GrapheneTextile fabricsEOGOOODipping and thermal treatment30 × 30
mm2
-Possibility and adaptability for mass manufacturing[42,57,96]
SilverTextile fabricsEOG, EMGOXOEmbroidering20 × 20
mm2
-Comfortableness and the usability with the measurement head cap[89]
CEFCEF fibersEOG, ECGO258.12%OIndustrial knitting machine20 × 20
mm2
Stress
11.99 MPa
Flexible, breathable, and washable dry textile electrodes
Unrestricted daily activities
[90]
Silver polymerEscalade FabricEOG, EMGOOOScreen and
Stencil printing
12 × 12 × 1
mm3
-Textile compatible, relatively low cost for a production lineSmaller scale manufacturing[105,106]
CopperOmniphobic paperEOG, ECG, EMGO58%ORazor printer20 × 15
mm2
Stress 2.5 MPaSimple, inexpensive, scalable, and fabrication
Breathable Ag/AgCl-based EPEDs
[99]
silver/polyamideFabricEOGOOOMix and
coating
10 × 10
mm2
-Reduction in noise by appropriate contact[40]
HydrogelPEGDA/AAm-EOG, EEGO2500%OPμSL-based 3D printer15 × 15
mm2
-Excellent stability and ultra-stretchability[98]
StarchSodium chlorideEOGO790%OGelation process30 × 10
mm2
4.4 kPaAdhesion, low modulus, and stretchability
No need for crosslinker or high pressure/temperature
[61]
HPC/PVAPDMSEOGO20%OCoating30 × 10
mm2
286 kPaWell-adhered to the dimpled epidermis[97]
MXenePolyimideEOG, EEG, ECGOOOMix and Sonicating20 × 20
mm2
-Low contact impedances and excellent flexibility[107]
PDMS-CB-EOGOOOMix and
deposition
15 × 15
mm2
2 MPaContinuous, long-term, stable EOG signal recording[108]
MetalSilverPolyimideEOGO100%OMicrofabrication process10 × 10
mm2
-Highly stretchable, skin-like, and biopotential electrodes[30]
GoldPolyimideEOGO30%OMicrofabrication process15 × 10
mm2
78 GPaComfortable, easy-to-use, and wireless control [1]

2.2. Examples of Platforms for EOG Monitoring

From the wearable EOG device user’s requirement, which enables long-term comfort, research groups designed various types of wearable EOG device platforms. As shown in Figure 2e, previous EOG devices were bulky and many wires were attached, causing limitations with respect to the long-term or continuous monitoring of the user’s daily eye movements and inconveniences when incorporated the device into one’s attire. The contact between soft human skin and rigid EOG devices causes limitations such as noise during the collection of the biopotential [109,110,111]. With recent advances in wearable technologies, Yeo et al. [1,30] suggest that wearable sensor systems should be soft, compact, and built-in to solve the above problems [112]. In addition, researchers and subjects indicated that wearable sensor systems should not interrupt daily behavior [112]. Advances in circuit systems enable the wireless, real-time, continuous detection of biopotentials [113]. This section introduces four types of wearable platforms: glasses, face masks, headbands, and earplugs (Table 2).

2.2.1. Eyeglass Type

Glasses-type platforms enable convenient and inconspicuous applications and minimize user distractions with respect to autonomous long-term usage in daily life. As another advantage, the glasses-type platform can be used with prescription lenses because the glasses-type EOG devices are embedded within a traditional glasses frame [114]. Among those who wear glasses because of their eyesight, 92% of populations over 70 already wear glasses [115]. For the above reasons, these glasses-type platforms are likely to be adopted by elderly individuals who already require corrective eyeglasses [115]. Among the various glasses-type platforms, we introduce goggle-based devices, commercial devices, and devices manufactured by 3D printers. Figure 5a shows a goggle-based wearable EOG device aimed at applications such as mobile with activity recognition and context recognition. The goggle-based platform is designed to achieve the above aims with a user-friendly fit. Compared to the existing bulky devices as shown in Figure 2c,d, the weight of the entire device (i.e., including the goggles and circuit boards) is only 150 g and flat metal electrodes (Figure 2d) are placed around the user’s eyes through constant pressure. This comfort allows long-term wear to be used continuously for more than a few hours. Andreas et al. [116] manufactured goggle-type devices and predicted that mobile applications can be used to map a large TV as the input medium [38]. One of the commercial devices, the JINS MEME (JINS MEME Inc., Tokyo, Japan) eyewear, looks similar to a typical pair of glasses. To collect EOG biopotential with kinematic motion data, the JINS MEME has consisted of three metal electrodes, an accelerometer, and a gyroscope. Three metal electrodes are placed on the bridges and nose pads of the glasses to acquire EOG signals in the horizontal and vertical dimensions. The accelerometer and gyroscope are embedded in one of the arms of the glasses to collect motion data. These embedded sensors and metal electrodes can real-time, continuously detect human activity data. JINS MEME eyewear is shown in Figure 5b [39]. With the recent development of 3D printer technology, Lee et al. [5] and Kosmyna et al. [117] are directly manufacturing wearable platforms in the form of glasses, as shown in Figure 5c. Here, we introduce multifunctional electronic eyeglasses (E-glasses) made using a 3D printer. In wireless, real-time modes, these 3D-printed eyeglasses can monitor biopotentials such as EEG, EOG, and UV intensity. Instead of conventional gel electrodes, soft conductive composite electrodes are placed on E-glasses for electrical and mechanical superior properties. The device is designed to maintain seamless contact between skin and electrodes through constant pressure for reliable biopotential measurements. Various human motions also can be observed by analyzing the accelerometer. As one of the advantages of the glasses-type platform, this device can transform the lens required by the user, such as sunglasses for the UV protection [31,118,119], or prescription lenses for eyesight. As shown in Figure 5d, details on electrodes for recording biopotential signals such as EOG and EEG are listed (SO: source electrode for EOG; RO: reference electrode for EOG; SE: source electrode for EEG; RE: reference electrode for EEG; G: ground electrode). It is possible to apply constant pressure to the CNTs/PDMS electrodes through the E-glasses legs and supports fixture [5]. Figure 5e shows that another 3D-printed glasses-type platform consists of two printed circuit boards (PCBs), two EEG electrodes, two EOG electrodes, a reference electrode, and a lithium polymer (LiPo) battery. This device is made of nylon plastic, which is a flexible material. The particular parts of the eyeglasses frame are made of silver as electrodes to monitor EOG and EEG. The EOG electrodes are located on the nose pad similar to the E-glasses structure above. Moreover, an extra silver electrode is placed on the nose bridge of the glasses to serve as a reference electrode (EOG electrodes (1), reference electrode (2), EEG electrodes (3), PCBs (4), LiPo battery (5), and small open chamber for piezoelectric element to deliver bone-conducted sound) [32].
Table 2. Summary of wearable EOG platforms.
Table 2. Summary of wearable EOG platforms.
Wearable
Platforms
ElectrodesPlatformsRefs.
TypesMaterialsCountsSizeFeatures
EarplugFoamSilver22 × 2 × 1 mm3-Stable and comfy during sleep[44,45]
FoamConductive cloth22 × 2 × 1 mm2-Stable and comfy during sleep[43]
EyeglassGelAg/AgCl615 × 14 × 5 cm3-Lots of wires were attached[37]
MetalSilver315 × 14 × 5 cm3-Real-time delivery of feedback in the form of an auditory[32,33,117]
MetalAg/AgCl515 × 14 × 7 cm3, 150 g-Constant pressure for electrodes[34,116]
FoamCNT/PDMS515 × 14 × 5 cm3-UV protection via sunglass lens[5]
FoamNi/Cu514 × 12 × 7 cm3-Absorbing the motion force via Foam and platform[38]
FacemaskFiberSilver/Polyamide314 × 7 × 2 cm3-The wires are embedded in the eye mask platform[40]
MetalSilver/Carbon820 × 15 cm2-Tattoo-based platform-Stable and comfy[41]
FiberGraphene515 × 7 × 2 cm3-High degree of flexibility and elasticity[42]
HeadbandGelAg/AgCl415 × 7 cm2-Waveforms were well measured on the headband platform[51]
MetalAg/AgCl415 × 7 cm2-Reduction in the total cost by using disposable Ag/AgCl medical electrodes[55]
FiberGraphene315 × 7 cm2-Long-term EOG monitoring applications[21,57,96]
FiberSilver515 × 7 cm2-Reusable and easy-to-use electrodes are integrated into the cap.[89]
Fibersilver-plated and nylon315 × 7 cm2-Long-term EOG monitoring applications[58]

2.2.2. Facemask Type

For a comfortable and stable fit, a face mask-type platform was presented. Among the various face mask type platforms, we introduce different types of eye masks (Figure 6a) as well tattoo-based and commercial devices. In the case of the eye mask platform [40], electrodes are made of conductive sponge materials. Three dry sponge electrodes are placed on the eye mask around the user’s eyes. Two electrodes are used to acquire the EOG signal, and the other one is used as a reference electrode. To reduce the pressure applied to the skin, the manufactured eye mask platform fits the shape of the skin deformation. The wires are embedded in the eye mask platform to reduce noise by the movement of the wire. Another eye mask-type platform integrates a sleep eye mask with electrodes. This eye mask platform uses EXCELLENT 47 (Moxie Corporation, Taipei, Taiwan) instead of a conventional gel electrode. The proposed dry fabric electrode consists of a high-performance silver/polyamide (20%/80%) compound. The combination of the sleep eye mask and the soft fabric electrode enables a reduction in noise by appropriately contacting the user’s skin to acquire a biopotential. Another face mask platform reported by Shustak et al. [41] is a tattoo-based EOG device as shown in Figure 6b. This tattoo-based device acquires various biopotentials such as EEG, EOG, and EMG using a dedicated electrode layout on the user’s face. This electrode layout is implemented on thin polyurethane films with silver electrodes coated by a bio-compatible C layer. To contact between skin and a tattoo-based platform, a double-sided adhesive is used for a stable attachment [120]. The position of the electrodes is shown below with a number and acquired biopotential: EMG electrodes (1 and 2), EOG electrodes (3 and 4), and forehead EEG electrodes (5~8). The Nox A1 portable H-PSG system (Nox Medical, Reykjavík, Iceland) together with an ambulatory electrode set is a face type of commercial device that can capture EOG and EEG signals. As shown in Figure 6c, EOG electrodes (F8 and F7) and EEG electrodes (Af8, Fp2, Fp1, and Af7) are placed on the forehead [121].

2.2.3. Headband Type

EOG signals can be sufficiently acquired not only around the eyes but also on the forehead. Heo et al. [51] designed a wearable EOG device based on a headband to acquire the forehead EOG signal. We introduce soft fabric headband-type and commercial devices among the various headband-type platforms. In general, dry electrodes are placed around the forehead inside the headband. The two electrode sets are prepared to measure horizontal and vertical eye movements, and the other one is used as a reference electrode. As shown in Figure 7a, the printed circuit board (44 mm × 55 mm) is placed on the back side of a headband. Such a soft fabric headband-type platform can provide a comfortable fit and can stably secure the electrodes on the human skin. One of the commercial headband types of wearable devices (Figure 7b), NeuroSky (San Jose, CA, USA) is used for brain–computer Interface (BCI) equipment [49]. The NeuroSky headband is adjustable and requires low costs, with an inexpensive dry sensor. Since one dry electrode located on the forehead acquires a biopotential, there is not enough information contained in the EOG signal with EEG signal, but it includes built-in electrical noise reduction software/hardware, making it easy to detect the EOG signals with the EEG signal. Another commercial headband-type wearable device, Muse, has four biopotential channels for monitoring eye movements and brain waves. Moreover, this device has a three-axis accelerometer and gyroscope for detecting head motion. In the case of the Muse device (Figure 7c), the electrodes are located on the forehead and behind the ear (as shown in Figure 7d two on the forehead (AF7 and AF8) and two behind the ear (TP9 and TP10)), with the reference electrode located at the center of the forehead (Fpz) [122,123].

2.2.4. Earplug Type

This earplug-type platform aims to be a human-centered, compact, non-obtrusive, and ergonomic wearable device. In addition, because it is non-invasive, users can use it for a long time without fatigue. Figure 7e shows that a pair of small and thin passive electrodes are attached to the surface of the earplugs [6]. Alternatively, another style of earplug-type platform uses an electrode that is made from a small piece of conductive silver cloth layered by pure and thin silver leaves many times on top. This wearable platform enables the earplugs to overcome the delicate structure of the human ear and users can use it comfortably inside the ear when sleeping. To ensure a comfortable and snug fit, the substrate material of the earplug-type platform is a memory foam that absorbs artifacts stemming from small and large mechanical deformations to the ear canal’s walls. The placement of an earplug-type platform should properly be placed to acquire the EOG signal with the EEG signal. The suggested place is the main electrode in one ear and the reference electrode in another.

2.3. Signal Processing Algorithms and Applications

2.3.1. EOG Signal Processing

Figure 8a shows the detailed pre-processing with EOG signals received through Bluetooth low energy (BLE) embedded in the circuit (Sample rate of 250 Hz). Before classification, noise and baseline drift removal and data averaging are implemented as pre-processing. A band-pass filter is applied to remove noise components [125]. When the received EOG signal (analog) from the skin is converted into a digital value, a DC offset is generated. The first DC offset value is removed from all signal values to remove drift (DC offset). Noise and trends can sometimes interfere with data analyses and should be eliminated. To smooth the EOG waveforms, samples are divided into minimal sets and averaged. It is used as a method for removing noise. EOG signals are generally classified in five directions (left, right, up, down, and blink). To classify into five classes, thresholds are setup with a specific value (horizontal channel: right (400 μV) and left (−400 μV); vertical channel: up (400 μV), down (−400 μV), and blink (500 μV)). In other ways, EOG signals are classified by comparing the amplitude or wavelength of the peak, or whether the difference from the peak to the peak is negative or positive, as shown in Figure 8b. However, signal processing alone cannot detect the class much. Moreover, medical analyses have many limitations when using signal processing. Here, we introduce machine learning for more classes or medical analyses.

2.3.2. Machine Learning

Recently, research groups introduced machine learning to analyze EOG signals. Machine learning technologies are applied according to the purpose of each study and application. However, different machine learning technologies can be used for the same purpose. Researchers introduce a discrete wavelet transform (DWT) classifier and a linear discriminant analysis (LDA) classifier among machine learning technologies. LDA is a common classifier, which uses dimensionality reduction techniques in machine learning. This classifier can solve two-class classification problems. Figure 9a is an example of an LDA classifier (targeted EOG from eye movements of “blink” and “down”). To remove noise, a third-order bandpass filter (Butterworth) is used. By using thresholds, a series of peaks were detected. The start time and end time detected by the threshold are factors that increase detection accuracy. Pre-processed EOG signals are divided into test data sets and training data sets. Test data sets and training data sets are transferred to the LDA classifier, as shown in Figure 9a. The LDA classification plot includes both correct (o) and incorrect (x) classes. Another machine learning technology is DWT which is one of the wavelets transforms. The wavelets are sampled at discrete intervals. As shown in Figure 9b, the DWT classifier targeted EOG from eye movements of “left” and “right”. The acquired EOG signals are classified based on eye movements with an angle of eye rotation. The fifth level of DWT coefficient with a scale of 100 and the “sym8” basis function is selected for DWT performances. To remove noise, a third-order bandpass filter (Butterworth) is used (fc = 0.5−50 Hz).

2.3.3. Applications

With the recent development of wearable EOG device platforms, EOG signals can be easily acquired and applied to HMI applications without limitations from previous bulky and wired EOG devices. The use of HMI applications is increasing rapidly. There are two types of applications, such as the controller type and analysis type as shown in Table 3. As shown in Figure 10a, in the case of the controller type such as wheelchairs [1,4,51,52], drones [11,59], game interfaces [5,36,47,60,61], and virtual keyboards [34,38,51,62], a command is put into the HMI by detecting the direction of the eye. However, the EOG signal for HMI has eye angle and gaze detection limitations. In general, four or six eye directions can be detected by signal processing. The limited number of eye movement detection is limited for HMI applications, which require complex commands. The EOG signal is sensitive to noise and users’ small movements. Therefore, there is a limit to being applied to surgical robots that require accurate movement. To overcome the above limitations, research groups are simultaneously analyzing biopotentials. Figure 10b shows various healthcare monitoring systems [7,40,41,44,45,63], and medical health status analyses [64,65,66] have been conducted using both biopotentials, such as EOG, EEG, and EMG, with signal processing. In general, EMG, EOG, and EEG signals are simultaneously obtained from the subject’s face, and information for healthcare analyses is obtained via signal processing with machine learning. The field that received a lot of attention is sleep or fatigue monitoring analyses. To monitor the sleep stage, Shustak et al. [41] recorded EMG, EOG, and EEG using a wireless system. This sleep monitoring system showed clear differentiation of the sleep stage for 6 h. This research group showed the potential of sleep disorders monitoring systems in the home environment by demonstrating sleep stage monitoring. Jiao et al. [63] presented a novel model for driver sleepiness detection by simultaneously analyzing EEG and EOG signals. The driver sleepiness detection system based on EEG and EOG is analyzed by the long-short term memory (LSTM) classifier, achieving a mean accuracy of 98%. The research group determined that a wearable sleepiness detection system could be used to reduce traffic accidents by detecting sleepiness. From a healthcare perspective, researchers are using EOG signals to analyze attention deficit hyperactivity disorder (ADHD) [64,65,66] or emotion detection [126,127,128]. Soundariya et al. [127] introduced emotion-recognizing systems based on EOG signals from eye movements. The recorded EOG signal is classified as happiness, sadness, anger, fear, and pleasure by the supporting vector machine (SVM) classifier.

3. Eye Trackers

Recent advances in computing power became powerful enough for real-time eye tracking, which allowed using video and screen-based eye trackers [67,129]. Since then, with new technologies in tracking optic cameras and machine learning processes, eye tracking has been widely utilized with stationary cameras or cameras embedded glasses [67,68]. These cameras can record corneal infrared light reflection for tracking pupil position, mapping the tracked gaze while recording, and calculating other parameters such as tracking rate, dwell time, and pupil dilation [68]. These parameters are used for dynamic stimulus analyses to create an eye concentration marker, which is essential in tracking various human stimuli and human applications [68,130]. Recent eye tracking technology proposed integration to virtual reality (VR) and mixed reality (MR) setups to fulfill the demand for the entertainment domain and cognitive functioning domain for clinical assessments [131,132,133,134,135].

3.1. Details of Eye Trackers

3.1.1. Human Eye Movement and Stimuli

All natural main eye movements are used to reposition the eye’s visual axis on the fovea [136]. The anatomy of the human eye is presented in Figure 11a. When the eye looks at a target, visual axis connects fixation point to center of the entrance pupil, front, and rear nodal point [137]. The eye moves when a user looks at an object to perceive stationary objects [136]. In real eyes, the fovea is displayed slightly inferior and temporally displaced from the point where the optical axis meets the retina and detects eye movement [137]. In general, the eye has six degrees of flexibility, three rotations, and three translations inside the eye socket [138,139]. The eye is rotated by two pairs of direct muscles that allow six degrees of freedom in eye movement control [67,136]. Eye movements can be classified into two main categories. First is saccadic movements. When we attempt to fixate the eye gaze to target area of interest, the eye does not stay still but continuously moves [131]. As known as rapid eye movements, saccade quickly adjusts visual axis of the eye on the fovea to interest area which is highly reflexive and voluntary [135]. The movement changes the eye’s vision to the object by gaze angle control [131]. Moreover, microsaccades (fixational saccades) are small eye movements that constitute fixation, which is the basis of visual perception [135]. The second category is for stabilizing movements, which attempt to hold the eye, or movement for a stable retinal picture [3,131,135]. Fixations occur when the gaze is fixed for a long time on a particular constrained area, providing fixational dynamics and statistics [136,138]. Figure 11b shows the foveal angle, and human vision span around the gaze direction; these numbers vary in different studies. While looking at an object with each eye’s fixation point remaining on the fovea, drift is an uneven and relatively slow movement of the eye’s axis [135]. The iris controls the amount of light admitted into the retina by contracting and expanding the pupil [136]. The crystalline lens, located behind the pupil, receives and focuses the image on the retina [136]. A transparent biconvex structure of crystalline lens controls focusing and accepting the image on the retina located behind the pupil [67]. The retina is next in control of converting the received image or visual stimuli into electric signals, and it transmit the visual cortex through optic nerves and stimulates the occipital lobes of the brain [133,138].

3.1.2. Principles of Eye Tracking Technology

When detecting an eye, it is essential to differentiate the eye’s appearance because it can change depending on the angle that the user is observing [135]. Non-invasive Eye trackers rely on measurements of the eyes’ observable characteristics, including the pupil, iris-sclera boundary, and corneal reflection of nearby light sources [135,139]. As shown in Figure 11a,c, a technique based on corneal reflections measures the position of the corneal reflection of an infrared (IR) light reflected to the pupil that can track the gaze direction accurately [129]. The most widespread method for tracking eye movements is screen-based or uses video oculography, which includes reflection of iris and corneal or the pupil and corneal [136,140,141]. As Figure 11c illustrates, screen-based gaze tracking technologies are simple to use and set up for various applications [71,131,132,133,135]. The pupil and limbus information are the most often used features for tracking [138]. Tracking limbus, which is the boundary between sclera and iris, can trace horizontal eye rotations because of their contrast [67]. Monotonous limbus tracking systems have poor vertical precision because the eyelids partially obscure the iris [67,136]. The pupil is more challenging to track due to less contrasts between the pupil and the iris, but it can be distinguished when illuminated by an infrared light source on the camera axis with an on-axis light source [3,136,139]. This produces a “red-eye” effect image [3,136]. IR light sources are frequently used in eye trackers to increase the contrast between ocular features [138]. This is due to the fact that the IR is invisible and does not distract or interfere with the user when tracking [136,140]. With this unique characteristic, the eye tracker has been successfully integrated to head-mounted, wearable, and infrared-based gaze trackers [132,133,134,139]. The system consists of an optical camera, IR light sources, a CPU for data processing, and a screen or monitor to determine the subject’s eyes’ focus [135,139]. For accurate gaze location in a video-based system, high-resolution eye pictures are required [3,71,142]. Image processing is required to calculate the three-dimensional rotation angles of the eye, and these algorithms are used to determine the pupil location, cornea glint positions, and other properties of the eye [67,138,140], as shown in Figure 11d. The point and direction of gaze can be computed instantly by an eye tracker using low-cost cameras and image processing technology [131,138]. Recent developments in various machine learning techniques and algorithms have been made with an accuracy of under one degree [67,68,143]. Recent studies attempted to improve gaze data to predict accurate eye motions by presenting an end-to-end user-specific prediction model with convolution neural network (CNN) architectures [3,131], as shown in Figure 11e. With human–machine interfaces, the practical AI application begins with data collection, data cleaning, standardization, and then data interpretation using algorithms [144]. Deep-learning prediction models have overcome limiting factors in real-world conditions [145]. Hence, bioelectrical signals provide a natural and interactive way for humans and machines to connect and are extensively used in clinical diagnosis and rehabilitation with machine learning [144].
Figure 11. Eye movements and eye tracking technology. (a) Optical metric for human eye tracking (Reprinted with permission [146]. Copyright 2020, The Psychonomic Society, Inc.). (b) Eye foveal angle and human vision span (Reprinted under terms of the CC-BY license [131]. Copyright 2021, the Authors. Published by Elsevier Ltd.). (c) Eye grid and corneal light reflection in eye tracking systems (Reprinted under terms of the CC-BY license [135]. Copyright 2012, the Author. And Reprinted with permission [141]. Copyright 2014, Elsevier). (d) Illustration of relative cornea location between camera and eye, during eye rotation (Reprinted under terms of the CC-BY license [147]. Copyright 2021, the Authors. Published by MDPI). (e) Eye motion and gaze prediction model with CNN (Reprinted with permission [145]. Copyright 2022, Springer Nature). (f) Eye landmark estimation with image processing used for custom eye tracking solutions (Reprinted under terms of the CC-BY license [148]. Copyright 2021, the Authors. Published by MDPI).
Figure 11. Eye movements and eye tracking technology. (a) Optical metric for human eye tracking (Reprinted with permission [146]. Copyright 2020, The Psychonomic Society, Inc.). (b) Eye foveal angle and human vision span (Reprinted under terms of the CC-BY license [131]. Copyright 2021, the Authors. Published by Elsevier Ltd.). (c) Eye grid and corneal light reflection in eye tracking systems (Reprinted under terms of the CC-BY license [135]. Copyright 2012, the Author. And Reprinted with permission [141]. Copyright 2014, Elsevier). (d) Illustration of relative cornea location between camera and eye, during eye rotation (Reprinted under terms of the CC-BY license [147]. Copyright 2021, the Authors. Published by MDPI). (e) Eye motion and gaze prediction model with CNN (Reprinted with permission [145]. Copyright 2022, Springer Nature). (f) Eye landmark estimation with image processing used for custom eye tracking solutions (Reprinted under terms of the CC-BY license [148]. Copyright 2021, the Authors. Published by MDPI).
Biosensors 12 01039 g011

3.1.3. Employment of Eye Tracking Technologies for Applications

Eye tracking is used to implement where and when the user’s eyes are focused [3]. The eye movements, such as saccades, smooth pursuit, vergence, and vestibulo-ocular movements, indicate human perception and recognition [136,140]. An improved sensor technology expands the possibility of a comprehensive understanding of a user’s visual attention [149]. Recent studies show that viewing emotionally toned or visual stimuli information is observed with an increased pupil size of the eyes, along with other features such as fixation duration, and saccades [149,150]. within addition to the pupil’s diameter, other variables such as fixation length, saccades, and EOG signals can also be used to identify emotions [3,139]. While eye tracking signals and information indicates the user’s behaviors, the system is widely used in human–computer interaction (HCI) and usability application research studies [3,69,141,142]. Moreover, a customized and personalized eye tracking system increases accuracy and allows more application in the areas of cognitive science, clinical assessment, and contents creation with affective information [70,142,149,151]. The development of eye trackers allows accurate eye tracking data to be integrated into a conventional clinical measurement system for higher brain functions such as cognition, social behavior, and higher-level decision-making measured by eye movement [70,152]. Eye movement data have been used by several research groups to distinguish patients with mental disorders such as schizophrenia or to examine eye movement traits that have a genetic component in relation to finding the risk of autism before the emergence of verbal-behavioral abnormalities [70,152]. Another study proposed a framework for vehicle control, which anticipates a driver’s real-time intention over future maneuvers by analyzing the gaze and fixation patterns of the driver [134]. Image processing and eye landmark estimation are the primary eye tracking technology used for control, as shown in Figure 11f. The study proposed future work for designing a customizable intention prediction model on vehicle control using strategy synthesis [134,153]. Recent eye tracking advances will significantly impact next-generation application solutions [69]. We will discuss these issues and related work in Section 3.4.

3.2. Eye Gaze and Movement Estimation

Many pupil center identification techniques have been presented in recent years using conventional gaze tracking with optical metrics, image processing, and machine learning-based techniques [69,70]. Previously, conventional methods are separated into two categories: optical modeling and characteristics modeling [136]. Optical modeling is used to calculate optical information mathematically and to examine the location between the angles of the input vectors and the location of intersection, which is computed as the pupil’s center [3,71,136]. Characteristic modeling estimates the pupil’s center by segmenting the pupil’s edge depending on its features in terms of contrast, contour, or color [136,142].

3.2.1. Eye Tracking Techniques and Algorithm

Recent appearance-based algorithms [69,71,142] estimate the pupil center and feature with appearances when the subject focuses at a specific point in the scene. Since the method utilizes a computational approach, a large set of computing resources, including image dataset, processing power, and prior machine learning training, is required [154].
  • PCCR—Pupil Center-Corneal Reflection and Bright and dark Pupil Effect
The PCCR method is one of the eye gaze tracking techniques to measure the direction of the eye’s gaze [155]. As shown in Figure 12a, the vector distance between the corneal reflection and the pupil center within the camera image can be used to calculate the eye’s orientation angle [72]. The line that connects the center of the camera lens and the center of the corneal sphere is utilized to measure both the vertical and horizontal elements of the eye’s orientation angle [72]. In the PCCR method, a single corneal reflection is utilized [71,72]. In the PCCR method, the corneal surface approximates a perfectly spherical mirror; thus, the vector from the pupil’s center to the corneal reflection within the camera image is closely related to the direction in which the eye is looking [72,73,135]. If the head is kept stationary while the corneal surface rotates, the glint remains stationary. By contrasting the corneal reflection and the pupil center, the eye tracking system can identify the direction of the gaze [71]. The corneal reflection is visible when a user stares directly at the camera close to the center of the pupil image [69]. When the user switches their attention upward from the corneal reflection, the pupil center shifts upward. Similarly, when attention is shifted downward, the glint–pupil vector points, and the pupil center moves downward [69]. Figure 12b shows the proportion of images with an error less than each percentage value of the inter-pupillary distance (IPD) with recent work on pupil center detection with CNN [148]. The result shows the proportion of images with an error less than each percentage value of the inter-pupillary distance and proposes possible limitations in tracking accuracies [148]. Inter-pupillary distances are expressed as a percentage of distances from the accurate eye pupil landmarks as shown in Figure 12b. Figure 12c demonstrates how the IR light source illuminates the user’s eye and creates two different pupil images and effects: bright and dark pupil [136]. For pupil detection and tracking, both bright and dark pupil effects are used [135,136]. A brighter pupil image can be created when using light sources parallel to the axis of the camera [135]. Since most of the light enters the eye along the optical axis and most of the light reflects back from the retina, this will cause the pupil to be brighter, which is called the “bright pupil effect” [72]. If the pupil is illuminated by light sources that are not parallel to the optical axis of the eye, it appears to be darker than its surroundings [141]. Since multiple corneal reflections and a variety of off-axis light sources produce darker pupil images, it is called the “dark pupil effect” [136]. The location of the illumination source and the camera’s optical axis determines how these two types of images differ from each other. When the light source is aligned coaxially with the optical path of the camera, the bright pupil image is created [136]. The eye then reacts as a retroreflector as the light reflects off the retina and creates the bright pupil effect. The pupil appears dark if the light source is located outside the camera’s optical axis because the retina’s retro-reflection is located away and creates the dark pupil effect [135,150]. Pupil contour extraction is a primary aspect of both feature extraction methods [135]. Due to the low contrast at the boundary between the pupil and iris, the pupil is difficult to distinguish in the eye [72]. Figure 12d shows researcher’s attempt to apply pupil tracking using grey level imagery and digital overlay indicators either of dark or bright pupils instead of employing threshold difference photos [73,148]. By overlapping the pupil between the images, the pupil image can detect directional movements accurately [136].
  • Time to first Fixation and Object of interest
The time to first fixation (TTFF) measures the speed at which respondents fixed their attention on an area of interest. TTFF is a simple but essential eye tracking metric [156]. Fixations are eye movements that naturally correspond to the intention and desire to keep one’s attention focused on a use interest point [72]. Fixation stabilizes the retina over a still object of interest, and the gaze remains on a certain region for an extended period of time [135,156]. TTFF measures how long a respondent can fixate on a particular area of interest (AOI) after the stimulus has started. TTFF can indicate the horizontal movement of stimulus-driven search. Fixations, which are still periods that happen in-between saccades on static scenes, are the major periods during which visual experience and recognition occur. Fixations are distinguished by small, high-frequency drifts and microsaccades that oscillate. Since the responders initially prefer to focus the center of the image over its edges, more bias toward the center occurs. This prevents the scene from being blind by preventing the image from fading [72]. The size and color of the objects in the AOI impact measuring TTFF. More distinguishing characteristics are frequently the subject of faster emphasis [72].
Figure 12. Eye gaze and movement estimation. (a) Optical metric for pupil center corneal reflection (PCCR) eye-gaze-tracking technology (Reproduced under terms of the CC-BY license [72]. Copyright 2013, the Authors. Published by ProQuest LLC). (b) Pupil images indicating inter-pupillary distance with two different landmark proportion (Reprinted under terms of the CC-BY license [148]. Copyright 2021, the Authors. Published by MDPI). (c) Dark and bright pupil effect and IR light source correlation with eye (Reproduced under terms of the CC-BY license [145]. Copyright 2022, the Authors. Published by Springer Nature). (d) Conversion of grey imagery and digital overlay indicator for pupil. (e) Multiple object eye movement analysis with time to first fixation (TTFF) (Reprinted under terms of the CC-BY license [157]. Copyright 2022, the Authors. Published by MDPI).
Figure 12. Eye gaze and movement estimation. (a) Optical metric for pupil center corneal reflection (PCCR) eye-gaze-tracking technology (Reproduced under terms of the CC-BY license [72]. Copyright 2013, the Authors. Published by ProQuest LLC). (b) Pupil images indicating inter-pupillary distance with two different landmark proportion (Reprinted under terms of the CC-BY license [148]. Copyright 2021, the Authors. Published by MDPI). (c) Dark and bright pupil effect and IR light source correlation with eye (Reproduced under terms of the CC-BY license [145]. Copyright 2022, the Authors. Published by Springer Nature). (d) Conversion of grey imagery and digital overlay indicator for pupil. (e) Multiple object eye movement analysis with time to first fixation (TTFF) (Reprinted under terms of the CC-BY license [157]. Copyright 2022, the Authors. Published by MDPI).
Biosensors 12 01039 g012
TTFF measures how quickly a target is identified and quantifies the attention; the shorter the TTFF, the greater the target’s visual significance [73]. The fixation duration is between 200 and 600 milliseconds, and the image formed on the retinas alters continuously due to the eyes’ involuntary microsaccades. The fixation’s small eye movements are essential in order to recalibrate the eye’s neuron sensors [135]. A qualitative evaluation of the eye tracking system used to record eye movements is shown in Figure 12e, which includes fixation time, count, and TTFF for each AOI [143]. As seen in Figure 12e, eye fixations and their duration frequently correspond with the respondent’s interest aspects in an image [143,149]. Therefore, by separating such components, quantitative analysis can produce data with a higher and lower ranking and order [143]. For more precise eye movement analysis, researchers attempted to compare different eye tracking metrics by quantitative fixation time and recognition [73]. In addition to the TFF method, first fixation duration (FFD), total fixation duration (TFD), and fixation count (FC) methods were used to analyze the detailed eye fixation. The FFD measures how quickly an object is recognized upon content identification. The shorter the period, the more effective information is transmitted. Total fixation counts and TFD are the metrics of time and count used to represent the participant’s distribution of interest in the target area [73]. The bigger the metrics TFD and FC indicate, the longer a participant focuses attention on the target object, and the more distribution of interest on the target across the entire scene [73]. The gaze and fixation points are more influenced by our own interests and experiences or by a user’s predetermined task. Visual scenes are perceived differently by different individuals. Early psychological research discovered a correlation between eye movements and visual attention [149,154]. The finding allows researchers to establish a foundation for measuring eye movements by observing the point of gaze, fixation, and saccades [143]. Some studies attempt to present visual information and continuous interpretation whenever the user opens their eyes and moves [72].

3.2.2. Visualization and Analysis of Eye Movements

  • Gaze Mapping and IR Technology
The gaze mapping uses time series plot maps, which show the sequential, step-by-step process of users’ visual search techniques [72]. A sequence of uniformly sampled, raw gaze points is transformed into a series of duration saccades and fixations using the eye gaze and mapping application [158]. Continuous fixations are detected by examining sequences of gaze point measurements that remain relatively consistent [72]. If a new gaze point lies within a circular region, a fixation is extended to include a new gaze point by running the average of an ongoing fixation [72,159]. Gaze plot maps can be generated using eye tracking systems such as the Tobii (Tobii AB, Stockholm, Sweden) eye tracking systems [160]. As shown in Figure 13a, horizontal and vertical gaze plot maps were generated by detecting microsaccades. The figure shows the individual ongoing fixation on the gaze point and saccades point with traces. The system can represent fixation locations as proportional circles, colored according to time, and the sequence of saccades between fixations as line symbols [158]. A gaze plot map shows the eye movements of a single user for a single image trial, thus providing a graphic overview of each user’s visual search strategy [158]. As shown in Figure 13a, microsaccade movements can be detected with a trace line [135]. A saccade is the fast movement of the eye. Saccades serve as a mechanism for rapid eye movement and fixation. They most frequently shift from 1-degree to 40-degree visual angles and last 30 ms to 120 ms. Between each saccade, there is typically a 100 ms to 200 ms delay [148]. The point light sources that illuminate the eye are modeled as having omnidirectional radiation. IR light-emitting sources are the primary light sources. Each light source consists of an array that corresponds to a single-point light source at the array’s center. The direction and position of the light sources are defined in comparison to the global coordinate system due to them being modeled as point light sources [158]. To define the gaze direction vector in the global coordinate system and to integrate it with the characteristics of the scene’s objects, the point of gaze (POG) is computed as the intersection of the vector with the screen. A mathematical model is used to calculate the corneal curvature’s center using the concepts of refraction and reflection [135,136,158]. Studies show that effective ways for detecting the POG could be approximated by using statistical averages for all eye characteristics in a single camera and a single light source [131,135,150]. Both spherical surface and non-spherical cornea models are used to obtain gaze estimations and to personalize eye parameters from the surface of the cornea model [135,150].
  • Heatmaps
A heatmap is a type of visualization technique that displays the variation of gaze points. Compared with a fixation map, a heatmap is a simple approach for quickly discovering what in the image is most interesting or where is more attractive than others [161]. A fixation heat map, as seen in Figure 13b, presents an overview of a compound image, including fixation locations and times [158]. Fixation heat maps and heat maps in general are influenced by cartographic traditions such as isoline and surface mapping [136,158]. By using gaze plots and heat maps, the obtained gaze fixation data are then viewed and evaluated [162]. Most cognitive activity occurs during fixations and not saccades, although some components of the visual scene are perceptually processed during saccades [158]. Studies that employ eye tracking analyses frequently concentrate on the data from heat maps. Commercial tools such as Tobii software create fixation heat maps by using red for areas in the image [158,161]. The software continuously acquires where users were fixated for a short period as green color in the image [161]. A fixation heat map provides a composite graphic showing the locations and lengths of fixations, and the variations in color value indicate the intensity of the time period [158]. Heat maps are also quantified by the center point for easy custom applications. To obtain a general idea of what is qualified, the user often estimates the length and width of the entire object before estimating the distance from a spot on the object to the dimension [158,161,162]. This analysis allows visual and statistical approaches for attention mapping and spatiotemporal eye tracking.
  • Area of Interest (AOI) and Dwell Time
In many eye tracking studies, researchers aim to classify and analyze how a user looks at a specific part of a stimulus, such as an object in a scene, a particular word in a sentence, or another human being [140,142,161]. To fulfill this goal, researchers identify an area of interest that corresponds to a region of interest. The target AOI is used to pick out particular parts of a visual stimulus and extract metrics for those regions. Figure 13c shows specific AOI spots and summary statistics such as eye fixation, duration, and repeating. Recent eye tracking metric attempts to connect and pattern numerous AOI information to identify user’s preferences and ranks inside the image [143]. The AOI spend time, and the statistical data of TTFF reflect what is more interesting but needs further analysis because it is not related to whether the assessment of the AOI is negative or positive [143]. Fixation counts, on the other hand, are positively correlated, implying that people tend to pay more attention to the image’s more appealing aspects [143]. Researchers proposed multiple eye tracking systems that reflect an interactive environment of visualizing analysis [163,164] or analyzing eye-movement protocols and object findings [165,166]. Recent eye tracking software is able to process complex eye movement variables, generate personalized eye interactions with objects, and analyze detailed areas of interest [142]. Figure 13d shows the eye tracking analysis steps from eye movement variables (user’s gaze plot, velocity plot, and fixation plot) to an AOI model. On the rectangular stimulus grid, the model indicates multiple locations of the stimulus map with colored areas. The AOI model shows a 2D Gaussian distribution of fixation and shadow mapping. This model map provides the meaning of eye movement by projecting object overlay and defining semantic localizations on the viewer’s target areas [165,166]. This real-time tracking system provides data alignment and classification between physiological information and eye tracking information.
Figure 13. Eye tracking algorithms: (a) graphical overview of gaze tracking and mapping, overlapping data on user’s eye view (Reproduced under terms of the CC-BY license [167]. Copyright 2020, the Authors. Published by DOAJ). (b) Heatmap and dwell time analysis (Reprinted under terms of the CC-BY license [157]. Copyright 2022, the Authors. Published by MDPI). (c) Analyzed AOI from gaze and fixation data (Reprinted under terms of the CC-BY license [157]. Copyright 2022, the Authors. Published by MDPI). (d) Area of interest (AOI) model computed from ensembled eye tracking gaze, velocity, and fixation data (Reprinted under terms of the CC-BY license [166]. Copyright 2018, the Authors. Published by MDPI).
Figure 13. Eye tracking algorithms: (a) graphical overview of gaze tracking and mapping, overlapping data on user’s eye view (Reproduced under terms of the CC-BY license [167]. Copyright 2020, the Authors. Published by DOAJ). (b) Heatmap and dwell time analysis (Reprinted under terms of the CC-BY license [157]. Copyright 2022, the Authors. Published by MDPI). (c) Analyzed AOI from gaze and fixation data (Reprinted under terms of the CC-BY license [157]. Copyright 2022, the Authors. Published by MDPI). (d) Area of interest (AOI) model computed from ensembled eye tracking gaze, velocity, and fixation data (Reprinted under terms of the CC-BY license [166]. Copyright 2018, the Authors. Published by MDPI).
Biosensors 12 01039 g013
The amount of time a user spends viewing an AOI is known as the dwell time. Researchers typically determine the average dwell duration, which informs how long a user spends on average viewing an AOI [73,133]. The length of dwell time depends on the size and informational density of the AOI. The complexity of the user’s scene and situational awareness also impacts dwell time. The dwell time is affected by the movement properties of the stimulus [168]. Related studies on eye movements indicate that bottom-to-top or top-to-bottom types of attention are strongly integrated [156,168,169]. If bottom-to-top and top-to-bottom influences on attention are independent of one another, then a simultaneous view might be anticipated [168]. It would be reasonable to assume that the eye’s attention moves to position A on some trials and to location B on others if top-to-bottom attention intends to guide the eyes to location A and bottom-to-up attention is drawn to location B. However, the previous results show that the eyes often move to a position between A and B in a similar situation [169]. As a result, the dwell time becomes a significant statistic because it can reveal information about the cognitive eye movements and intentions of a user [156].

3.3. Eye Tracking Platforms

Studies using eye tracking systems have grown significantly in both quantity and variety over the past decade [170]. There has been many prior research studies that attempt to track user and interpret intentional eye movement [170]. The early phase of eye tracking research laid on utilizing prior observations on eye movement, perception, seeing, and looking [74]. A new advancement in optical device-based mobile eye=tracking systems presents the comprehensive tracking of nonintrusive human gaze points [73,74]. Recent developments in real-time computer devices have led to the emergence of mobile and stationary eye tracker platforms and these platforms changed daily lives.

3.3.1. Screen-Based

The majority of contemporary eye gaze-tracking devices track eye movement by processing visual information of the eyes digitally [135]. To track the POG, high-resolution eye images are necessary so that screen-based data acquisition system can be emerged [135,168]. In screen-based systems, infrared light is used to illuminate the eye, and produce glints for gaze direction estimation [74]. Moreover, the system analyzes the data of distance and experimental setups [168]. This methodology can be used for a wide range of evaluation methods that include measuring rotation, translation, pupil’s shape, location of the limbus, and corneal reflections by IR sources [135]. After the calibration of distance and light, the eye tracker data usually include the gaze position and conversion of screen coordination. The spatial accuracy of the eye is dependent on a range of motion in remote eye tracker platforms that use cameras to identify and track the eyes’ features [67]. In more recent systems, an estimated inaccuracy rate is approximately 1 degree or under in computing the optical target [67,135]. The POG for the projected scene that the user is viewing, is first calculated by the eye tracker using inputs from the scene. A correct understanding of the POG is the second prerequisite [135]. IR light sources are frequently functional elements in produced glints on the cornea that many commercial devices compute to track [154,160,170]. As seen in Figure 14a, a remote eye-gaze-tracking system consists of a CPU for data collections, an image camera, infrared light sources, and a screen for determining the subject’s eye focus, without a wired connection. The field of computer vision has long been active in the study of screen-based real-time eye recognition and tracking. The market currently offers a wide variety of gaze tracking hardware and software [171]. Recent research attempted to use an eye tracking system as part of PC accessories such as replacing mouse controllers. However, the eye tracking system has encountered limitations due to the placement of the camera or other devices near the screen [72,169,171].

3.3.2. Glasses Type

Eye tracking glasses are infrared sensors and camera-integrated portable devices that can be easy to wear [172], as shown in Figure 14b. This unique platform allows the user to move freely with the head unit’s discretion and freedom for natural head movement [160]. The system utilizes an approach of light reflection from the pupil and captures eye image using cameras. Then, extensive image processing is used to determine the position of the pupil [168] in eye trackers with high performances [74]. Wearable eye trackers can record the user’s vision as well as their surroundings and background noise [74]. Compared to screen-based systems, wearable eye trackers have advantages in recording a person’s gaze movement in 3D real view [168,172]. With these advantages, mass data studies can be presented by large datasets for new clinical findings and interactions [173]. Existing commercialized wearable glasses such as Tobii and SMI (Imotions Inc, Boston, USA) made previous research on wearable eye trackers daily human circumstances possible [171,174]. Recent experiments used glass-tupe eye trackers, enable recording eye movements in natural settings when humans are moving freely [160]. This platform makes conducting numerous studies that are not appropriate for screen-based eye trackers possible, such as detecting eye contractions [175], tracking 3D gaze behavior to obtain coordinate information [162], attempting to develop a battery-free tracker for ubiquitous computing platforms [174], and evaluating human and robot interactions with active behavior [176]. Eye tracking has been used in numerous specialized software programs that are developed for various study fields.

3.3.3. Virtual Reality (VR)

The usage of VR technology is growing across a range of applications, including immersive training, as well as in various fundamental research areas, such as cognitive science, visual perception, and psychology [177,178]. Eye tracking in VR analyzes multiple computations, including perceptual depth changes, vergence, and inter-pupillary changes [177,179]. In addition, distance virtual reality has the feature of a pre-calculated experiment setup allowing the subject to move freely in natural settings [179]. Thus, the VR platform can effectively integrate free eye movements and eye tracking methodology that suggests human-centered computing approach. Eye tracker in VR technology has been used to measure vergence eye movements and depth analysis [180,181]. Vergence, which is the simultaneous rotation of the eyes while viewing objects, is necessary for distance measurement, because of the perceived depth from monocular and binocular depth cues [179]. This depth perception requires fast and precise eye movements along with saccadic and fixation information [136,177]. Recent researchers used the vergence movement in response to depth changes combined with an eye tracker platform for precise clinical gaze direction studies, as demonstrated in Figure 14c [182]. In addition, VR’s optical information is more reliable than the glass type because of the unlimited geometries. It allows for the analysis of individuals’ behaviors with respect to the objects they looked at as well as the locations they looked at in relation to the behaviors they performed in the virtual environment [183]. With the technology of using gaze-detection technology, the device can measure gaze directions of nine in both eyes at the same time in VR eye platform [182]. This offers a dichoptic separation structure and allows the eye to integrate with specially designed screens, such as virtual reality environments. IPD, or the distance between the centers of the left and right eye pupils [179], is another crucial aspect of the human binocular visual system that changes with the object’s depth. By changing the IPD, this contour-based eye tracking data can distinguish between a verged and gaze distance from an object. The IPD value will become smaller if the eyes are focused at a close distance and it becomes larger if they are focused at long distances [179]. The IPD can track changes in perceptual depth in VR because it was recorded independently as eye tracker camera data [179]. By examining behavior, perception, and interest, an eye tracking integrated VR platform offers a chance to understand the human visual system [183]. One study that used an eye tracker in virtual reality to estimate perceived depth change is shown in Figure 14d. The top convergence case shows how the user’s eyes move their sight from bottom-to-top and top-to-bottom. This demonstrates how effective the platform’s angle computation is for measuring angles in 3D maps. With eye tracking and VR, it is feasible to compute a subject’s gaze in 3D space and to see where they are focusing while they are engaged in an activity. Unlike the real world where it is difficult to identify the regions of interest in 3D spaces and reconstruct the points when the regions were looked at, it is simple to do so in VR eye tracking [182]. It is possible that in the future, virtual and augmented reality glasses will widely use this 3D integration to simulate more realistic information [161].
Figure 14. Eye tracking platforms. (a) Glass-type eye tracker (Left: Reprinted with permission [184]. Copyright 2013, Springer-Verlag London, Right: Reprinted under terms of the CC-BY license [3]. Copyright 2020, the authors. Published by MDPI). (b) Screen-based eye tracker (Left: Reprinted with permission [174]. Copyright 2022 ACM, Inc., Right: Reprinted under terms of the CC-BY license [160]. Copyright 2020, the Authors. Published by Springer Nature). (c) VR hardware platform (Left: Reprinted under terms of the CC-BY license [182]. Copyright 2020, the Authors. Published by DOAJ., Upper right: Reproduced under terms of the CC-BY license [3]. Copyright 2020, the Authors. Published by MDPI, Lower right: Reproduced under terms of the CC-BY license [183]. Copyright 2020, the Authors. Published by DOAJ). (d) Perceptual depth change with eye trackers in virtual reality (Reprinted under terms of the CC-BY license [177]. Copyright 2021, the authors. Published by Frontiers).
Figure 14. Eye tracking platforms. (a) Glass-type eye tracker (Left: Reprinted with permission [184]. Copyright 2013, Springer-Verlag London, Right: Reprinted under terms of the CC-BY license [3]. Copyright 2020, the authors. Published by MDPI). (b) Screen-based eye tracker (Left: Reprinted with permission [174]. Copyright 2022 ACM, Inc., Right: Reprinted under terms of the CC-BY license [160]. Copyright 2020, the Authors. Published by Springer Nature). (c) VR hardware platform (Left: Reprinted under terms of the CC-BY license [182]. Copyright 2020, the Authors. Published by DOAJ., Upper right: Reproduced under terms of the CC-BY license [3]. Copyright 2020, the Authors. Published by MDPI, Lower right: Reproduced under terms of the CC-BY license [183]. Copyright 2020, the Authors. Published by DOAJ). (d) Perceptual depth change with eye trackers in virtual reality (Reprinted under terms of the CC-BY license [177]. Copyright 2021, the authors. Published by Frontiers).
Biosensors 12 01039 g014

3.4. Applications

Humans look at objects to receive visual information, which is then used to recognize events and objects to understand the situation [185]. Eye tracking is employed throughout many fields of psychology, medical examination, and cognitive science to study topics including oculomotor system development, attention, perception, disease diagnosis, diverse usability, and neurological findings [186]. Based on the different characteristics, this chapter introduces examples of cognitive-based medical and educational committing creative tasks by reading human intentions and assisting humans directly. Researchers can determine how effectively a person executes a task by examining attentional eye behaviors when communicating and working on a task. Additionally, observing the gaze allows estimating the individual’s cognitive states and understanding the user’s status [185]. These approaches play a significant role in an expanding number of applications [171]. Recent technological advancements with precise quantification, large data, and automated evaluation enable eye tracking applications available for disease diagnosis and assisting roles [70,133,153,154,171,187]. Multi-disciplinary research, including driving applications [134], pattern analysis with machine learning [71], human–computer interaction [67], and learning assistant and evaluation [151,178] are some examples of innovations.

3.4.1. Cognitive Behavior and Human Recognition

Cognitive behavior can be revealed in a variety of ways, such as changes in eye movement, action, the inability to recognize people and objects, and even the loss of memory [170,185]. Researchers attempt to identify cognitive strategies (e.g., problem-solving) and recognition skills when humans execute tasks to quantify and qualify human recognition [185,188]. Recent researchers presented a tangible medium of diverse applications with eye movement analysis. Referring to Table 4, researchers attempt to measure gaze movements [182], artifacts [186], and wayfinding [189] to assess the user’s intention and real-time eye movement analysis. Additionally, for precise patient diagnosis [133,153,190,191] and behavioral research [160,192], wearable and screen-based tracking devices guide how deep investigations on human gaze behavior in real-world scenarios. The “individual difference” has gained significant attention in the fields of target content and psychology in recent years. Researchers emphasized the study of cognition and recognition, which is influenced by cognitive psychology, in order to identify personal behavior in a diverse medium [188]. The recent study analyzed the eye movement data of student participants with different individual cognitive styles when they read and recognized contents, as shown in Figure 15a. The study explores the differences in visual attention among individuals with different cognitive behaviors by identifying unique eye movements during interaction and communication [188]. This non-invasive eye tracking platform allows the easy coordination of test designs and stimuli provision, making it possible to collect robust eye information data [171]. While eye movements were previously measured and recorded in the laboratory, an advanced eye tracking platform proposes the use of human statuses and cognitive monitoring. Precise data acquisition mechanisms such as disease status and monitoring of disease progression are well established [70,153,154].

3.4.2. Contents

Using eye tracking technology, a computer can precisely track the motion of the user’s gaze on a screen in real-time. People with physical limitations can use it as a natural and simple interface between themselves and outside technology, giving them a potential means of communication [193]. Recent attempts have been made to combine an eye tracker with traditional input methods, including a mouse, keyboard, controller, and speech recognition. Figure 15b shows one example where a robotic arm operates as a standalone hand for any point in the workspace by simply moving the user’s eyes [193]. This eye tracker integrated system shows HMI compatibility by creating intelligent space with the support of AR and VR. A system architecture for using an eye tracking interface can create new artistic mediums and content with an industrial robot, especially those targeting amputees, those with movement disorders, or people who need an assistive device for creative drawing and painting [193]. The researcher also presented enhanced imagery techniques for virtual web mapping for re-producing eye movements and human strategies [158]. It opens up the prospect of duplicating a person’s visual search approach via picture and strategy enhancements. The individual map identification analysis was possible by using precise eye analyses and qualitative and quantitative analytical methodologies [158].
Figure 15. Examples of eye tracker applications. (a) Applications in cognitive learning and human recognition (Left: reprinted under terms of the CC-BY license [194]. Copyright 2020, the authors. Published by MDPI, Right: reprinted with permission [184]. Copyright 2013, Springer-Verlag London). (b) Robot operation with eye tracking (Reproduced under terms of the CC-BY license [193]. Copyright 2021, the authors. Published by MDPI). (c) Eye tracker application in operative simulation and surgical operation (Left: reprinted with permission [195]. Copyright 2012, Springer-Verlag Berlin Heidelberg, Right: reprinted under terms of the CC-BY license [13]. Copyright 1969, the authors. Published by Elsevier Ltd.). (d) Interface for task-related eye tracker application (Left: reprinted with permission [196]. Copyright 2018, International Society of the Learning Sciences, Inc., Right: reproduced under terms of the CC-BY-NC-ND license [197]. Copyright 2019, the authors. Published by Elsevier Ltd.).
Figure 15. Examples of eye tracker applications. (a) Applications in cognitive learning and human recognition (Left: reprinted under terms of the CC-BY license [194]. Copyright 2020, the authors. Published by MDPI, Right: reprinted with permission [184]. Copyright 2013, Springer-Verlag London). (b) Robot operation with eye tracking (Reproduced under terms of the CC-BY license [193]. Copyright 2021, the authors. Published by MDPI). (c) Eye tracker application in operative simulation and surgical operation (Left: reprinted with permission [195]. Copyright 2012, Springer-Verlag Berlin Heidelberg, Right: reprinted under terms of the CC-BY license [13]. Copyright 1969, the authors. Published by Elsevier Ltd.). (d) Interface for task-related eye tracker application (Left: reprinted with permission [196]. Copyright 2018, International Society of the Learning Sciences, Inc., Right: reproduced under terms of the CC-BY-NC-ND license [197]. Copyright 2019, the authors. Published by Elsevier Ltd.).
Biosensors 12 01039 g015

3.4.3. Guided Operation

As a result of significant advancements in eye tracking technology in recent years, eye trackers are highlighted to assist human operational tasks and supportive devices [198]. Studies using wearable eye tracking technologies, such as those attached to light eyeglass frames, have been able to help novice surgeons throughout various laparoscopic procedures, as shown in Figure 15c [198]. Additionally, recent research has shown that analyzing gaze and velocity provides prospective surgical risk and can help assess tasks during simulated or real operative procedures. The eye tracker’s integration in surgery identifies the task’s complexity and measures the cognitive change of the user [195]. Research findings, including the current surgery study, have shown the sensitivity of gaze-based metrics as an assessment tool [195] and it demonstrates the possibility of assisting surgical education, surgical robot, and operation assessment in order to produce a more effective and efficient health care system [195]. As shown in Figure 15d, the researcher attempts to construct a tangible interface for task-related eye tracker applications [197,198]. The researcher shows remote construction tasks with mobile eye trackers. The user can assist by remote eye information input. Interface proposes expandability on collaborative tasks, complex processes, and design interventions for safe and remote operation tasks [195]. Research in driver’s assistant technology utilizes eye gaze and fixation patterns to anticipate driver’s future maneuvers [134]. Real-time intention tracking enables smart and collaborative advanced driver assistance systems (ADAS) that can aid drivers to overcome safety critical situations [134]. Recently, researchers presented the tangible medium of diverse applications. Referring to Table 4, researchers attempt to guide operations with eye tracking with respect to surgical skill and training [13] and also with respect to driver guidance systems during driving operations [158]. Applications for supportive guidance provide educational breakthrough opportunities [151,198] and attention evaluation tools for learning purposes [134].
Table 4. Application examples using eye tracking systems.
Table 4. Application examples using eye tracking systems.
ApplicationTarget UserPlatformDevice Info.Gaze DetectionProcessing MethodRefs.
Cognitive RecognitionAutismInfantScreen-basedISCAN, Inc.-Dark pupil Tracking
-Corneal reflection
Customized
(Eye position and fixation data identification with MATLAB)
[153]
Impact of slippageAny mobile userEyeglasses-Tobii
-SMI
-Pupil-labs
-PCCR
-Dark pupil Tracking
-Corneal
reflection
Commercial
(Tobii Pro: Process with Two cameras and Six glints per eye, iViewETG: Three makers tracking from SMI)
Customized
(EyeRecToo: Open-source pupil Grip algorithm)
[160]
The Effects of Mobile Phone Use on Gaze Behavior in Stair ClimbingAny mobile userEyeglassesTobii Glasses 2.0-PCCR
-Corneal
reflection
Customized
(Frame by frame image classification with MATLAB)
[192]
Diagnosis and Measurement of StrabismusChildrenScreen-basedEyeTracker 4C-PCCR
-Corneal reflection
Customized
(EyeSwift: IR and
Image Processing)
[190]
Measurement of nine gaze directionsPatient with strabismusScreen-basedOMD-Pupil and corneal reflectionCustomized
(Hess screen test)
[182]
ADHDADHD PatientScreen-basedEyelink 1000-Dark Pupil Tracking
-Corneal Reflection
Customized
(MOXO-dCPT Stimuli,
AOI and relative gaze analysis)
[133]
ADHDADHD PatientScreen-basedTX300-PCCRCustomized
(Logistic regression Classification model for pupil analysis)
[191]
Measurement of pupil size artefact (PSA)Any mobile userScreen-basedEyeLink 1000 Plus,
Tobii Pro Spectrum (Glasses 2)
-Dark Pupil Tracking,
-Corneal Reflection
Commercial
(Tobii Pro: Process with Two cameras and Six glints per eye)
[186]
A comparison study of EXITs design in a wayfinding systemAny mobile userEyeglassesTobii Glasses-PCCRCustomized
(Custom IR Marker, AOI Analysis)
[73]
Contents CreationArtistic DrawingGraphic user include people with diabilitiesScreen-basedTobii Eye Tracker 4C-PCCRCommercial
(Tobii Pro: Process with Two cameras and Six glints per eye)
[193]
To Enhance Imagery Base mapsMap UserScreen-basedTobii Pro Spectrum-PCCRCommercial
(Tobii pro: Process with Two cameras and Six glints per eye)
Customized
(AOI statistic and heatmap)
[158]
Guided operation supportive guidanceSemi-Autonomous VehiclesDriverScreen-basedfaceLAB-Pupil TrackingCustomized
(Markov model, Pattern analysis)
[134]
To capture joint visual attentionCo-located collaborative learning groupsEyeglassesSMI ETG-Pupil/CR
-Dark pupil tracking
Commercial
(Fiducial tracking engine)
[196]
Human robot interaction for laparoscopic surgerysurgeonScreen-basedTobii 1750-PCCRCustomized
(Hidden Markov model)
[13]
Surgical Skills Assessment and Training in UrologysurgeonEyeglasses + VRTobii Glasses 2.0-PCCRCommercial
(UroMentor simulator)
[198]
Architectural EducationOrdinary Users, Students and LecturersEyeglasses
Screen-based
VR
-Tobii
-SMI
- PCCR
- pupil/CR, dark pupil tracking
Commercial
(BeGaze software)
[151]
EducationStudentVRSelf-made “VR eye tracker”-Record the condition of the pupils via infrared LED lightCustomized
(Analysis of regions of interest)
[188]

4. Discussion

We summarized device technologies and HMI applications in eye tracking. Previous EOG systems were bulky and used many wires with Ag/AgCl gel electrodes. The electrode can record signals in high fidelity, but it has issues, including poor breathability, skin irritation, and a loss of performance during long-term usage. Moreover, EOG signals have difficulties when detecting the detailed modality of input signals, so there is a limit to classifying various eye angles and eyes. Due to these limited capabilities, the HMI shown in previous studies only applies to simple motion control with a finite number of directions. Screen-based eye trackers were also used for HMI, but they required complex eye movements that caused extreme eye fatigue. Recent advances in electrophysiological signal monitoring and manufacturing of wearable platforms and various types of electrodes have enabled EOG monitoring systems with comfortable wearable EOG devices to detect eye movements without skin issues. Many research groups have introduced biocompatible electrodes to solve the skin issue of conventional gel electrodes. Various biocompatible electrodes were introduced, such as hydrogel, fiber, polymer, and micro-patterned types. These biocompatible electrodes enable long-term EOG measurements and multiple types of wearable platforms, such as glasses, face masks, headbands, and earplugs. Video monitoring systems have also been improved for eye tracking used in HMI applications.
Based on recent technological advances, HMI applications via EOG show the potential for healthcare and virtual world development. We want to introduce two highly influential potential future usage. An instance of a healthcare application is the diagnosis of blepharospasm, which is the abnormal contraction of the eyelid muscles. Currently, there is no simple quantitative system for accurate and objective diagnosis of blepharospasm. To diagnose blepharospasm, EMG and EOG signals could be measured simultaneously with biocompatible electrodes. However, only using biopotentials is not easy to detect all clinical symptoms, such as the frequency of blinking, the duration of eye closures during spasms, and the combinations of blinking and spasms. In this case, the accurate diagnosis of blepharospasm will be possible via biopotentials with a camera-based eye tracker. Camera-based eye trackers can capture tiny eye movements that are difficult to catch via biopotentials. Moreover, biopotentials can measure hidden eye movements that the camera-based eye tracker cannot record. With this new mechanism, it is possible to increase the accuracy of the diagnosis of blepharospasm while supplementing each other’s limitations.
Second is an integration of EOG technology and other biopotentials in the virtual world, such as a metaverse. The current virtual world platforms require complex user input to enjoy applications. For example, when moving the user’s location, users have to press and indicate the location where the user wants to go. For the interaction choice, both hands are the majority input source that requires clicking and moving hands. In order to expand the application capability of complex user inputs, biopotential signals can create new commands input. For people with disabilities who cannot move their muscles such as their hands or mouth, they can freely move in the virtual world with only EEG and EOG. Recent EEG technologies proposed reliable select mechanisms for long words and sentences. Through EEG technologies, the user can communicate with other users without speaking. In addition, EOG and EEG data in specific frequencies can track the user’s gaze and provides an additional command input mechanism that corresponds to the user’s additional behavior. Simple movements such as up, down, left, and right can be performed quickly by analyzing the EOG signal in the virtual world.
Eye tracking is in its early stage. Many studies and industries show the potential of ultimate HMI applications and next-generation diagnosis via recognition, sensing, and analysis. Researchers propose scrutinizing human intentions and integrating those intentions to actuate the task, suggest decision guidelines, and assist during operations. However, eye tracker technology is an insufficient data acquisition system for executing advanced and complex structures such as exoskeletons. Moreover, the limitations of eye tracking measurements using optical devices prevent it from becoming a primary parameter for clinical-level diagnoses. However, recent machine learning and advanced computing technology have shown the possibility of designing personalized profile modeling. Advanced technology makes the eye tracker suitable for various HMI applications, for new medical guidelines, or for understanding of a person’s cognitive state. Nevertheless, many aspects of eye tracking must be further developed to realize its applications in everyday life in terms of eye tracking usability and opportunities. Moreover, EOG data and eye tracker gaze data can be integrated with machine learning or each other for scalability and performance. We believe that the consideration of these challenges will provide broad scalability to further develop eye tracking for practical applications.

5. Conclusions

This paper summarizes various wearable EOG devices and eye-tracking systems in terms of material properties, sensing performances, and platform technologies. Specifically, we outline recent developments in biocompatible materials, manufacturing technologies, signal-processing strategies, integrated systems, and applications in detecting eye movements. Advances in wearable technologies and video monitoring systems for electrophysiological signal monitoring enabled various human–machine interfaces. The unique properties of flexible soft electrodes offer enhanced skin compatibility, long-term stability, and increased skin–electrode contact. Overall, soft material-enabled electronics and camera-based high-resolution systems are up-and-coming tools for accurately detecting eye movements and persistent human–machine interfaces.

Author Contributions

S.B., Y.J.L., K.R.K. and W.-H.Y. conceived and designed the materials in the paper; S.B., Y.J.L., K.R.K. and W.-H.Y. conducted reviews of materials, signal processing technologies, and applications; S.B., Y.J.L., K.R.K., J.-H.K. and W.-H.Y. designed figures and reviewed all sections; all authors wrote the paper together. All authors have read and agreed to the published version of the manuscript.

Funding

We acknowledge the support of the IEN Center Grant from the Georgia Tech Institute for Electronics and Nanotechnology, and the support of the Korea Medical Device Development Fund (Ministry of Science and ICT, Ministry of Trade, Industry and Energy, Ministry of Health and Welfare, Ministry of Food and Drug Safety; Project Number: 1711138229, KMDF_PR_20200901_0124). Electronic devices in this work were fabricated at the Institute for Electronics and Nanotechnology, a member of the National Nanotechnology Coordinated Infrastructure, which is supported by the National Science Foundation (grant ECCS-2025462). J.-H.K. acknowledges the partial support of the National Science Foundation (CBET-1707056).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviation

ADHDattention deficit hyperactivity disorder
AJPaerosol jet printing
AOIarea of Interest
BCIbrain–computer interface
CEFsconductive elastomeric filaments
CNNconvolution neural network
CNTscarbon nanotubes
CVDchemical vapor deposition
DIdeionized
DWTdiscrete wavelet transform
ECGelectrocardiogram
EEGelectroencephalogram
EMGelectromyography
EOGelectrooculograms
EPEDsepidermal paper-based electronic devices
FCfixation count
FFDfirst fixation duration
GOgraphene oxide
HCIhuman–computer interaction
HMIhuman–machine interface
HPChydroxypropyl cellulose
IPDinter-pupillary distance
IRinfrared
LDAlinear discriminant analysis
MRmixed reality
PCBsprinted circuit boards
PCCRpupil center-corneal reflection
PDMSpolydimethylsiloxane
PMMApoly methyl methacrylate
POGpoint of gaze
PVApolyvinyl alcohol
rGOreduced graphene oxide
SNRsignal-to-noise ratio
SVMsupporting vector machines
TFDtotal fixation duration
TTFFtime to first fixation
UVultraviolet
VRvirtual reality

References

  1. Mishra, S.; Norton, J.J.S.; Lee, Y.; Lee, D.S.; Agee, N.; Chen, Y.; Chun, Y.; Yeo, W.H. Soft, conformal bioelectronics for a wireless human-wheelchair interface. Biosens. Bioelectron. 2017, 91, 796–803. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  2. Mahmood, M.; Kim, N.; Mahmood, M.; Kim, H.; Kim, H.; Rodeheaver, N.; Sang, M.; Yu, K.J.; Yeo, W.-H. VR-enabled portable brain-computer interfaces via wireless soft bioelectronics. Biosens. Bioelectron. 2022, 210, 114333. [Google Scholar] [CrossRef] [PubMed]
  3. Lim, J.Z.; Mountstephens, J.; Teo, J. Emotion recognition using eye-tracking: Taxonomy, review and current challenges. Sensors 2020, 20, 2384. [Google Scholar] [CrossRef]
  4. Choudhari, A.M.; Porwal, P.; Jonnalagedda, V.; Mériaudeau, F. An electrooculography based human machine interface for wheelchair control. Biocybern. Biomed. Eng. 2019, 39, 673–685. [Google Scholar] [CrossRef]
  5. Lee, J.H.; Kim, H.; Hwang, J.-Y.; Chung, J.; Jang, T.-M.; Seo, D.G.; Gao, Y.; Lee, J.; Park, H.; Lee, S. 3D printed, customizable, and multifunctional smart electronic eyeglasses for wearable healthcare systems and human–machine Interfaces. ACS Appl. Mater. Interfaces 2020, 12, 21424–21432. [Google Scholar] [CrossRef] [PubMed]
  6. Goverdovsky, V.; Von Rosenberg, W.; Nakamura, T.; Looney, D.; Sharp, D.J.; Papavassiliou, C.; Morrell, M.J.; Mandic, D.P. Hearables: Multimodal physiological in-ear sensing. Sci. Rep. 2017, 7, 1–10. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  7. Miettinen, T.; Myllymaa, K.; Hukkanen, T.; Töyräs, J.; Sipilä, K.; Myllymaa, S. A new solution to major limitation of HSAT: Wearable printed sensor for sleep quantification and comorbid detection. Sleep Med. 2019, 64, S270–S271. [Google Scholar] [CrossRef]
  8. Yeo, W.H.; Kim, Y.S.; Lee, J.; Ameen, A.; Shi, L.; Li, M.; Wang, S.; Ma, R.; Jin, S.H.; Kang, Z. Multifunctional epidermal electronics printed directly onto the skin. Adv. Mater. 2013, 25, 2773–2778. [Google Scholar] [CrossRef]
  9. Wang, L.; Zhang, M.; Yang, B.; Ding, X.; Tan, J.; Song, S.; Nie, J. Flexible, robust, and durable aramid fiber/CNT composite paper as a multifunctional sensor for wearable applications. ACS Appl. Mater. Interfaces 2021, 13, 5486–5497. [Google Scholar] [CrossRef]
  10. Kabiri Ameri, S.; Ho, R.; Jang, H.; Tao, L.; Wang, Y.; Wang, L.; Schnyer, D.M.; Akinwande, D.; Lu, N. Graphene electronic tattoo sensors. ACS Nano 2017, 11, 7634–7641. [Google Scholar] [CrossRef]
  11. Ameri, S.K.; Kim, M.; Kuang, I.A.; Perera, W.K.; Alshiekh, M.; Jeong, H.; Topcu, U.; Akinwande, D.; Lu, N. Imperceptible electrooculography graphene sensor system for human–robot interface. Npj 2d Mater. Appl. 2018, 2, 1–7. [Google Scholar] [CrossRef] [Green Version]
  12. Chen, C.-H.; Monroy, C.; Houston, D.M.; Yu, C. Using head-mounted eye-trackers to study sensory-motor dynamics of coordinated attention. Prog. Brain Res. 2020, 254, 71–88. [Google Scholar] [PubMed]
  13. Fujii, K.; Gras, G.; Salerno, A.; Yang, G.Z. Gaze gesture based human robot interaction for laparoscopic surgery. Med. Image. Anal. 2018, 44, 196–214. [Google Scholar] [CrossRef]
  14. Soler-Dominguez, J.L.; Camba, J.D.; Contero, M.; Alcañiz, M. A proposal for the selection of eye-tracking metrics for the implementation of adaptive gameplay in virtual reality based games. In Proceedings of the International Conference on Virtual, Augmented and Mixed Reality, Vancouver, BC, Canada, 9–14 July 2017; pp. 369–380. [Google Scholar]
  15. Ou, W.-L.; Kuo, T.-L.; Chang, C.-C.; Fan, C.-P. Deep-learning-based pupil center detection and tracking technology for visible-light wearable gaze tracking devices. Appl. Sci. 2021, 11, 851. [Google Scholar] [CrossRef]
  16. Barea, R.; Boquete, L.; Mazo, M.; López, E. System for assisted mobility using eye movements based on electrooculography. IEEE Trans Neural Syst Rehabil Eng 2002, 10, 209–218. [Google Scholar] [CrossRef] [PubMed]
  17. Barea, R.; Boquete, L.; Mazo, M.; López, E. Wheelchair guidance strategies using EOG. J. Intell. Robot. Syst. 2002, 34, 279–299. [Google Scholar] [CrossRef]
  18. Krishnan, A.; Rozylowicz, K.F.; Weigle, H.; Kelly, S.; Grover, P. Hydrophilic Conductive Sponge Electrodes For EEG Monitoring; Sandia National Lab.(SNL-NM): Albuquerque, NM, USA, 2020. [Google Scholar]
  19. Xu, T.; Li, X.; Liang, Z.; Amar, V.S.; Huang, R.; Shende, R.V.; Fong, H. Carbon nanofibrous sponge made from hydrothermally generated biochar and electrospun polymer nanofibers. Adv. Fiber Mater. 2020, 2, 74–84. [Google Scholar] [CrossRef] [Green Version]
  20. Lin, S.; Liu, J.; Li, W.; Wang, D.; Huang, Y.; Jia, C.; Li, Z.; Murtaza, M.; Wang, H.; Song, J. A flexible, robust, and gel-free electroencephalogram electrode for noninvasive brain-computer interfaces. Nano Lett. 2019, 19, 6853–6861. [Google Scholar] [CrossRef]
  21. Acar, G.; Ozturk, O.; Golparvar, A.J.; Elboshra, T.A.; Böhringer, K.; Yapici, M.K. Wearable and flexible textile electrodes for biopotential signal monitoring: A review. Electronics 2019, 8, 479. [Google Scholar] [CrossRef] [Green Version]
  22. Lee, J.; Pyo, S.; Kwon, D.S.; Jo, E.; Kim, W.; Kim, J. Ultrasensitive strain sensor based on separation of overlapped carbon nanotubes. Small 2019, 15, 1805120. [Google Scholar] [CrossRef]
  23. Wan, X.; Zhang, F.; Liu, Y.; Leng, J. CNT-based electro-responsive shape memory functionalized 3D printed nanocomposites for liquid sensors. Carbon 2019, 155, 77–87. [Google Scholar] [CrossRef]
  24. Sang, Z.; Ke, K.; Manas-Zloczower, I. Design strategy for porous composites aimed at pressure sensor application. Small 2019, 15, 1903487. [Google Scholar] [CrossRef] [PubMed]
  25. Zhang, S.; Sun, K.; Liu, H.; Chen, X.; Zheng, Y.; Shi, X.; Zhang, D.; Mi, L.; Liu, C.; Shen, C. Enhanced piezoresistive performance of conductive WPU/CNT composite foam through incorporating brittle cellulose nanocrystal. Chem. Eng. J. 2020, 387, 124045. [Google Scholar] [CrossRef]
  26. Zhou, C.-G.; Sun, W.-J.; Jia, L.-C.; Xu, L.; Dai, K.; Yan, D.-X.; Li, Z.-M. Highly stretchable and sensitive strain sensor with porous segregated conductive network. ACS Appl. Mater. Interfaces 2019, 11, 37094–37102. [Google Scholar] [CrossRef]
  27. Cai, J.-H.; Li, J.; Chen, X.-D.; Wang, M. Multifunctional polydimethylsiloxane foam with multi-walled carbon nanotube and thermo-expandable microsphere for temperature sensing, microwave shielding and piezoresistive sensor. Chem. Eng. J. 2020, 393, 124805. [Google Scholar] [CrossRef]
  28. Lee, S.M.; Byeon, H.J.; Lee, J.H.; Baek, D.H.; Lee, K.H.; Hong, J.S.; Lee, S.-H. Self-adhesive epidermal carbon nanotube electronics for tether-free long-term continuous recording of biosignals. Sci. Rep. 2014, 4, 1–9. [Google Scholar] [CrossRef] [Green Version]
  29. Cai, Y.; Shen, J.; Ge, G.; Zhang, Y.; Jin, W.; Huang, W.; Shao, J.; Yang, J.; Dong, X. Stretchable Ti3C2T x MXene/carbon nanotube composite based strain sensor with ultrahigh sensitivity and tunable sensing range. ACS Nano 2018, 12, 56–62. [Google Scholar] [CrossRef] [PubMed]
  30. Mishra, S.; Kim, Y.-S.; Intarasirisawat, J.; Kwon, Y.-T.; Lee, Y.; Mahmood, M.; Lim, H.-R.; Herbert, R.; Yu, K.J.; Ang, C.S. Soft, wireless periocular wearable electronics for real-time detection of eye vergence in a virtual reality toward mobile eye therapies. Sci. Adv. 2020, 6, eaay1729. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  31. Kosmyna, N.; Morris, C.; Sarawgi, U.; Nguyen, T.; Maes, P. AttentivU: A wearable pair of EEG and EOG glasses for real-time physiological processing. In Proceedings of the 2019 IEEE 16th International Conference on Wearable and Implantable Body Sensor Networks (BSN), Chicago, IL, USA, 19–22 May 2019; pp. 1–4. [Google Scholar]
  32. Kosmyna, N.; Morris, C.; Nguyen, T.; Zepf, S.; Hernandez, J.; Maes, P. AttentivU: Designing EEG and EOG compatible glasses for physiological sensing and feedback in the car. In Proceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Utrecht, The Netherlands, 21–25 September 2019; pp. 355–368. [Google Scholar]
  33. Kosmyna, N.; Sarawgi, U.; Maes, P. AttentivU: Evaluating the feasibility of biofeedback glasses to monitor and improve attention. In Proceedings of the 2018 ACM International Joint Conference and 2018 International Symposium on Pervasive and Ubiquitous Computing and Wearable Computers, Singapore, 8–12 October 2018; pp. 999–1005. [Google Scholar]
  34. Hosni, S.M.; Shedeed, H.A.; Mabrouk, M.S.; Tolba, M.F. EEG-EOG based virtual keyboard: Toward hybrid brain computer interface. Neuroinformatics 2019, 17, 323–341. [Google Scholar] [CrossRef]
  35. Vourvopoulos, A.; Niforatos, E.; Giannakos, M. EEGlass: An EEG-eyeware prototype for ubiquitous brain-computer interaction. In Proceedings of the Adjunct proceedings of the 2019 ACM international joint conference on pervasive and ubiquitous computing and proceedings of the 2019 ACM international symposium on wearable computers, London, UK, 9–13 September 2019; pp. 647–652. [Google Scholar]
  36. Diaz-Romero, D.J.; Rincón, A.M.R.; Miguel-Cruz, A.; Yee, N.; Stroulia, E. Recognizing emotional states with wearables while playing a serious game. IEEE Trans. Instrum. Meas. 2021, 70, 1–12. [Google Scholar] [CrossRef]
  37. Pérez-Reynoso, F.D.; Rodríguez-Guerrero, L.; Salgado-Ramírez, J.C.; Ortega-Palacios, R. Human–Machine Interface: Multiclass Classification by Machine Learning on 1D EOG Signals for the Control of an Omnidirectional Robot. Sensors 2021, 21, 5882. [Google Scholar] [CrossRef] [PubMed]
  38. Lin, C.T.; Jiang, W.L.; Chen, S.F.; Huang, K.C.; Liao, L.D. Design of a Wearable Eye-Movement Detection System Based on Electrooculography Signals and Its Experimental Validation. Biosensors 2021, 11, 343. [Google Scholar] [CrossRef] [PubMed]
  39. Díaz, D.; Yee, N.; Daum, C.; Stroulia, E.; Liu, L. Activity classification in independent living environment with JINS MEME Eyewear. In Proceedings of the 2018 IEEE International Conference on Pervasive Computing and Communications (PerCom), Athens, Greece, 19–23 March 2018; pp. 1–9. [Google Scholar]
  40. Liang, S.-F.; Kuo, C.-E.; Lee, Y.-C.; Lin, W.-C.; Liu, Y.-C.; Chen, P.-Y.; Cherng, F.-Y.; Shaw, F.-Z. Development of an EOG-based automatic sleep-monitoring eye mask. IEEE Trans. Instrum. Meas. 2015, 64, 2977–2985. [Google Scholar] [CrossRef]
  41. Shustak, S.; Inzelberg, L.; Steinberg, S.; Rand, D.; Pur, M.D.; Hillel, I.; Katzav, S.; Fahoum, F.; De Vos, M.; Mirelman, A. Home monitoring of sleep with a temporary-tattoo EEG, EOG and EMG electrode array: A feasibility study. J. Neural Eng. 2019, 16, 026024. [Google Scholar] [CrossRef] [PubMed]
  42. Garcia, F.; Junior, J.J.A.M.; Freitas, M.L.B.; Stevan Jr, S.L. Wearable Device for EMG and EOG acquisition. J. Appl. Instrum. Control 2019, 6, 30–35. [Google Scholar] [CrossRef]
  43. Nakamura, T.; Alqurashi, Y.D.; Morrell, M.J.; Mandic, D.P. Automatic detection of drowsiness using in-ear EEG. In Proceedings of the 2018 International Joint Conference on Neural Networks (IJCNN), Rio de Janeiro, Brazil, 8–13 July 2018; pp. 1–6. [Google Scholar]
  44. Nguyen, A.; Alqurashi, R.; Raghebi, Z.; Banaei-Kashani, F.; Halbower, A.C.; Dinh, T.; Vu, T. In-ear biosignal recording system: A wearable for automatic whole-night sleep staging. In Proceedings of the 2016 Workshop on Wearable Systems and Applications, Singapore, 30 June 2016; pp. 19–24. [Google Scholar]
  45. Nguyen, A.; Alqurashi, R.; Raghebi, Z.; Banaei-Kashani, F.; Halbower, A.C.; Vu, T. A lightweight and inexpensive in-ear sensing system for automatic whole-night sleep stage monitoring. In Proceedings of the 14th ACM Conference on Embedded Network Sensor Systems CD-ROM, Stanford, CA, USA, 14–16 November 2016; pp. 230–244. [Google Scholar]
  46. Manabe, H.; Fukumoto, M.; Yagi, T. Conductive rubber electrodes for earphone-based eye gesture input interface. Pers. Ubiquitous Comput. 2015, 19, 143–154. [Google Scholar] [CrossRef] [Green Version]
  47. Wang, K.-J.; Zhang, A.; You, K.; Chen, F.; Liu, Q.; Liu, Y.; Li, Z.; Tung, H.-W.; Mao, Z.-H. Ergonomic and Human-Centered Design of Wearable Gaming Controller Using Eye Movements and Facial Expressions. In Proceedings of the 2018 IEEE International Conference on Consumer Electronics-Taiwan (ICCE-TW), Taichung, Taiwan, 19–21 May 2018; pp. 1–5. [Google Scholar]
  48. English, E.; Hung, A.; Kesten, E.; Latulipe, D.; Jin, Z. EyePhone: A mobile EOG-based human-computer interface for assistive healthcare. In Proceedings of the 2013 6th International IEEE/EMBS Conference on Neural Engineering (NER), San Diego, CA, USA, 6–8 November 2013; pp. 105–108. [Google Scholar]
  49. Jadhav, N.K.; Momin, B.F. An Approach Towards Brain Controlled System Using EEG Headband and Eye Blink Pattern. In Proceedings of the 2018 3rd International Conference for Convergence in Technology (I2CT), Pune, India, 6–8 April 2018; pp. 1–5. [Google Scholar]
  50. Minati, L.; Yoshimura, N.; Koike, Y. Hybrid control of a vision-guided robot arm by EOG, EMG, EEG biosignals and head movement acquired via a consumer-grade wearable device. Ieee Access 2016, 4, 9528–9541. [Google Scholar] [CrossRef]
  51. Heo, J.; Yoon, H.; Park, K.S. A Novel Wearable Forehead EOG Measurement System for Human Computer Interfaces. Sensors 2017, 17, 1485. [Google Scholar] [CrossRef] [Green Version]
  52. Wei, L.; Hu, H.; Yuan, K. Use of forehead bio-signals for controlling an intelligent wheelchair. In Proceedings of the 2008 IEEE International Conference on Robotics and Biomimetics, Bangkok, Thailand, 22–25 February 2009; pp. 108–113. [Google Scholar]
  53. Shyamkumar, P.; Oh, S.; Banerjee, N.; Varadan, V.K. A wearable remote brain machine interface using smartphones and the mobile network. Adv. Sci. Technol. 2013, 85, 11–16. [Google Scholar] [CrossRef]
  54. Roh, T.; Song, K.; Cho, H.; Shin, D.; Yoo, H.-J. A wearable neuro-feedback system with EEG-based mental status monitoring and transcranial electrical stimulation. IEEE Trans. Biomed. Circuits Syst. 2014, 8, 755–764. [Google Scholar] [CrossRef]
  55. Tabal, K.M.; Cruz, J.D. Development of low-cost embedded-based electrooculogram blink pulse classifier for drowsiness detection system. In Proceedings of the 2017 IEEE 13th International Colloquium on Signal Processing & its Applications (CSPA), Penang, Malaysia, 10–12 March 2017; pp. 29–34. [Google Scholar]
  56. Ramasamy, M.; Oh, S.; Harbaugh, R.; Varadan, V. Real Time Monitoring of Driver Drowsiness and Alertness by Textile Based Nanosensors and Wireless Communication Plat-form. 2014. Available online: https://efermat.github.io/articles/Varadan-ART-2014-Vol1-Jan_Feb-004/ (accessed on 19 October 2022).
  57. Golparvar, A.J.; Yapici, M.K. Graphene smart textile-based wearable eye movement sensor for electro-ocular control and interaction with objects. J. Electrochem. Soc. 2019, 166, B3184. [Google Scholar] [CrossRef]
  58. Arnin, J.; Anopas, D.; Horapong, M.; Triponyuwasi, P.; Yamsa-ard, T.; Iampetch, S.; Wongsawat, Y. Wireless-based portable EEG-EOG monitoring for real time drowsiness detection. In Proceedings of the 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Osaka, Japan, 3–7 July 2013; pp. 4977–4980. [Google Scholar]
  59. Chen, C.; Zhou, P.; Belkacem, A.N.; Lu, L.; Xu, R.; Wang, X.; Tan, W.; Qiao, Z.; Li, P.; Gao, Q. Quadcopter robot control based on hybrid brain–computer interface system. Sens. Mater. 2020, 32, 991–1004. [Google Scholar] [CrossRef] [Green Version]
  60. López, A.; Fernández, M.; Rodríguez, H.; Ferrero, F.; Postolache, O. Development of an EOG-based system to control a serious game. Measurement 2018, 127, 481–488. [Google Scholar] [CrossRef]
  61. Wan, S.; Wu, N.; Ye, Y.; Li, S.; Huang, H.; Chen, L.; Bi, H.; Sun, L. Highly Stretchable Starch Hydrogel Wearable Patch for Electrooculographic Signal Detection and Human–Machine Interaction. Small Struct. 2021, 2, 2100105. [Google Scholar] [CrossRef]
  62. O’Bard, B.; Larson, A.; Herrera, J.; Nega, D.; George, K. Electrooculography based iOS controller for individuals with quadriplegia or neurodegenerative disease. In Proceedings of the 2017 IEEE International Conference on Healthcare Informatics (ICHI), Park City, UT, USA, 23–26 August 2017; pp. 101–106. [Google Scholar]
  63. Jiao, Y.; Deng, Y.; Luo, Y.; Lu, B.-L. Driver sleepiness detection from EEG and EOG signals using GAN and LSTM networks. Neurocomputing 2020, 408, 100–111. [Google Scholar] [CrossRef]
  64. Sho’ouri, N. EOG biofeedback protocol based on selecting distinctive features to treat or reduce ADHD symptoms. Biomed. Signal Process. Control 2022, 71, 102748. [Google Scholar] [CrossRef]
  65. Latifoğlu, F.; Esas, M.Y.; Demirci, E. Diagnosis of attention-deficit hyperactivity disorder using EOG signals: A new approach. Biomed. Eng. Biomed. Tech. 2020, 65, 149–164. [Google Scholar] [CrossRef]
  66. Ayoubipour, S.; Hekmati, H.; Sho’ouri, N. Analysis of EOG signals related to ADHD and healthy children using wavelet transform. In Proceedings of the 2020 27th National and 5th International Iranian Conference on Biomedical Engineering (ICBME), Tehran, Iran, 26–27 November 2020; pp. 294–297. [Google Scholar]
  67. Singh, H.; Singh, J. Human eye tracking and related issues: A review. Int. J. Sci. Res. Publ. 2012, 2, 1–9. [Google Scholar]
  68. Cognolato, M.; Atzori, M.; Müller, H. Head-mounted eye gaze tracking devices: An overview of modern devices and recent advances. J. Rehabil. Assist. Technol. Eng. 2018, 5, 2055668318773991. [Google Scholar] [CrossRef]
  69. Krafka, K.; Khosla, A.; Kellnhofer, P.; Kannan, H.; Bhandarkar, S.; Matusik, W.; Torralba, A. Eye tracking for everyone. In Proceedings of the IEEE conference on computer vision and pattern recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 2176–2184. [Google Scholar]
  70. Fabio, R.A.; Giannatiempo, S.; Semino, M.; Caprì, T. Longitudinal cognitive rehabilitation applied with eye-tracker for patients with Rett Syndrome. Res. Dev. Disabil. 2021, 111, 103891. [Google Scholar] [CrossRef]
  71. Hansen, D.W.; Ji, Q. In the eye of the beholder: A survey of models for eyes and gaze. IEEE Trans. Pattern Anal. Mach. Intell. 2009, 32, 478–500. [Google Scholar] [CrossRef]
  72. Oyekoya, O. Eye Tracking: A Perceptual Interface for Content Based Image Retrieval; University of London, University College London (United Kingdom): London, UK, 2007. [Google Scholar]
  73. Zhang, Y.; Zheng, X.; Hong, W.; Mou, X. A comparison study of stationary and mobile eye tracking on EXITs design in a wayfinding system. In Proceedings of the 2015 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA), Hong Kong, 16–19 December 2015; pp. 649–653. [Google Scholar]
  74. Płużyczka, M. The first hundred years: A history of eye tracking as a research method. Appl. Linguist. Pap. 2018, 4, 101–116. [Google Scholar] [CrossRef]
  75. Stuart, S.; Hickey, A.; Galna, B.; Lord, S.; Rochester, L.; Godfrey, A. iTrack: Instrumented mobile electrooculography (EOG) eye-tracking in older adults and Parkinson’s disease. Physiol. Meas. 2016, 38, N16. [Google Scholar] [CrossRef] [PubMed]
  76. Boukadoum, A.; Ktonas, P. EOG-Based Recording and Automated Detection of Sleep Rapid Eye Movements: A Critical Review, and Some Recommendations. Psychophysiology 1986, 23, 598–611. [Google Scholar] [CrossRef] [PubMed]
  77. Lam, R.W.; Beattie, C.W.; Buchanan, A.; Remick, R.A.; Zis, A.P. Low electrooculographic ratios in patients with seasonal affective disorder. Am. J. Psychiatry 1991, 148, 1526–1529. [Google Scholar]
  78. Bour, L.; Ongerboer de Visser, B.; Aramideh, M.; Speelman, J. Origin of eye and eyelid movements during blinking. Mov. Disord. 2002, 17, S30–S32. [Google Scholar] [CrossRef]
  79. Yamagishi, K.; Hori, J.; Miyakawa, M. Development of EOG-based communication system controlled by eight-directional eye movements. In Proceedings of the 2006 International Conference of the IEEE Engineering in Medicine and Biology Society, New York, NY, USA, 30 August–3 September 2006; pp. 2574–2577. [Google Scholar]
  80. Magosso, E.; Ursino, M.; Zaniboni, A.; Provini, F.; Montagna, P. Visual and computer-based detection of slow eye movements in overnight and 24-h EOG recordings. Clin. Neurophysiol. 2007, 118, 1122–1133. [Google Scholar] [CrossRef]
  81. Yazicioglu, R.F.; Torfs, T.; Merken, P.; Penders, J.; Leonov, V.; Puers, R.; Gyselinckx, B.; Van Hoof, C. Ultra-low-power biopotential interfaces and their applications in wearable and implantable systems. Microelectron. J. 2009, 40, 1313–1321. [Google Scholar] [CrossRef]
  82. Bulling, A.; Ward, J.A.; Gellersen, H.; Tröster, G. Eye movement analysis for activity recognition using electrooculography. IEEE Trans. Pattern Anal. Mach. Intell. 2010, 33, 741–753. [Google Scholar] [CrossRef]
  83. Cruz, A.; Garcia, D.; Pires, G.; Nunes, U. Facial Expression Recognition based on EOG toward Emotion Detection for Human-Robot Interaction. In Proceedings of the Biosignals, Lisbon, Portugal, 12–15 January 2015; pp. 31–37. [Google Scholar]
  84. Lin, C.-T.; Liao, L.-D.; Liu, Y.-H.; Wang, I.-J.; Lin, B.-S.; Chang, J.-Y. Novel dry polymer foam electrodes for long-term EEG measurement. IEEE Trans. Biomed. Eng. 2010, 58, 1200–1207. [Google Scholar]
  85. Searle, A.; Kirkup, L. A direct comparison of wet, dry and insulating bioelectric recording electrodes. Physiol. Meas. 2000, 21, 271. [Google Scholar] [CrossRef]
  86. Meziane, N.; Webster, J.; Attari, M.; Nimunkar, A. Dry electrodes for electrocardiography. Physiol. Meas. 2013, 34, R47. [Google Scholar] [CrossRef] [PubMed]
  87. Marmor, M.F.; Brigell, M.G.; McCulloch, D.L.; Westall, C.A.; Bach, M. ISCEV standard for clinical electro-oculography (2010 update). Doc. Ophthalmol. 2011, 122, 1–7. [Google Scholar] [CrossRef] [PubMed]
  88. Tobjörk, D.; Österbacka, R. Paper electronics. Adv. Mater. 2011, 23, 1935–1961. [Google Scholar] [CrossRef] [PubMed]
  89. Vehkaoja, A.T.; Verho, J.A.; Puurtinen, M.M.; Nojd, N.M.; Lekkala, J.O.; Hyttinen, J.A. Wireless head cap for EOG and facial EMG measurements. In Proceedings of the 2005 IEEE Engineering in Medicine and Biology 27th Annual Conference, Shanghai, China, 1–4 September 2005; pp. 5865–5868. [Google Scholar]
  90. Eskandarian, L.; Toossi, A.; Nassif, F.; Golmohammadi Rostami, S.; Ni, S.; Mahnam, A.; Alizadeh Meghrazi, M.; Takarada, W.; Kikutani, T.; Naguib, H.E. 3D-Knit Dry Electrodes using Conductive Elastomeric Fibers for Long-Term Continuous Electrophysiological Monitoring. Adv. Mater. Technol. 2022, 7, 2101572. [Google Scholar] [CrossRef]
  91. Calvert, P. Inkjet printing for materials and devices. Chem. Mater. 2001, 13, 3299–3305. [Google Scholar] [CrossRef]
  92. Bollström, R.; Määttänen, A.; Tobjörk, D.; Ihalainen, P.; Kaihovirta, N.; Österbacka, R.; Peltonen, J.; Toivakka, M. A multilayer coated fiber-based substrate suitable for printed functionality. Org. Electron. 2009, 10, 1020–1023. [Google Scholar] [CrossRef]
  93. Kim, D.H.; Kim, Y.S.; Wu, J.; Liu, Z.; Song, J.; Kim, H.S.; Huang, Y.Y.; Hwang, K.C.; Rogers, J.A. Ultrathin silicon circuits with strain-isolation layers and mesh layouts for high-performance electronics on fabric, vinyl, leather, and paper. Adv. Mater. 2009, 21, 3703–3707. [Google Scholar] [CrossRef]
  94. Hyun, W.J.; Secor, E.B.; Hersam, M.C.; Frisbie, C.D.; Francis, L.F. High-resolution patterning of graphene by screen printing with a silicon stencil for highly flexible printed electronics. Adv. Mater. 2015, 27, 109–115. [Google Scholar] [CrossRef]
  95. Golparvar, A.; Ozturk, O.; Yapici, M.K. Gel-Free Wearable Electroencephalography (EEG) with Soft Graphene Textiles. In Proceedings of the 2021 Ieee Sensors, Online, 31 October–4 November 2021; pp. 1–4. [Google Scholar]
  96. Golparvar, A.J.; Yapici, M.K. Graphene-coated wearable textiles for EOG-based human-computer interaction. In Proceedings of the 2018 IEEE 15th International Conference on Wearable and Implantable Body Sensor Networks (BSN), Las Vegas, NV, USA, 4–7 March 2018; pp. 189–192. [Google Scholar]
  97. Wang, X.; Xiao, Y.; Deng, F.; Chen, Y.; Zhang, H. Eye-Movement-Controlled Wheelchair Based on Flexible Hydrogel Biosensor and WT-SVM. Biosensors 2021, 11, 198. [Google Scholar] [CrossRef]
  98. Wang, Z.; Chen, L.; Chen, Y.; Liu, P.; Duan, H.; Cheng, P. 3D printed ultrastretchable, hyper-antifreezing conductive hydrogel for sensitive motion and electrophysiological signal monitoring. Research 2020, 2020, 1–11. [Google Scholar] [CrossRef] [PubMed]
  99. Sadri, B.; Goswami, D.; Martinez, R.V. Rapid fabrication of epidermal paper-based electronic devices using razor printing. Micromachines 2018, 9, 420. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  100. Peng, H.-L.; Liu, J.-Q.; Dong, Y.-Z.; Yang, B.; Chen, X.; Yang, C.-S. Parylene-based flexible dry electrode for bioptential recording. Sens. Actuators B Chem. 2016, 231, 1–11. [Google Scholar] [CrossRef]
  101. Huang, J.; Zhu, H.; Chen, Y.; Preston, C.; Rohrbach, K.; Cumings, J.; Hu, L. Highly transparent and flexible nanopaper transistors. ACS Nano 2013, 7, 2106–2113. [Google Scholar] [CrossRef] [PubMed]
  102. Blumenthal, T.; Fratello, V.; Nino, G.; Ritala, K. Aerosol Jet® Printing Onto 3D and Flexible Substrates. Quest Integr. Inc. 2017. Available online: http://www.qi2.com/wp-content/uploads/2016/12/TP-460-Aerosol-Jet-Printing-onto-3D-and-Flexible-Substrates.pdf (accessed on 19 October 2022).
  103. Saengchairat, N.; Tran, T.; Chua, C.-K. A review: Additive manufacturing for active electronic components. Virtual Phys. Prototyp. 2017, 12, 31–46. [Google Scholar] [CrossRef]
  104. Beach, C.; Karim, N.; Casson, A.J. A Graphene-Based Sleep Mask for Comfortable Wearable Eye Tracking. In Proceedings of the 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Berlin, Germany, 23–27 July 2019; pp. 6693–6696. [Google Scholar]
  105. Paul, G.M.; Cao, F.; Torah, R.; Yang, K.; Beeby, S.; Tudor, J. A smart textile based facial EMG and EOG computer interface. IEEE Sens. J. 2013, 14, 393–400. [Google Scholar] [CrossRef]
  106. Paul, G.; Torah, R.; Beeby, S.; Tudor, J. The development of screen printed conductive networks on textiles for biopotential monitoring applications. Sens. Actuators A: Phys. 2014, 206, 35–41. [Google Scholar] [CrossRef]
  107. Peng, H.-L.; Sun, Y.-l.; Bi, C.; Li, Q.-F. Development of a flexible dry electrode based MXene with low contact impedance for biopotential recording. Measurement 2022, 190, 110782. [Google Scholar] [CrossRef]
  108. Cheng, X.; Bao, C.; Dong, W. Soft dry electroophthalmogram electrodes for human machine interaction. Biomed. Microdevices 2019, 21, 1–11. [Google Scholar] [CrossRef]
  109. Tian, L.; Zimmerman, B.; Akhtar, A.; Yu, K.J.; Moore, M.; Wu, J.; Larsen, R.J.; Lee, J.W.; Li, J.; Liu, Y. Large-area MRI-compatible epidermal electronic interfaces for prosthetic control and cognitive monitoring. Nat. Biomed. Eng. 2019, 3, 194–205. [Google Scholar] [CrossRef]
  110. Park, J.; Choi, S.; Janardhan, A.H.; Lee, S.-Y.; Raut, S.; Soares, J.; Shin, K.; Yang, S.; Lee, C.; Kang, K.-W. Electromechanical cardioplasty using a wrapped elasto-conductive epicardial mesh. Sci. Transl. Med. 2016, 8, 344ra86. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  111. Norton, J.J.; Lee, D.S.; Lee, J.W.; Lee, W.; Kwon, O.; Won, P.; Jung, S.-Y.; Cheng, H.; Jeong, J.-W.; Akce, A. Soft, curved electrode systems capable of integration on the auricle as a persistent brain–computer interface. Proc. Natl. Acad. Sci. USA 2015, 112, 3920–3925. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  112. Bergmann, J.; McGregor, A. Body-worn sensor design: What do patients and clinicians want? Ann. Biomed. Eng. 2011, 39, 2299–2312. [Google Scholar] [CrossRef] [PubMed]
  113. Preece, S.J.; Goulermas, J.Y.; Kenney, L.P.; Howard, D.; Meijer, K.; Crompton, R. Activity identification using body-mounted sensors—A review of classification techniques. Physiol. Meas. 2009, 30, R1. [Google Scholar] [CrossRef] [PubMed]
  114. Kanoh, S.; Ichi-nohe, S.; Shioya, S.; Inoue, K.; Kawashima, R. Development of an eyewear to measure eye and body movements. In Proceedings of the 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 25–29 August 2015; pp. 2267–2270. [Google Scholar]
  115. Desai, M.; Pratt, L.A.; Lentzner, H.R.; Robinson, K.N. Trends in vision and hearing among older Americans. Aging Trends 2001, 1–8. [Google Scholar] [CrossRef] [Green Version]
  116. Bulling, A.; Roggen, D.; Tröster, G. Wearable EOG goggles: Eye-based interaction in everyday environments. In CHI’09 Extended Abstracts on Human Factors in Computing Systems; Association for Computing Machinery: New York, NY, USA, 2009; pp. 3259–3264. [Google Scholar]
  117. Kosmyna, N. AttentivU: A Wearable Pair of EEG and EOG Glasses for Real-Time Physiological Processing (Conference Presentation). In Proceedings of the Optical Architectures for Displays and Sensing in Augmented, Virtual, and Mixed Reality (AR, VR, MR), San Francisco, CA, USA, 2 February 2020; p. 113101P. [Google Scholar]
  118. Bulling, A.; Roggen, D.; Tröster, G. Wearable EOG goggles: Seamless sensing and context-awareness in everyday environments. J. Ambient Intell. Smart Environ. 2009, 1, 157–171. [Google Scholar] [CrossRef] [Green Version]
  119. Dhuliawala, M.; Lee, J.; Shimizu, J.; Bulling, A.; Kunze, K.; Starner, T.; Woo, W. Smooth eye movement interaction using EOG glasses. In Proceedings of the 18th ACM International Conference on Multimodal Interaction, Tokyo, Japan, 12–16 November 2016; pp. 307–311. [Google Scholar]
  120. Inzelberg, L.; Pur, M.D.; Schlisske, S.; Rödlmeier, T.; Granoviter, O.; Rand, D.; Steinberg, S.; Hernandez-Sosa, G.; Hanein, Y. Printed facial skin electrodes as sensors of emotional affect. Flex. Print. Electron. 2018, 3, 045001. [Google Scholar] [CrossRef]
  121. Miettinen, T.; Myllymaa, K.; Hukkanen, T.; Töyräs, J.; Sipilä, K.; Myllymaa, S. Home polysomnography reveals a first-night effect in patients with low sleep bruxism activity. J. Clin. Sleep Med. 2018, 14, 1377–1386. [Google Scholar] [CrossRef]
  122. Simar, C.; Petieau, M.; Cebolla, A.; Leroy, A.; Bontempi, G.; Cheron, G. EEG-based brain-computer interface for alpha speed control of a small robot using the MUSE headband. In Proceedings of the 2020 International Joint Conference on Neural Networks (IJCNN), Glasgow, UK, 19-24 July 2020; pp. 1–4. [Google Scholar] [CrossRef]
  123. Balconi, M.; Fronda, G.; Venturella, I.; Crivelli, D. Conscious, pre-conscious and unconscious mechanisms in emotional behaviour. Some applications to the mindfulness approach with wearable devices. Appl. Sci. 2017, 7, 1280. [Google Scholar] [CrossRef] [Green Version]
  124. Asif, A.; Majid, M.; Anwar, S.M. Human stress classification using EEG signals in response to music tracks. Comput. Biol. Med. 2019, 107, 182–196. [Google Scholar] [CrossRef]
  125. Merino, M.; Rivera, O.; Gómez, I.; Molina, A.; Dorronzoro, E. A method of EOG signal processing to detect the direction of eye movements. In Proceedings of the 2010 First International Conference on Sensor Device Technologies and Applications, Washington, DC, USA, 18–25 July 2010; pp. 100–105. [Google Scholar]
  126. Wang, Y.; Lv, Z.; Zheng, Y. Automatic emotion perception using eye movement information for E-healthcare systems. Sensors 2018, 18, 2826. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  127. Soundariya, R.; Renuga, R. Eye movement based emotion recognition using electrooculography. In Proceedings of the 2017 Innovations in Power and Advanced Computing Technologies (i-PACT), Vellore, India, 21–22 April 2017; pp. 1–5. [Google Scholar]
  128. Kose, M.R.; Ahirwal, M.K.; Kumar, A. A new approach for emotions recognition through EOG and EMG signals. Signal Image Video Process. 2021, 15, 1863–1871. [Google Scholar] [CrossRef]
  129. Sánchez-Ferrer, M.L.; Grima-Murcia, M.D.; Sánchez-Ferrer, F.; Hernández-Peñalver, A.I.; Fernández-Jover, E.; Del Campo, F.S. Use of eye tracking as an innovative instructional method in surgical human anatomy. J. Surg. Educ. 2017, 74, 668–673. [Google Scholar] [CrossRef]
  130. Vansteenkiste, P.; Cardon, G.; Philippaerts, R.; Lenoir, M. Measuring dwell time percentage from head-mounted eye-tracking data–comparison of a frame-by-frame and a fixation-by-fixation analysis. Ergonomics 2015, 58, 712–721. [Google Scholar] [CrossRef] [PubMed]
  131. Mohanto, B.; Islam, A.T.; Gobbetti, E.; Staadt, O. An integrative view of foveated rendering. Comput. Graph. 2022, 102, 474–501. [Google Scholar] [CrossRef]
  132. Holmqvist, K.; Örbom, S.L.; Zemblys, R. Small head movements increase and colour noise in data from five video-based P–CR eye trackers. Behav. Res. Methods 2022, 54, 845–863. [Google Scholar] [CrossRef] [PubMed]
  133. Lev, A.; Braw, Y.; Elbaum, T.; Wagner, M.; Rassovsky, Y. Eye tracking during a continuous performance test: Utility for assessing ADHD patients. J. Atten. Disord. 2022, 26, 245–255. [Google Scholar] [CrossRef] [PubMed]
  134. Wu, M.; Louw, T.; Lahijanian, M.; Ruan, W.; Huang, X.; Merat, N.; Kwiatkowska, M. Gaze-based intention anticipation over driving manoeuvres in semi-autonomous vehicles. In Proceedings of the 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China, 4–8 November 2019; pp. 6210–6216. [Google Scholar]
  135. Taba, I.B. Improving Eye-Gaze Tracking Accuracy through Personalized Calibration of a User’s Aspherical Corneal Model; University of British Columbia: Vancouver, BC, Canada, 2012. [Google Scholar]
  136. Vázquez Romaguera, T.; Vázquez Romaguera, L.; Castro Piñol, D.; Vázquez Seisdedos, C.R. Pupil Center Detection Approaches: A Comparative Analysis. Comput. Y Sist. 2021, 25, 67–81. [Google Scholar] [CrossRef]
  137. Schwiegerling, J.T. Eye axes and their relevance to alignment of corneal refractive procedures. J. Refract. Surg. 2013, 29, 515–516. [Google Scholar] [CrossRef] [Green Version]
  138. Duchowski, A.T. Eye Tracking Methodology: Theory and Practice; Springer: Berlin/Heidelberg, Germany, 2017. [Google Scholar]
  139. Shehu, I.S.; Wang, Y.; Athuman, A.M.; Fu, X. Remote Eye Gaze Tracking Research: A Comparative Evaluation on Past and Recent Progress. Electronics 2021, 10, 3165. [Google Scholar] [CrossRef]
  140. Mantiuk, R. Gaze-dependent tone mapping for HDR video. In High Dynamic Range Video; Elsevier: Amsterdam, The Netherlands, 2017; pp. 189–199. [Google Scholar]
  141. Schall, A.; Bergstrom, J.R. Introduction to eye tracking. In Eye Tracking in User Experience Design; Elsevier: Amsterdam, The Netherlands, 2014; pp. 3–26. [Google Scholar]
  142. Carter, B.T.; Luke, S.G. Best practices in eye tracking research. Int. J. Psychophysiol. 2020, 155, 49–62. [Google Scholar] [CrossRef] [PubMed]
  143. Noland, R.B.; Weiner, M.D.; Gao, D.; Cook, M.P.; Nelessen, A. Eye-tracking technology, visual preference surveys, and urban design: Preliminary evidence of an effective methodology. J. Urban. Int. Res. Placemaking Urban Sustain. 2017, 10, 98–110. [Google Scholar] [CrossRef]
  144. Wang, M.; Wang, T.; Luo, Y.; He, K.; Pan, L.; Li, Z.; Cui, Z.; Liu, Z.; Tu, J.; Chen, X. Fusing stretchable sensing technology with machine learning for human–machine interfaces. Adv. Funct. Mater. 2021, 31, 2008807. [Google Scholar] [CrossRef]
  145. Donuk, K.; Ari, A.; Hanbay, D. A CNN based real-time eye tracker for web mining applications. Multimed. Tools Appl. 2022, 81, 39103–39120. [Google Scholar] [CrossRef]
  146. Hosp, B.; Eivazi, S.; Maurer, M.; Fuhl, W.; Geisler, D.; Kasneci, E. Remoteeye: An open-source high-speed remote eye tracker. Behav. Res. Methods 2020, 52, 1387–1401. [Google Scholar] [CrossRef]
  147. Petersch, B.; Dierkes, K. Gaze-angle dependency of pupil-size measurements in head-mounted eye tracking. Behav. Res. Methods 2022, 54, 763–779. [Google Scholar] [CrossRef]
  148. Larumbe-Bergera, A.; Garde, G.; Porta, S.; Cabeza, R.; Villanueva, A. Accurate pupil center detection in off-the-shelf eye tracking systems using convolutional neural networks. Sensors 2021, 21, 6847. [Google Scholar] [CrossRef]
  149. Hess, E.H.; Polt, J.M. Pupil size as related to interest value of visual stimuli. Science 1960, 132, 349–350. [Google Scholar] [CrossRef]
  150. Punde, P.A.; Jadhav, M.E.; Manza, R.R. A study of eye tracking technology and its applications. In Proceedings of the 2017 1st International Conference on Intelligent Systems and Information Management (ICISIM), Aurangabad, India, 5–6 October 2017; pp. 86–90. [Google Scholar]
  151. Rusnak, M.A.; Rabiega, M. The Potential of Using an Eye Tracker in Architectural Education: Three Perspectives for Ordinary Users, Students and Lecturers. Buildings 2021, 11, 245. [Google Scholar] [CrossRef]
  152. Edition, F. Diagnostic and statistical manual of mental disorders. Am Psychiatr. Assoc 2013, 21, 591–643. [Google Scholar]
  153. Constantino, J.N.; Kennon-McGill, S.; Weichselbaum, C.; Marrus, N.; Haider, A.; Glowinski, A.L.; Gillespie, S.; Klaiman, C.; Klin, A.; Jones, W. Infant viewing of social scenes is under genetic control and is atypical in autism. Nature 2017, 547, 340–344. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  154. Khan, M.Q.; Lee, S. Gaze and eye tracking: Techniques and applications in ADAS. Sensors 2019, 19, 5540. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  155. Cazzato, D.; Leo, M.; Distante, C. An investigation on the feasibility of uncalibrated and unconstrained gaze tracking for human assistive applications by using head pose estimation. Sensors 2014, 14, 8363–8379. [Google Scholar] [CrossRef] [PubMed]
  156. Hessels, R.S.; Niehorster, D.C.; Nyström, M.; Andersson, R.; Hooge, I.T. Is the eye-movement field confused about fixations and saccades? A survey among 124 researchers. R. Soc. Open Sci. 2018, 5, 180502. [Google Scholar] [CrossRef] [Green Version]
  157. González-Mena, G.; Del-Valle-Soto, C.; Corona, V.; Rodríguez, J. Neuromarketing in the Digital Age: The Direct Relation between Facial Expressions and Website Design. Appl. Sci. 2022, 12, 8186. [Google Scholar] [CrossRef]
  158. Dong, W.; Liao, H.; Roth, R.E.; Wang, S. Eye tracking to explore the potential of enhanced imagery basemaps in web mapping. Cartogr. J. 2014, 51, 313–329. [Google Scholar] [CrossRef]
  159. Voßkühler, A.; Nordmeier, V.; Kuchinke, L.; Jacobs, A.M. OGAMA (Open Gaze and Mouse Analyzer): Open-source software designed to analyze eye and mouse movements in slideshow study designs. Behav. Res. Methods 2008, 40, 1150–1162. [Google Scholar] [CrossRef] [Green Version]
  160. Niehorster, D.C.; Santini, T.; Hessels, R.S.; Hooge, I.T.; Kasneci, E.; Nyström, M. The impact of slippage on the data quality of head-worn eye trackers. Behav. Res. Methods 2020, 52, 1140–1160. [Google Scholar] [CrossRef] [Green Version]
  161. Hu, N. Depth Estimation Inside 3D Maps Based on Eye-Tracker. 2020. Available online: https://mediatum.ub.tum.de/doc/1615800/1615800.pdf (accessed on 19 October 2022).
  162. Takahashi, R.; Suzuki, H.; Chew, J.Y.; Ohtake, Y.; Nagai, Y.; Ohtomi, K. A system for three-dimensional gaze fixation analysis using eye tracking glasses. J. Comput. Des. Eng. 2018, 5, 449–457. [Google Scholar] [CrossRef]
  163. Špakov, O.; Miniotas, D. Visualization of eye gaze data using heat maps. Elektron. Ir Elektrotechnika 2007, 74, 55–58. [Google Scholar]
  164. Maurus, M.; Hammer, J.H.; Beyerer, J. Realistic heatmap visualization for interactive analysis of 3D gaze data. In Proceedings of the Symposium on Eye Tracking Research and Applications, Safety Harbor, FL, USA, 26–28 March 2014; pp. 295–298. [Google Scholar]
  165. Pfeiffer, T.; Memili, C. Model-based real-time visualization of realistic three-dimensional heat maps for mobile eye tracking and eye tracking in virtual reality. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, Charleston, SC, USA, 14–17 March 2016; pp. 95–102. [Google Scholar]
  166. Kar, A.; Corcoran, P. Performance evaluation strategies for eye gaze estimation systems with quantitative metrics and visualizations. Sensors 2018, 18, 3151. [Google Scholar] [CrossRef] [Green Version]
  167. Munz, T.; Chuang, L.; Pannasch, S.; Weiskopf, D. VisME: Visual microsaccades explorer. J. Eye Mov. Res. 2019, 12. [Google Scholar] [CrossRef] [PubMed]
  168. Reingold, E.M. Eye tracking research and technology: Towards objective measurement of data quality. Vis. Cogn. 2014, 22, 635–652. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  169. Godijn, R.; Theeuwes, J. Programming of endogenous and exogenous saccades: Evidence for a competitive integration model. J. Exp. Psychol. Hum. Percept. Perform. 2002, 28, 1039. [Google Scholar] [CrossRef]
  170. Ha, K.; Chen, Z.; Hu, W.; Richter, W.; Pillai, P.; Satyanarayanan, M. Towards wearable cognitive assistance. In Proceedings of the 12th annual international conference on Mobile systems, applications, and services, Bretton Woods, NH, USA, 16–19 June 2014; pp. 68–81. [Google Scholar]
  171. Larrazabal, A.J.; Cena, C.G.; Martínez, C.E. Video-oculography eye tracking towards clinical applications: A review. Comput. Biol. Med. 2019, 108, 57–66. [Google Scholar] [CrossRef]
  172. Mahanama, B.; Jayawardana, Y.; Rengarajan, S.; Jayawardena, G.; Chukoskie, L.; Snider, J.; Jayarathna, S. Eye Movement and Pupil Measures: A Review. Front. Comput. Sci. 2022, 3, 733531. [Google Scholar] [CrossRef]
  173. Hessels, R.S.; Benjamins, J.S.; Niehorster, D.C.; van Doorn, A.J.; Koenderink, J.J.; Holleman, G.A.; de Kloe, Y.J.; Valtakari, N.V.; van Hal, S.; Hooge, I.T. Eye contact avoidance in crowds: A large wearable eye-tracking study. Atten. Percept. Psychophys. 2022, 84, 2623–2640. [Google Scholar] [CrossRef]
  174. Li, T.; Zhou, X. Battery-free eye tracker on glasses. In Proceedings of the 24th Annual International Conference on Mobile Computing and Networking, New Delhi, India, 29 October–2 November 2018; pp. 67–82. [Google Scholar]
  175. Ye, Z.; Li, Y.; Fathi, A.; Han, Y.; Rozga, A.; Abowd, G.D.; Rehg, J.M. Detecting eye contact using wearable eye-tracking glasses. In Proceedings of the 2012 ACM conference on ubiquitous computing, Pittsburgh, PA, USA, 5–8 September 2012; pp. 699–704. [Google Scholar]
  176. Aronson, R.M.; Santini, T.; Kübler, T.C.; Kasneci, E.; Srinivasa, S.; Admoni, H. Eye-hand behavior in human-robot shared manipulation. In Proceedings of the 2018 13th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Chicago, IL, USA, 5–8 March 2018; pp. 4–13. [Google Scholar]
  177. Callahan-Flintoft, C.; Barentine, C.; Touryan, J.; Ries, A.J. A Case for Studying Naturalistic Eye and Head Movements in Virtual Environments. Front. Psychol. 2021, 12, 650693. [Google Scholar] [CrossRef]
  178. Radianti, J.; Majchrzak, T.A.; Fromm, J.; Wohlgenannt, I. A systematic review of immersive virtual reality applications for higher education: Design elements, lessons learned, and research agenda. Comput. Educ. 2020, 147, 103778. [Google Scholar] [CrossRef]
  179. Arefin, M.S.; Swan II, J.E.; Cohen Hoffing, R.A.; Thurman, S.M. Estimating Perceptual Depth Changes with Eye Vergence and Interpupillary Distance using an Eye Tracker in Virtual Reality. In Proceedings of the 2022 Symposium on Eye Tracking Research and Applications, Seatle, WA, USA, 8–11 June 2022; pp. 1–7. [Google Scholar]
  180. Puig, M.S.; Romeo, A.; Supèr, H. Vergence eye movements during figure-ground perception. Conscious. Cogn. 2021, 92, 103138. [Google Scholar] [CrossRef]
  181. Hooge, I.T.; Hessels, R.S.; Nyström, M. Do pupil-based binocular video eye trackers reliably measure vergence? Vis. Res. 2019, 156, 1–9. [Google Scholar] [CrossRef] [PubMed]
  182. Iwata, Y.; Handa, T.; Ishikawa, H. Objective measurement of nine gaze-directions using an eye-tracking device. J. Eye Mov. Res. 2020, 13. [Google Scholar] [CrossRef] [PubMed]
  183. Clay, V.; König, P.; Koenig, S. Eye tracking in virtual reality. J. Eye Mov. Res. 2019, 12. [Google Scholar] [CrossRef] [PubMed]
  184. Biedert, R.; Buscher, G.; Dengel, A. Gazing the Text for Fun and Profit. In Eye Gaze in Intelligent User Interfaces; Springer: Berlin/Heidelberg, Germany, 2013; pp. 137–160. [Google Scholar]
  185. Nakano, Y.I.; Conati, C.; Bader, T. Eye Gaze in Intelligent User Interfaces: Gaze-Based Analyses, Models and Applications; Springer: Berlin/Heidelberg, Germany, 2013. [Google Scholar]
  186. Hooge, I.T.; Niehorster, D.C.; Hessels, R.S.; Cleveland, D.; Nyström, M. The pupil-size artefact (PSA) across time, viewing direction, and different eye trackers. Behav. Res. Methods 2021, 53, 1986–2006. [Google Scholar] [CrossRef]
  187. Shishido, E.; Ogawa, S.; Miyata, S.; Yamamoto, M.; Inada, T.; Ozaki, N. Application of eye trackers for understanding mental disorders: Cases for schizophrenia and autism spectrum disorder. Neuropsychopharmacol. Rep. 2019, 39, 72–77. [Google Scholar] [CrossRef]
  188. Hung, J.C.; Wang, C.-C. The Influence of Cognitive Styles and Gender on Visual Behavior During Program Debugging: A Virtual Reality Eye Tracker Study. Hum.-Cent. Comput. Inf. Sci. 2021, 11, 1–21. [Google Scholar]
  189. Obaidellah, U.; Haek, M.A. Evaluating gender difference on algorithmic problems using eye-tracker. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications, Warsaw, Poland, 14–17 June 2018; pp. 1–8. [Google Scholar]
  190. Yehezkel, O.; Belkin, M.; Wygnanski-Jaffe, T. Automated diagnosis and measurement of strabismus in children. Am. J. Ophthalmol. 2020, 213, 226–234. [Google Scholar] [CrossRef] [Green Version]
  191. Nobukawa, S.; Shirama, A.; Takahashi, T.; Takeda, T.; Ohta, H.; Kikuchi, M.; Iwanami, A.; Kato, N.; Toda, S. Identification of attention-deficit hyperactivity disorder based on the complexity and symmetricity of pupil diameter. Sci. Rep. 2021, 11, 1–14. [Google Scholar] [CrossRef]
  192. Ioannidou, F.; Hermens, F.; Hodgson, T.L. Mind your step: The effects of mobile phone use on gaze behavior in stair climbing. J. Technol. Behav. Sci. 2017, 2, 109–120. [Google Scholar] [CrossRef] [Green Version]
  193. Scalera, L.; Seriani, S.; Gallina, P.; Lentini, M.; Gasparetto, A. Human–robot interaction through eye tracking for artistic drawing. Robotics 2021, 10, 54. [Google Scholar] [CrossRef]
  194. Aoyama, T.; Takeno, S.; Takeuchi, M.; Hasegawa, Y. Head-mounted display-based microscopic imaging system with customizable field size and viewpoint. Sensors 2020, 20, 1967. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  195. Mantiuk, R.; Kowalik, M.; Nowosielski, A.; Bazyluk, B. Do-it-yourself eye tracker: Low-cost pupil-based eye tracker for computer graphics applications. In Proceedings of the International Conference on Multimedia Modeling, Klagenfurt, Austria, 4–6 January 2012; pp. 115–125. [Google Scholar]
  196. Schneider, B.; Sharma, K.; Cuendet, S.; Zufferey, G.; Dillenbourg, P.; Pea, R. Leveraging mobile eye-trackers to capture joint visual attention in co-located collaborative learning groups. Int. J. Comput.-Support. Collab. Learn. 2018, 13, 241–261. [Google Scholar] [CrossRef]
  197. Hietanen, A.; Pieters, R.; Lanz, M.; Latokartano, J.; Kämäräinen, J.-K. AR-based interaction for human-robot collaborative manufacturing. Robot. Comput.-Integr. Manuf. 2020, 63, 101891. [Google Scholar] [CrossRef]
  198. Diaz-Piedra, C.; Sanchez-Carrion, J.M.; Rieiro, H.; Di Stasi, L.L. Gaze-based technology as a tool for surgical skills assessment and training in urology. Urology 2017, 107, 26–30. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Recent advances in eye tracking sensors, systems, and methods (Screen-based type, Eyeglasses type (right), and VR type: Reprinted under terms of the CC-BY license [3]. Copyright 2020, the authors. Published by MDPI), (Headband type: Reprinted with permission [4]. Copyright 2019 Elsevier), (Eyeglasses type (left): Reprinted with permission [5]. Copyright 2020 American Chemical Society), (Ear type: Reprinted under terms of the CC-BY license [6]. Copyright 2017, the Authors. Published by Springer Nature), (Facemask type: Reprinted with permission [7]. Copyright 2019 Elsevier), (Metal membrane: Reprinted with permission [8]. Copyright 2013 John Wiley and Sons), (Composite: Reprinted with permission [9]. Copyright 2021 American Chemical Society), (Tattoo: Reprinted with permission [10]. Copyright 2017 American Chemical Society), (Smart wheelchair: Reprinted with permission [1]. Copyright 2017, Elsevier B.V), (Drone control: Reprinted under terms of the CC-BY license [11]. Copyright 2018, the Authors. Published by Springer Nature), (Infant Analysis: Reprinted with permission [12]. Copyright 2020 Elsevier), (laparoscopic surgery: Reprinted under terms of the CC-BY license [13]. Copyright 1969, the authors. Published by Elsevier Ltd.), (PCCR: Reprinted with permission [14]. Copyright 2017 Springer Nature), (Machine learning: Reprinted under terms of the CC-BY license [15]. Copyright 2021, the Authors. Published by MDPI).
Figure 1. Recent advances in eye tracking sensors, systems, and methods (Screen-based type, Eyeglasses type (right), and VR type: Reprinted under terms of the CC-BY license [3]. Copyright 2020, the authors. Published by MDPI), (Headband type: Reprinted with permission [4]. Copyright 2019 Elsevier), (Eyeglasses type (left): Reprinted with permission [5]. Copyright 2020 American Chemical Society), (Ear type: Reprinted under terms of the CC-BY license [6]. Copyright 2017, the Authors. Published by Springer Nature), (Facemask type: Reprinted with permission [7]. Copyright 2019 Elsevier), (Metal membrane: Reprinted with permission [8]. Copyright 2013 John Wiley and Sons), (Composite: Reprinted with permission [9]. Copyright 2021 American Chemical Society), (Tattoo: Reprinted with permission [10]. Copyright 2017 American Chemical Society), (Smart wheelchair: Reprinted with permission [1]. Copyright 2017, Elsevier B.V), (Drone control: Reprinted under terms of the CC-BY license [11]. Copyright 2018, the Authors. Published by Springer Nature), (Infant Analysis: Reprinted with permission [12]. Copyright 2020 Elsevier), (laparoscopic surgery: Reprinted under terms of the CC-BY license [13]. Copyright 1969, the authors. Published by Elsevier Ltd.), (PCCR: Reprinted with permission [14]. Copyright 2017 Springer Nature), (Machine learning: Reprinted under terms of the CC-BY license [15]. Copyright 2021, the Authors. Published by MDPI).
Biosensors 12 01039 g001
Figure 4. Schematic diagrams and images of micro-patterned electrodes: (a) Graphene electrode fabrication process based on a polymer material (reprinted under terms of the CC-BY license [11]. Copyright 2018, the authors. Published by Springer Nature). (b) Copper electrode fabrication process based on paper substrate (reprinted under terms of the CC-BY license [99]. Copyright 2018, the authors. Published by MDPI). (c) AgNPs electrode fabrication process via aerosol jet printing (reprinted under terms of the CC-BY license [30]. Copyright 2020, the authors. Published by Science). (d) Flexible dry Ag/AgCl electrode fabrication process via screen printing (reprinted with permission [100]. Copyright 2016 Elsevier).
Figure 4. Schematic diagrams and images of micro-patterned electrodes: (a) Graphene electrode fabrication process based on a polymer material (reprinted under terms of the CC-BY license [11]. Copyright 2018, the authors. Published by Springer Nature). (b) Copper electrode fabrication process based on paper substrate (reprinted under terms of the CC-BY license [99]. Copyright 2018, the authors. Published by MDPI). (c) AgNPs electrode fabrication process via aerosol jet printing (reprinted under terms of the CC-BY license [30]. Copyright 2020, the authors. Published by Science). (d) Flexible dry Ag/AgCl electrode fabrication process via screen printing (reprinted with permission [100]. Copyright 2016 Elsevier).
Biosensors 12 01039 g004
Figure 5. Examples of eyeglasses with electrodes. (a) Goggle type of EOG device (reprinted with permission under the terms of the CC-BY license [38]. Copyright 2021, the authors. Published by MDPI). (b) Eyeglass type of commercial EOG device (reprinted with permission [119]. Copyright 2016 ACM). (c) Eyeglass type of 3D-printed EOG devices (left: reprinted under terms of the CC-BY license [11]. Copyright 2018, the authors. Published by Springer Nature, right: reprinted with permission [32]. Copyright 2019 ACM). (d) Positions of CNTs/PDMS electrodes. (e) Positions of dry metal electrodes (left: reprinted under terms of the CC-BY license [11]. Copyright 2018, the authors. Published by Springer Nature, right: reprinted with permission [32]. Copyright 2019 ACM).
Figure 5. Examples of eyeglasses with electrodes. (a) Goggle type of EOG device (reprinted with permission under the terms of the CC-BY license [38]. Copyright 2021, the authors. Published by MDPI). (b) Eyeglass type of commercial EOG device (reprinted with permission [119]. Copyright 2016 ACM). (c) Eyeglass type of 3D-printed EOG devices (left: reprinted under terms of the CC-BY license [11]. Copyright 2018, the authors. Published by Springer Nature, right: reprinted with permission [32]. Copyright 2019 ACM). (d) Positions of CNTs/PDMS electrodes. (e) Positions of dry metal electrodes (left: reprinted under terms of the CC-BY license [11]. Copyright 2018, the authors. Published by Springer Nature, right: reprinted with permission [32]. Copyright 2019 ACM).
Biosensors 12 01039 g005
Figure 6. Examples of facemask platforms for EOG monitoring. (a) Face type of EOG device with graphene-coated tissue electrodes (reprinted under terms of the CC-BY license [42]. Copyright 2019, the Authors. Published by JAIC). (b) The electrode array system and a subject wearing a temporary-tattoo eight-electrode array (reprinted with permission [41]. Copyright 2019 IOP). (c) Positions of a screen-printed electrode set and a subject wearing screen-printed electrodes (reprinted with permission [7]. Copyright 2019 Elsevier).
Figure 6. Examples of facemask platforms for EOG monitoring. (a) Face type of EOG device with graphene-coated tissue electrodes (reprinted under terms of the CC-BY license [42]. Copyright 2019, the Authors. Published by JAIC). (b) The electrode array system and a subject wearing a temporary-tattoo eight-electrode array (reprinted with permission [41]. Copyright 2019 IOP). (c) Positions of a screen-printed electrode set and a subject wearing screen-printed electrodes (reprinted with permission [7]. Copyright 2019 Elsevier).
Biosensors 12 01039 g006
Figure 7. Examples of headband platforms and earplugs with electrodes. (a) Example of headband type of EOG device. (Reproduced under terms of the CC-BY license [51]. Copyright 2017, the Authors. Published by MDPI). (b,c) Headband type of commercial EOG devices. ((b): Reprinted with permission [4]. Copyright 2019 Elsevier, (c): Reprinted under terms of the CC-BY license [123]. Copyright 2017, the Authors. Published by MDPI). (d) Position of embedded dry electrodes with the subject wearing a commercial device. (Left: Reprinted with permission [124]. Copyright 2019 Elsevier, Middle and Right: Reprinted under terms of the CC-BY license [123]. Copyright 2017, the Authors. Published by MDPI). (e) Earplugs type of EOG device. ((e): Reprinted under terms of the CC-BY license [6]. Copyright 2017, the Authors. Published by Springer Nature).
Figure 7. Examples of headband platforms and earplugs with electrodes. (a) Example of headband type of EOG device. (Reproduced under terms of the CC-BY license [51]. Copyright 2017, the Authors. Published by MDPI). (b,c) Headband type of commercial EOG devices. ((b): Reprinted with permission [4]. Copyright 2019 Elsevier, (c): Reprinted under terms of the CC-BY license [123]. Copyright 2017, the Authors. Published by MDPI). (d) Position of embedded dry electrodes with the subject wearing a commercial device. (Left: Reprinted with permission [124]. Copyright 2019 Elsevier, Middle and Right: Reprinted under terms of the CC-BY license [123]. Copyright 2017, the Authors. Published by MDPI). (e) Earplugs type of EOG device. ((e): Reprinted under terms of the CC-BY license [6]. Copyright 2017, the Authors. Published by Springer Nature).
Biosensors 12 01039 g007
Figure 8. Signal processing and data analysis. (a) Recorded EOG signals depending on eye directions (Left: Reproduced under terms of the CC-BY license [51]. Copyright 2017, the Authors. Published by MDPI, Right: Reproduced with permission [61]. Copyright 2021, Wiley-VCH GmbH). (b) Schematic algorithm diagram using threshold (Reproduced with permission [61]. Copyright 2021, Wiley-VCH GmbH).
Figure 8. Signal processing and data analysis. (a) Recorded EOG signals depending on eye directions (Left: Reproduced under terms of the CC-BY license [51]. Copyright 2017, the Authors. Published by MDPI, Right: Reproduced with permission [61]. Copyright 2021, Wiley-VCH GmbH). (b) Schematic algorithm diagram using threshold (Reproduced with permission [61]. Copyright 2021, Wiley-VCH GmbH).
Biosensors 12 01039 g008
Figure 9. Machine learning for data analysis (a) Signal processing sequence with an LDA classifier (Reprinted with permission [1]. Copyright 2017, Elsevier B.V.). (b) Signal processing sequence with a DWT classifier (Reprinted with permission [5]. Copyright 2020, American Chemical Society).
Figure 9. Machine learning for data analysis (a) Signal processing sequence with an LDA classifier (Reprinted with permission [1]. Copyright 2017, Elsevier B.V.). (b) Signal processing sequence with a DWT classifier (Reprinted with permission [5]. Copyright 2020, American Chemical Society).
Biosensors 12 01039 g009
Figure 10. (a) Controller-type applications such as wheelchairs, drones, game interfaces, and virtual keyboards (1st: reprinted with permission [1]. Copyright 2017, Elsevier B.V., 2nd: reprinted under terms of the CC-BY license [11]. Copyright 2018, the Authors. Published by Springer Nature, 3rd: reprinted with permission [5]. Copyright 2020, American Chemical Society, 4th). (b) Healthcare monitoring systems applications and medical health status analyses applications (2nd: reprinted with permission [63]. Copyright 2020, Elsevier B.V., 3rd: reprinted with permission [65]. Copyright 2020, Walter de Gruyter GmbH).
Figure 10. (a) Controller-type applications such as wheelchairs, drones, game interfaces, and virtual keyboards (1st: reprinted with permission [1]. Copyright 2017, Elsevier B.V., 2nd: reprinted under terms of the CC-BY license [11]. Copyright 2018, the Authors. Published by Springer Nature, 3rd: reprinted with permission [5]. Copyright 2020, American Chemical Society, 4th). (b) Healthcare monitoring systems applications and medical health status analyses applications (2nd: reprinted with permission [63]. Copyright 2020, Elsevier B.V., 3rd: reprinted with permission [65]. Copyright 2020, Walter de Gruyter GmbH).
Biosensors 12 01039 g010
Table 3. Summary of applications using EOG signals.
Table 3. Summary of applications using EOG signals.
PurposeTarget UserSignalData ProcessingRefs
WheelchairsDisabled peopleEOG + EEG + EMGSignal processing[51]
EOGLDA[1]
EOGSignal processing[4]
EOG + EEG + EMGSignal processing[52]
Game controllerAnyoneEOGDWT[5]
EOGSWT[60]
EOG + EEG + EMGSVM[47]
EOGSignal processing[61]
DroneEOGSignal processing[11,59]
Virtual keyboardEOGSVM[38]
EOG+EEG+EMGSignal processing[51]
EOG+EEGSVM[34]
EOGSignal processing[62]
ADHDChildrenEOGSignal processing[64]
EOGSignal processing[65]
EOGWT[66]
Emotion RecognitionAnyoneEOGSVM[127]
EOG + EMGSVM[128]
EOG + Eye imageSTFT[126]
sleepinessDriverEOG+EEGGAN + LSTM[63]
DrowsinessAnyoneEOGSignal processing[55]
EOG+EEGSignal processing[58]
Sleep monitoringEOG+EEG+EMGSignal processing[41]
EOGLinear classifier[40]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Ban, S.; Lee, Y.J.; Kim, K.R.; Kim, J.-H.; Yeo, W.-H. Advances in Materials, Sensors, and Integrated Systems for Monitoring Eye Movements. Biosensors 2022, 12, 1039. https://doi.org/10.3390/bios12111039

AMA Style

Ban S, Lee YJ, Kim KR, Kim J-H, Yeo W-H. Advances in Materials, Sensors, and Integrated Systems for Monitoring Eye Movements. Biosensors. 2022; 12(11):1039. https://doi.org/10.3390/bios12111039

Chicago/Turabian Style

Ban, Seunghyeb, Yoon Jae Lee, Ka Ram Kim, Jong-Hoon Kim, and Woon-Hong Yeo. 2022. "Advances in Materials, Sensors, and Integrated Systems for Monitoring Eye Movements" Biosensors 12, no. 11: 1039. https://doi.org/10.3390/bios12111039

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop