Next Article in Journal
Study on Multiple Fractal Analysis and Response Characteristics of Acoustic Emission Signals from Goaf Rock Bodies
Previous Article in Journal
Miniaturized Sensors for Detection of Ethanol in Water Based on Electrical Impedance Spectroscopy and Resonant Perturbation Method—A Comparative Study
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Empirical Study on Human Movement Classification Using Insole Footwear Sensor System and Machine Learning

1
Department of Electrical and Computer Engineering, The University of Alabama, Tuscaloosa, AL 35487, USA
2
College of Nursing, The University of Alabama, Tuscaloosa, AL 35487, USA
3
Department of Electrical Engineering, Pohang University of Science and Technology, Pohang 37673, Korea
*
Author to whom correspondence should be addressed.
Sensors 2022, 22(7), 2743; https://doi.org/10.3390/s22072743
Submission received: 23 February 2022 / Revised: 25 March 2022 / Accepted: 29 March 2022 / Published: 2 April 2022
(This article belongs to the Section Wearables)

Abstract

:
This paper presents a plantar pressure sensor system (P2S2) integrated in the insoles of shoes to detect thirteen commonly used human movements including walking, stooping left and right, pulling a cart backward, squatting, descending, ascending stairs, running, and falling (front, back, right, left). Six force sensitive resistors (FSR) sensors were positioned on critical pressure points on the insoles to capture the electrical signature of pressure change in the various movements. A total of 34 adult participants were tested with the P2S2. The pressure data were collected and processed using a Principal Component Analysis (PCA) for input to the multiple machine learning (ML) algorithms, including k-NN, neural network and Support-Vector Machine (SVM) algorithms. The ML models were trained using four-fold cross-validation. Each fold kept subject data independent from other folds. The model proved effective with an accuracy of 86%, showing a promising result in predicting human movements using the P2S2 integrated in shoes.

1. Introduction

Physical activity recognition is quickly becoming one of the most important methods of tracking human health and wellbeing. The desire to know more about our bodies has never been stronger with the rise of wearable devices such as smartwatches and smartphones. The rapid advances in the detection capabilities of these devices have shown just how far the technology can go, and they can detect human activity and movement with reasonable to high accuracy [1,2,3]. The demand for a device that can detect every type of daily human movement for the purpose of health tracking, injury prevention, and fall detection still exists. The need persists for a low-cost and non-invasive device.
Many methods have been adopted to capture human movement. Among them, one of the most popular sensors used is inertial measurement units (IMU) [4,5,6,7]. Despite its high accuracy, the IMU revealed discomfort in daily use because of the amount of IMUs that need be attached to a body and the complex setup procedure. Video capture using cameras has proven to be relatively accurate as well [8,9,10,11]. However, camera-based motion sensing technology is not feasible when a direct line of sight obstructed. [11]. Other methods such as a millimeter wave [12] also offer good accuracy but pose the same problems for integration into real life. Acoustic signal [13] and infrared signal [14] have also been used.
Another solution to this problem is to use a smart shoe using pressure-sensing technology in combination with machine learning. Smart shoe sensors present simple implementation to the body while providing comfort while in use. The smart shoe allows for practical use in day-to-day life. In addition, compared to other methods, this allows for a low-cost solution. The main areas of investigation in this field have been in pressure sensing materials and in the classification of different types of human movement. The use of multiple different materials such as sponges, textiles, and rubber for pressure sensing [15,16,17,18] shows promise for low-profile integration into an insole in a shoe. Pressure sensing for the use of human movement detection has also been investigated for cases such as stride counting [19], gait analysis [20], loss of balance and fall detection [21,22], and a variety of other human movements [23,24,25,26,27]. Thirteen frequently used household movements including lying, sitting, standing, walking, descend and ascend stairs, ergometer cycling, vacuuming, shelving items, washing dishes, sweeping the floor, and driving a car were classified in [23]. Walking and stair ascent and descent were classified in [28]. Walking up and down stairs were examined and classified in [29]. Two studies also investigated the detection of falling [21,22]. Based on previous research, there is a gap in knowledge related to more diverse movements, and the development of a system which can detect a broader set of human movements is in high demand. This will greatly improve the ability for workers to properly move and provide an approximation of estimated calories burned. Displayed in Table 1 is an overview of different study approaches and their accuracy in movement classification.
There are a number of footwear systems with the purpose of detecting the plantar pressure that are used today. One solution presented is an insole equipped with capacitive sensors with commercial solutions made by Moticon located in Munich, Germany [31]. Furthemore, others have been created for research purposes [32,33] in multiple different studies for gait tracking and motion analysis [21,34,35,36]. Other solutions such as the pedar© system designed by Novel in St. Paul, MN, USA [37] contain more than one hundred sensors so as to detect the precise pressure distribution across the foot [38,39]. A common solution was created with force-sensitive resistors placed at specific locations across the foot. This has been effectively used to detect pressure for a much lower cost than commercial solutions and allows for significant customizability [40,41,42,43,44,45].
This paper aims to detect thirteen different human movements using the P2S2 and machine learning algorithms. The P2S2 was developed in our previous work in [46] and the details will be provided in Section 2. It allows the system to acquire a more complete view of the user and makes it highly useful for injury prevention and health tracking.

2. Materials and Methods

2.1. System Design

The basic concept of the pressure-sensing system relies on the use of force-sensitive resistor (FSR)s. These are sensors that are constructed with a substrate layer, a conductive film, a spacer, and another substrate with a conductive print on top. When a person’s foot pushes against the ground, a force is exerted back onto it through the shoe, which is known as ground reaction force (GRF). The GRF varies in magnitude and location depending the point of pressure on the foot while in active motion. When this force is applied to an FSR, the conductive film meets the conductive print on the bottom substrate. This contact increases with force. As this contact increases, the resistance decreases and more current flows. This resistance changes logarithmically for a linear increase in force. Because of this property, the amount of force could be measured.
Figure 1 shows the block diagram of the proposed human-movement sensor system that was used in our previous work [27,46]. The FSR sensors (Flexiforce A301) [47] were located at six common points of pressure across the foot. This included the inside (S1) and outside (S2) of the heel, the inside (S3), middle (S4), and outside (S5) of the midfoot, and under the big toe (S6). These sensors were connected to a microcontroller with a Bluetooth Low Energy module (Adafruit Feather M0 Bluefruit LE), a microSD card reader for data recording [48], and a 3.7 V lithium-ion battery. Each sensor was connected to an ADC terminal on the microcontroller to detect the analog signal from the pressure sensors. Drop-down resistors were connected between the ADC terminals and ground, and were then used to provide a threshold voltage for the pressure sensors. The sensors were placed on a flexible plastic substrate and copper strips were used to create a common power line and to route the six signals to the ADC terminals. Another layer of flexible plastic was placed over the sensors and copper strips to protect them from damage.
So as to provide for the largest possible participant pool, we found that the average shoe size in the United States for a male was 10.5 and for a female was 8.5, respectively [49]. For this reason, the pressure-sensing system was built into two different pairs of shoes: one a size 10.5 and one a size 8.5. The insole sensor system was placed underneath the included insole in each of the shoes, while the microcontroller was attached to the outside of each shoe with Velcro. A slit was made in the side of each of the shoes to feed the wiring from the insole sensor system to the microcontroller. Both shoes with their respective sensor systems are shown in Figure 2.

2.2. Movement Description

Data were acquired for thirteen different movements during testing. These movements were chosen as an extensive collection of movements that every person performs across the average day. The chosen movements are displayed below in Table 2.

2.3. Experimental Procedure

Testing for this study took place at the University of Alabama College of Nursing. Participants were provided with the shoes equipped with the insole pressure sensing system and were instructed to perform the series of movements. For start and end time verification, the participants were instructed to perform a heel raise before a new motion was performed. This is because the GRF profiles during a heel raise are prominent and easily distinguishable in consecutive movements. This was particularly important for movements such as squatting and stooping as to divide a test of multiple squats or stoops into individual movements for input to the machine learning algorithm.
A total of 34 subjects were tested for the study. This consisted of 12 males and 22 females with an average age of 22.6 years. All subjects provided written, informed consent to the study before any data were taken. Displayed in Table 3 is the information collected about each of the 34 test subjects.

2.4. Data Collection

Pressure data from the participants were collected from the P2S2 by using the microSD card reader to write to a text file. The written data included pressure data for each of the sensors, as well as a timestamp that was recorded in terms of the number of samples taken since microcontroller system start-up. The data were captured at a sampling rate of 50 Hz, corresponding to data capture every 20 ms ± 2 ms. This implies that the time needed to take one sample is 20 ms. The pressure data were recorded using a 10-bit ADC with the received values ranging from 0 to 1024. These were then scaled on a relative pressure scale of 0 to 100 for each sensor. Once the raw data were gathered, the text file generated on the SD card was imported onto a computer. Individual movement tests were separated from the text file using a MATLAB script which detected heel raises and separated them into individual movements. This script also normalized the time for all the samples. These individual tests were then visualized and processed into smaller pieces using the MATLAB Signal Analyzer. The movements of walking, running, stair ascent and descent, and pushing and pulling a cart were broken into folds of two steps each that overlap by one step each. For example, a test of ten steps of walking was split into nine individual datapoints, each of which consisted of two steps. For all other motions, each individual movement was captured. For example, a test where the subject squatted ten times was split into ten datapoints of one squat each. This trimmed data was then prepared for feature extraction and then input into the k-NN algorithm for training.

2.5. Machine Learning Technique

After the raw data were pre-processed, a MATLAB script was used to extract features from each data segment and normalize the sample numbering so that it would start at time zero. These features were as follows:
  • Average value of each sensor (Feature 1 to 6). The average value of relative pressure was distinctive because each motion had a different period of pressure values. Each motion had different pressure values over the duration of the movement.
  • Standard deviation of each sensor (Feature 7–12). The standard deviation feature was utilized for similar reasons to the average value feature. It varied quite significantly based on the motion being tested but stayed within a margin of error for each given motion.
  • Pressure time integral (PTI) of each sensor. This is the summation of each pressure value multiplied by its corresponding sample value (Feature 13 to 18). The PTI was calculated using the following equation: t = 1 N P i ( t ) × Δ t where N was the total number of samples in a data segment, i was the index of the sensors (1–6), Pi was the sensor value at sample number t, and Δt is the number of samples from the beginning of the data segment [21]. The pressure time integral was a feature that helped differentiate between motions of different lengths. By summing the relative pressure values by the sample time, a greater variation between motions of various length was provided. This helped to increase the accuracy and allowed for more motions to be added into the study and classified accurately.
The data were then processed to a Tensorflow 2 machine learning algorithm, and several algorithms were tested to determine the highest accuracy algorithm. Of these algorithms, k-NN was selected due to the higher accuracy compared to the other algorithms. The k-NN algorithm is a supervised machine learning algorithm that operates on the assumption that similar things exist near one another. It works by finding the distance between points on a graph and chooses a value k and picks the first k entries that are closest to a certain point and captures their classification labels. The algorithm is trained by choosing different values for k and selecting the value which results in the most homogenous classification possible, while attempting to maintain the prediction accuracy as more unknown data were input. For our algorithm, a k value of one was chosen, as there was a lot of overlapping data and which would allow the algorithm to select more than one nearest neighbor, thereby resulting in significant misclassification. The algorithm also used a Euclidean distance metric to choose this neighbor with equal weighting given to distance. Dimensional reduction was applied to the data using PCA. The dimensions of the features were reduced to 17 from the original 18 features. This increased the separation between classes and decreased the training time. For the validation of the k-NN model, four-fold cross-validation was used. Each subject’s data was limited to one-fold, ensuring independence of the data. Four folds were chosen to split groups into seven subject groups, based on the movements with lowest amount of data points. Furthermore, Figure 3 shows the progression from raw data to classification for this study.

3. Results

This section describes data collected using P2S2 for movement classification. Six FSR sensor data are shown as a function of samples. These are representative of what a typical movement would look like for each movement. Thirty-four participant’s data were included for the study including 22 female and 12 male participants with an average age of 22.6 years. Data were collected using P2S2 for movement classification.

3.1. Walking

Walking was chosen, as it was the most common motion any person will perform and the one that current technology tracks best. Figure 4 showed the progression of the motion. The back inside and outside sensors peaked first, showing the heel strike and the subsequent contact of the front of the foot with the floor as the other four sensors peaked afterwards. Three steps were shown.

3.2. Running

Running was considered as a movement similar to walking, but at a faster movement rate. Steps are more rapid and a larger GFR is displayed. Figure 5 shows the motion as seen by the sensors. Compared to walking, the strike of the heel was much more instantaneous and leaned heavily towards the outside. This was then followed by an almost immediate contact of the front half of the foot with the ground. It could also be seen that the number of samples from beginning to the end of the motion was much less, showing the short amount of time in which the foot was on the ground. Three steps of running were shown.

3.3. Walking Up and Down the Stairs

Going up and down the stairs was considered to be a very common motion during a person’s average day, so this was also included in the selected motions. Figure 6 illustrates an example of walking down the stairs. shows the heel striking the ground first followed by the front of the foot. Compared to walking, the motion was similar, but the front of the foot was under more pressure than the rear of the foot. Three steps are presented in the figure. Figure 7 presents two steps during stair descent. It shows a brief heel strike followed by a significant strike of the front of the foot.

3.4. Stooping

The stooping motion, or kneeling with one foot forward, was chosen as a repeatable motion that could be predicted using our methods. Figure 8 and Figure 9 show the sensor data for the left foot for stooping with both the left and right foot forward. One full stooping motion is shown. Figure 8 showed very low sensor readings for the first half of the motion, as the subject put all their weight on the right foot as they kneeled with their right foot forward. Then, as they began to rise again, all their weight was placed on the front inside of their left foot which can be seen by the large peak of the sensor located under the big toe. As the participant’s foot shifted to being level with the ground, a small peak was seen on the sensors under the heel. Figure 9 demonstrates a large peak of the sensors under the heel initially as the participant stooped with their left foot forward and put most of their weight on the back of their left foot. Very little pressure was put on anything but the rear of the left foot during this motion.

3.5. Squatting

Another motion was squatting, as shown in Figure 10. The data show that a squat entirely concerned with the heel of the foot. They show that the rear two sensors under near constant load, with rising pressure observed as the user pushed back up to standing and then as they balanced the load on the foot once standing. We also noticed when looking at the data that some subjects performed squats with their weight on the front of their foot and their heels entirely off the ground. This did not prove to be an issue for classification though, as our data labelling and supervised machine learning scheme proved to allow for two quite different motions to be accurately classified as the same motion.

3.6. Pushing and Pulling a Cart

Pushing and pulling are two other motions that were chosen for this study. Looking at the sensor data for pulling in Figure 11, there was first heel contact with the ground and then contact with the front of the foot at nearly the same amplitude and duration. This was a similar motion to walking, but from our results we can tell that they can be differentiated between. Figure 12 showed two steps of pulling the cart. The sensor data show the front of the foot contacting the ground first with pressure predominantly on the very front of the foot. They then showed a light contact of the back foot, indicating that the subject was nearly on their toes as they walked backwards.

3.7. Falling

One of the most critical motions to detect was falling. This had huge applications for geriatrics and in anyone with disabilities. Falling backwardss and forwards were quite simple to be detected, as falling backwards included only peaks on the back of the foot in Figure 13, while falling forward included a significant shifting of weight from the back to front of the foot in Figure 14. The best indicator for falling left or right was strong maximums on the outside or inside middle of the foot. As a person falls, almost all their weight transfers to the outside of one foot and the inside of the other. The left foot data in Figure 15 was selected for detecting falling to the left and the right foot data in Figure 16 for falling to the right. This difference was easily detectable.

3.8. Machine Learning Results

The machine learning algorithm was trained on a labelled dataset that consisted of all of the movement results for each subject within the study. To achieve the highest classification accuracy possible, many different ML algorithms were trained. Table 4 presents various machine learning schemes and their respective overall accuracies is presented below. All methods were trained using four-fold cross-validation as specified earlier.
With the number of datapoints in the dataset, the algorithm took only a few seconds to train using a Tensorflow 2-based classification algorithm utilizing CUDA acceleration via a GTX 1060 graphics processing unit. This was a significant advantage of the k-NN algorithm as compared to more complex machine learning techniques such as deep learning, which can take anywhere from minutes to hours to train. This processing time did not include the time it took to pre-process the data. This process included sorting all the data, segmenting them into usable datapoints, and then extracting features from them. Below, displayed in Figure 17, is a confusion matrix showing the results of the predicted and actual movements trained on data from all subjects.
The movement detection result showed high accuracy in predicting all movements, with a greater than 83% accuracy acquired for twelve of the thirteen movements and an overall classification accuracy of 90.4%. Algorithm confusion occurred with similar movements. The algorithm confused different types of falling, with falling right being misclassified with falling forward. Overall, the classified movements were accurate.

4. Discussion

Of the thirteen movements examined in the study, twelve had an accuracy of 83% or higher. Some difficulties occurred with the prediction of falling in the right direction. The model still predicted that a fall occurred but was confused with forward and left directions. One reason of this misclassification could potentially be due to the fitment of shoes on participants. Not all human feet of the same size are the same, and slight differences in shape could cause slight classification error. Increasing the sample size of participants could mitigate this issue by training on more people from a group of participants.
Compared to our proposed P2S2, there are limitations within the design of the sensor system. First, the number of pressure sensors can have both advantages and disadvantages in design. Utilizing more sensors produces a higher resolution of pressure distribution and machine learning accuracy at the cost of increased design complexity and monetary cost. For example, the F-Scan 64 developed by Tekscan located in South Boston, MA, USA [50] is composed of 64 pressure sensors. With a sensor density of 3.9 (sensels/cm2), this system costs approximately USD 7000. Secondly, a larger scanning area could be achieved with a bigger FSR sensor [51]. However, the sensor area is limited to the size of the shoe. Lastly, the sensitivity of a sensor can be improved using an advanced material and technology such as a piezoelectric sensor [52], carbon nanotube [53], and capacitive materials [54].
The machine learning algorithm selected for this study was a kNN based algorithm. There are many other algorithms suited for this application such as convolutional neural network (CNN), SVM, decision tree, and linear regression. kNN is a relatively small model with a quick training time. SVM and decision tree are simple models that can differentiate various human movement. Linear regression can find correlation based on pressure data to identify human movement. CNN can be accurate with optimized hyper parameters with a number of neurons, number of layers, and epochs for movement classification
The applications of this technology could dramatically improve quality of life by understanding daily movements. For example, employees can be properly trained by knowing their movements and correction if a movement abnormality is detected. This opens the ability to prevent injury, thereby saving both personal and corporate expenses. Additionally, by knowing the types and number of human movements, a calorie usage can be reported to humans. This provides the individual encouragement to exercise. By tracking previous to current movements, trends can be developed to indicate progression towards weight loss.The potential for every person to wear a shoe that can help train them to perform tasks with less strain to their bodies, as well as to inform them of what they can do to become the healthiest version of themselves could be revolutionary in a world where personal health is becoming more and more important. Future work could focus on the areas of microcontroller integration, automatic data processing, the ability to export data wirelessly, as well as wireless charging of the system. These future goals all serve the idea of creating a fully integrated system that has mass marketability and is as easy to use as possible.

5. Conclusions

In summary, a low-cost non-invasive footwear P2S2, using six force sensitive resistors with machine learning techniques, was presented to demonstrate the prediction of human movements. A total of 34 participants, with an average age of 22.9, were tested with P2S2 at the Capstone College of Nursing in the University of Alabama. Thirteen commonly used human movements including walking, stooping left and right, pulling a cart backward, squatting, descending, ascending stairs, running, and falling (front, back, right, left) were predicted using kNN machine learning algorithm. Validation of model was performed using a 4 k-fold process, which isolated training and test data. The results of this study showed that the proposed P2S2 can predict almost all the thirteen different human movements with an average accuracy of above 86%, while falling right was classified at a 78% accuracy.

Author Contributions

Conceptualization, W.A., Z.C., N.J. and M.C.; methodology, W.A., Z.C. and N.J.; software, W.A.; validation, W.A., Z.C., N.J. and S.J.; formal analysis, W.A., N.J. and S.J.; investigation, N.J., M.C. and S.J.; resources, N.J. and M.C.; writing—original draft preparation, W.A., N.J. and M.C.; writing—review and editing, W.A., Z.C., N.J., M.C., S.J. and E.S.; visualization, W.A., Z.C., N.J. and M.C.; supervision, N.J. and M.C.; project administration, N.J. and M.C.; funding acquisition, N.J. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Department of Electrical and Computer Engineering at The University of Alabama.

Institutional Review Board Statement

The study was conducted according to the guidelines of the Declaration of Helsinki and approved by the Institutional Review Board of the University of Alabama protocol ID 20-02-3356, approved on 5 June 2020.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Del Rosario, M.B.; Redmond, S.J.; Lovell, N.H. Tracking the evolution of smartphone sensing for monitoring human movement. Sensors 2015, 15, 18901–18933. [Google Scholar] [CrossRef] [Green Version]
  2. Bulbul, E.; Cetin, A.; Dogru, I.A. Human activity recognition using smartphones. In Proceedings of the 2018 2nd International Symposium on Multidisciplinary Studies and Innovative Technologies (ISMSIT), Ankara, Turkey, 19–21 October 2018; pp. 1–6. [Google Scholar]
  3. Shoaib, M.; Bosch, S.; Scholten, H.; Havinga, P.J.; Incel, O.D. Towards detection of bad habits by fusing smartphone and smartwatch sensors. In Proceedings of the 2015 IEEE International Conference on Pervasive Computing and Communication Workshops (PerCom Workshops), St. Louis, MO, USA, 23–27 March 2015; pp. 591–596. [Google Scholar]
  4. Guo, L.J.; Xiong, S.P. Accuracy of Base of Support Using an Inertial Sensor Based Motion Capture System. Sensors 2017, 17, 2091. (In English) [Google Scholar] [CrossRef] [Green Version]
  5. Crema, C.; Depari, A.; Flammini, A.; Sisinni, E.; Haslwanter, T.; Salzmann, S. IMU-based solution for automatic detection and classification of exercises in the fitness scenario. In Proceedings of the 2017 IEEE Sensors Applications Symposium (SAS), Glassboro, NJ, USA, 13–15 March 2017; pp. 1–6. [Google Scholar] [CrossRef]
  6. Lu, Y.; Velipasalar, S. Human activity classification incorporating egocentric video and inertial measurement unit data. In Proceedings of the 2018 IEEE Global Conference on Signal and Information Processing (GlobalSIP), Anaheim, CA, USA, 26–29 November 2018; pp. 429–433. [Google Scholar] [CrossRef]
  7. Losing, V.; Yoshikawa, T.; Hasenjaeger, M.; Hammer, B.; Wersing, H. Personalized Online Learning of Whole-Body Motion Classes using Multiple Inertial Measurement Units. In Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019; pp. 9530–9536. [Google Scholar] [CrossRef]
  8. Ramli, M.S.A.; Zamzuri, H.; Abidin, M.S.Z. Tracking human movement in office environment using video processing. In Proceedings of the 2011 Fourth International Conference on Modeling, Simulation and Applied Optimization, Kuala Lumpur, Malaysia, 19–21 April 2011; pp. 1–6. [Google Scholar] [CrossRef] [Green Version]
  9. Lao, W.; Han, J.; De With, P.H. Automatic video-based human motion analyzer for consumer surveillance system. IEEE Trans. Consum. Electron. 2009, 55, 591–598. [Google Scholar] [CrossRef] [Green Version]
  10. Corazza, S.; Mündermann, L.; Chaudhari, A.M.; Demattio, T.; Cobelli, C.; Andriacchi, T.P. A markerless motion capture system to study musculoskeletal biomechanics: Visual hull and simulated annealing approach. Ann. Biomed. Eng. 2006, 34, 1019–1029. (In English) [Google Scholar] [CrossRef]
  11. Zago, M.; Luzzago, M.; Marangoni, T.; De Cecco, M.; Tarabini, M.; Galli, M. 3D Tracking of Human Motion Using Visual Skeletonization and Stereoscopic Vision. Front. Bioeng. Biotechnol. Orig. Res. 2020, 8, 181. (In English) [Google Scholar] [CrossRef]
  12. Geng, Y.; Chen, J.; Fu, R.; Bao, G.; Pahlavan, K. Enlighten Wearable Physiological Monitoring Systems: On-Body RF Characteristics Based Human Motion Classification Using a Support Vector Machine. IEEE Trans. Mob. Comput. 2016, 15, 656–671. [Google Scholar] [CrossRef]
  13. Wang, T.; Zhang, D.; Zheng, Y.; Gu, T.; Zhou, X.; Dorizzi, B. C-FMCW based contactless respiration detection using acoustic signal. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2018, 1, 1–20. [Google Scholar] [CrossRef]
  14. Yun, J.; Lee, S.-S. Human movement detection and identification using pyroelectric infrared sensors. Sensors 2014, 14, 8057–8081. [Google Scholar] [CrossRef]
  15. Zhang, L.; Li, H.; Lai, X.; Gao, T.; Yang, J.; Zeng, X. Thiolated Graphene@Polyester Fabric-Based Multilayer Piezoresistive Pressure Sensors for Detecting Human Motion. ACS Appl. Mater. Interfaces 2018, 10, 41784–41792. [Google Scholar] [CrossRef]
  16. Nie, B.; Huang, R.; Yao, T.; Zhang, Y.; Miao, Y.; Liu, C.; Liu, J.; Chen, X. Textile-Based Wireless Pressure Sensor Array for Human-Interactive Sensing. Adv. Funct. Mater. 2019, 29, 1808786. [Google Scholar] [CrossRef]
  17. Ding, Y.; Yang, J.; Tolle, C.R.; Zhu, Z. Flexible and Compressible PEDOT:PSS@Melamine Conductive Sponge Prepared via One-Step Dip Coating as Piezoresistive Pressure Sensor for Human Motion Detection. ACS Appl. Mater. Interfaces 2018, 10, 16077–16086. [Google Scholar] [CrossRef]
  18. Motha, L.; Kim, J.; Kim, W.S. Instrumented rubber insole for plantar pressure sensing. Org. Electron. 2015, 23, 82–86. [Google Scholar] [CrossRef]
  19. Truong, P.H.; Lee, J.; Kwon, A.-R.; Jeong, G.-M. Stride Counting in Human Walking and Walking Distance Estimation Using Insole Sensors. Sensors 2016, 16, 823. Available online: https://www.mdpi.com/1424-8220/16/6/823 (accessed on 28 March 2022). [CrossRef]
  20. Martini, E.; Fiumalbi, T.; Dell’Agnello, F.; Ivanić, Z.; Munih, M.; Vitiello, N.; Crea, S. Pressure-Sensitive Insoles for Real-Time Gait-Related Applications. Sensors 2020, 20, 1448. (In English) [Google Scholar] [CrossRef] [Green Version]
  21. Antwi-Afari, M.F.; Li, H.; Seo, J.; Wong, A.Y.L. Automated detection and classification of construction workers’ loss of balance events using wearable insole pressure sensors. Autom. Constr. 2018, 96, 189–199. [Google Scholar] [CrossRef]
  22. Howcroft, J.; Kofman, J.; Lemaire, E.D. Feature selection for elderly faller classification based on wearable sensors. J. NeuroEng. Rehabil. 2017, 14, 47. [Google Scholar] [CrossRef]
  23. Hegde, N.; Bries, M.; Swibas, T.; Melanson, E.; Sazonov, E. Automatic Recognition of Activities of Daily Living Utilizing Insole-Based and Wrist-Worn Wearable Sensors. IEEE J. Biomed. Health Inform. 2018, 22, 979–988. [Google Scholar] [CrossRef]
  24. Jeong, G.; Truong, P.H.; Choi, S. Classification of Three Types of Walking Activities Regarding Stairs Using Plantar Pressure Sensors. IEEE Sens. J. 2017, 17, 2638–2639. [Google Scholar] [CrossRef]
  25. Moore, S.R.; Kranzinger, C.; Fritz, J.; Stöggl, T.; Kröll, J.; Schwameder, H. Foot Strike Angle Prediction and Pattern Classification Using LoadsolTM Wearable Sensors: A Comparison of Machine Learning Techniques. Sensors 2020, 20, 6737. Available online: https://www.mdpi.com/1424-8220/20/23/6737 (accessed on 28 March 2022). [CrossRef]
  26. Antwi-Afari, M.F.; Li, H.; Seo, J.; Wong, A. Automated recognition of construction workers’ activities for productivity measurement using wearable insole pressure system. In Proceedings of the CIB World Building Congress 2019: Constructing Smart Cities, Hong Kong, China, 17–21 June 2019; Available online: http://hdl.handle.net/10397/88030 (accessed on 28 March 2022).
  27. Sazonov, E.S.; Fulk, G.; Hill, J.; Schutz, Y.; Browning, R. Monitoring of Posture Allocations and Activities by a Shoe-Based Wearable Sensor. IEEE Trans. Biomed. Eng. 2011, 58, 983–990. [Google Scholar] [CrossRef]
  28. ISO 13732-1:2006; Ergonomics of the Thermal Environment—Methods for the Assessment of Human Responses to Contact with Surfaces—Part 1: Hot Surfaces. ISO: Geneva, Switzerland, 2006.
  29. Nguyen, N.D.; Bui, D.T.; Truong, P.H.; Jeong, G. Classification of Five Ambulatory Activities Regarding Stair and Incline Walking Using Smart Shoes. IEEE Sens. J. 2018, 18, 5422–5428. [Google Scholar] [CrossRef]
  30. Leu, F.-Y.; Ko, C.-Y.; Lin, Y.-C.; Susanto, H.; Yu, H.-C. Chapter 10—Fall Detection and Motion Classification by Using Decision Tree on Mobile Phone. In Smart Sensors Networks; Xhafa, F., Leu, F.-Y., Hung, L.-L., Eds.; Academic Press: Cambridge, MA, USA, 2017; pp. 205–237. [Google Scholar]
  31. Moticon. Moticon Science Insole 3. Available online: https://www.moticon.de/insole3-overview/ (accessed on 28 March 2022).
  32. Aqueveque, P.; Osorio, R.; Pastene, F.; Saavedra, F.; Pino, E. Capacitive Sensors Array for Plantar Pressure Measurement Insole fabricated with Flexible PCB. In Proceedings of the 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Honolulu, HI, USA, 18–21 July 2018; pp. 4393–4396. [Google Scholar] [CrossRef]
  33. Mazumder, O.; Kundu, A.S.; Bhaumik, S. Development of wireless insole foot pressure data acquisition device. In Proceedings of the 2012 International Conference on Communications, Devices and Intelligent Systems (CODIS), Kolkata, India, 28–29 December 2012; pp. 302–305. [Google Scholar] [CrossRef]
  34. Lehner, S.; Dießl, C.; Chang, D.; Senner, V. Optimization of a Foot Model for the Evaluation of the Injury Risk during Cutting Movements in Football. Procedia Eng. 2013, 60, 325–330. [Google Scholar] [CrossRef] [Green Version]
  35. He, J.; Lippmann, K.; Shakoor, N.; Ferrigno, C.; Wimmer, M.A. Unsupervised gait retraining using a wireless pressure-detecting shoe insole. Gait Posture 2019, 70, 408–413. [Google Scholar] [CrossRef]
  36. Antwi-Afari, M.F.; Li, H. Fall risk assessment of construction workers based on biomechanical gait stability parameters using wearable insole pressure system. Adv. Eng. Inform. 2018, 38, 683–694. [Google Scholar] [CrossRef]
  37. Ramanathan, A.K.; Kiran, P.; Arnold, G.P.; Wang, W.; Abboud, R.J. Repeatability of the Pedar-X® in-shoe pressure measuring system. Foot Ankle Surg. 2010, 16, 70–73. [Google Scholar] [CrossRef]
  38. Kanitthika, K.; Chan, K.S. Pressure sensor positions on insole used for walking analysis. In Proceedings of the 18th IEEE International Symposium on Consumer Electronics (ISCE 2014), Jeju, Korea, 22–25 June 2014; pp. 1–2. [Google Scholar] [CrossRef]
  39. Abu-Faraj, Z.O.; Faraj, Y.T.; Mohtar, K.H.; Rammal, M.M. Characterization of plantar pressures in visually impaired individuals: A pilot study. In Proceedings of the 2013 6th International IEEE/EMBS Conference on Neural Engineering (NER), San Diego, CA, USA, 6–8 November 2013; pp. 1549–1553. [Google Scholar] [CrossRef]
  40. Di Rosa, M.; Hausdorff, J.M.; Stara, V.; Rossi, L.; Glynn, L.; Casey, M.; Burkard, S.; Cherubini, A. Concurrent validation of an index to estimate fall risk in community dwelling seniors through a wireless sensor insole system: A pilot study. Gait Posture 2017, 55, 6–11. [Google Scholar] [CrossRef]
  41. Zhang, Q.; Wang, Y.L.; Xia, Y.; Wu, X.; Kirk, T.V.; Chen, X.D. A low-cost and highly integrated sensing insole for plantar pressure measurement. Sens. Bio-Sens. Res. 2019, 26, 100298. [Google Scholar] [CrossRef]
  42. Aggarwal, A.; Gupta, R.; Agarwal, R. Design and Development of Integrated Insole System for Gait Analysis. In Proceedings of the 2018 Eleventh International Conference on Contemporary Computing (IC3), Noida, India, 2–4 August 2018; pp. 1–5. [Google Scholar] [CrossRef]
  43. Choi, H.S.; Shim, M.; Lee, C.H.; Baek, Y.S. Estimating GRF(Ground Reaction Force) and Calibrating CoP(Center of Pressure) of an Insole Measured by an Low-Cost Sensor with Neural Network. In Proceedings of the 2018 IEEE 18th International Conference on Bioinformatics and Bioengineering (BIBE), Taichung, Taiwan, 29–31 October 2018; pp. 185–188. [Google Scholar] [CrossRef]
  44. Ghaida, H.A.; Mottet, S.; Goujon, J. A real time study of the human equilibrium using an instrumented insole with 3 pressure sensors. In Proceedings of the 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Chicago, IL, USA, 26–30 August 2014; pp. 4968–4971. [Google Scholar] [CrossRef]
  45. Numchaichanakij, A.; Chitsakul, K.; Tretriluxana, S. An insole point pressure monitoring system. In Proceedings of the 4th 2011 Biomedical Engineering International Conference, Chiang Mai, Thailand, 29–31 January 2012; pp. 198–201. [Google Scholar] [CrossRef]
  46. Choffin, Z.; Jeong, N.; Callihan, M.; Olmstead, S.; Sazonov, E.; Thakral, S.; Getchell, C.; Lombardi, V. Ankle Angle Prediction Using a Footwear Pressure Sensor and a Machine Learning Technique. Sensors 2021, 21, 3790. Available online: https://www.mdpi.com/1424-8220/21/11/3790 (accessed on 28 March 2022). [CrossRef]
  47. Tekscan. FlexiForce A301 Sensor. Available online: https://www.tekscan.com/products-solutions/force-sensors/a301 (accessed on 3 March 2021).
  48. Adafruit. MicroSD Card Breakout Board+. Available online: https://www.adafruit.com/product/254 (accessed on 28 March 2022).
  49. AAOS. AAOS Orthopaedic Surgeon Census. American Academy of Orthopadic Surgeons, Census October 2014; AAOS: Rosemont, IL, USA, 2014. [Google Scholar]
  50. Tekscan. F-Scan System. Available online: https://www.tekscan.com/products-solutions/systems/f-scan-system (accessed on 25 March 2022).
  51. Tekscan. FlexiForce A401 Sensor. Available online: https://www.tekscan.com/products-solutions/force-sensors/a401 (accessed on 25 March 2022).
  52. Yang, Y.; Pan, H.; Xie, G.; Jiang, Y.; Chen, C.; Su, Y.; Wang, Y.; Tai, H. Flexible piezoelectric pressure sensor based on polydopamine-modified BaTiO3/PVDF composite film for human motion monitoring. Sens. Actuators A Phys. 2020, 301, 111789. [Google Scholar] [CrossRef]
  53. Doshi, S.M.; Thostenson, E.T. Thin and Flexible Carbon Nanotube-Based Pressure Sensors with Ultrawide Sensing Range. ACS Sens. 2018, 3, 1276–1282. [Google Scholar] [CrossRef]
  54. Sharma, S.; Chhetry, A.; Sharifuzzaman, M.; Yoon, H.; Park, J.Y. Wearable Capacitive Pressure Sensor Based on MXene Composite Nanofibrous Scaffolds for Reliable Human Physiological Signal Acquisition. ACS Appl. Mater. Interfaces 2020, 12, 22212–22224. [Google Scholar] [CrossRef]
Figure 1. Schematic of the proposed pressure sensing system.
Figure 1. Schematic of the proposed pressure sensing system.
Sensors 22 02743 g001
Figure 2. (a) men’s size 10.5 shoe with integrated insole pressure system and microcontroller shown. (b) women’s size 8.5 shoe with insole pressure system and microcontroller shown.
Figure 2. (a) men’s size 10.5 shoe with integrated insole pressure system and microcontroller shown. (b) women’s size 8.5 shoe with insole pressure system and microcontroller shown.
Sensors 22 02743 g002
Figure 3. Diagram showing progression from raw data to classified movement.
Figure 3. Diagram showing progression from raw data to classified movement.
Sensors 22 02743 g003
Figure 4. Walking motion sensor readings for the left foot.
Figure 4. Walking motion sensor readings for the left foot.
Sensors 22 02743 g004
Figure 5. Running motion sensor readings for the left foot.
Figure 5. Running motion sensor readings for the left foot.
Sensors 22 02743 g005
Figure 6. Stair ascent sensor reading for the left foot.
Figure 6. Stair ascent sensor reading for the left foot.
Sensors 22 02743 g006
Figure 7. Stair descent sensor readings for left foot.
Figure 7. Stair descent sensor readings for left foot.
Sensors 22 02743 g007
Figure 8. Stoop with right foot forward sensor readings for the left foot.
Figure 8. Stoop with right foot forward sensor readings for the left foot.
Sensors 22 02743 g008
Figure 9. Stoop with left foot forward sensor readings for the left foot.
Figure 9. Stoop with left foot forward sensor readings for the left foot.
Sensors 22 02743 g009
Figure 10. Squatting motion sensor data for the left foot.
Figure 10. Squatting motion sensor data for the left foot.
Sensors 22 02743 g010
Figure 11. Pushing a cart sensor data for the left foot.
Figure 11. Pushing a cart sensor data for the left foot.
Sensors 22 02743 g011
Figure 12. Pulling a cart sensor data for the left foot.
Figure 12. Pulling a cart sensor data for the left foot.
Sensors 22 02743 g012
Figure 13. Falling backwards sensor data for the left foot.
Figure 13. Falling backwards sensor data for the left foot.
Sensors 22 02743 g013
Figure 14. Falling forwards sensor data for the left foot.
Figure 14. Falling forwards sensor data for the left foot.
Sensors 22 02743 g014
Figure 15. Falling to the left sensor data for the left.
Figure 15. Falling to the left sensor data for the left.
Sensors 22 02743 g015
Figure 16. Falling to the right sensor data for the right foot.
Figure 16. Falling to the right sensor data for the right foot.
Sensors 22 02743 g016
Figure 17. Movement detection results from 34 subject data. True Positive Rate (TPR) and False Negative rate (FNR).
Figure 17. Movement detection results from 34 subject data. True Positive Rate (TPR) and False Negative rate (FNR).
Sensors 22 02743 g017
Table 1. Previous Studies for Human Movement Detections.
Table 1. Previous Studies for Human Movement Detections.
AuthorApplicationSensorNumber of Sensors UsedMachine-Learning AlgorithmType of MovementsAccuracy
Crema et al. [5]Physical movementIMU1Linear Discriminant Analysis, Principal Component Analysis9 gym exercises (bench press, squats, shoulder press, etc.)85%
Lu et al. [6]Physical movementIMU, Image5Capsule Networks, Convolutional Long short-term memory (LSTM)6 cooking activities (opening fridge, cracking eggs, stirring eggs, pouring oil, pouring bag, stirring big bowl)85.8%
Lao et al. [10]Physical movementVideo1Continuous Hidden Markov ModelLeft/right hand pointing, squatting, raising hands overhead, lying86%
Geng et al. [12]Physical movementOn-body radio freqency (RF) receivers and transmitters5SVMStanding, walking, running, lying, crawling, climbing, and running up stairs88.69%
Wang et al. [13]Physical movementAcoustic2NoneRespirationNone
Yun et al. [14]Physical movementInfrared4Bayes Net, Decision Tree, Instance-based learning, Multilayer Perception, Naïve Bayes, SVMWalking in different directions99.9%
Hegde et al. [23]Physical MovementFSR, accelerometer, and IMU13Multinomial logistic discriminationLying, sitting, standing, walking, driving, stair descent/ascent, cycling, vacuuming, shelving items, dish washing, sweeping, not wearing device89%
Jeong et al. [24]Physical movementFSR3SVMWalking, stair ascent/descent95.2%
Antwi-Afari et al. [26]Physical movementCapacitive4SVMLifting, lowering, carrying, standing94.4%
Sazonov et al. [27]Physical movementAccelerometer and FSR6SVMSitting, standing, walking, stair ascent/descent, and cycling98%
Nguyen et al. [29]Physical movementFSR5SVMWalking on flat, inclined, or declined surface, stair ascent/descent97.8%
Leu et al. [30]Physical movementMobile phone2Decision treeSix types of falls96.57%
Table 2. Detailed description of all recorded motions.
Table 2. Detailed description of all recorded motions.
Motion NameFigureDescriptionDuration
(Minutes)
Falling
(split into 4 directions: left, right, forward, backward)
Sensors 22 02743 i001Subject fell a total of eight times on to a full-sized mattress of approximately 7-inch thickness. The falls occurred two times in each of the following directions: forward, backward, on the right side, and left side.3
Stoop Left Sensors 22 02743 i002Subject performed a kneeling motion with their left foot forward and then stood back up. This was repeated until ten stoops had been completed.2
Pulling a cart backward Sensors 22 02743 i003Subject walked backwards and pulled the cart five steps. This was repeated once for more data points.1
Pushing a cart forward Sensors 22 02743 i004Subject pushed the cart approximately five steps forward. This was repeated once for more data points.1
Stoop Right Sensors 22 02743 i005Subject performed a kneeling motion with their right forward and then stood back up. This was repeated until ten stoops had been completed.2
Squatting Sensors 22 02743 i006Subject stood still, squatted down, and then returned to a standing position. This was repeated ten times.2
Descending stairs Sensors 22 02743 i007Subject naturally descended stairs. The stairs will be a standard flight located at the test site.1
Ascending stairs Sensors 22 02743 i008Subject naturally ascended stairs. The stairs will be a standard flight located at the test site.1
Running Sensors 22 02743 i009Subject jogged down a hallway at the test site (approximately 30 steps), repeating once for more data points.1
Walking Sensors 22 02743 i010Subject walked down the same hallway at the test site This was then repeated once more for more data points.2
Table 3. Participant’s information.
Table 3. Participant’s information.
SubjectSexAgeHeight (Inch)Weight (Lb)Shoe Size (Inch)
1Female215′3″1208.5
2Female215′4″1858.5
3Female215′7″13010.5
4Female215′7″1358.5
5Female415′1″1508.5
6Male215′11″18010.5
7Female215′9″17010.5
8Female215′8″1258.5
9Female215′4″1658.5
10Male216′1″17010.5
11Female205′7″1408.5
12Male245′10″18510.5
13Male215′11″17010.5
14Female205′7″1408.5
15Female295′3″1458.5
16Male235′10″17510.5
17Male216′1″15010.5
18Female215′4″1508.5
19Female235′5″1558.5
20Male196′1″13510.5
21Male215′8″16010.5
22Male446′0″20510.5
23Female225′8″1508.5
24Male225′11″14510.5
25Female225′4″1658.5
26Female215′10″1358.5
27Female205′6″1308.5
28Female215′6″1388.5
29Female205′6″1458.5
30Female215′5″1408.5
31Male226′3″19010.5
32Female205′2″1128.5
33Female215′6″1408.5
34Male216′0″14510.5
Table 4. Machine Learning Algorithm Performance Comparison.
Table 4. Machine Learning Algorithm Performance Comparison.
ML AlgorithmDetailsEpochsTraining TimeAccuracy
SVMQuadratic kernel function, 1-vs.-1 multiclass method100025.1 s89.9%
Neural NetworkMedium NN, one fully connected layer, first layer size of 25100027.1 s89.2%
Neural NetworkWide NN, one fully connected layer, first layer size of 100100034 s89.5%
KNNWeighted, 10 neighbors, Euclidean distance metric, squared inverse distance weight100025.1 s90.4%
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Anderson, W.; Choffin, Z.; Jeong, N.; Callihan, M.; Jeong, S.; Sazonov, E. Empirical Study on Human Movement Classification Using Insole Footwear Sensor System and Machine Learning. Sensors 2022, 22, 2743. https://doi.org/10.3390/s22072743

AMA Style

Anderson W, Choffin Z, Jeong N, Callihan M, Jeong S, Sazonov E. Empirical Study on Human Movement Classification Using Insole Footwear Sensor System and Machine Learning. Sensors. 2022; 22(7):2743. https://doi.org/10.3390/s22072743

Chicago/Turabian Style

Anderson, Wolfe, Zachary Choffin, Nathan Jeong, Michael Callihan, Seongcheol Jeong, and Edward Sazonov. 2022. "Empirical Study on Human Movement Classification Using Insole Footwear Sensor System and Machine Learning" Sensors 22, no. 7: 2743. https://doi.org/10.3390/s22072743

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop