Document Detail

Inertial sensor-based two feet motion tracking for gait analysis.
Jump to Full Text
MedLine Citation:
PMID:  23628759     Owner:  NLM     Status:  In-Data-Review    
Abstract/OtherAbstract:
Two feet motion is estimated for gait analysis. An inertial sensor is attached on each shoe and an inertial navigation algorithm is used to estimate the movement of both feet. To correct inter-shoe position error, a camera is installed on the right shoe and infrared LEDs are installed on the left shoe. The proposed system gives key gait analysis parameters such as step length, stride length, foot angle and walking speed. Also it gives three dimensional trajectories of two feet for gait analysis.
Authors:
Tran Nhat Hung; Young Soo Suh
Related Documents :
23909319 - Phase estimation with weak measurement using a white light source.
21535589 - Effect of light and sweeteners on color in an amaretto-type liqueur.
23157189 - A novel method to measure the mechanical pushing and pulling forces during ureteroscopy...
21133499 - Single microparticle launching method using two-stage light-gas gun for simulating hype...
3608569 - Strain differences in sensitivity to light-induced photoreceptor degeneration in albino...
24993209 - From acoustic descriptors to evoked quality of car door sounds.
21527629 - The efficiency of c4 photosynthesis under low light conditions: assumptions and calcula...
17906499 - Contralateral ear in chronic otitis media: a histologic study.
8641149 - Neural net identification of thumb movement using spectral characteristics of magnetic ...
Publication Detail:
Type:  Journal Article     Date:  2013-04-29
Journal Detail:
Title:  Sensors (Basel, Switzerland)     Volume:  13     ISSN:  1424-8220     ISO Abbreviation:  Sensors (Basel)     Publication Date:  2013  
Date Detail:
Created Date:  2013-04-30     Completed Date:  -     Revised Date:  -    
Medline Journal Info:
Nlm Unique ID:  101204366     Medline TA:  Sensors (Basel)     Country:  Switzerland    
Other Details:
Languages:  eng     Pagination:  5614-29     Citation Subset:  IM    
Affiliation:
Department of Electrical Engineering, University of Ulsan, Namgu, Ulsan 680-749, Korea. yssuh@ulsan.ac.kr.
Export Citation:
APA/MLA Format     Download EndNote     Download BibTex
MeSH Terms
Descriptor/Qualifier:

From MEDLINE®/PubMed®, a database of the U.S. National Library of Medicine

Full Text
Journal Information
Journal ID (nlm-ta): Sensors (Basel)
Journal ID (iso-abbrev): Sensors (Basel)
ISSN: 1424-8220
Publisher: Molecular Diversity Preservation International (MDPI)
Article Information
Download PDF
© 2013 by the authors; licensee MDPI, Basel, Switzerland.
License:
Received Day: 05 Month: 3 Year: 2013
Revision Received Day: 19 Month: 4 Year: 2013
Accepted Day: 20 Month: 4 Year: 2013
collection publication date: Month: 5 Year: 2013
Electronic publication date: Day: 29 Month: 4 Year: 2013
Volume: 13 Issue: 5
First Page: 5614 Last Page: 5629
PubMed Id: 23628759
ID: 3690018
DOI: 10.3390/s130505614
Publisher Id: sensors-13-05614

Inertial Sensor-Based Two Feet Motion Tracking for Gait Analysis
Tran Nhat Hung
Young Soo Suh*
Department of Electrical Engineering, University of Ulsan, Namgu, Ulsan 680-749, Korea; E-Mail: hungtn306@gmail.com
* Author to whom correspondence should be addressed; E-Mail: yssuh@ulsan.ac.kr; Tel.: +82-52-259-2186; Fax: +82-52-259-1686.

1.  Introduction

Gait analysis is the systematic study of human walking motion [1]. Gait analysis is used to evaluate individuals with conditions affecting their ability to walk. It can be used for health diagnostics or rehabilitation.

There are mainly two kinds of systems for gait analysis: outside observation systems and wearable sensor systems. In outside observation systems, a camera [2], sensors on the floor [3] and optical remote sensors [4] are used to observe walking motion. Advantages of outside observation systems are their high accuracy. The disadvantage is that it requires a dedicated experiment space and the walking range is rather limited.

Various wearable sensors [5] are used for gait analysis, including force sensors [6], goniometer [1] and inertial sensors [7,8]. The main advantage of wearable sensor systems is that it does not require a dedicated space for experiments. Thus gait analysis can be performed during everyday life, where more natural walking can be observed.

Recently inertial sensors have received lots of attention as wearable sensors for gait analysis. There are two types of inertial sensor-based systems. In [7,8], angles of leg joints are estimated, where attitude estimation algorithms are applied using inertial sensor data. In [9], an inertial navigation algorithm [10] is used to estimate a foot movement. By installing inertial sensors on a shoe, foot motion (position, velocity and attitude) can be estimated quantitatively. Many similar systems [1113] are also developed for personal navigation systems. This paper is closely related to the latter approach, where an inertial navigation algorithm is used to track foot motion.

Inertial navigation algorithm-based foot motion analysis is in most cases [9,1113] done only for a single foot. However, two feet motion tracking provides more information for the gait analysis. Theoretically, two feet motion tracking can be done by simply attaching an inertial sensor on each shoe instead of on a single shoe. However, inter-shoe distance error diverges as time goes by and the relative position between the left and right foot becomes very large. To maintain the accurate relative position between two feet, it is necessary to measure inter-shoe distance.

In [14], two feet motion is estimated in the context of a personal navigation system. In the system, an inertial sensor is attached on each foot and the distance between two feet is measured using a sonar sensor. Since the system is developed for a personal navigation system, the main interest is the accurate position estimation of a person.

In this paper, we propose inertial-sensor based two feet motion tracking system for gait analysis. An inertial sensor unit is installed on each shoe. The position and attitude between two shoes are estimated using a camera on one shoe and infrared LEDs on the other shoe. Using the proposed system, two feet motion (position, velocity and attitude) can be estimated. We note that only the inter-shoe distance (scalar quantity) is measured in [14]. Thus the relative position between two feet can only be obtained after many walking steps. The fact that the relative position cannot be computed is not a problem in [14] since the goal is to estimate a person's position. On the other hand, the relative position and attitude between two feet can be accurately estimated in the proposed system. Thus the relative location between two feet can be obtained from the first step as long as a camera on the right shoe can see the landmark on the left shoe.


2.  System Overview

The picture of the proposed system is given in Figure 1. Two IMUs (XSens MTi) are attached on both feet. An USB camera (Pointgrey FireFly MV; is attached on the right foot and eight infrared LEDs are attached on the left foot. The movement of two feet is estimated using an inertial navigation algorithm. The relative position between two feet is estimated by capturing the LEDs on the left foot from the camera on the right foot.

As we can see in Figure 1, the sensor unit size is rather large. This may affect the walking patterns. We note that no conscious effort was given to make the system smaller since the purpose of this paper is to demonstrate feasibility of a wearable gait analysis system combining a camera and an inertial sensor unit.

Five coordinate systems are used in the paper (see Figure 2). Three axes of the body 1 (2) coordinate system coincide with three axes of IMU on the right (left) foot. The origin of the camera coordinate system coincides with the pinhole of the camera. The LED coordinate system is defined as in Figure 2. The navigation coordinate system is used as the reference coordinate system. The z axis of the navigation coordinate system coincides with the local gravity vector and the x axis can be chosen arbitrarily.

A vector pR3 expressed in the “A” coordinate system is sometimes denoted by [p]A to emphasize that a vector p is expressed in the “A” coordinate system. When there are no concerns for confusion, [p]A is just denoted by p. A symbol CBA is used to denote the rotation matrix between “A” and “B” coordinate systems. In this paper, symbols b1, b2, n, c, l are used to denote body 1, body 2, navigation, camera and LED coordinate systems, respectively.

Let [r1]n ∈ R3 and [r2]nR3 be the origins of the body 1 and 2 coordinate systems, respectively. [r1]n and [r2]n denote the positions of the left and right foot in the navigation coordinate systems. The objective of this paper is to estimate [r1]n and [r2]n, which are estimated separately using an inertial navigation algorithm. The errors in [r1]n and [r2]n are compensated by computing [r1]n − [r2]n using vision. To compute [r1]n − [r2]n, we introduce some variables in the following.

In Figure 2, [pc]b1R3 denotes the origin of the camera coordinate system in the body 1 coordinate system and [pl]b2 denotes the origin of the LED coordinate system in the body 2 coordinate system. Note that pc and Ccb1 are constant since the camera and IMU 1 are attached on a shoe. Similarly, pl and Ccb2 are also constant.

[λl]b1 and [ρl]c denote the origin of the LED coordinate system in the body 1 and camera coordinate systems, respectively. As a person is walking, λl and ρl are continuously changing. We note that λl and ρl can be estimated when the camera captures the LED image.

From the vector relationship in Figure 2, we have

[Formula ID: FD1]
(1) 
[λl]b1=[pc]b1+Ccb1[ρl]c

The origin of the LED coordinate system can be expressed in the navigation coordinate system as follows:

[Formula ID: FD2]
(2) 
[r1]n+Cb1n[λl]b1=[r2]n+Cb2n[pl]b2

Inserting Equation (1) into Equation (2), we have

[Formula ID: FD3]
(3) 
[r1]n−[r2]n=Cb2n[pl]b2−Cb1n[pc]b1−Cb1nCcb1[ρl]c
The relationship (3) is used in Section 3.3.


3.  Motion Estimation Algorithm

In this section, an inertial navigation algorithm to estimate two feet motion is given. In Sections 3.1 and 3.2, a basic inertial navigation algorithm using an indirect Kalman filter is given. In Sections 3.3 and 3.4, measurement equations for the Kalman filter are given. In Section 3.5, an implementation issue of the proposed algorithm is discussed.

3.1.  Basic Inertial Navigation Algorithm

Let 1 and 2 be estimates of r1 and r2. In this paper, we use the inertial navigation algorithm in [9,15]. We only state an algorithm for 1 since the algorithm for 2 is exactly the same.

Let υ1R3 be the velocity of the right foot and q1R4 be the quaternion representing the rotation between the navigation and body 1 coordinate system. It is standard [10] that q1, r1 and υ1 satisfy the following:

[Formula ID: FD4]
(4) 
q˙1=12Ω(ωb1)q1υ˙1=ab1r˙=υ1
where ωb1R3 is the angular rates of the body 1 coordinate system with respect to the navigation coordinate system and ab1 is the external acceleration acting on IMU 1. For a vector ω = [ωx ωz ωy]′ ∈ R3, Ω(ω) is defined by
[Formula ID: FD5]
Ω(ω)≜[0−ωx−ωy−ωzωx0ωz−ωyωy−ωz0ωxωzωy−ωx0]

Angular rates ωb1 and external acceleration ab1 are measured using gyroscopes and accelerometers in IMU 1. Let yg,1R3 and ya,1R3 be gyroscope and accelerometer outputs of IMU 1, then yg,1 and ya,1 are given by

[Formula ID: FD6]
(5) 
yg,1=ωb1+υg,1+bg,1ya,1=Cnb1g˜+ab1+υa,1
where R3 is the earth's gravitational vector and bg,1R3 is the gyroscope bias. Measurement noises υg,1R3 and υa,1R3 are assumed to be white Gaussian noises whose covariances are given by Rg,1 and Ra,1, respectively.

Inserting Equation (5) into Equation (4), we obtain the following:

[Formula ID: FD7]
(6) 
q^˙1=12Ω(yg,1−b^g,1)q^1υ^˙1=(C(q^1))′ya,1−g˜r^˙1=υ^1
where C(q) for a quaternion q = [q0q1q2q3]′ is defined by
[Formula ID: FD8]
(7) 
C(q)=[2q02+2q12−12q1q2+2q0q32q1q3−2q0q22q1q2−2q0q32q02+2q22−12q2q3+2q0q12q1q3+2q0q22q2q3−2q0q12q02+2q32−1]

Variables for the left foot (υ2, q2, ab2, ωb2, yg,2, ya,2, bg,2, υg,2 and υa,2) are defined with the same way as in the right foot. 2 can be computed using Equation (6) if the left foot variables are used instead of the right foot variables.

3.2.  Indirect Kalman Filter

Mainly due to measurement noises, 1, υ̂1 and 1 (position, velocity and attitude estimates of the right foot) have some errors. These errors are estimated using a Kalman filter. This kind of Kalman filters is called an indirect Kalman filter since errors in 1, υ̂1 and 1 are estimated instead of directly estimating r1, υ1 and q1.

Let re,1, υe,1, qe,1 and be,1 be errors in 1, υ̂1, 1 and g,1, which are defined by

[Formula ID: FD9]
(8) 
re,1=r1−r^1υe,1=υ1−υ^1qe,1=q^1∗⊗q1be,1=bg,1−b^g,1
where ⊗ is the quaternion multiplication. For a quaternion q, q* denotes the quaternion conjugate of q. Assuming the attitude error is small, qe,1 can be approximated as follows:
[Formula ID: FD10]
(9) 
qe,1=[1q¯e,1]∈[RR3]
With this assumption, the attitude error can be represented by the three dimensional vector e,1.

The multiplicative attitude error term qe,1 in Equation (8) is commonly used in the attitude estimation [16]. If we express the last Equation of (8) in the rotation matrix with the assumption (9), we have the following:

[Formula ID: FD11]
(10) 
C(q)=C(qe,1)C(q^1)=(I−2[q^e×])C(q^1)

For the left foot, re,2, υe,2, qe,2 and be,2 can be defined similarly. If we combine the left and right foot variables, the state of a Kalman filter is defined by

[Formula ID: FD12]
(11) 
x=[q¯e,1re,1υe,1q¯e,2re,2υe,2be,1be,2]∈[R3R3R3R3R3R3R3R3]

The state space equation for one foot is a standard inertial navigation algorithm and is given in [9]. The state space equation for two feet is just a combination and is given by

[Formula ID: FD13]
(12) 
x˙(t)=A(t)x(t)+w(t)
Where
[Formula ID: FD14]
A=[[−(yg,1−b^g,1)×]00000−0.5I000I00000−2C′(q^1)[ya,1×]0000000000[−(yg,2−b^g,2)×]000−0.5I00000I00000−2C′(q^2)[ya,2×]00000000000000000000],w=[−0.5υg,10−C′(q^1)υa,1−0.5υg,20−C′(q^2)υa,2υb,1υb,2]
Noises υb,1 and υb,2 are introduced to represent a slow change in the bias terms. In the definition of A, the symbol [p×] for a vector p = [p1p2p3]′ ∈ R3 is defined by
[Formula ID: FD15]
[p×]≜[0−p3p2p30−p1−p2p10]

There are two measurement equations for the state x(t). One is from vision data (Section 3.3). The other measurement equation (Section 3.4) is derived using the fact that the velocity of a foot is zero and z axis values are the same while a foot is on the flat floor.

3.3.  Measurement Equation from the Vision Data

This section explains how the vision data is used in the Kalman filter.

There are eight infrared LEDs on the left foot as in Figure 3. A number is assigned to each LED. These LEDs are captured using the camera on the right foot. To simplify the image processing algorithm, an infrared filter is placed in front of the camera.

The typical infrared LED images during walking are given in Figure 4. A simple image processing algorithm can be used to obtain the center points of infrared LEDs.

Let the coordinates of the LEDs in the LED coordinate system be [ledi]lR3 (1 ≤ i ≤ 8). Let [ui υi]′ ∈ R2 be the image coordinates of eight LEDs on the normalized image plane, which are obtained by applying the camera calibration parameters [17] to the pixel coordinates of eight LEDs. [ledi]l and [υi υi]′ satisfy the following relationship:

[Formula ID: FD16]
(13) 
si[uiυi1]=Clc[ledi]l+[ρl]c
where si is the scaling factor. It is known that ρl, Clc and si can be computed if the number of LEDs is equal to or more than four. We used the algorithm in [18] to compute ρl, Clc and si. Only ρl is used in the Kalman filter measurement equation.

To use Equation (13), we need to identify LED numbers from the LED image. In a general case where LEDs can rotate freely, it is impossible to uniquely identify the LED number. However, in our case, LEDs are attached on a shoe and the rotation is rather limited due to the mechanical structure of ankles. Thus it is not difficult to identify LED numbers from the images in Figure 4.

Let the estimated value of ρl from the algorithm in [18] be defined by ρ̂l:

[Formula ID: FD17]
(14) 
ρl=ρ^l+υvision
where υvision denotes the estimation error in ρ̂l.

Inserting Equations (8), (10) and (14) into Equation (3), we have

[Formula ID: FD18]
(15) 
(r^1+re,1)−(r^2+re,2)=((I−2[q¯e,2×])C^nb2)′pl−((I−2[q¯e,1)×]C^nb1)′pc−((I−2[q¯e,1×])Cnb1)′C^nb1(p^l+υvision)

Assuming e,1 and υvision are small, we can ignore the product term in Equation (15):

[Formula ID: FD19]
(16) 
r^1−r^2−C^b2npl+C^b1npc+C^b1nC^cb1ρ^l=−2C^b2n[pl×]q¯e,2+2C^b1n[pc×]q¯e,1−C^b1nC^cb1υvision+2C^b1n[(Ccb1ρ^l)×]q¯e,1−re,1+re,2

The left hand side of Equation (16) is denoted by zvisionR3 and is used as a measurement equation in the Kalman filter:

[Formula ID: FD20]
zvision≜r^1−r^2−C^b2npl+C^b1npc+C^b1nCcb1ρ^l
In the matrix form, Equation (16) can be written as follows:
[Formula ID: FD21]
(17) 
zvision=Hvisionx+υvision
where υvision is the measurement noise and
[Formula ID: FD22]
Hvision=[2C^b1n[((C^cb1ρ^)+pc)×]−I0−2C^b2n[pl×]I0]∈R3×18

Whenever the camera on the right foot captures the LEDs on the left foot, Equation (17) can be used as a measurement equation.

3.4.  Measurement Equations from Zero Velocity and Flat Floor Assumptions

During normal walking, a foot touches the floor almost periodically for a short interval. During this short interval, the velocity of a foot is zero and this interval is called a “zero velocity interval”.

The zero velocity interval is detected using accelerometers and gyroscopes [19]. In this paper, the detection method in [9] is used: the foot is assumed to be in the zero velocity interval if the change of the accelerometer is small and gyroscope values are small. The zero velocity intervals are detected separately for the left and right foot.

We assume that a person is walking on a flat floor. Thus, the z axis value of a foot in the navigation coordinate returns to a constant during the zero velocity interval (when a foot is on the floor). Using both zero velocity intervals and the flat floor assumptions, the measurement equation for the zero velocity interval of the right foot is given by

[Formula ID: FD23]
(18) 
[0−υ^1z1,floor−[001]r^1]=H1x+υzero,1
where z1,floor is the z axis value when the right foot is on the floor and
[Formula ID: FD24]
H1≜[03×303×3I3×303×303×303×303×303×301×3[001]01×301×301×301×301×301×3]

The measurement equation for the zero velocity interval of the left foot is given by

[Formula ID: FD25]
(19) 
[0−υ^2z2,floor−[001]r^2]=H2x+υzero,2
where z2,floor is defined similarly with z1,floor and
[Formula ID: FD26]
H2≜[03×303×303×303×303×3I3×303×303×301×301×301×301×3[001]01×301×301×3]

3.5.  Kalman Filter Implementation

Here the implementation of the indirect Kalman filter is briefly explained. Detailed explanation for a similar problem can be found in [20]. All computations are done in the discrete time with the sampling period T = 0.01 second. The discrete time index k is used as usual; for example, discrete time value r1,k denotes the sampled value of continuous time value r1(kT).

The procedure to estimate q1,k, υ1,k, r1,k, q2,k, υ2,k and r2,k is as follows:

  • 1,k, υ̂1,k, 1,k, 2,k, υ̂2,k and 1,k are computed using the discretized Equation of (6).
  • The time update step [21] of the Kalman filter using Equation (12) is performed.
  • The measurement update step using Equations (17)(19) is performed to compute .
  • Using , 1,k, υ̂1,k,1,k and g,1 are updated as follows:
    [Formula ID: FD27]
    r^1,k=r^1,k+r^e,1,kυ^1,k=υ^1,k+υ^e,1,kb^g,1,k=b^g,1,k+b^e,1,kq^1,k=q^1,k⊗q^e,1,k
  • Similarly, 2,k, υ̂2,k, 2,k and g,2 are updated.
  • After the update, is set to a zero vector.
  • The discrete time index k is increased and the procedure is repeated.


4.  Smoother

In Figure 5, typical two feet movement during walking is illustrated in the navigation coordinate system. Suppose the right foot is on the floor in the area around (b). As the right foot is taking off the floor ((b)–(d) area), the left foot is touching on the floor in the area around (e). From the configuration of the camera, LED images are available in the (c)–(d) interval.

For the left foot, the measurement data are available in the area around (a) (zero velocity update) and (c)–(d) (vision data update). When the measurement data are not available, the motion estimation depends on double integration of acceleration, whose errors tend to increase quickly even for a short time. To get a smooth motion trajectory, a forward-backward smoother (Section 8.5 in [21]) is applied.

A smoother algorithm is applied for each walking step separately on the left and right foot movement. For example, consider the left foot movement between (a) and (e). After computing the forward Kalman filter (that is, a filter in Section 3.2) up to the point (e), the backward Kalman filter is computed from (e) to (a) with the final value of the forward Kalman filter as an initial value. Since the final value of the forward filter is used in the backward filter, the forward and the backward filter become correlated. Thus the smoother is not optimal. However, we found that the smoothed output is good enough for our application.

Note that 2,k is the position of the left foot, which is computed by the forward filter in Section 3.2. Let 2,b,k be the position of the left foot, which is computed by a backward Kalman filter. Two values 2,k and 2,b,k are combined using simple weighting functions w2,f,k and w2,b,k as follows:

[Formula ID: FD28]
(20) 
r^2,s,k=w2,b,kw2,f,k+w2,b,kr^2,k+w2,f,kw2,f,k+w2,b,kr^2,b,k
The weighting functions w2,f,k and w2,b,f are given by
[Formula ID: FD29]
(21) 
w2,f,k=αβk−M1w2,b,k=αβM2−k1,β>1
where the discrete time indices of one walking step is assumed to be [M1,M2]. Consider one walking step from (a) to (e) in Figure 5. With the weighting functions in Equation (21), 2,s,k2,k near the position (a) (that is, near the discrete time M1) and 2,s,k2,b,k near the position (e). Thus the weighting functions in Equation (21) provide a simple way to combine the forward and backward filters.

A smoother algorithm can be applied to the velocity and attitude similarly.


5.  Experiments

To verify the proposed system, a person walked on the floor and the two feet motion was estimated using the proposed algorithm. The estimated two feet trajectory on the xy plane in the navigation coordinate system is given in Figure 6. Since the x direction of the navigation coordinate system can be chosen arbitrarily, the trajectories are rotated so that the walking direction coincides with the x axis. The left foot trajectory is the upper one and the right foot trajectory is the lower one. The zero velocity intervals are indicated with the diamond symbols. The rectangle symbols indicate that vision data are available at those positions (that is, LEDs on the left foot can be seen from the camera on the right foot).

In the time domain, the relationship between zero velocity intervals and vision data available intervals is given in Figure 7. As illustrated in Figure 5, vision data are available between the right foot zero velocity intervals and the left foot zero velocity intervals during walking.

Three dimensional trajectories are given in Figure 8. There is a difference between the left and right foot motion patterns. This is due to the difference in the positions of inertial sensors: the inertial sensor unit is on the front in the case of the right foot and on the back in the case of the left foot (see Figure 1).

In addition to trajectories, attitude and velocity are also available from the inertial navigation algorithm. For example, estimated attitude (in Euler angles) of the left foot is given in Figure 9.

Thus we can obtain key gait analysis parameters such as step length, stride length, foot angle and walking speed using the proposed system.

Now the accuracy of the proposed system is evaluated. First, we test the accuracy of the vision-based position estimation, which is used to estimate the vector between two feet. The left shoe is located on different positions of the grid while the right shoe is located on the fixed position. The estimated left shoe position with respect to the right shoe is compared with the true value, which can be obtained from the grid. The result is given in Figure 10, where the position represents the origin of the body 2 coordinate system in the body 1 coordinate system. We can see the position can be accurately estimated using the proposed system (eight infrared LEDs). The mean error distance is 0.4 cm and the maximum error distance is 0.8 cm.

The next task is to evaluate the accuracy of the trajectories. A person walked on the long white paper with marker pens attached on both shoes. Marker pens are attached on shoes so that dots are marked on the white paper whenever a foot touches the floor. Marked dot positions are measured with a ruler and these values are considered as true values. The estimation positions during zero velocity intervals (when one foot is on the floor) are compared with marked dots. One step result is given in Figure 11.

A person walked 33 steps and the errors between the estimated positions and the marked positions are given in Figure 12 for each step. The estimated step length is given in Figure 13. The mean errors are 2.2 cm for the left foot and 2.1 cm for the right foot. The maximum errors are 3.6 cm for the left foot and 3.89 cm for the right foot. Two more experiments were done and the mean errors are 2.5 cm and 1.2 cm for the left foot and 2.5 cm and 1.7 cm for the right foot. The maximum errors are 4.1 cm and 2.8 cm for the left foot and 3.9 cm and 5.4 cm for the right foot. The 2 cm level error is too large for the kinetic calculations. However, the proposed system is suitable for the gait analysis system requiring basic gait parameters such as walking step length and walking speed.


6.  Conclusions

Using inertial sensors on shoes, two feet motion is estimated using an inertial navigation algorithm. When two feet motion is estimated, it is necessary to measure the relative position between the two feet. In the proposed system, a vision system is used to measure the relative position and attitude between two feet.

Using the proposed system, we can obtain quantitative gait analysis parameters such as step length, stride length, foot angle and walking speed. Also we can see three dimensional trajectories of the two feet, which give qualitative information for gait analysis.

The accuracy of the proposed system is evaluated by measuring the position of a foot when a foot touches the floor. The mean position error is 1.2–2.5 cm and the maximum position error is 5.4 cm. For gait analysis, we believe the error is in an acceptable range.

The main contribution of the proposed system is that two feet motion can be observed at any place as long as the floor is flat. In commercial motion tracking using a camera such as Vicon, a dedicated experiment space is required. Thus we believe natural walking patterns can be observed using the proposed system.


This work was supported by the 2013 Research Fund of University of Ulsan.


Conflict of Interest

The authors declare no conflict of interest. References


References
1.. Perry J.. Gait Analysis: Normal and Pathological FunctionSLACK IncoporatedThorofare, NJ, USAYear: 1992
2.. Karaulova I.A.,Hall P.M.,Marshall A.D.. Tracking people in three dimensions using a hierarchical model of dynamicsImage Vision Comput.Year: 200220691700
3.. Yun J.. User identification using gait patterns on UbiFloorIISensorsYear: 2011112611263922163758
4.. Teixido M.,Palleja T.,Tresanchez M.,Nogues M.,Palacin J.. Measuring oscillating walking paths with a LIDARSensorsYear: 2011115071508622163891
5.. Zhang B.,Jiang S.,Wei D.,Marschollek M.,Zhang W.. State of the Art in Gait Analysis Using Wearable Sensors for Healthcare ApplicationsProceedings of 2012 IEEE/ACIS 11th International Conference on the Computer and Information Science, (ICIS)Shanghai, China30 May 2012213218
6.. Kappel S.L.,Rathleff M.S.,Hermann D.,Simonsen O.,Karstoft H.,Ahrendt P.. A novel method for measuring in-shoe navicular drop during gaitSensorsYear: 201212116971171123112678
7.. Tao W.,Liu T.,Zheng R.,Feng H.. Gait analysis using wearable sensorsSensorsYear: 2012122255228322438763
8.. Schepers H.M.,Koopman H.F.J.M.,Veltink P.H.. Ambulatory assessment of ankle and foot dynamicsIEEE Trans. Biomed. Eng.Year: 20075489590217518287
9.. Do T.N.,Suh Y.S.. Gait analysis using floor markers and inertial sensorsSensorsYear: 2012121594161122438727
10.. Titterton D.H.,Weston J.L.. Strapdown Inertial Navigation TechnologyIPeter Peregrinus Ltd.Reston, VA, USAYear: 1997
11.. Foxlin E.. Pedestrian tracking with shoe-mounted inertial sensorsIEEE Comput. Graph. Appl.Year: 200525384616315476
12.. Ojeda L.,Borenstein J.. Non-GPS navigation for security personnel and first respondersJ. Navig.Year: 200760391407
13.. Bebek O.,Suster M.A.,Rajgopal S.,Fu M.J.,Xuemei H.,Cauvusoglu M.C.,Young D.J.,Mehregany M.,van den Bogert A.J.,Mastrangelo C.H.. Personal navigation via high-resolution gait-corrected inertial measurement unitsIEEE Trans. Instrum. Meas.Year: 20105930183027
14.. Kelly A.. Personal Navigation System based on Dual Shoe-Mounted IMUs and Intershoe RangingProceedings of the Precision Personnel Locator Workshop 2011Worcester, MA, USA1–2 August 2011
15.. Suh Y.S.,Phuong N.H.Q.,Kang H.J.. Distance estimation using inertial sensor and visionInt. J. Control, Automation Syst.Year: 201311211215
16.. Markley F.L.. Multiplicative vs. Additive Filtering for Spacecraft Attitude DeterminationProceedings of 6th Cranfield Conference on Dynamics and Control of Systems and Structures in SpaceRiomaggiore, Italy18–22 July 2004467474
17.. Forsyth D.A.,Ponce J.. Computer Vision: A Modern ApproachPrentice HallNew York, NY, USAYear: 2003
18.. Lu C.P.,Hager G.D.,Mjolsness E.. Fast and globally convergent pose estimation from video imagesIEEE Trans. Pattern Anal. Mach. Intell.Year: 200022610622
19.. Peruzzi A.,Croce U.D.,Cereatti A.. Estimation of stride length in level walking using an inertial measurement unit attached to the foot: A validation of the zero velocity assumption during stanceJ. Biomechan.Year: 20114419911994
20.. Suh Y.S.. A smoother for attitude and position estimation using inertial sensors with zero velocity intervalsIEEE Sens. J.Year: 20121212551262
21.. Brown R.G.,Hwang P.Y.C.. Introduction to Random Signals and Applied Kalman FilteringJohn Wiley & SonsNew York, NY, USAYear: 1997

Article Categories:
  • Article

Keywords: gait analysis, inertial navigation system, inertial sensors, computer vision, Kalman filter.

Previous Document:  Optical Changes of the Human Cornea as a Function of Age.
Next Document:  Garment counting in a textile warehouse by means of a laser imaging system.