*2.1. Data Collection and Experimental Protocol*

Data collection can significantly affect the accuracy of UIR. Input signals must be informative enough to accurately discriminate between various human gait modes. In this paper, we collect vertical hip position and thigh angle to indicate the state of the residual limb, and thigh moment to indicate the user–prosthesis interaction. These signals are like an implicit communication link between the user and the prosthesis and can be used to infer user intent.

To design and evaluate the performance of the UIR system, we collect these signals from three able-bodied subjects (AB01, AB02, and AB03) and three transfemoral amputee subjects (AM01, AM02, and AM03). All the experiments were approved by the Department of Veterans Affairs Institutional Review Board. The above-knee amputees wore an Ottobock prosthesis on the right leg. Data were collected for able-bodied subjects during four different activity modes: (1) standing (ST), (2) normal walking (NW) at user-preferred speed (PS), (3) slow walking (SW), and (4) fast walking (FW). We asked subjects to walk slower and faster than their normal walking speed for SW and FW modes, respectively, allowing them to choose comfortable velocities for these two modes. Due to physical limitations, we collect data during only three activity modes for the amputee subjects: ST, NW, and SW. Table 1 shows the physical characteristics of the subjects.


**Table 1.** Physical characteristics of the six human test subjects. AB and AM represent able-bodied and amputee subject, respectively.

The data were collected at the Motion Study Laboratory of the Cleveland Department of Veterans Affairs Medical Center with 47 reflective markers on each subject's body. Subjects were asked to walk on a treadmill with built-in force sensors. A 16-camera Vicon system (Denver, CO, USA) recorded kinematic data at 100 Hz. Ground reaction force along three axes were collected from the force sensors at 1000 Hz. Data were filtered with a second-order low-pass filter with a cutoff frequency of 6 Hz. A 3D biomechanical rigid body model was constructed from the marker data, and segmental and joint kinematics (joint displacements) and kinetics (joint moments) were computed as inputs for the UIR system. Detailed methods and sample results can be found in [32]. The experimental setup is illustrated in Figure 2. Note that the lower-limb amputee demographic at the Veterans Affairs Medical Center, where data collection was performed, is dominated by males. Over 98% of veterans who underwent amputation in 2011 were male [33]. Our future work will need to include more subjects and wider demographics (for example, ages and genders).

Note that, in the real-world, non-laboratory settings, we would measure the required input signals directly rather than with cameras. For example, we could use piezo-electric sensors or multi-axis load cells for force sensing, and optical encoders or inertial measurement units for accurate position and angle sensing during stance and swing phases [7,11].

The able-bodied subjects were asked to perform four sequences of walking trials, each lasting approximately 60 s. Each sequence consists of four different gait modes (ST, SW, NW, and FW) and each mode was maintained for several seconds. Figure 3 illustrates a sample walking trial for AB01. The amputee subjects performed six sequences of three different walking modes, each lasting approximately 30 s.

In summary, we note a few important points. Firstly, in this paper, the type of walking activities used for recognition is not our main focus, but rather the assessment of the proposed methodology to eliminate irrelevant/redundant features for UIR is our main goal. Secondly, in human activity mode recognition applications, an entire stride is typically used for non-real-time classification [34]. However, in UIR, we use a small window of measurement signals, mostly within a few milliseconds, to identify user's intent for real-time prosthesis control.

**Figure 2.** Experimental setup for data collection. The **left** figure shows an able-bodied subject and the **right** figure shows an amputee subject with an Ottobock prosthesis on the right leg.

**Figure 3.** Sample walking trial with four different gait modes for able-bodied subject AB01. Although data is available for both legs, we require only one side for gait mode recognition. The data from the two legs look similar because of gait symmetry.
