3.2. System Operation and Interface
While the software is running, the system can be accessed through a web interface using a notebook computer or tablet browser. Starting by describing the system user’s perspective, after logging in, users will have the options corresponding to their profile, which can be administrator or session coordinator. It is important to clarify the difference between a system user and a training session participant. Only the first has an account to perform actions in the system, or to perform analyses or consult records. The second concept corresponds to each person participating in the training session, and who, with consent, is being monitored by the system through data collected from wearable sensors.
Less frequently used, the administration profile is dedicated to maintenance operations and also for user profile management, which involves creating or removing user accounts or resetting a user’s password. However, each user can also autonomously change the password. The most common access is performed with the session coordinator profile, which is a type of user responsible for managing the creation and monitoring of a training session.
After logging in, the home page shows a list of sessions held and a list of participants as illustrated in
Figure 1. For each known participant in a training session, a record is kept with minimal personal data, but enough to distinguish them by name from other participants during monitoring. The participant’s threshold HR values are stored in this record to support monitoring and the calculation of personalized effort zones.
By opening the session details in the first grid, it is possible to consult an information summary, including the session’s start and end times, the participants, and their sensors. Additionally, graphs displaying the participant’s HR and HR zones can be viewed.
Figure 2 illustrates the second option, with a graphical representation of the HR zones over time (in dots) and the desired target zone (if a single line) or the zone thresholds (if two lines, as at the beginning of
Figure 2a, where the goal is an HR zone between 2 and 5). We can see that at 17:40, the first participant was within the target zone, but remained almost the entire time below the minimum intended intensity. On the other hand,
Figure 2b indicates a good correspondence between the effort level of the second participant and the intended intensity over time.
To create a new session, the coordinator defines the session name. Some fields, such as the session coordinator and the date, are automatically filled in. Afterward, it is necessary to assign sensors to participants as illustrated in
Figure 3. On the left (
Figure 3a), we can see how the system lists the names of the available HR sensors, meaning that they have been automatically detected and have not yet been assigned to anyone. Each sensor is associated to one participant who has not yet been selected as shown in the middle image (
Figure 3b). The resting and maximum HR values for that participant are pre-filled with default values from the participant’s record but can be adjusted at this point and for this session’s scope. Once all participants are registered, a confirmation grid displays all sensor/participant pairs, including a preview of each sensor’s data as shown in the second column of
Figure 3c.
The coordinator can now start the session, at which point the system enters the exercise monitoring mode and begins collecting and storing data from the sensors. At this stage, the system displays a monitoring dashboard, showing real-time HR data from each participant, along with a bar graph representing the HR trends over the last 30 s as depicted in
Figure 4. This allows the coordinator to have an accurate and easy readable perception of the training intensity for each person in the group. When certain customized and rule-defined circumstances occur, alerts are shown to immediately notify the coordinator, or the group if they are watching the panel. Alerts are visual, with changes that stand out from the regular panel, and can optionally include audible beeps. The typical alert criteria are if the observed level of effort is too high, or when it diverges from the target training intensity. At the top right of
Figure 4, we see an example of an alert indicating intensity above the limit for participant Carlos.
In this same example, the time displayed below the participant’s name shows the most recent timestamp of data received from his sensor, helping to confirm the recency and the communications status.
For interval training, or sessions with variable intensity, the coordinator has a panel to adjust the desired intensity levels over time as shown in
Figure 5. The buttons on the right (“Target Zone”) are used to increase or decrease the limits for the ideal HR zone in a given phase of the activity.
In this operation mode, the monitoring dashboard will also display an indicator for the participant’s HR zone, calculated based on their specific parameters. This zone indicator allows us to compare the effort level between participants.
Figure 6 shows the monitoring panel including the HR zone (to the left, above the HR) across three scenarios: (a) the participant’s effort is below the target intensity; (b) the participant’s zone is within the intended range; (c) the HR zone exceeds the intended range.
On the right, the HR zone level 5.1 means that the HR value (102) is in the fifth decile of the range , with 48 and 178 being the resting and maximum HR values for that participant.
Throughout the session, notes can be entered regarding general occurrences, user-specific remarks, or any complications. These textual records will help later in the subsequent analysis conducted by the session coordinator or other experts.
At the end of the physical activity, the session is marked as closed. The system records the end time and stops storing the data streams coming from the sensors. After that, participants may remove the sensors. The coordinator will see a session summary page that includes the session duration, the coordinator’s name, the participants involved, and which sensors were used as illustrated in
Figure 7.
Coordinator users with an active account can consult previously created session records to analyze a participant’s history and to eventually adjust the training plan for the next session.
In the participant details view (accessible from the participants list on the home page), we can see which sessions the participant has attended and access graphs showing their corresponding HR data. Optionally, if there is an additional large monitor available for visual monitoring for the entire group, the system offers a “Room View” option, which opens a new browser window that can be placed on this second monitor. The same dashboard is shown but without the operational control buttons. This setup also allows the session coordinator to write notes in a separate control window, ensuring greater privacy. To further increase visibility during session monitoring, it is also possible to enlarge the interface using the browser’s zoom controls.
During a training session, it is typical for each participant to wear a single sensor. However, it may be necessary to replace the sensor midway through, which is not expected but accounted for in case of unforeseen events, such as battery drain. In this case, there may be two sensor registrations for the participant, with the first one having a recorded removal time. The HR data for this participant will be fully available, being optionally separable for each sensor in two time series.
In the event that a sensor exchange operation occurs for a participant, the data collection (and monitoring in room view) for the remaining participants will continue without interruption. This sensor reassignment operation can be completed in approximately 15 s by the coordinator (including the physical placement of wearables).
If a participant moves too far away, the sensor data transmission will stop. Additionally, if the sensor band falls off and the contacts are no longer properly positioned, the sensor may also cease transmitting data. In both cases, either the participant’s return to a distance within the communication range, or adjusting the sensor placement, respectively, will be sufficient to automatically resume data collection and monitoring from that sensor.
In these special situations, to minimize gaps in sensor data, the system will display a red alert for any sensor/participant whose last data transmission exceeds a configurable maximum period (e.g., 10 s). This allows the session coordinator to intervene if necessary, ensuring that no participant remains unmonitored without detection.
If communication does not resume automatically, the coordinator can use a system option to refresh the sensor association. This operation will attempt (a) device re-bonding and (b) resetting the Bluetooth controller. The first action does not impact the other active sensors. The second action, if applied, could inhibit data collection from other sensors for 5 to 10 s. If this action does not solve the issue, the coordinator can assign a new sensor to that participant using the “Manage Sensors” operation during the session, and without interrupting data collection from the other sensors.
3.3. Infrastructure and Technology
The sessions’ exercise can take place in a gym room, on a sports court, or, in a healthcare context, at a physical rehabilitation facility. Since participants are spread across the space and can even move to a corridor, or switch between equipment, for instance, between a treadmill and an exercise bike, we designed an IoT-based data collection solution.
Figure 8 shows the general architecture of the developed system, from sensors to visualization components. A sensor gateway module (GW) is deployed to collect data from nearby sources and stream them upwards to the control module, the system’s central node responsible for processing, analysis and storage management. In terms of platform, for the gateway component, to which the sensors are paired, we use a mini-PC equipped with a Bluetooth controller version 5.3, and a network interface (wired or wireless) to upstream data to the control module. Currently, both the gateway and control modules run the Ubuntu Linux operating system and use Java (JRE 17+) to execute the system software. We use BlueZ (
https://www.bluez.org/about, accessed on 6 June 2024) to interact with the Bluetooth controller on Linux, and the blessed-bluez (
https://github.com/weliem/blessed-bluez, accessed on 6 June 2024) library for operations and communication with the sensors.
The control module is a Java application whose web component was developed with Vaadin (
https://vaadin.com, accessed on 6 June 2024), an open-source framework for the development of interactive and responsive web applications.
The received data are used for visualization and passed to the storage module (DB), which relies on a Postgres database for persistence. Alternative databases can also be used, provided they support high-speed data ingestion and complex data analysis capabilities.
From the beginning of this project, HR was chosen as the primary parameter to be measured using sensors. Due to power constraints, the communication between such sensors and the gateway node is based on a short-range and low-power wireless protocol: Bluetooth Low Energy (BLE) [
17]. Other options, such as ANT+ and Zigbee, were also considered. Compared to BLE, ANT+ has a slight disadvantage in battery power consumption, and apart from the fitness area, there is less equipment that supports it, which could limit future vendor options if we want to associate complementary sensors. Zigbee is more appropriate for communications over distances slightly greater than those of interest in this work. It may be relevant for future extensions to this system, for communication with more distant actuator devices, where messages are not as frequent. The BLE protocol proved to be a good balance between the data transmission rate, power consumption, and broad hardware compatibility, making it the chosen option. The gateway software implementation prioritized communication with this type of sensor, including those from Movesense (
https://www.movesense.com, accessed on 4 June 2024), which our partners already had. But to meet our interoperability goal, we used a brand-independent communication form.
The maximum number of sensors we can connect to a single Bluetooth controller depends on several factors, including the gateway device software and the Bluetooth version. Under typical conditions, a Bluetooth controller version 5.x might handle 7 to 10 BLE active devices. Whenever the number of participants is greater than the maximum supported by a single Bluetooth controller, we can activate another gateway component. Therefore, the box on the left side (representing the Gateway and a small number of sensors) of
Figure 8 can be multiplied, depending on the number of participants.
The communication between the sensor gateway node(s) and the central control module is based on standard networking using REST over HTTP. As an alternative, MQTT can also be used. Depending on the number of sensors and the data volume, and also on the frequency requirement for data recency in monitoring, the sensor gateway can be adjusted to send batch transmissions of multiple readings for greater efficiency.
In a monitoring system, it is crucial to avoid losing communication with a sensor. At least the system must be capable of detecting such a loss if it occurs during a training session. It is known that in these wireless communication protocols with sensors, the signal can be impacted by various factors, including the sensor’s battery level, the distance between the antennas, obstacles or physical barriers in the signal path, and possible interference from other wireless signals.
3.4. Validation Methods
For interoperability assessment, concerning wearable sensors and to verify brand-independent capability, we tested the collection of HR data from distinct devices. In addition to the Movesense sensors, which were available to us in greater number, the system successfully worked with Polar H10 (
https://www.polar.com/pt/sensors/h10, accessed on 4 June 2024) equipment and with sports smartwatches capable of HR transmission during exercise, such as the Garmin Forerunner 255 (
https://www.garmin.com/en-IE/p/780139#specs, accessed on 4 June 2024) and others. We began with association tests to validate sensor detection and effective data communication. Next, we measured the communication range. For each sensor type, we progressively increased the distance from the gateway until the data transmission was interrupted. Multiple repetitions were conducted in both a typical indoor environment (a room with people and various devices) and an open outdoor environment without barriers or interference. From each series of repetitions with the same setup, we recorded the average distance value. The stored records, the session notes, and the system logs were subsequently analyzed for complementary information regarding special cases, such as eventual transmission losses.
To understand how users perceive the system we followed a Technology Acceptance Model (TAM)-based approach, a widely used model in information systems and technology adoption research [
18]. An extended TAM-type questionnaire was prepared, having mostly Likert scale-type closed-ended questions. It includes the two TAM base aspects, perceived usefulness and perceived ease of use, as well as additional aspects such as peer influence, facilitating conditions, user satisfaction, and usage intention. Please refer to
Appendix A for the full questionnaire.
We asked session coordinators, who use this system in partner entities, to respond. To diversify the survey and obtain a reasonable number of responses, we broadened the distribution of the questionnaire to other users, who, until then, were unaware of the existence of this system. The inclusion criterion for this study was having prior experience in using sensors and interpreting physical activity indicators, as well as the ability to monitor activities involving multiple participants. Each questionnaire response was based on two or more training sessions using the system. The first session served to explain the system’s operation, while the subsequent sessions were controlled by the responding user. In each experiment, in addition to the user, one or two researchers from this project were present to demonstrate the procedures and ensure protocol compliance. Additionally, two to six other people were required to wear sensors for 20 min or more during exercises of varying intensity.
TAM questionnaires can reveal specific issues or concerns that may hinder users from adopting a technology, such as perceived complexity or lack of perceived benefits. Moreover, that is what we intend to analyze from the data: to identify possible barriers or favorable factors in the users’ perception of the system. We did not aim to assess the impact of the exercise on session participants, nor to differentiate between questionnaire respondents.
We employed common TAM data analysis procedures, including descriptive statistics and reliability analysis. The purpose of descriptive statistics is to understand the data main features, such as central tendency and data dispersion. The reliability of the scale-type questionnaire items was evaluated using Cronbach’s Alpha [
19], a measure of internal consistency.
The questionnaire was filled out via browser, using Google Forms. The data were exported to an XLSX spreadsheet format. Statistical analysis was conducted using Python, the Pandas library (
https://pandas.pydata.org/, accessed on 19 June 2024), and the Pingouin open-source statistical package (
https://pingouin-stats.org/, accessed on 19 June 2024) (the files are available in
Supplementary Material). After reviewing the initial results, a
t-test and a Mann–Whitney U test [
20] were applied in an additional experiment to determine if there were relevant differences in the perceived usefulness between two user types.