**4. Actuation**

The mechanisms to provide actuation in a form of feedback to the human take an important role in creating a complete interaction from sensing body properties to making the subject aware of them. Our research aims at linking biosensing to body actuation. Actuation is generally provided by mechanical elements that move and respond to input signals in order to either control or inform about a system. We stretch this definition to include feedback mechanisms such as screen-based visuals, although no mobile mechanical element is necessarily implied. In this section, we focus on actuation mechanisms that can be easily controlled and coupled to our body. We take a similar approach to the structure used to describe the biosignals, on explaining: how, what, where, when, and the actuator limitations and usage precautions for a selected list of actuators. The range of actuation mechanisms presented draws upon our research on affective technologies and interaction design, as well as inspirational works present interaction design research, but it should be seen as a non-exhaustive list of possibilities.

#### *4.1. Screen-Based Visual Biofeedback*

**How it works:** Screen-based visual biofeedback is the representation of body signals that inform about body changes happening along time. Its goal is to provide to the researcher a means to assess the dynamics of the aforementioned changes, helping to gain understanding and tracking the inner state of a given subject. Examples of this could be ECG feedback, respiration feedback, or movement tracking, usually employed in health metrics or sports performance research. Screen-based biosensing systems for feedback are standard practice in clinical settings and hospitals. Biofeedback use has for instance been adopted in psychotherapy, as research suggests that the technique provides a mechanism to self-regulate the emotions.

**What:** Screen-based visual biofeedback uses a 2D graphical interface and benefits from light, colors, strokes, and visual styles to represent a changing signal that evolves with time. Signal peaks and troughs appear in an axis showing the measurement magnitude in a given range, so that rapid and slow dynamics can be seen as the representation moves along the time axis when updated.

**Where:** Screen-based visual feedback takes place in a display, either a computer screen or a sensing platform display.

**When:** It is important that the represented signals are updated in real-time. Doing otherwise, although possible using delays or technology limitations, would compromise the ability of the actuation to convey the tracking meaning attributed to the practice of biofeedback. When sensing requirements pose concerns on the technical ability to render a smooth representation through time, approaches such as averaging or undersampled representations are used.

**Limitations:** Screen-based visual biofeedback connects easily with the mathematical properties that underlie the signals under study. However, signal processing procedures such as filtering, scaling or normalization are crucial in achieving a smooth and flowing representation. These come, of course, tightly dependent on the available computing capabilities. There are situations in which feedback users report finding difficulties or experiencing anxiety when engaging in the assessment of body rhythms. Moreover, visual information tends to remarkably capture the attention of the user, thus needing special care when used as an element of broader interaction (movement, performance, exercise) that could render a poorer experience quality or present a deviation from the aimed activity.

#### *4.2. Sound Feedback*

**How it works:** Sound feedback, when applied to biosignals, is the audio representation of body signals that uses sound properties to inform about body changes happening along time. Its goal is to exploit our sophisticated trained sense of hearing to convey meanings linked to body signal features, leading to the understanding and tracking of a given subject's biosignal dynamics.

**What:** Sound feedback uses the properties of sound, that is, volume, pitch or frequency (note), rhythm, harmony, timbre, and transients (attack, sustain, etc.) among others, to represent a signal (or its features) that changes over time. Its generation, often using speakers or headphones, is linked to properties of the signal. Alternative approaches draw upon several transducing paradigms, that is, different ways to convert electrical signals into sound (electromechanical as in the case of speakers, piezoelectric or others), often more limited such as buzzers or beepers made of basic vibrating elements that produce sound.

**Where:** Sound can be generated in speakers, devices that work converting electrical pulses to sound (air pressure) waves, allowing users to listen to the feedback without the need for additional equipment. Headphones, working by the same principle, can be used for the same purpose but only providing feedback to the person wearing them.

**When:** The human hearing range typically comprises frequencies between 20 Hz and 20,000 Hz. The oscillating frequency of the sound wave that is created is what gives it a particular tone (what we call a note). The different times at which sound waves are generated is what creates the meaning of rhythm and articulation.

**Limitations:** Audio generation and processing techniques are complex. Whilst high-level hardware and software tools can be exploited to make a complete system more accessible, there certainly remains a relevant learning-curve. The scenario in which audio feedback is deployed conditions a lot the effect achieved, given the fact that materials surrounding the sound generating system at use impose effects like reverberation, echoes, or absorption. Exposure to sound feedback for a prolonged period of time has some drawbacks. Sound volume can potentially harm our auditory system. Sound feedback that lack textural richness (e.g., a single sine-wave) has the risk of becoming unengaging for the user or potentially cause irritation.

#### *4.3. Vibrotactile Actuation*

**How it works:** Vibrotactile actuation uses motors to stimulate communication utilizing touch, and more precisely tactile vibrations. When linked to biosensors, it can use the properties of the so-called vibrations to convey features of the biosignal being tracked.

**What:** Vibrotactile actuation is a technology communication mechanism that uses touch vibrations to exploit the touching sense of humans. It is built upon motors, which can mostly be categorized under two types:


These motors usually take the form of small (few millimeters) enclosures with simple positive and negative (+/ −) terminals to be driven, lowering the power and supporting (weight) requirements. The typical power supply needed for this kind of micromotors is of the order of 1–5 V.

**Where:** With weights below 1g, the small form factor of these motors makes them suitable for body explorations, often relying on patches, elastic bands, or holders. Typical uses include also

vibrotactile-equipped wristbands or smartwatches. Besides the traditional game/remote controllers including vibrotactile feedback and actuating on the hands, the currently ubiquitous role of mobile phones has spread the use of vibration feedback and patterns for notification, alarms and other communication examples anywhere a phone can be placed or hold.

**When:** Small vibrotactile motors feature fast startup and breaking times and can actuate taking rotations up to 11,000 revolutions per minute (RPM), in the case of ERMs, and oscillations of the order of few hundreds of Hertz.

**Limitations:** Vibration comes often with undesired noises or sounds. While this is mitigated by rubber-made absorbing structures often integrated in the motors, use cases need to consider this aspect. While vibrotactile actuation offers the opportunity to explore a particular type of haptic feedback, the use of small motors limits the generated effects, in terms of amplitude, duration, and intensity perceived. To create vibration sequences, several motors are needed, provided integration software and hardware development efforts are carried out. The actuators often require extra drivers to widen the operating regime possibilities while maintaining electrical safety standards. As generally advised in the case of feedback modalities applied to the body, haptic feedback actuation has to go hand in hand with user experience studies, since prolonged exposure and certain placements can lead to discomfort.

#### *4.4. Temperature Actuation*

**How it works:** Temperature actuation uses heating and cooling elements to stimulate communication using heat passed by haptics, that is, through our sense of touch. When used with biosensors, it can use the properties of the heating/cooling dynamics of the material to convey characteristics of the biosignal being tracked.

**What:** Temperature actuation is a type of communication that is used as feedback drawing upon the human haptic (touch) sense. The properties of the temperature feedback depend on the materials that are used to convey the features of the information (e.g., biosignals, behavioral data, etc.) of interest. Most commonly used approaches rely on the conversion of an electrical current input into heat/cold outputs, mainly by using resistive elements that heat up when current flows through. Examples of these are nichrome wires, conductive threads, conductive fabrics, and thermoelectric coolers (Peltier elements).

**Where:** The nature of the heating elements determines where the temperature actuation can be placed. The flexibility of wires and fabric has led to many developments that extend wearable capabilities, producing smart garments that lie close to the skin. Implementation possibilities of these technologies comprise patches and configurations to be mounted in accessories (bags, caps, etc.), among others. In applying heat or cold, placement plays a key role given the different perceptual and comfort ranges that exist throughout the body skin.

**When:** Typically, heat is an actuation modality that acts slowly. Whereas thermoelectric coolers and nichrome could seem to behave otherwise, being able to be turned on in a fast manner thanks to conduction, there is usually an element that plays a dissipating (slow) role in heat dynamics either using convection or radiation. Heat transfer, hence, usually involves relatively slow dynamics.

**Limitations:** As it is the case with actuation having an effect upon the body, heat/cool feedback has to closely consider the user comfort and perceptual thresholds. Materials' properties (mainly heat conductivity) constrain the possibilities in terms of time. While options like actuation upon wide-areas or multi-actuator sequences emerge as interesting actuation paradigms, power requirements remain a challenge. Moreover, sensitivity to temperature varies widely from user to user and is affected by ambient temperature conditions.

#### *4.5. Shape-Changing Actuation*

**How it works:** Shape-changing actuation uses interfaces that exhibit changes in size, shape, or texture in order to, when linked to feedback, exploit the human visual and tactile perception to convey meanings and information content. By using shape changes that unfold over time, the actuation dynamics are brought forth letting the user be able to play with concepts such as time (increasingly/decreasingly fast, slow, abrupt) and volume or size to depict the desired information.

**What:** Shape-changing actuation interfaces use the change of physical form that, when linked to feedback, provide a certain output conveying meanings and properties of the signal that they are bound to. Shape-changing exploits a combination of the senses of sight and touch to convey meanings intrinsic in the dynamics of the information that they are linked to, such as rapid changes, stability, increase, decrease, and steady growth. Such interfaces, despite the parallelisms found in visual screen-based explorations done in visual computing, are emerging as an alternative, physical, and tangible way of interacting with technological devices [66]. Three of the most widely used examples of shape-changing actuation elements are:


**Where:** In order for the technology to best benefit from the focus on visual and tactile perception, shape-changing actuation implementations must remain under reach (sight or touch) to convey the meanings embedded in the changes of shape. This can entail direct contact with the body, with the potential to increase the felt shape meaning when in contact with a large body area, or where touch sensations are more developed, or within the field of vision of the user.

**When:** The wide range of shape-changing actuation possibilities comes with different actuation timings in it. While it is possible to work with shape-memory wires that are rapidly heated or compressed gas or pumps that quickly fill up a given inflatable, the time affordances of this kind of actuation do not generalize. Linear actuators, for instance, usually require a system of pistons and damping mechanisms that have an impact on the dynamics of the actuation while it unfolds over time. Moreover, it is often the case that the behavior of shape-changing interfaces is not symmetrical, for example, although an inflatable can be rapidly fed air by a pump or a reservoir, deflation valves have their own rules.

**Limitations:** Memory wires, although visually appealing, imply the utilization of high temperatures which challenges the use of haptic shape-changing feedback based on them. In turn, the strains achieved (or pulling forces) are generally weak, often leading to implementations that use several wires. It is very common that the developments using this kind of actuation include the application of protective heat layers. In the case of linear shape-changing actuators, movement is often accompanied by undesired noise and relatively slow dynamics. The actuators themselves, made of rigid moving elements, impose a certain rigidity to the overall actuation. Moreover, multiple units are often needed to create appealing effects. Shape-changing inflatables often present problems of fluid leaks, as well as different asymmetric behaviors for inflation and deflation. These can be tuned by further developments on valves and compartments but requires significant work. Besides, the type of pump poses specific fluid requirements and usually exhibits noise that interferes with the actuation designed.

#### **5. Sensing and Actuation under the Soma Design Approach**

Our affective technology research aims to create personal technologies that enable self-reflection [6]. With a focus on the body, design research is used as a way to enter introspectively to emotion self-reflection and potentially disrupt the way we relate to our mental well-being with technology-mediated interactions. Technology-mediated interactions, drawing upon ubiquitous computing capabilities (biosensing, wearables, monitoring applications, embodied actuation) could add novelty and be taken further to psychotherapy contexts. In this section, we introduce the design approach taken to accomplish meaningful biosensing-actuation couplings. This comprises the first-person design stance, the somaesthetic design ("design through the body") approach, and the path to orchestration (i.e., the mechanisms for the coordination and event recognition in technology-mediated body interactions and the connections and sequencing for evocative sensing-actuation experience design).

#### *5.1. Designing from a First-Person Perspective*

Often in user-centered design, designers conceive, test, and set requirements for the ultimate users that are placed at the center of the design efforts. In doing so (using a third-person perspective), users are relegated to a second line, in which from time to time, designers probe, test and interview the target users iteratively in order to modify and render the design outcome meaningful according to their needs. This approach, however, misses out the potential of stepping into the user's shoes. The first-person perspective [67], instead, constitutes a way to highlight the designers' user experiences, paying honest tribute to the potential end-users, and actively engaging in experiencing the design meanings and effects. The designer who follows the first-person perspective embarks in an iterative process of trying, testing, feeling, and evaluating the designed object or interaction. The design is tried by the designer herself/himself. This process provides meaningful insight into what the eventual user could ge<sup>t</sup> from the resulting design. When taking the first-person perspective, the designer is seen as the user, since eventually, anyone interacting with the technology shapes its meaning and how the technology behaves.

#### *5.2. Somaesthetic Design*

Somaesthetic design underscores the need to place importance on the aesthetic aspect of the felt bodily experiences, as a fundamental element of the design process. This is, for us, a grea<sup>t</sup> first step to attempt to create embodied technologies or interactions. The chosen approach, consequently, confers in our case a key role to the body in the design of personal technologies for affective health. Somaesthetics, introduced by philosopher R. Shusterman [10] is the result of the efforts of combining the body with aesthetics, with a strong emphasis on how the body plays a major role in how we feel, perceive and think the world. With somaesthetic design, an attempt is made to leverage the role of the feeling body when engaging in design experiences. This strategy, requiring certain training to grasp one's sensations, control the movement and perceive what we feel, offers a fresh approach to using the body as the main instrument to feel, assess, and appreciate the design affordances. The soma design manifesto [8] highlights, among other aspects, the need to engage slowly in the aesthetic appreciation of the technologies being designed, disrupting the habitual and inspiring users' drive to obtain interactions—biosignal-mediated in our case—that lead to novel ways to embrace technologies in our lives. Body practices to support the process of getting attuned to one's sensations are usually employed, as exemplified in Reference [9]. These often further support the notion of estrangement (or disrupting the habitual), that is, how designers can engage in actions, movements, or performances far from the habitual way of carrying them out. By doing so, the intricacies of a certain interaction become exposed, helping its analysis or reaching to novel possibilities to carry it out.

#### *5.3. First-Person Biosensing*

Having successfully been applied to design workshops that address the effects of actuation-based interaction for embodiment and self-reflection, the somaesthetic design offers a unique opportunity to address the challenging goal of bringing together biosensing and actuation in an evocative way, relevant to the user who would utilize personal technologies for emotional awareness and

regulation. In taking a somaesthetic design approach, affective technology researchers find a path to make sensing workable, paving the way for discoveries that support the creation of new relevant technology-mediated experiences as targeted by the AffecTech project [6]. To explore how our internal physiological mechanisms can be revealed via a set of biosignals, a routine for experiencing sensing from a first-person perspective was followed, significantly inspired by previous haptic actuation explorations [48,68] and used in part in Reference [49]. Using the knowledge in biosignals acquisition and relevant information processing the designed routine aims to support the learning of newcomer students to the field of biosignals, potentially adding tangible interaction features that make the topic more accessible. This process was originally thought as having an expert guide on biosignals and a learner that would follow and report on the felt biosensing experiences during a 1-3 hour session depending on the number of different biosignals covered, but in practice moved to a more open exploration scheme based on switching roles where no expert/novice knowledge hierarchy is sought [49]. The first-person sensing experience goes beyond a theoretical explanation and the lab experience with pre-recorded signals or a recording session of biosignals. With this different approach, the students or researchers that want to be initiated follow an introspection process to discover in a deeper way how and what can change the internal mechanisms of the signals and the body. This process is not thought to take place in a formal lab experience or lecture teaching environment, but accompanied by a person experienced both in soma design and biosensing that would revisit the experience and rely on body practices to awaken the somaesthetic appreciation needed for the exercise (see for instance the work of Windlin et al. and Tsaknaki et al. [9,48]).
