*5.4. Orchestration*

Orchestration mechanisms try to answer the overarching goal of achieving systems that facilitate the exploration of biosensor data and meaningful representation through actuation that addresses many more modalities than just visual feedback (see Section 4.1). To achieve this facilitation it is crucial to be able to combine and nicely coordinate the relationship between input (biosensors) and output (actuators). In this paper, the term orchestration defines the process of:


#### *5.5. A Soma Design Example: The Breathing Light, or How Light Actuation Inspires Design*

Interaction design research work by Ståhl et al. such as the Breathing Light [69], preceding the cross-disciplinary efforts presented in this paper, proved inspirational to our research. The goal of the Breathing Light prototype is to help the users to find a safe place where they can take a break from daily routines, focus on inner body processes, and reflect. The prototype is built from fabric and string curtains creating a secluded space for the user's upper body. The Breathing Light system switches the users' attention to breathing, focusing on the experience of inhale/exhale cycles. Light has been chosen as a modality given its ability to subtly guide the attention of the participants inwards. Technology-wise, the Breathing Light is a lamp with a proximity sensor. The sensor measures the distance between the chest of the user and the lamp, which in practice becomes a breathing sensor under these conditions. The ambient light in the prototype is dimming in accordance with the breathing

patterns: exhale with a dim-out and recovering when inhaling. The intensity of the ambient light is high enough to make it possible to follow the light pattern even with the eyes closed but is not high enough to distract the user. The participants reported that when they were lying under the Breathing Light module, they felt enclosed and taken care of. Limitations arose as it was a demanding task to set the timing, intensity, and warmth of the light. However, in turn, this interaction facilitated an intimate correspondence between the perception of the breathing and the light, which meant that the light was perceived as an extension of the body, providing a much richer experience of breathing.

#### **6. Results: Designing Biosensing-Actuation Couplings**

In the work of *Somadata* [49], soma design sessions and first-person accounts of the user/designer participants are highlighted to understand what constitutes a tangible or "felt encounter" with an otherwise disembodied design material, that is, biosignals. Our design approach is not that of a "solutionist" method that tries to quantitatively acquire data and formally evaluate how a coupling solves a given problem. This avoidance of a solution is a resource that has been leveraged in design fiction [70]. Our research does not try to tackle the recognition of certain given patterns in biodata to, for example, make users optimize behaviors (walking, running or fitness related activities) nor prevent anomalies (heart malfunction, fall detection, stress recognition). We use design topics, or challenges, at most—such as exploring synchrony between peers. We do not work with a given problem. Rather, a qualitative, explorative stance grounded on the body is at the basis of design discoveries that help us look at biosignals as design material to be shaped, changed and integrated in interaction design toolkits instead of taken as a given, unchallenged and immutable. Where we have failed with other coupling attempts, a selection of carefully crafted sensing-output combinations succeed in achieving what we call *soma data*, that is, biodata that is somatically experienced, leading to novel insight, collectively shareable and in line with a design context or goal. Soma data examples in Reference [49] include a mechanism for groups of two people to connect non-verbally through audio and synchronous movements, a way to share muscle activity insight (on the calf muscles) relevant to an activity of crossing a balancing pole and new way to understand EDA data thanks to a haptic heating effect—that we address in more detail. In this paper, we want to open the design space. Hence, we bring to discussion the underlying interaction mechanisms of this kind of experiences. Although prototypes could be evolved into final products and studied quantitatively, the research presented in this paper is, instead, driven by the inspiration gathered from works that use technology to connect experientially to our felt bodies [9,48,68]. We aim to incorporate biosensing in soma design toolkits, as a design material, and discern what is needed to support this design. When used in soma design workshops, first-person somaesthetic accounts of users exploring the couplings are taken to assess whether an input-output connection is meaningful with regard to body self-awareness. Orchestration decisions are integrated in the organization of wired, programmed effects and input-output mechanisms found in the couplings that successfully led us to what we consider interaction design discoveries [49].

However, our interaction design research seems to sugges<sup>t</sup> that better orchestration mechanisms would render our technology-mediated interactions more aesthetic [9,48]. In this line of thought, we have conceptualized a workflow that integrates sensor devices and actuators into a shared network that is controlled through a server, responsible for data transmission between components. In this scenario, the designer has the ability to explore available devices either via a graphical user interface (GUI) or tangible user interface. Furthermore, the available devices can be connected with one another through the GUI to couple input signal streams (such as biosignals) to output signals (like the intensity of a haptic actuator) and allow one to create and fine-tune somatic associations. These associations are designed through defining triggers and responses to signals or even via training machine learning models that react to input signals and act on the output signals. The system should allow for a flexible use of devices, their signals and behaviors to uncover novel interactions (Figure 2). An early example of an interactive visual programming metaphor can be seen in Figure 6.

In this section, we describe *Scarfy*, one of our research outcome biosensing-actuation coupling prototypes (EDA to temperature), as well as current work in Breathing Synchrony and EMG couplings with audio. In our view, a potential orchestration platform should let the users choose what elements are present in every experience, such as sensing-actuation modalities (haptic, sound, light, heat, cold, airflow, shape-changing), count on visual programming interfaces (as used in the EMG-audio feedback experience), enable the possibility to run signal processing code snippets (e.g., breathing synchrony assessment and audio feedback), allow interactions that work more implicitly (movement monitoring, wireless/wearable devices), and set the time sequence structure and order in place, hence shifting the design focus away from the technology constraints and highlighting how experiences are enacted and elements are part of a whole.

**Figure 2.** Mechanisms needed for the orchestration of couplings.

#### *6.1. Scarfy: A Temperature Scarf to Make Electrodermal Activity Perceptible*

Inspired by existing research and commercial work on haptic material actuators on the body [43, 71], we started exploring different materials and actuators to communicate biodata. We explored materials that are low-cost and safe to use for near body applications that take wearability and comfort into account. After trying out and working with different materials and actuators, that is, thermochromic, heat resistive materials, vibrotactile motors, shape memory alloys [43] and sensors, that is, electrodermal activity, heart rate and breathing, we started preparing a temperature-actuated scarf to promote interpersonal synchrony by linking skin conductance data. This coupling was aimed toward a soma-design session that explored the concept of interpersonal synchrony [49]. We chose the EDA signal, which has been often used to communicate increase and decrease of physiological arousal, to actuate heating and cooling. We used four 20x20 mm Peltier modules in series with a distance of 2.5 cm and enclosed them in a scarf. The resulting artifact can be easily worn on the neck and taken off as shown below (Figure 3).

**Figure 3.** Scarfy: (**a**) EDA heat/cool temperature scarf coupling, (**b**) participants exploring actuation on the neck, (**c**) forehead and (**d**) showing the heating elements.

The Peltier modules are driven by Arduino boards with motor drivers. Their actuation is triggered by an EDA sensor. To mark the increase and decrease of changes in physiological arousal using temperature, we created four different patterns of heating and cooling as shown (see Figure 4). These patterns are *Appearing/disappearing heat/cool* actuating heat or cold in all the modules at once and then turning them off simultaneously. The second and third patterns as shown in Figure 4b,c are *Increasing heat/cool* meaning that heat or cool slightly turns up module by module, and *Decreasing heat/cool*, that is, gently reducing the thermal effect one by one. The fourth pattern is the *Moving heat/cool* pattern (Figure 4d ) in which thermal actuation is alternated on the modules one by one following a spatial direction and keeping the temperature constant.

**Figure 4.** Scarfy EDA-temperature patterns with four Peltier module elements and how they change over time *ti*: (**a**) Appearing/disappearing heat/cool, (**b**) Increasing heat/cool, (**c**) Decreasing heat/cool and (**d**) Moving heat/cool.

The purpose of this coupling and heating and cooling patterns was to communicate increasing and decreasing arousal in interpersonal settings. A set of designers' first-person accounts and insight on using the prototype in a design workshop focused on synchrony are found in Reference [49]. We wanted to explore how Scarfy can mediate synchrony between people and probe what heating and cooling patterns, intensity and duration would best support this quality. Besides we also wanted to explore feedback around technology *black-boxing* in Scarfy—that is, what design elements should be

overtly exposed for customization—how it can be improved, and how it should be used in everyday life settings. Participants in design workshops approached Scarfy by wearing it around the neck, trying out different patterns and positions to feel the increase and decrease of heat and cold. While exploring different patterns and placement we felt that, although subjective, heat and cold have different scales, that is, the sensation of both heating and cooling feels different depending on the parts of the body it is applied to. While exploring the patterns on the body, we discussed how cold feels more pleasant than heat because of the placement of the modules inside a thick scarf fabric which is itself warm. Placing the scarf around the neck, we found that Peltier modules often do not touch the skin and need to be pressed in order to be felt. We were not limited to the neck area only in our exploration. We also explored several other areas such as the forehead (see Figure 3c), back, shoulder, and wrist. While exploring these other placements, we found that the considerable size of the scarf is harder to manage around these other areas. Therefore, we discussed that it would be better to place Peltier modules in smaller strap-on patches that can be placed and taken off easily. It would give us enough freedom to quickly explore patterns on different parts of the body. Finally, talking about arousal and the overall purpose, we discussed that the exploration should shape what meaning we assign to it, that is, whether you are trying to learn about yourself or you are trying to calm yourself down. In fact, ambiguity and the interpretability of electrodermal data have been a recent matter of study in human-computer interaction research [40], with some works questioning the user's meaning-making processes and challenges when new representations are appropriated, taken outside the lab [72,73]. For Scarfy, any researcher can explore several patterns and needs to figure out which one fits for his/her personal experience, that is, bodily awareness, calming yourself down, feeling your peers' arousal. Moreover, the interaction described in Reference [49], invites us to rethink how the aspects or features of the signal translated to actuation changes constrict the way biodata is perceived.

#### *6.2. Breathing in Synchrony: From Physiological Synchrony to Audio Feedback*

This coupling example, drawing upon the psychology concept of therapeutic alliance [74], takes respiration data from two users in the same physical space, where two BITalino devices stream data wirelessly to a host computer. In this example, two users participate in a timed breathing exercise together whilst their individual respiratory patterns are being measured with piezoelectric (PZT) bands placed around the diaphragm (see Figure 5). The data is aggregated on the host computer, executing a script that measures the collective breathing activity. From here, we apply shared biofeedback in the form of sound to stimulate synchrony awareness and physiological dialogue between users over time.

**Figure 5.** Breathing synchrony-audio experiment, based on the analysis of two BITalino piezoelectric abdominal respiration signals (image showing the two BITalino streaming simultaneously).

The exploration followed a stage of preliminary research on physiological synchrony features both in published research and drawing upon statistical measurements, potentially generalizable to signals other than breathing. We implemented the computation of linear regression coefficients, cosine similarity and correlations between filtered signal and derivatives. The process for mapping the user's activity audio output can be split into two main components. First, the device data is transmitted to a Python program, which is used to perform statistical analysis on the incoming signals, calculating a "magnitude of synchrony" using the features listed above. After a fifteen second warm-up period, the system accumulates a sufficient amount of data to determine mutual behavior, and the resulting values are encoded into Open Sound Control (OSC) messages that are continuously streamed to a local address, enabling the designer to map the data to appropriate parameters for sound feedback. With this generic protocol in place, we aim to embrace modularity, and advocate for the experimentation of sonic associations. In our tests, we used Cecilia's [75] built-in granular synthesis engine; this manipulates the playback of a pre-recorded soundscape divided into independent samples of 10 to 50 milliseconds [76].

#### *6.3. Orchestrating an EMG-Audio Feedback Coupling*

A depiction of an orchestration platform we achieved to create is that of the EMG-audio feedback coupling. Through a visual programming interface called PureData (Pd) [77,78], we connected a muscle activity signal with processing capabilities and a given audio pitch that changes properties according to the biosignal dynamics as muscles are contracted. The interface, shown in Figure 6, presents intuitive elements such as sliders and value boxes that facilitate the decision and modification of the coupling properties. This patch receives the EMG signal from a BITalino R-IoT device, a WiFi-enabled sensor platform, via Open Sound Control (OSC) [79,80] data packets. We mapped the EMG signal to sound in three steps: first, we took the absolute value of the signal—a full-wave rectification; second, we smoothed it with a low-pass filter to remove some of the oscillations; finally, the smoothed signal was mapped to the pitch of a sine wave generator. The higher the measured muscle contraction the higher the pitch the generator produces, and vice versa.

**Figure 6.** Orchestrating an EMG-audio feedback coupling in a PureData interface patch.

#### *6.4. Understanding the Different Input/Output Tradeoffs*

Through soma design sessions, we used first-person accounts of the designers or users of the created technology couplings to better understand the drawbacks and benefits that the input and output modalities pose. This subsection lists resulting remarks on the couplings and modalities studied (see Table 2). Scarfy, for example (Section 6.1), draws on previous studies [42,43] that propose novel EDA feedback. Although works such as Reference [41] are interesting in how they look differently at biodata to engage with, in our case we avoid visual feedback and design with the body to investigate effects that can be worn and felt. However, our results, letting us envision the platforms to support the design of couplings, lack the perspective of longitudinal studies highlighting users' data interpretation [40].



Our claim here is not that a soma design approach is the *best* or most efficient way to design with biosignals. Instead, we argue that soma design provides an interesting way to bridge between engineering- and interaction design perspectives and that this bridge in turn renders novel, creative, and relevant design concepts. In our work, it led to the creation of digitally-enabled experiences that succeeded in making us aware of sensations and reactions of our own bodies as well as those of our peers, at the same time as these explorations pinpointed technological challenges when sensors and actuators were used in ways they were not intended for. It helped us to move away from the predominant health optimization or fitness performance paradigm often present in physical and activity tracking devices that most biosensors are built for. In this sense, it provided a richer space for what biosensing might be used for.

## **7. Discussion**

The coupling prototypes presented in this paper, in line with the reflections presented by Reference [49], led to what are arguably design discoveries in combining biosensing with body actuation. These are used to highlight the role of the body and ultimately make us, the users/designers, connect more intimately with it. Soma design is not a shortcut to circumvent the difficulties present when designing with biosensing, but a way to approach them differently. Interaction designers must face the same challenges that engineers or developers struggle with when evaluating what form factor, sampling rate or placement for a sensor is best for a given input or action of interest. Within our soma design exploration, though, issues such as noise in muscle tracking, electrode misplacement, or sensor undersampling that leads to no data variations are experienced through, for example, distorted sounds, excessive vibrations, or changeless temperature feedback, echoing what Fdili Alaoui writes about artists avoiding a problem-solving approach and turning technology resistance into creativity [81]. We believe that instances of couplings that have succeeded in bringing design insight should be integrated into a design or prototyping toolkit. Furthermore, our design approach offers the foundations to successfully integrate different sensing and actuation modalities in a way that is evocative to the body. With regard to Scarfy (Section 6.1), there is a direct link to the works that inspired an EDA coupling [42,43] to be more closely felt on the body emerge from an affective awareness goal. That is also the case of breathing, where Miri et al. [46] show haptic actuation examples with a clear affect intervention workflow. Our approach, instead, is that of supporting the design process. We do so by exploring what effects are possible and using the designer's own felt body to assess them. The change of focus is relevant. For instance, instead of using feedback as a breathing pacer [ibid.] or affect control mechanism, we delve into the experiential properties of the biosignal at hand and how they are shared or understood collectively. Although there is room for development, the paradigms that we used depict avenues in which we aim to widen the palette of interactions, refine orchestration mechanisms and connect to the underlying ethics of our way of designing bodily awareness or affective technologies.

#### *7.1. Orchestration: The Soma Bits Toolkit*

In previous work, we have brought forth the Soma Bits: a prototyping toolkit [48]. Acting as accessible "sociodigital materials", Soma Bits allow designers to pair digital technologies, with their whole body and senses, as part of an iterative design process. The Soma Bits have a form factor and materiality that allow actuators (heat, vibration, and shape-changing) and sensing (biosensors and pressure sensors) to be placed on and around the body (see Figure 7). They are comprised of a growing library of three-dimensional physical soft shapes, which are made of stretchable textile and memory foam. Each shape has at least one pocket, making it possible to insert different sensing or actuating components. By combining several actuators with shapes, one can orchestrate experiences, and explore the qualities of the sociodigital material directly on the body, by changing the parameters of the sensors and actuators and placing the shapes on different parts of the body.

**Figure 7.** Elements of the Soma Bits design toolkit: (**a**) shapes, (**b**) temperature actuation, (**c**) vibration actuation.

The Soma Bits are easily (re-)configurable to enable quick and controllable creations of soma experiences which can be both parts of a first-person approach as well as shared with others. In the case of a first-person exploration, someone can, for example, experience and reflect on the properties of heat actuation on their foot and gain a bodily understanding of a heat modality, which can be later integrated into the design of interactive systems. In the case of exploring somaesthetic experiences shared with others, an example can be that one person feels on their spine the breathing patterns and rhythm of another person, translated into a shape-changing pattern that is experienced through the spine soma shape that is part of the Soma toolkit.

We have taken the first steps towards orchestrating collective behaviors of the Soma Bits by combining different Bits and allowing interaction designers to program complex behaviors (e.g., slowly shape-changing materials that heat up when a user presses on them). To achieve that, we aimed at a protocol and an interface for connecting Soma Bits together. In the middle, between sensing and actuation, we provide an "orchestration unit". This unit acts as a hub for the sensor and actuator network. The orchestration unit is controllable through OSC (Open Sound Control) [79,80]. This protocol enables the usage of common musical interfaces, such as controllers or sequencers, allowing end-users to program the bits without having to write code. We have also started using supervised learning algorithms to quickly help bootstrap interactions. These algorithms allow for mapping noisy sensor data to actuation, which in turn would allow for more complex behaviors to also be programmed into the bits without writing code. We tested the Soma toolkit focusing mainly on combinations of shapes and actuators during three workshops with interaction design researchers and students from several disciplines. Research purposes were explained to participants, who signed informed consent forms. The first was a one-day workshop at the Amsterdam University of Science that took place in October 2018, in which 30 master's students engaged in a soma design process having the Soma toolkit as the main medium to explore actuation and bodily experiences around the topic of empathy. The second workshop took place in December 2018 at the Mixed Reality Lab, at the University of Nottingham, UK, and was focused on the Soma Bit shapes addressing the topic of balance. First-person accounts of the designers involved in the study and an elaborate analysis of design outcomes of the workshop can be found at References [49,82]. Together, we explored for three days the Soma Bits in several design contexts, including VR applications and leg prosthetics for dancers. During this workshop, we introduced sensing, apart from actuation, through the BITalino prototyping platform [25,55,56]. We also initiated the design of couplings between sensing and actuation, for example by translating movement through acceleration, to sound. The third workshop deploying the Soma Bits was conducted in February 2019 in Milan (see examples in Reference [49]). In this workshop, researchers from several disciplines including psychology, engineering, and

interaction design, experienced different prototypes that were brought to the workshop, in combination with the Soma Bits toolkit. The workshop lasted for a day, synchrony was the main topic underlying the prototype demonstrations, as well as the bodily and technological explorations. As a general reflection we observed that as soon as the Soma Bits toolkit was introduced to the design process, the workshop participants shifted their attention to experiencing the sensing-actuation technology through their bodies, rather than just on a conceptual or verbal level. On a broader level, the Soma Bits toolkit addresses the difficulty we experienced in past soma design processes—that of articulating sensations we want to evoke to others, and then maintaining these experiences in memory throughout a design process. Thus, the Soma Bits enable designers to know and experience what a design might feel like and to share that with others. The Soma Bits have become a living, growing library of shapes, sensors, and actuators and we continue using them in our design practices, as well as when engaging others in soma design processes.

#### *7.2. Missing Bits: Shape-Changing Actuation*

In the creation of a Soma Design toolkit, we aim for a wide range of tunable modalities for the user to explore and create the effects what communicates best for her/his soma. In this regard, we know that our prototyping efforts fall short on making shape-changing actuation available. HCI research has already shown some of the design potential behind linear actuators and inflatable shape-changing mechanisms that inspire us and guide our future research perspectives.

## 7.2.1. Linear Actuators

The linear actuator serves as a central piece of what we call the Soma Pixel system. Soma Pixel is a modular interchangeable sensor-actuator system (see Figure 8a), inspired by shape-changing projects carried out by MIT researchers from the Tangible Media and the Senseable groups:


We aim at having a number of smart modules ("pixels"), capable of sensing the human body (pressure by weight, biosignals) and providing certain actuation. The shape-changing actuator (currently a linear one) serves as a skeleton for the "pixel". Sensors and other actuators are thought to be located in the upper part of the device. The modules can be easily rearranged in space, while "knowing" their relative position with respect to each other. At the moment, in the Soma Pixel setup, the linear actuator is coupled with a force sensor. Actuation happens when weight/force is being applied to the sensor. This is, nonetheless, accompanied with limitations, that is, the linear actuators cannot move (change length) fast and the motor inside of the linear actuator makes substantial noise when running.

**Figure 8.** Shape change: (**a**) Prototyping with linear actuators (**b**) Inflatable shape.

### 7.2.2. Inflatable Shapes

In an ongoing research line, inflatable shapes are currently being used for constructing a "singing corset" prototype, as well as a part of a newer revision of the Soma Bits toolkit. The corset is built upon an Arduino-powered inflatable that will be integrated with the Soma Bits (see Figure 8b). However, the first iteration of inflatable shapes had difficulties with fast deflation. This was partly solved by adding another air pump, devoted specifically for exhausting air. Experimenting with the addition of separators inside an inflatable shape or splitting it into multiple inflatable sections may further improve exhaust performance. Another limitation is unavoidable minor air leakage, which will happen due to imperfections in manufacturing the actuator (sealing inflatable shape, valve timing).

#### *7.3. Extending Orchestration: The Role of Biosignal Processing*

A relevant part of our research has focused on the extension of biosignal feature extraction, processing, and analysis in real-time (to enrich real-time feedback possibilities within the lab and in more ecological settings). While initial sensing-actuation orchestration couplings have successfully made use of basic signal processing, further possibilities lie in improving the current algorithms and processing approaches. Any sort of biosignal acquisition, in particular in psychophysiology sensing, can be seen as the process of dealing with sequential and time-series data. Prior to the deployment of feature extraction and selection techniques, data must be preprocessed properly. ECG, EMG, EDA, IMU, respiration, and all the data collected from available sensors are in general sampled at different rates and present different properties that entail signal noise and instances of data that do not conform to (standard) expected representations. These can be seen as artifacts or "meaningless" data. Since the data obtained in real-world ambulatory settings is always noisy, presenting inconsistencies or missing values, preprocessing and cleaning is required (see examples in previous studies we conducted [57]). As it is the case for the features listed in Section 3, time domain and statistical features such as, in the case of designing with breathing and/or heart monitoring (ECG), the mean value of the rates, the mean value of the time between events, the standard deviation of those intervals and the root mean square of successive interval differences are accessible in the wild. Of particular interest are the less apparent frequency-based features which given the demanding spectral analysis computational requirements have just begun to appear in the nowadays more capable out-of-the lab devices. While certain features count on solid research support, pointing at the most scientifically validated ones, attention is given to the exploration of the potential mappings that can be built upon them. The use of multimodal data (namely, the use of several biosensors at the same time), extends the monitoring capabilities that can be brought to the couplings' design by widening the perspective and being able to keep track of body signals that are not the main focus of the interaction. This allows the designer to put in place validation mechanisms, gaining accuracy, and supporting the claims made with respect to the the targeted biosignal. If orchestration aims at the creation of meaningful technology-mediated interactions, the made connections need to rely on processing capabilities, event detection, and high-level feature extraction to overcome the limitations of too basic couplings that only put in place simple (signal) amplitude-to-intensity mappings. Moreover, the group somaesthetic design appreciation sessions where the felt experiences are brought together and debated (usually relying on inspirational body-centered exercises, reenactions, body sketching tools, and verbal communication) could be significantly enriched with the monitoring of signals acquired during the design experience. As design explorations have already started to show, there seems to be room for the creation of machine learning and processing capabilities that could render interactions with the technology more implicit (also embodied or intuitive).

#### *7.4. Extending Orchestration: Machine Learning Capabilities and Personalization*

Standard machine learning workflows are usually multi-step and involve the definition of problem-specific feature extraction methods, as well as in-depth expert knowledge of the problem at hand. Deep Learning (DL) [87] techniques, also part of our signal processing research efforts, shift the ML practitioner focus from that of feature extraction methods to feature learning. In particular, in end-to-end learning settings, DL models are directly fed with the raw input signal (i.e., without any form of pre-processing, or at best very mild), and use this to automatically extract (deep) features from it and make predictions based on them [23,88]. Its benefits apply as well to machine learning approaches aimed at improving pattern recognition capabilities of relevance in physiology or affective data [89]. In cases in which the input signal is high-dimensional or when timestamps are a relevant characteristic of the problem at hand (e.g., biosignals, multimodal), DL methodologies have been shown to outperform traditional pattern recognition techniques. While research for real-life DL deployments is still under progress, with algorithm computational resources being one of the main limitations nowadays, DL approaches emerge as mechanisms to overcome the difficulty of coming up with sensible hand-crafted features for a certain classification problem at hand. Provided advances towards DL in ambulatory scenarios are made, orchestration design (of sensing-to-actuation couplings) can potentially benefit from DL feature learning by lowering the technology expertise burden on the designer/experimenter side. Overall, this would allow experience design to focus more specifically on the affordances of the interaction rather than the processing mechanisms. To this end, work on Convolutional and Recurrent Neural Networks (CRNN) has shown to be able to deploy a Neural Network paradigm that obtains classification/recognition performance improvements by algorithm personalization, that is, avoiding specific calibrations to be done according to the user. Typically, personalized Machine Learning models are simply the idea of learning individual behaviors that train a model using the data collected only from the subject. These models are usually customized to the requirements of each individual, imposing requirements to consider the individual differences, ye<sup>t</sup> still using data collected from the population. To some extent, new deep learning paradigms are circumventing these requirements.

In Section 3, we describe how embodied sensor technologies react differently in accordance with the unique biological characteristics of the body. Similarly, the perceived impact of a given actuation mechanism—as those described in Section 4—largely depends on the sensitivity to a given stimulus, as well as the natural bodily variations between different users. With this considered, we recognize the necessity to attune the system's parameters in order to produce mappings that facilitate meaningful interactions that are not overly obtrusive. While auto-calibration mechanisms have been implemented in the previous examples, which typically define and minimum/maximum parameter ranges, we foresee an extended benefit in adopting Interactive Machine Learning (IML) [90] frameworks as means to foster perspectives respecting body pluralism. Existing frameworks such as *Wekinator* [91] and *Teachable Machine* [92] facilitate the design of classification and regression-based mappings that are initiated and iteratively adjusted with example data provided by the user (e.g., Reference [93]). Furthermore, we would like to explore the use of Interactive Machine Learning to develop novel coupling relationships that go beyond linear mappings, as well as intuitive mappings between multimodal inputs and multi-dimensional outputs.

In our sound-based examples, visual programming environments heavily assisted the orchestration process. In both cases, the systems enabled users to visualize a continuous stream of mappable data in real-time, clearly exposing any unexpected behavior that may occur (for example, with the displacement of sensor electrodes). The node-based functionality of the frameworks allowed for a coherent representation of the dataflow and signal processing steps in order of execution, less abstract compared to a code-based script. During the process of developing the system, a user interface is generated in parallel on-the-fly as each node presents a GUI element that grants the designer access to parameters such as scaling and smoothing coefficients. In Section 6.2, a basic interface allowed users to test and compare a set of algorithms for sound mappings. This workflow can be beneficial for rapid experimentation with a variety of parameters and signal processing techniques that influence the interactive experience. It also presents a convenient solution for fine-tuning a complete system according to the user's experience.

#### *7.5. Ethical Underpinnings*

We have provided examples of research on sensing and actuation technologies, first-person somaesthetic approaches and experiments to foster the design and research of self-awareness (embodied, wearable) technologies with the potential to support self-reflection, emotion regulation, and

affective health. In any design process, and particularly for technologies that may be used in affective or health contexts, it is important to consider the ethical implications of design, use, and research. Ethics simply concerns what is *good*, with a utilitarian perspective dealing with the greatest good for the greatest number of people. To assist in ethical decisions and practices, ethical frameworks outline key concepts such as beneficence and nonmaleficence, justice, responsibility, autonomy, privacy and confidentiality, and respect for the rights and dignity of others [13,14,94–97]. By reflecting on these ethical principles and standards, designers and researchers can ensure that their processes, technological developments, and research are done ethically and for the greater good.

The concept of the first-person somaesthetic design reflects qualities of good ethical practice through its focus on experiencing each aspect of the technology to facilitate its appropriate use, integration, and development. This extensive design process explores the benefits and potential adverse effects of the various sensors and actuators and uses this to shape orchestration and future design. Guiding users in interpreting the captured data throughout the felt experience and exposing or creating the meanings attached to personal sensing is a deliberate effort to make interactions intuitive but conspicuous. In Reference [67], designing with the body is seen as a route for designers to harmonize with their felt experiences, an alternative of particular relevance in light of the implicit interaction direction that personal tracking technologies seem to move into. The slow and deliberate soma design methods also reinforce the careful consideration of how technology may impact experience, and therefore the potential effects this may have on future users. Part of our research on orchestration, attempts to provide the design exploration ground that would show capabilities, limitations and roles of the involved technologies that participate in the designed interactions, in line with the first-person perspective. It is important to acknowledge that when stepping into the user's shoes, designers jump to the firing line. Potential effects for the user are experienced firsthand. Another ethical strength lies in the acknowledgement of the need for variety and customization of these experiences to suit the individual end-users and their specific needs, as avoiding the blindness for body differences is advocated early on in first-person work [67]. This is important for inclusive and diverse technologies which must consider not just individual differences across users, but also users with additional needs or impairments who may be disadvantaged or excluded from technologies which focus on only one modality, such as visual interfaces. The ethical somaesthetic design will involve consideration and incorporation of ethical principles and practices in the design process from conceptualization and throughout the design and development lifespan. Designers and researchers should be familiar with core ethical principles and should reflect on how these may shape their design practices, research, and technological developments. Research examples such as the work of Balaam et al. [35], point at ways to challenge the design practice, especially in emotion work, and avoid engaging participants by default but questioning and justifying their engagement. Autoethnographic and first-person design are seen as promising alternatives. Future research should explore how to incorporate user-centered design within the first-person design process. This will increase the validity of the premise that first-person experiences can be used to create devices to serve the diversity of human experiences. This is especially crucial for devices that may be used for emotion regulation or affective health, where different users may have different experiences, needs, and risks based on their unique histories and circumstances. Experiments such as the design explorations and couplings described in this paper can, therefore, be adapted to involve persons with varied lived experiences to explore how their experience of the soma toolkit or other technologies may differ from those of the designers. This is further connected to concepts of beneficence, non-maleficence, and justice which encompass issues related to benefits, risks, safety, fairness, and equal access for all. Technology offers opportunities to reduce barriers but to do so, design and development must consider how to deliver innovative and impactful technology while still being accessible and affordable. Designers also have a responsibility to consider the intended and unintended potential consequences of any new technology, and the need for appropriate design, guidance, and support to ensure safe use. While discussions of impact, effects, and outcomes are centered on end-results, it is crucial for these to be considered at the beginning of the design process

and throughout to ensure the creation of good and safe technology. Also important to consider are issues of data security, privacy, and ownership of personal data, with the safe and secure handling of biosensor data an important design consideration with ethical and legal implications for its use and misuse. Finally, while somaesthetic couplings may offer opportunities for self-awareness and emotion regulation, designers must consider the balance between optimizing and pathologizing typical human experiences, as well as the potential stigma in encouraging the tracking and monitoring of affective health. As with all ethical issues, this must be considered throughout the design process. Like the somesthetic design, the ethical design must be a continuous process integrated throughout all aspects of the design experience and production.
