Next Article in Journal
Layered-Cost-Map-Based Traffic Management for Multiple AMRs via a DDS
Next Article in Special Issue
Multiple Object Tracking in Robotic Applications: Trends and Challenges
Previous Article in Journal
A Harmonized Information Security Taxonomy for Cyber Physical Systems
Previous Article in Special Issue
Towards a Multi-Perspective Time of Flight Laser Ranging Device Based on Mirrors and Prisms
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Smart Modular IoT Sensing Device for Enhancing Sensory Feedbacks in Surgical Robotics

1
BioRobotics Institute, Scuola Superiore Sant’Anna, 56127 Pisa, Italy
2
Department of Excellence in Robotics & AI, Scuola Superiore Sant’Anna, 56127 Pisa, Italy
3
Smart Medical Theatre Laboratory, ABzero, 56124 Pisa, Italy
*
Author to whom correspondence should be addressed.
Appl. Sci. 2022, 12(16), 8083; https://doi.org/10.3390/app12168083
Submission received: 11 July 2022 / Revised: 5 August 2022 / Accepted: 10 August 2022 / Published: 12 August 2022
(This article belongs to the Special Issue Trends and Challenges in Robotic Applications)

Abstract

:
This paper proposes a device of sensing that could be integrated into the instruments of any surgical robot. Despite advances in robot-assisted laparoscopic surgery, the tools currently supplied to surgical robots have limited functions, due to the absence of sensorization. With this motivation, we present a preliminary work based on the design, development, and early stages of experimentation with smart and multifunctional devices of sensing for surgical tools. The proposed device of sensing has a proximity sensor, colorimetric sensor, and BLE connection for different surgical instruments to connect to each other. The proximity feedback allows the surgeon to know the distance of the instrument from a particular tissue, to operate in conditions of greater safety. With the colorimetric feedback, on the other hand, we intend to proceed to the identification of specific tissue areas with characteristics that are not typical of the physiological tissue. The results show that the device is promising and can be further developed for multiple clinical needs in robotic procedures. This system can effectively increase the functionality of surgical instruments by overcoming the sensing limitations introduced by using robots in laparoscopic surgery.

1. Introduction

The past few decades have seen an exponential growth in medical technology, particularly with regard to the application of robotics to surgery. Robotic surgery, the latest evolution of minimally invasive surgery, overcoming the limits of traditional surgery, has allowed the broadening of therapeutic horizons and represents the gold standard for various clinical applications.
Robotics is the center of modern health engineering. The first robot used in the clinical setting to obtain neurosurgical biopsies was the Puma 560 robot in 1985. Since then, more and more advanced surgical robots have been developed [1,2]. In general, the use of robotic surgery increased significantly from 2012 to 2018, with an increase from 1.8% to 15.1% for all general surgery procedures. Over the same period, the use of both laparoscopic and open surgery declined. For example, the proportional use of open surgery was 42.4% in 2012, compared to 32.4% in 2018 [3]. It has also been witnessed that the use of robotic surgery has increased rapidly and spread widely in numerous procedures during the years following the adoption of this practice in hospitals. Therefore, for most surgeons, it was already considered a safe and effective approach when clinically feasible.
Current robotic platforms are designed to incorporate advanced features that allow for increased accuracy by making the execution of operator tasks easier and safer. Additionally, surgical robots have retained the ability to perform surgical operations through smaller incisions. These characteristics aim to improve the results compared to those obtainable through traditional surgical methods. The adoption and diffusion of robotic surgery shows a positive trend in some geographical areas, especially in countries with advanced economies. This is shown by the widespread use of the da Vinci Surgical System (Intuitive Surgical Inc., Sunnyvale, CA, USA), in a multitude of application areas [4]. Currently, the da Vinci Surgical System represents the most widespread surgical system, with over 5000 models implemented worldwide, performing over 7 million surgical procedures in different anatomical areas. The da Vinci Research Kit (dVRK) research platform fits into this context, developed through a collaboration among academic institutions to address the challenges in starting research on surgical robotics. This has led to a significant boost in the development of surgical robotics research over the past decade, and has generated new opportunities for collaboration and linking of a surgical robot to other technologies.
Among the advantages introduced by robot-assisted surgery are the reduction in tissue trauma thanks to small incisions, less bleeding, and less need for transfusions, a reduction in hospital stays and post-operative pain, a reduction in recovery times, and a quicker recovery rate in carrying out daily activities and greater ease in the execution of complex surgical tasks, which entails greater safety for the patient. On the other hand, the disadvantages of robotic surgery are mainly linked to the cost of the robotic system, the instrumentation, the system maintenance, and to the fact that, to operate the robot, very high-level skills are required on the part of the surgeon and room staff, to be acquired through specific training [5].
During an open surgery, surgeons can use their hand to locate and diagnose abnormal tissue by direct palpation; instead, in laparoscopic surgery, direct palpation is not feasible, due to the limits of the incision [6,7]. For this reason, one aspect that many studies are focusing on is the lack of haptic feedback during the surgical procedure, in addition to visual feedback [8,9]. Nevertheless, surgeons using robotic technologies could benefit from other types of feedback, such as feedbacks of color, speed, and proximity, to further broaden the fields of the application of surgical robots [10,11].
In this work, an attempt is made to restore the functions that are lost in robotic laparoscopic surgery, by using the sensorization of the surgical instrumentation. The design intention is to develop an intelligent and multifunctional sensing device to improve the performance of surgical robots, interconnect instruments, and enable, in the future, the development of AI algorithms.

2. Materials and Methods

2.1. Device Overview

In this section, the overall design of the proposed device of sensing is introduced. For the implementation, the following needs have been considered and analyzed:
-
The design of a compact mechanical concept;
-
A circuit that integrates the functional components to detect color and proximity;
-
A machine learning model that allows the classification of tissues based on their color;
-
A BLE (Bluetooth Low Energy) communication to allow the interconnection of the various arms and the introduction of the IoT to surgical robotics.
The main purpose of the first prototype proposed in this manuscript is to carry out the sensorization of the surgical instrument used during procedures with surgical robots, to ensure a safer interaction with the organs. For the first prototyping phase, an Arduino Nano 33 BLE Sense board is used, due to its compact size and the presence of integrated sensors. Reference is made to the colorimetric and proximity sensor integrated inside the APDS9960 unit of the Arduino board considered. In particular, the proximity sensor provides feedback on the distance between the tip of the surgical instrument and the organs, to ensure safe interaction. In addition, the colorimetric feedback allows, with the implementation of a simple neural network, to identify specific tissue areas with characteristics that are not typical of physiological tissue, such as cancerous tissue structures. This kind of recognition leads the way, in the future, to the classification of healthy tissue and diseased tissue. A 3D printed mechanical support is used to mount the electronic board to the robotic instruments, consisting of two main parts held together using a magnetic anchor. The magnets are integrated into the mechanical structure of the device, and allow for a quick alignment of the two parts, and a stable anchoring to the instrument. The dimensions of the holder are compatible with the size of conventional da Vinci EndoWrist tools; as this device of sensing is designed for robot-assisted laparoscopic surgery, it can be integrated into instruments of the Patient Side Manipulator (PSM) of the da Vinci Research Kit (dVRK) and can be teleoperated by the Master Tool Manipulator (MTM) [12] (Figure 1).

2.2. Design and Implementation of the Sensing Device

2.2.1. Mechanical Design

A mechanical support is created to allow to the Arduino Nano 33 BLE Sense board to be kept in position on the dVRK instrumentation during the execution of the validation tasks. It is conceived in two different versions to adapt to the different positioning along the instruments of the dVRK. The first version of the prototype, shown in Figure 2, allows the lying positioning on the robot instrument. This arrangement is ideal during BLE communication between the various robot arms, as there is no need to keep the sensor integrated on the board in a specific position. Furthermore, the first position is characterized by a reduced size and a better flexibility during the movements.
The second version, shown in Figure 3, is designed for cases in which it is necessary to hold the embedded sensors that look towards the work surface. This is useful in the experimental validation phases during the detection of the color and proximity of tissues placed on the work surface.
This support, for both versions, is made up of two main parts. The first part represents the housing for the electronic board, while the second part connects to the first for the positioning of the device along the EndoWrist tools, as shown in Figure 2 and Figure 3. The two structural portions of the mechanical support are held together using four K&J MAGNETICS neodymium magnets (NdFeB), for each side. The magnetic blocks have dimensions of (0.64 × 0.32 × 0.08) mm and grade N42. The magnetic anchoring enables the quick alignment of the two mechanical parts and the perfect integration of the sensing device with the dVRK instruments. At the same time, this magnetic mechanism makes it possible to separate the two structural portions of the support after performing the tasks using the tip of the surgical instrument itself [13,14]. This operation not only saves time, but can be performed without the assistance of an assistant surgeon.

2.2.2. Hardware Components

In this research, we employ the Arduino Nano 33 BLE Sense board composed of embedded sensors to detect color, proximity, motion, temperature, humidity, audio, and more. The presence of the embedded sensors allows the board to manage numerous IoT and AI applications without requiring the presence of external sensors. This board is built upon the nRF52840 microcontroller and runs on Arm® Mbed™ OS. The processor has other important features such as Bluetooth® pairing via NFC and ultra-low mode energy consumption. For the sensorization of the device, reference made is to the APDS-9960 unit built into the Arduino Nano33 BLE Sense board, which features advanced gesture sensing, proximity sensing, digital ambient light sensing (ALS), and color sensing (RGBC). This modular unit has dimensions 3.94 × 2.36 × 1.35 mm, and incorporates an IR LED and a factory-calibrated LED driver.
The proximity detection function provides the measurement of distance via the photodiode sensing of reflected IR energy from built-in LEDs. Detect/release events are interrupt-driven and occur whenever the proximity result crosses the upper and/or lower threshold settings. The IR LED intensity is factory trimmed to eliminate the need for end-equipment calibration due to component variations. The proximity results are further improved by automatic ambient light subtraction. The proximity results are affected by three basic factors: IR LED emission, IR reception, and environmental factors, including distance to the target and the surface reflectivity. The photodiode signal is combined, amplified, and offset adjusted to optimize performance. The colour and ALS detection feature provides red, green, blue, and clear light intensity data. Each of the R, G, B, C channels has a UV and IR blocking filter and a dedicated data converter producing 16-bit data simultaneously. This architecture allows applications to accurately measure ambient light and sense colour, which enables devices to calculate colour temperature and control display backlight.

2.2.3. IoT and Bluetooth Low Energy Connection

In recent years, we have seen a significant advance in digital technologies, which contributes to the current concept of the Internet of Things (IoT). At the basis of the IoT, there are “intelligent” objects that are interconnected to the exchange information owned, collected, and/or processed. The smart object must first be identifiable; that is, with a unique identifier in the digital world, and then it must be connected to transmit and receive information. These are smart connected devices that process and share all kinds of data with each other, and that can be controlled via the Internet. Into this context fit energy-efficient short-range wireless communication technologies such as Bluetooth Low Energy (BLE) [15]. This section shows how is possible to exchange information between two Arduino Nano 33 BLE Sense boards. With this communication, the various arms of the da Vinci surgical robot can be interconnected.
When a Bluetooth® connection is established, the central device will scan the surrounding devices and “listen” for the devices that transmit information, and at the same time, the device will advertise or transmit its data or information to any nearby device. As soon as the central device collects information from the peripheral device, an attempt is made to connect the peripheral device. Once the connection is established, the central device will interact with the available information to the peripheral device. This information exchange takes place using so-called services. By grouping the various device capabilities into services, central devices allow peripheral devices to quickly find, select, and interact with the desired services. Any service has a unique identifier called UUID. This code can be 16 or 32 bits long, for services with Bluetooth® specifications. One of the two Arduino Nano BLEs is configured as a central device, while the other as a peripheral device. The information shared between the two boards comes from the proximity sensor of the integrated APDS-9960 unit of the Arduino Nano 33 BLE Sense board.
For this purpose, the ArduinoBLElibrary library was used, and a service called proximityService was created with a feature called proximity_type, as shown in Figure 4. The central device tries to establish a connection with the peripheral device, and tries to discover the service and the feature that we have specified when implementing the code. If the connection is made successfully, the Nano 33 BLE Sense board’s built-in proximity sensor is activated. When a proximity value is detected by the sensor, the central device gives us feedback, through the serial monitor, on the type of distance detected (FAR, MIDDLE, or CLOSE). The value is written to the proximity_type feature of the proximity service in the peripheral device. In addition, the on-board LED in the peripheral device lights up according to the detected value. If a distance is detected that exceeds the threshold set as FAR, the green LED will turn on; in the case of MIDDLE distance, the blue LED will be on; finally, if the object is close to the sensor, the red LED lights up.

2.2.4. Complete Prototype

All the mechanical parts that make up the sensing device were produced using a 3D printer. After that, the two mechanical parts and the electronic board described above were assembled with the use of the magnetic anchor and positioned on the EndoWrist tool of the dVRK system. Figure 5 clearly shows how the first version of the prototype guarantees a minimum footprint and better compactness. This support configuration is to be taken into consideration in cases where there is no need to detect objects placed on the workspace using the integrated sensors. For example, in the case in question, it is possible to refer to this first version to demonstrate the potential of the BLE connection between two boards placed on two arms of the dVRK. Furthermore, during the validation phase of the device on dVRK instrumentation, it was found that the friction between the surface of the instrument and the inside of the device prevents any movement of the support; therefore, the stability of the electronic component is guaranteed.
The second assembled version of the prototype, unlike the first, allows the board to be kept parallel to the work surface. The main limitation of this configuration is the larger footprint, as shown in Figure 6, which could constitute an obstacle for the other tools of the robotic system during the execution of the tasks. Furthermore, it may seem less stable than the first configuration, but it has been tested that using magnets, the sensing device remains perfectly in position during the entire procedure carried out with the dVRK system.

2.3. Experimental Evaluations

This section presents the experiments conducted to verify the possibility of introducing a proximity and color sensor on the EndoWrist instrument.

2.3.1. Proximity Detection and Experiments with Organs

Proximity calibration was performed using the embedded APDS9960 unit of the Arduino Nano 33 BLE Sense. The path of the IR receive signal begins with the IR detection using four photodiodes, and ends with an 8-bit proximity result (256 values) in the PDATA register. So, we focused on proximity readings, which were based on sensing a tissue on photodiodes, and were then converted to millimeters within the sensor for our use. The board has been programmed to print out simple proximity detections and control the RGB LED accordingly, and to change the colors of the RGB LED according to the proximity of a tissue to the sensor. To take advantage of the functions of the APDS9960 detection unit, the <Arduino_APDS9960.h> library was used. In particular, the LED lights up green if the object is far from the sensor (proximity value >150), blue in the case of intermediate distance (60 < proximity value < 150), and red if the object is very close to the sensor (proximity value < 60). The threshold values shown in parentheses have been implemented in the code. The first test performed concerns the evaluation of the proximity between the tissues and the sensor. This allows the surgeon to work in a safer environment with the sensorization of the instrument, which gives a sound or visual feedback based on the distance of the instrument to the tissue. This aspect is very important during abdominal surgery procedures, where the various organs are very close to each other. Before getting to the heart of the testing phase, a connection was created between the Arduino Nano 33 BLE Sense board and the Matlab software through a serial communication protocol to manage the data collected by the proximity sensor directly in the Matlab environment. Subsequently, we moved on to the real-time display of the proximity values read by the sensor, which allowed for an instant evaluation of the trend of the data collected in relation to the distance in centimeters. Proximity data acquisition was performed with the MELFA RV3-SB industrial robot. For the execution of the tests, a support for the Arduino board to be fixed to the end effector of the robot was designed and built using a 3D printer. The support was fixed to the robot using two M5 screws, as shown in Figure 7.
The robot was programmed to perform movements along the z axis, allowing the sensor to move away from and approach the tissue sample. Once we acquired the initial position (minimum distance from the tissue) desired, we decided to make the robotic arm move by 13 cm by making vertical movements of 0.5 mm intervals. All the positions necessary for the desired movement were implemented manually using the robot programming language with a structure that is very simple and intuitive; for example, the command MOV was used to manage the movement of the robot from one position to the next. Slices of tissues with a thickness of about six millimeters were obtained from the organ. Obviously, since these are soft and/or spongy organs, it is not possible to consider a precise and absolute measure of thickness, as the tissue surfaces have protuberances and depressions. This condition was considered relevant for an optimal and faithful representation of the operational reality; therefore, no corrections were made. These slices of tissue were placed on a rigid plate on the workstation below the sensor. As shown in Figure 8, a safety space was left between the sensor and the surface of the tissues to ensure the cleanliness and integrity of the detector. During the testing phase, the same environmental lighting conditions were always maintained. Furthermore, the tests on the samples were carried out on the same day that the tissues were taken from the slaughterhouse; in this way, the freshness of the sample was guaranteed to preserve its color and consistency, to better respect the conditions in vivo.

2.3.2. Classification of Tissues by Color

This subsection presents the implementation of a TensorFlow model to classify tissues, based on color detection. The main objective is to be able to distinguish normal tissue from one with characteristics that are not typical of physiological tissue, evaluating the color differences that characterize the two types of tissue. For example, it is possible to define the area of the tissue affected by cancerous manifestations that lead to an alteration of the color, compared to that of normal tissue.
To demonstrate this concept, we limited ourselves to verifying that it is possible, based on color, to distinguish two different tissues, i.e., liver and stomach. In particular, the procedure for the identification and classification of tissues was based on the use of the TensorFlow Lite Micro library and the Arduino Nano 33 BLE Sense colorimetric sensor. To do this, a simple neural network was implemented on the board, combining machine learning with integrated systems to develop an intelligent device. For the first phase of collecting the color data from the two portions of tissues, the Arduino Nano 33 BLE Sense board was first programmed. The colorimetric sensor was integrated inside the APDS9960 unit. This allowed the detection of the intensity of the red, green, and blue colors of each tissue subjected to the sensor. For correct data acquisition, it is advisable to move the sensor around the surface of the tissue to capture the variations of color, as shown in Figure 9. RGB color values were captured as comma separated data grouped in CSV format. This procedure was repeated for both organs, which we were decided to classify by capturing the color data.
Following the data acquisition phase, the model training phase was implemented. The model training phase represents the process by which a model learns to produce the correct output for a given set of inputs. This phase involved feeding training data through the model, making small changes until the most accurate predictions were possible. A model was a network of simulated neurons represented by arrays of numbers arranged in various layers. As data are fed into the network, it is transformed by successive mathematical operations involving weights and distortions at each level. The output of the model was the result of executing the input through these operations. Training stopped when the model’s performance stopped improving. For the training phase of the machine learning model, reference was made to the Google Colaboratory, an interactive environment that provides a notepad that allows the writing and execution of code in Python, using the data collected in the previous phase. Before TensorFlow Lite ran the model that was trained, it needed to be converted to the TensorFlow Lite format, and then a model.h file would be generated to be downloaded and included in the final Arduino code, to classify fabrics based on color. Finally, after loading this code on the Arduino board, it was possible, by approaching the RGB sensor close to the object to be classified with which the model was trained, to view the percentage relating to the two classes implemented in the model.

2.3.3. Validation by dVRK

The proposed prototype has been tested on the dVRK system. The experimental protocol followed for the tasks performed with the aid of the dVRK system, and the various phases, tools, and objects used are presented below. As shown in Figure 10, an MTM from the dVRK console was used to teleoperate the instrument mounted on the PSM. The activities were carried out by three users with the support of the visual feedback of the dVRK endoscope, designed with two separate optical channels capable of recreating the most important aspect of stereopsis: binocular disparity. Inside the viewer, therefore, users followed the operations thanks to the stereoscopic vision that provided three-dimensional images of the work area [16].
The Arduino board with the integrated proximity sensor was positioned, using the support made by a 3D printer, on the EndoWrist instrument two centimeters from the tip of the tool, as shown in Figure 11. The positioning of the board at this distance from the tip of the instrument was in line with the results obtained during the sensor calibration phase; from the characterization curve, it is possible to see that the operating area of the liver sensor occurred after two centimeters, a distance at which the sensor returned the first non-null value.
Users were asked to perform a simple transfer task; a suitably sized ball was picked up from the worktable with the ProGaspr tool supplied with the dVRK, and it was brought within eight centimeters of the liver simulation object. The sequence of images below shows the steps just mentioned in performing the task (Figure 12). Finally, the user placed the ball on the object by approaching it slowly (Figure 13). During the execution of the operations, the user was able to estimate the distance of the instrument from the object based on what could be perceived by the viewer. During the entire execution phase of the task, the proximity values read by the sensor were recorded and compared with the values visually estimated by the users.

3. Results

The results related to the proximity measurements on the various tissue samples, obtained by experimental steps described in the previous section, are shown in this section. Due to the lack of synchronization between the start signal of the robotic arm and the Arduino board containing the proximity sensor, it was necessary to proceed with a manual evaluation of the first useful point at which the sensor returns a non-zero proximity value. To do this, the tissue sample was positioned on the plate as shown in section III, and the robotic arm was moved manually to detect the first significant data by acting on the coordinates of the individual joints. As shown in Figure 14, the sensor returned a series of null proximity values when it was too close to the sample. Starting from the fixed initial position P1 (position of the robotic arm closest to the sample), the sensor, positioned at the end-effector of the robot, was slowly moved away from the tissue sample by taking steps equal to 0.5 mm. In this way, it was possible to evaluate and display on the real-time graph relating to the sensor the first non-zero proximity value detected, and the corresponding distance of the sensor from the sample. The real distance was known to the user who programmed the robotic arm. By repeating this experiment for all three types of tissues, it was possible to precisely define the first useful proximity value for each sample. Regarding the liver, the first useful proximity value was detected at two centimeters from the sample, while for the stomach and intestine, the first non-zero proximity value was shown at 3.5 cm.
The characteristic curves for each tissue are shown in Figure 14. In this way, it was possible to relate the real distance due to the movement of the robotic arm with the proximity values derived from the sensor. As previously mentioned, the proximity measurement by the sensor took place thanks to the detection of the IR energy reflected by the tissue on the photodiode. Consequently, since this measurement was influenced by the reflectivity of the sample surface, it was reasonable to think that tissues of different colors could give a different characterization curve. This concept is clearly shown in Figure 14d, where the stomach and intestines with a very similar color returned the first non-zero value at the same distance. The liver characterization curve, on the other hand, had a different trend than that of the stomach and gut. It should be mentioned that for all three types of tissues, it was possible to note that as the robotic arm moved away from the sample, the readings of the proximity sensor were accompanied by a part of noise.
To ensure the repeatability of the experiment, four different tests were carried out for each of the three tissues examined. For each organ, as shown in Figure 15, Figure 16 and Figure 17, the sensor characterization curves were practically the same for the four tests. This allows us to conclude that it is possible to extend this argument to any tissue sample, and therefore, to be able to consider the curves obtained as being representative for the tissue to which they refer in relation to the sensor used.
During the following analysis, the stomach curve was taken as a reference, but similar results can also be easily extended to the other two tissues. By analyzing the curve, an initial flat area that describes the locations where the sample was too close to the sensor, and therefore, was unable to return proximity values, was pointed out. Continuously, a linear area was highlighted by the green rectangle in Figure 18.
This linear part just mentioned was well approximated by a first-degree polynomial, as demonstrated with the help of MATLAB’s Curve Fitting Toolbox (Figure 19). The linear polynomial model that best approximates this data distribution was from the equation:
f ( x ) = p 1 × x + p 2
where x was normalized by mean 7.667 and std 4.378. The coefficients p1 = 16.88, and p2 = 27.9 were estimated by the model with 95% confidence bounds.
In the same way, it was possible to highlight and analyze the next portion of the curve about the stomach, shown by the yellow rectangle in Figure 18. In the second case, it was possible to make an approximation described by a third-degree polynomial in the form:
f ( x ) = p 1 × x 3 + p 2 × x 2 + p 3 × x + p 4
where x was normalized by mean 25.17 and std 14.48. The coefficients p1 = 1.116, p2 = −3.359, p3 = 6.919 and p4 = 82.15 were estimated by the model with 95% confidence bounds (Figure 20).
For both portions of the graph highlighted, the goodness of the fitting was evaluated with appropriate parameters, such as R-squared, to understand how strong the predictive power of a linear regression model is. These measurements evaluate how much difference there is between the observed values in the sample and the values that the model has estimated. The case examined shows small discrepancies between the expected and observed values, and this indicates that the model fits well with the data. In fact, the value of R-squared, respectively, for the first and second case examined, was equal to R 1 = 0.994 and R 2 = 0.997.

4. Discussion

This work aims to design and develop a sensing device that can be integrated with the EndoWrist instrument of surgical robots, in order to provide the surgeon with colorimetric feedback and information on the distance between the tip of the instrument and the organs. This is the first version of our device of sensing, and hence, the aspects are still preliminary. Despite this, the results obtained allow us to conclude that it is possible to use color to distinguish two different types of tissue with the final goal of making tumor diagnoses of tissue portions hidden from the human eyes. At the same time, through proximity feedback, the surgeon can work in conditions of greater control and safety. Furthermore, thanks to the integration of BLE communication, a connection between the various robotic arms is possible for the exchange of information during surgical procedures. These capabilities will facilitate the opening to the world of IoT applied to the world of surgical robotics, and will enable the development of AI algorithms for automatic or semi-automatic procedures in the future.

Author Contributions

Conceptualization, G.T.; methodology, M.R., R.L., G.P. and G.T.; software, M.R.; validation, M.R.; data analysis, M.R.; investigation, M.R., R.L., G.P. and G.T.; resources, M.R., R.L., G.P. and G.T.; data curation, M.R., R.L., G.P. and G.T.; writing—original draft preparation, M.R.; writing—review and editing, R.L., G.P. and G.T.; visualization, M.R.; supervision, G.T.; project administration, G.T.; funding acquisition, G.T. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Lane, T. A short history of robotic surgery. Ann. R. Coll. Surg. Engl. 2018, 100, 5–7. [Google Scholar] [CrossRef] [PubMed]
  2. Morrell, A.L.G.; Morrell-Junior, A.C.; Morrell, A.G.; Mendes, J.; Freitas, M.; Tustumi, F.; Morrell, A. The history of robotic surgery and its evolution: When illusion becomes reality. Rev. do Col. Bras. de Cir. 2021, 48, 1–9. [Google Scholar] [CrossRef] [PubMed]
  3. Sheetz, K.H.; Claflin, J.; Dimick, J.B. Trends in the adoption of robotic surgery for common surgical procedures. JAMA Netw. Open 2020, 3, e1918911. [Google Scholar] [CrossRef] [PubMed]
  4. Tan, A.; Ashrafian, H.; Scott, A.J.; Mason, S.E.; Harling, L.; Athanasiou, T.; Darzi, A. Robotic surgery: Disruptive innovation or unfulfilled promise? A systematic review and meta-analysis of the first 30 years. Surg. Endosc. 2016, 30, 4330–4352. [Google Scholar] [CrossRef]
  5. Yu, H.Y.; Hevelone, N.D.; Lipsitz, S.R.; Kowalczyk, K.J.; Hu, J.C. Use, costs and comparative effectiveness of robotic assisted, laparoscopic and open urological surgery. J. Urol. 2012, 187, 1392–1399. [Google Scholar] [CrossRef] [PubMed]
  6. Bergeles, C.; Yang, G.Z. From passive tool holders to microsurgeons: Safer, smaller, smarter surgical robots. IEEE Trans. Biomed. Eng. 2013, 61, 1565–1576. [Google Scholar] [CrossRef] [PubMed]
  7. Koh, F.H.; Tan, K.K.; Lieske, B.; Tsang, M.L.; Tsang, C.B.; Koh, D.C. Endowrist versus wrist: A case-controlled study comparing robotic versus hand-assisted laparoscopic surgery for rectal cancer. Surg. Laparosc. Endosc. Percutaneous Tech. 2014, 24, 452–456. [Google Scholar] [CrossRef] [PubMed]
  8. Liu, H.; Selvaggio, M.; Ferrentino, P.; Moccia, R.; Pirozzi, S.; Bracale, U.; Ficuciello, F. The MUSHA hand II: A multifunctional hand for robot-assisted laparoscopic surgery. IEEE/ASME Trans. Mechatron. 2020, 26, 393–404. [Google Scholar]
  9. Puangmali, P.; Liu, H.; Seneviratne, L.D.; Dasgupta, P.; Althoefer, K. Miniature 3-axis distal force sensor for minimally invasive surgical palpation. IEEE/ASME Trans. Mechatron. 2011, 17, 646–656. [Google Scholar] [CrossRef]
  10. Jeong, S.H.; Seo, K.W.; Min, J.S. Intraoperative tumor localization of early gastric cancers. J. Gastric Cancer 2021, 21, 4. [Google Scholar] [CrossRef]
  11. Bințințan, V.; Calborean, A.; Mocan, M.; Macavei, S.; Cordoș, A.; Ciuce, C.; Bințințan, A.; Chira, R.; Nagy, G.; Surlin, V.; et al. New inductive proximity sensor platform for precise localization of small colorectal tumors. Mater. Sci. Eng. C 2020, 106, 110146. [Google Scholar] [CrossRef] [PubMed]
  12. D’Ettorre, C.; Mariani, A.; Stilli, A.; Valdastri, P.; Deguet, A.; Kazanzides, P.; Taylor, R.H.; Fischer, G.S.; DiMaio, S.P.; Menciassi, A.; et al. Accelerating surgical robotics research: Reviewing 10 years of research with the dvrk. arXiv 2021, arXiv:2104.09869. [Google Scholar]
  13. Tognarelli, S.; Salerno, M.; Tortora, G.; Quaglia, C.; Dario, P.; Schurr, M.O.; Menciassi, A. A miniaturized robotic platform for natural orifice transluminal endoscopic surgery: In vivo validation. Surg. Endosc. 2015, 29, 3477–3484. [Google Scholar] [CrossRef] [PubMed]
  14. Tortora, G.; Dario, P.; Menciassi, A. Array of robots augmenting the kinematics of endocavitary surgery. IEEE/ASME Trans. Mechatron. 2014, 19, 1821–1829. [Google Scholar] [CrossRef]
  15. Nikodem, M.; Slabicki, M.; Bawiec, M. Efficient communication scheme for Bluetooth low energy in large scale applications. Sensors 2020, 20, 6371. [Google Scholar] [CrossRef] [PubMed]
  16. Palep, J.H. Robotic assisted minimally invasive surgery. J. Minimal Access Surg. 2009, 5, 1. [Google Scholar] [CrossRef] [PubMed]
Figure 1. The developed sensing device mounted on the PSM of the dVRK system.
Figure 1. The developed sensing device mounted on the PSM of the dVRK system.
Applsci 12 08083 g001
Figure 2. The CAD design of the first version of the mechanical support. (a) The housing for the electronic board. (b) The connection part for the positioning of the device along the EndoWrist tools.
Figure 2. The CAD design of the first version of the mechanical support. (a) The housing for the electronic board. (b) The connection part for the positioning of the device along the EndoWrist tools.
Applsci 12 08083 g002
Figure 3. The CAD design of the second version of the mechanical support. (a) The housing for the electronic board. (b) The connection part for the positioning of the device along the EndoWrist tools.
Figure 3. The CAD design of the second version of the mechanical support. (a) The housing for the electronic board. (b) The connection part for the positioning of the device along the EndoWrist tools.
Applsci 12 08083 g003
Figure 4. BLE communication between two Arduino Nano 33 BLE Sense units.
Figure 4. BLE communication between two Arduino Nano 33 BLE Sense units.
Applsci 12 08083 g004
Figure 5. Complete prototype of the first version of the detection device connected to the EndoWrist tool of dVRK system.
Figure 5. Complete prototype of the first version of the detection device connected to the EndoWrist tool of dVRK system.
Applsci 12 08083 g005
Figure 6. Complete prototype of the second version of the detection device connected to the EndoWrist tool of dVRK system.
Figure 6. Complete prototype of the second version of the detection device connected to the EndoWrist tool of dVRK system.
Applsci 12 08083 g006
Figure 7. (a) Support for the Arduino board fixed to the end effector of the MELFA RV3-SB industrial robot by means of two M5 screws. (b) View of the support with an integrated board made with a 3D printer.
Figure 7. (a) Support for the Arduino board fixed to the end effector of the MELFA RV3-SB industrial robot by means of two M5 screws. (b) View of the support with an integrated board made with a 3D printer.
Applsci 12 08083 g007
Figure 8. Tissue samples from the pig, placed on the plate below the sensor for the testing phase with the robot MELFA RV3-SB. (a) Liver. (b) Gut. (c) Stomach.
Figure 8. Tissue samples from the pig, placed on the plate below the sensor for the testing phase with the robot MELFA RV3-SB. (a) Liver. (b) Gut. (c) Stomach.
Applsci 12 08083 g008
Figure 9. Frames concerning the acquisition of color data from the stomach and liver to detect the intensity of the red, green, and blue colors of each tissue subjected to the sensor.
Figure 9. Frames concerning the acquisition of color data from the stomach and liver to detect the intensity of the red, green, and blue colors of each tissue subjected to the sensor.
Applsci 12 08083 g009
Figure 10. Functionality demonstration. MTM from the dVRK console was used to teleoperate the instrument mounted on the PSM.
Figure 10. Functionality demonstration. MTM from the dVRK console was used to teleoperate the instrument mounted on the PSM.
Applsci 12 08083 g010
Figure 11. Positioning of the sensing device prototype on the PSM tool of the dVRK system.
Figure 11. Positioning of the sensing device prototype on the PSM tool of the dVRK system.
Applsci 12 08083 g011
Figure 12. Three frames concerning the phases of the performed transfer task, taking the ball from the workspace and positioning it about 8 cm from the tissue.
Figure 12. Three frames concerning the phases of the performed transfer task, taking the ball from the workspace and positioning it about 8 cm from the tissue.
Applsci 12 08083 g012
Figure 13. Positioning of the ball on the object placed on the work surface.
Figure 13. Positioning of the ball on the object placed on the work surface.
Applsci 12 08083 g013
Figure 14. Characteristic curves relating to the proximity of the three tissues analyzed. (a) Stomach. (b) Gut. (c) Liver. (d) Superposition of the curves of the intestine and stomach.
Figure 14. Characteristic curves relating to the proximity of the three tissues analyzed. (a) Stomach. (b) Gut. (c) Liver. (d) Superposition of the curves of the intestine and stomach.
Applsci 12 08083 g014
Figure 15. The proximity values recorded for the liver during the movement of the robot in four tests.
Figure 15. The proximity values recorded for the liver during the movement of the robot in four tests.
Applsci 12 08083 g015
Figure 16. The proximity values recorded for the stomach during the movement of the robot in four tests.
Figure 16. The proximity values recorded for the stomach during the movement of the robot in four tests.
Applsci 12 08083 g016
Figure 17. The proximity values recorded for the gut during the movement of the robot in four tests.
Figure 17. The proximity values recorded for the gut during the movement of the robot in four tests.
Applsci 12 08083 g017
Figure 18. Characteristic curve relative to the stomach, with highlighted areas.
Figure 18. Characteristic curve relative to the stomach, with highlighted areas.
Applsci 12 08083 g018
Figure 19. Fitting of the values of the portion of the graph highlighted by the green rectangle with the polynomial model.
Figure 19. Fitting of the values of the portion of the graph highlighted by the green rectangle with the polynomial model.
Applsci 12 08083 g019
Figure 20. Fitting of the values of the portion of the graph highlighted by the yellow rectangle with the polynomial model.
Figure 20. Fitting of the values of the portion of the graph highlighted by the yellow rectangle with the polynomial model.
Applsci 12 08083 g020
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Rosa, M.; Liu, R.; Pitruzzello, G.; Tortora, G. A Smart Modular IoT Sensing Device for Enhancing Sensory Feedbacks in Surgical Robotics. Appl. Sci. 2022, 12, 8083. https://doi.org/10.3390/app12168083

AMA Style

Rosa M, Liu R, Pitruzzello G, Tortora G. A Smart Modular IoT Sensing Device for Enhancing Sensory Feedbacks in Surgical Robotics. Applied Sciences. 2022; 12(16):8083. https://doi.org/10.3390/app12168083

Chicago/Turabian Style

Rosa, Mafalda, Rongrong Liu, Giorgio Pitruzzello, and Giuseppe Tortora. 2022. "A Smart Modular IoT Sensing Device for Enhancing Sensory Feedbacks in Surgical Robotics" Applied Sciences 12, no. 16: 8083. https://doi.org/10.3390/app12168083

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop