Next Article in Journal
CO2 Laser Fabrication of PMMA Microfluidic Double T-Junction Device with Modified Inlet-Angle for Cost-Effective PCR Application
Next Article in Special Issue
Capillary Transport of Miniature Soft Ribbons
Previous Article in Journal
Engineered Liver-On-A-Chip Platform to Mimic Liver Functions and Its Biomedical Applications: A Review
Previous Article in Special Issue
Three-Dimensional Autofocusing Visual Feedback for Automated Rare Cells Sorting in Fluorescence Microscopy
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Tele–Robotic Platform for Dexterous Optical Single-Cell Manipulation

1
Institut des Systèmes Intelligents et de Robotique, ISIR, Sorbonne Université, CNRS, F-75005 Paris, France
2
TIPS Laboratory, CP 165/67, Université libre de Bruxelles, 50 Avenue F. Roosevelt, B-1050 Brussels, Belgium
*
Author to whom correspondence should be addressed.
Current address: Indian Institute of Technology Madras, ITT-Madras, Chennai 600036, India.
Micromachines 2019, 10(10), 677; https://doi.org/10.3390/mi10100677
Submission received: 30 August 2019 / Revised: 25 September 2019 / Accepted: 2 October 2019 / Published: 8 October 2019
(This article belongs to the Special Issue Robotic Micromanipulation)

Abstract

:
Single-cell manipulation is considered a key technology in biomedical research. However, the lack of intuitive and effective systems makes this technology less accessible. We propose a new tele–robotic solution for dexterous cell manipulation through optical tweezers. A slave-device consists of a combination of robot-assisted stages and a high-speed multi-trap technique. It allows for the manipulation of more than 15 optical traps in a large workspace with nanometric resolution. A master-device (6+1 degree of freedom (DoF)) is employed to control the 3D position of optical traps in different arrangements for specific purposes. Precision and efficiency studies are carried out with trajectory control tasks. Three state-of-the-art experiments were performed to verify the efficiency of the proposed platform. First, the reliable 3D rotation of a cell is demonstrated. Secondly, a six-DoF teleoperated optical-robot is used to transport a cluster of cells. Finally, a single-cell is dexterously manipulated through an optical-robot with a fork end-effector. Results illustrate the capability to perform complex tasks in efficient and intuitive ways, opening possibilities for new biomedical applications.

1. Introduction

Dexterous manipulation of single-cells offers many possible applications in cellular surgery, mechanobiology, tissue engineering, and biophysics. Recent breakthroughs in biotechnology are rising the demand for complex single-cell operation techniques such as cell isolation, 3D orientation, and cell-injection. Nowadays, those tasks are usually performed using simple three-axis cartesian robots consisting of motor-driven micromanipulators with prismatic joints and equipped with micro-pipettes or micro-grippers as end-effectors. The operator directly controls a single actuator through buttons or knobs, ignoring the overall kinematics of the robot. Basic tasks, like the rotation of a cell, are proven to be quite time-consuming and challenging due to the lack of dexterity of those micromanipulators and their control interfaces. Consequently, those devices have a steep learning curve.
Significant challenges remain for applications related to single-cell manipulation mainly due to the physics involved (volumetric forces dominated by the surface forces) and the size limitations imposed by the environment. The resolution and precision required at those sizes have a cost in terms of degrees-of-freedom, workspace, grasping strategies and control schemes. Furthermore, there is an increasing demand to manipulate objects in confined environments like micro-fluidic devices, in order to decrease flow disturbances, contamination or evaporation of the culture medium, rendering external actuators unusable. All these constraints call to replace current techniques by non-contact manipulation methods.
Accordingly, a great effort has been made in the search of solutions for the actuation of mobile microrobots (i.e., untethered robots where the entire body is micrometer-sized) to serve as remote manipulators. A variety of methods have been developed employing chemical reactions, physical fields or bio-hybrid approaches [1,2]. Remote actuation using different external energy-fields like magnetic, acoustic or optical has appeared as a very promising solution in applications where high spatial maneuverability and precision are required [3]. Among them, optical trapping [4] (awarded with the Nobel Prize in physics 2018) offers several advantages in the manipulation of small biological samples [5].
Optical manipulation exploits the light radiation pressure to noninvasively trap and position suspended micro-objects and cells with a nanometer resolution; resulting in a contamination-free, contact-free, and label-free method for cell manipulation in their original culture medium. Their compatibility with other optical techniques, especially microscopy implies that these are highly appropriate for lab-on-chip systems and micro-fluidic devices. In addition, it is possible to simultaneously trap several objects using a unique laser source by active diffractive optical elements (holographic optical tweezers) [6] or by rapid laser-deflection (time-shared methods) [7]. Furthermore, optical-robots (i.e., 3D-printed micro-structures actuated using laser trapping techniques) can be used to indirectly handle the cells in applications where their viability is an important issue. Physical and chemical treatments in optical-robots also allow functionalization for more specific tasks such as pH sensor [8], temperature sensor [9], puncturing of cells [10] or syringe function [11].
Although recent few examples of automated direct cell-rotation [12] or automated cell transportation through microtool [13] have been proposed, automation at micro-scales remains a very challenging issue in most cases where working conditions are uncertain and samples unstructured. Automating a specific task is a time-consuming operation, often beyond the skills of the end-user. As optical-robots can be directly 3D printed according to the needs of a given experiment, each structure will have its own characteristics, and the system must be able to adapt to this new tool. Also, most tasks require some human know-how as the operator can determine the optimal method or protocol depending on the use case.
Similar issues in macro-manipulation robotics have been treated using teleoperated schemes, where a master device is handled by the operator to control a slave robot. This approach integrates human intelligence in the robotic control loop, allowing user’s expertise and ability to adapt the manipulation protocol to environmental disturbances and the peculiarities of the task. A very well-known example is the Da Vinci System (Intuitive Surgical, Inc.) for robotic surgery. It is designed to facilitate complex minimally invasive surgery and it is controlled by a surgeon from a console. Tele–robotics have also been successfully implemented in a large range of applications where the operator is unable to interact directly with the environment, like underwater vehicles or nuclear robots.
Existing commercial interfaces for optical manipulation allow the user to control the optical traps using 2D mouse position. In order to enhance the user control, some attempts to incorporate more efficient master devices have been made, such as joysticks [14], gesture recognition (e.g., 2D cameras, kinect, leap-motion) [15,16,17], multi-touch tablets [18,19] and 3D robotized interfaces (or haptic interfaces) [20,21,22]. Multi-modal approaches have also been presented with a combination of gesture recognition, eye gaze tracking or speech recognition [23,24]. Despite the improvement brought by those attempts in terms of ergonomy and efficiency, completing complex real-world tasks still remains a challenge. Mouses, joysticks, and tablets do not permit a three-dimensional workspace. Tracking sensors suffer from low temporal and spatial resolution. As 3D robotized interfaces have only been used to manipulate one particle at a time, limiting the scenarios in which the number of objects exceeds the abilities of one operator. In addition, multi-trap actuation techniques used on most of these platforms are based on spatial light modulators (SLM), which suffers from high time latency resulting from the reactivity of their hardware and a high computational cost for a given trajectory.
Robust 3D real-time micro-manipulation requires high spatial and temporal resolution, as the end-effector interacts in a micro-world with high dynamics effects. Working-space, degrees-of-freedom (DoF), immersion, and flexibility are also essential characteristics of an efficient micro-manipulator. Based on all these observations, we propose a teleoperated optical-micromanipulation platform for direct and indirect single-cell dexterous manipulation. The content of this paper has been partially mentioned in a previous conference paper [25]. This article presents a more detailed methodology and additional studies on the precision and efficiency of the proposed platform. Further, two new results are reported: the direct and indirect manipulation of a single erythrocyte, a suspended red blood cell.
The system is based on optical actuation, allowing a non-contact manipulation of biological samples or micro-machines. The workspace is optimized by combining a 3D multi-trap time-shared method, a 3D nano-stage, and a 2D micro-stage. To solve the latency issues, the implemented 3D multi-trap technique is based only on high-bandwidth steering mirrors [7]. Teleoperation control is implemented with a master device, Omega.7, providing 6+1 DoF and is ensured by a hard real-time system. Traps can be grouped and controlled in a variety of ways for specific purposes enlarging the DoF of the slave device. The performance of the system in terms of static and dynamic precision is evaluated using trajectory control tasks. Three state-of-the-art experiments have been performed to verify its efficiency. First, the reliable 3D rotation of a cell has been demonstrated. Secondly, the transport of a cluster of cells has been performed with an optical-robot. Finally, a single-cell has been dexterously manipulated using an optical-robot with a fork end-effector. These results illustrate the kind of complex biomedical applications that can be effectively and intuitively accomplished with the proposed platform.

2. Materials and Methods

2.1. Teleoperation System Design

The platform was composed of three main parts: the optical set-up (Laser and passive optical components forming the light path), the slave robot (robot-assisted stages and active optical components for 3D multi-trap actuation), and the master device (seven-DoF robot manipulator). Figure 1 shows a schematic representation of the platform. The system can be used to manipulate biological samples directly or indirectly (i.e., through trapped inert objects).

2.1.1. Optical System

The system is constructed around a custom made inverted microscope. The microscope objective (oil immersion, Olympus UPlanFLN 40x, NA 1.3) was used to visualize the sample and to generate the optical-traps. A near-infrared laser (1070 nm) has been chosen as source to minimize biological damages. The beam is expanded in order to overfill (20%) the objective entrance, thus improving the trapping efficiency [26]. The illumination (LED, 3 W ) is reflected by a longpass dichroic mirror into a high-speed CMOS camera (Basler, Ahrensburg, Germany, 659 × 494 px) to provide visual feedback and environmental information to the operator.

2.1.2. Slave Robot: Robot-Assisted Stages and High-Speed 3D Multi-Trap Actuation

Two different types of actuation coexist to control the motion of optical traps. The first one is composed of a 3D nano-stage mounted on a 2D micro-stage. These nano-stage and micro-stage move the sample chamber while trapped objects remain fixed. The micro-stage allows for a large workspace (25 × 25 m m 2 ), while the nano-stage gives a finer control in 200 μ m 3 range.
The second actuation uses high-speed laser-deflection generated by a galvanometer (GVS002, Thorlabs, Newton, NJ, USA) and a deformable mirror (PTT111 DM, Iris AO, Berkeley, CA, USA). The 3D motion of the focal spot is obtained by the synchronization of the orientation of the galvanometer mirror and focusing or defocusing the deformable mirror. Multiple traps are hence created by sequentially moving the focal spot between different positions. This time-sharing method is made possible by the short response time of the galvanometer and the deformable mirror, and a hard real-time control framework implemented in C++ on a Real-Time kernel (Xenomai). This design allows the creation of numerous independent optical traps within a volume of approximately 70 × 50 × 9 μ m 3 with a bandwidth up to 200 Hz. This technique is aberration-free as it uses only mirrors with high reflectance and is designed to generate equally efficient and stable traps regardless of their position in the workspace. For further details please see [7].

2.1.3. Master–Slave Coupling

The master device is an Omega (Omega.7, ForceDimension), which allows seven-DoF. On the seven-DoF, three-DoF are for translations, three-DoF are for rotations and a last DoF is given by a gripper under the index finger of the user. The work-space is 160 × 160 × 110 m m 3 for translation, 240 × 140 × 180 deg for rotations and 25 m m for grasping.
The master device translation is appropriately transformed (scaling factor of × 10 4 ) and sent to the nano-stage. The orientation of the master device is used to compute a rotation matrix which indicates trap positions to the galvanometer and to the deformable mirror. The rotation matrix is calculated using the Tait–Bryan angles with z y x convention (yaw, pitch and roll). Finally, the gripper position is interpreted to determine new trap positions depending on the configuration. The gripper has two operating modes. The radial mode moves the desired traps toward or away from the rotation center. This type of radial motion allows the user to grasp objects using trapped beads as ‘fingers’. The scissor mode rotates the wanted traps towards the Y-axis in order to give a scissor-like movement to a set of traps. It can be used to actuate a tool such a clamp.
The optical traps can be dynamically created via the control interface. Every trap position can be directly edited in the micro-world coordinates. The traps can also be organized by groups. For each group, the 3D rotations, the 3D translations and the gripper functionality can be independently enabled or disabled. The number of traps is virtually unlimited; however as the stiffness decreases with the number of traps, the amount of stable traps in dynamic and static configuration is around 15 and 30 respectively. Figure 2 shows an example of 3D tele-operation using three groups of traps.
Translations and rotations have two control modes:
  • Position Control: This first mode mirrors the master device’s position and orientation to the slave robot, with an appropriate scaling factor. This factor can be chosen according to the task’s dimensions and the operator’s comfort. This method is suitable to execute precise tasks.
  • Velocity Control: This second mode enables control of the slave robot’s velocity. The motion’s direction and amplitude are computed according to the vector made by the center of the master device’s workspace and the position of the handle. A scaling factor can also be chosen according to the task’s requirements. A maximal handling velocity is also defined according to the number of traps, in order to assist the user and help the trapped objects’ retention. The velocity control mode can be enabled independently for translations and rotations, and is suitable for long displacements like sample chamber exploration or for continuous rotation of an object like micro-pump [27] or cell rotation for tomographic imaging [28].

2.2. Evaluation of Teleoperation Performances

To evaluate the system’s performances, several experiments with micro-beads have been carried out. Polystyrene beads have been chosen, as they are the most common tool used in optical tweezers manipulation systems.
Data from the master device, the set-points of actuators, and video images at 64 fps from the CMOS camera are recorded during different tasks. Each data record contains the current system time (Xenomai timer) for synchronization purposes. Measured positions of the trapped beads are extracted in an off-line process using the circle hough transform algorithm from OpenCV software. The image has a resolution of 659 × 494 px and cover a surface of 70 × 50 μ m and the theoretical resolution of the tracking algorithm is 2 px, corresponding to about 200 n m . As the 2D image tracking algorithm does not permit to estimate the depth, only 2D projection of the 3D motions are studied. Same laser power (400 m W ), trap irradiation time (5 m s ), and polystyrene micro-beads (3 μ m diameter, refractive index ∼1.59) were used for all tasks.
For translations motions, nano-stage (P-562.3CD, Physik Instrumente, Karlsruhe, Germany) and piezo-controller (E-725.3CD, Physik Instrumente, Karlsruhe, Germany) are used in closed-loop with a resolution of 1 n m , 20 k Hz sampling rate and factory calibration. For the sake of simplicity, only rotational motions are considered here.

2.2.1. Static Precision

The static precision depends on the actuators’ accuracy, thermal noise, and trap stiffness. The deformable mirror is controlled in open-loop based on its calibration and has nanometer and microradian resolution (manufacturer data: wavefront resolution <15 n m rms). The galvanometer is controlled in closed-loop with an angular resolution of 15 μ rad .
In time-sharing scenarios, when more than one trap is created, the laser switches from trap to trap. In this case, each bead is only held by the laser during a fraction of the cycle, and the rest of the time the micro-bead is subjected to Brownian fluctuations and other environmental forces [29]. In consequence, the effective stiffness of each trap is diminished by the reduced duty cycle and the precision of the bead’s position is inversely related to the number of traps created.
Different static tasks have been performed and recorded during 15 s . The static position error value is computed as the maximum error between the trap’s position command and the bead’s tracked position. For a single trap, no differences between the trap’s command position and bead’s measured position are detected, as the stiffness of the trap highly reduced the Brownian fluctuations. We conclude that the error position is less than the resolution of the tracking algorithm, i.e., the position’s maximum error is less than 200 n m . For four and thirteen micro-beads positioned in a square of 40 × 40 μ m , the position error is calculated as 400 n m and 1 μ m respectively, and is essentially due to Brownian motion as the effective stiffness of each trap decreases as the number of traps increases. These experimental results for four and thirteen traps are presented in Figure 3.

2.2.2. Trajectory Control

Three different tasks have been performed to evaluate the trajectory control. First, four micro-beads have been trapped and rotated in position mode. The master device orientation is used to compute the optical trap position. The scaling coefficient is set to 5 in order to achieve complete rotations of the trapped particle. The measured position of the Trap T1 and the master trajectory are shown in Figure 4. The mean error between command and real position is 0.31 μ m with standard deviation of 0.23 μ m mainly due to Brownian Motion.
The second task consists in trapping a micro-bead at different axial positions and rotates it in velocity mode, increasing the speed by steps of 21 μ m / s . The handle orientation of the master device is locked to generate speed steps using an adjustable command gain. Positions of micro-beads are extracted off-line from video images using a circle tracking algorithm and the velocity is computed as the discrete derivative of the position. Results shown in Figure 5 confirm that the micro-bead accurately follows the velocity reference from 21 μ m / s to 462 μ m / s . Finally, the bead is lost at 483 μ m / s .
The same experiment performed with a group of four trapped micro-beads with different axial configurations, shown in Figure 6, also demonstrates proper following of the references. The maximal reachable velocity without losing the beads are 105 μ m / s . Since the beads are lost at the same speed in the different configurations, these experiments validate the proposed velocity control and corroborate that the system produces equally efficient and stable traps, regardless of the traps’ 3D positions.

2.2.3. Velocity Limitation

As expected, the escape velocity to release the trapped bead is higher for one trap than for four traps. The laser is deflected at a constant frequency of 200 Hz from one trap to another, meaning that the position is moving by steps, and greater velocities imply larger steps. The velocity thresholds estimated in Section 2.2.2 are 462 μ m / s for one trap and 105 μ m / s for four traps.
For both experiments, the velocity threshold corresponds to a position step of approximately 2.3 μ m . Same results are found with two, three, five and nine traps. Hence, this step-size defines the limit before risking to free a trapped object.
The following equation predicts the theoretical highest reachable velocity depending on the number of optical traps generated and the deflection frequency of the laser:
V m a x = D m a x f N t r a p ,
where V m a x is the velocity threshold, D m a x the position step-size, N t r a p the number of optical traps, and f the deflecting frequency of the laser. As a consequence, if the deflecting frequency of the laser is set to a higher value, the velocity threshold increases.
In regard to the translation actuation, since the whole sample-chamber is moved, if only one trap is generated the velocity threshold is only limited by the drag force, as the trap is always active. The escape velocity is measured at 1500 μ m / s , resulting in an escape force of 38 p N . When several traps are generated, the velocity threshold is limited by the deflecting frequency and the viscous drag forces. During the time period when the trap is not active, viscous forces will shift the object out of the trap. If the motion is small enough, the bead will be attracted to equilibrium position when the trap is activated back. However, when the displacement of the bead is large enough during a period (i.e., if the translation velocity is too high) the bead is not in the attractive zone anymore when the trap is active again. For two, three, four and five traps, the velocity threshold measured is respectively 300 μ m / s , 220 μ m / s , 110 μ m / s and 80 μ m / s . Note that escape forces are defined by optical properties at the very edges of the trap, where the restoring force is no longer a linear function of the displacement [30]. Further investigations on the effective stiffness of traps combining time-sharing actuation and stage-based actuation is required for a higher precision on force measurements. Table 1 summarizes the principal parameters of the system.
To illustrate the kind of biological applications that can be accomplished with the proposed platform in a real world scenario, direct and indirect manipulations of mouse erythrocytes, suspended red-blood cells (RBCs), are presented in the next section.

3. Results and Discussion

3.1. Direct Manipulation: 3D Rotation of a Cell

Presently, 3D orientation of biological samples has gained much attention due to its involvement in various single-cell surgeries and cell imaging techniques. Accordingly, significant efforts have been made toward achieving 3D cell orientation control using holographic optical tweezers (HOT) [12,31,32,33]. Although these SLM-based techniques represent elegant solutions for the 3D rotations, the intrinsically slow response of liquid crystal and the complexity of trajectory computation induce important delays making their implementation in real-time scenarios a difficult task even when only reduced to two traps. In time-sharing techniques, one of the main advantages is that the control of traps is straightforward, requiring only the 3D transformation that produces the desired movement and its direct conversion into actuator coordinates.
The first experiment is dedicated to the 3D orientation of a suspended erythrocyte. A group of eight traps arranged along the perimeter of the cell has been created through the user interface. Initially, the erythrocyte was sedimented on a coverslip with face-on orientation to the optical axis. The eight traps enable the rotation of the cell with respect to the x, y, and z axes, and the simultaneous translation in all three axes. Thanks to the high bandwidth and the efficiency of the proposed system, the user can control the 3D motion of the cell without any noticeable latency.
Figure 7 and Supplementary Video S1 show the six-DoF control of an individual cell. Every DoF can be controlled independently or coupled to another. Rotation’s scaling factor is set to three in order to achieve full 360 rotation. The maximum velocity before observing minor overshooting is 188.6 deg/ s or 9.92 μ m / s for a cell with 6 μ m diameter. The difference between the theoretical maximum velocity for eight micro-beads (57.5 μ m / s ) and the measured velocity for RBCs may be due to the lower cell’s refractive index (RI ∼ 1.38), the higher viscous drag torque due to larger surfaces, and the shape differences. Factors such as symmetry, size, and non-homogeneity of manipulated objects will impact the performance of direct manipulation.
Rotation can also be controlled in speed mode, in order to allow constant displacement of a cell (see Supplementary Video S2). True 360 rotation around all-axes of a single RBC is successfully demonstrated. The achievable orientation range depends on the maximum distance separating the traps and the center of rotation and will be limited by the workspace for bigger cells.

3.2. Indirect Manipulation: Six-DoF Teleoperated Optical Robot for Cell Manipulation

Direct optical manipulation is the simplest and most used manipulation technique; however, several studies have shown that direct laser exposure can cause considerable photo-damage [34,35]. Furthermore, it is difficult to reliably hold different kinds of biological samples as the stability of the traps depends on the shape, material, and the refractive index of the target. Therefore, indirect manipulation through beads formations [36] or more complex micro-tools [37] have been proposed.
This section demonstrates the capability to use the proposed platform for the indirect manipulation of cells through optical robots. This type of manipulation is a good illustration of the capabilities and versatility of the platform as indirect manipulation through beads is more complex and time-consuming than direct manipulation [38]. Furthermore, controlling the 3D motion of optical robots, where optical handles are rigidly linked, is more demanding than controlling separated and individual beads. Relative deviations in the position of each trap and synchronization problems in the movements of groups will affect the stability of the whole structure.

3.2.1. Fabrication and Collection of the Robots

Two optical robots with three and four spherical handles have been designed. These micro structures are manufactured following dimensional specifications shown in Figure 8b and Figure 9b, by two-photon polymerization (Nanoscribe) using IP-Dip resin (refractive index ∼ 1.52). The first robot has a shovel-shaped end-effector, in order to tow and move several cells at the same time. The second robot has a fork end-effector in order to dexterously manipulate a single-cell. Different spacers are attached on both sides of the robot to minimize the adhesion forces. (See Figure 8a and Figure 9a).
After fabrication, the robots are incubated in a 94.5% distilled water, 5% ethanol and 0.5% Tween20 solution to prevent the surface adhesion. For experiments, micro-robots are transferred to a sample chamber containing suspended erythrocytes through an actuated microliter syringe (Hamilton, 250 μ L ). Then, the sample chamber is sealed with a cover-slip, forming a confined environment.

3.2.2. Teleoperated Optical Robot for Cells Transportation

The second experiment consists in the indirect transportation of a cluster of cells. A group of three traps in triangle formation is generated using the user interface. Then, teleoperation is performed through the master device, allowing six-DoF motions.
Thanks to the low response-time and transparency of the system, the operator manages to open a path through a sample, heavily loaded with suspended RBCs. Figure 8c shows the trajectory of the teleoperated Robot.
Initially the robot moves in the microscope-slide bottom (T: 0–19 s ). Then, the robot is lifted in the Z-direction (10 μ m ) as the obstacles completely block the path (T: 20 s ). Once the target cells are identified, the axial position of the robot is lowered until it hits the bottom of the slide again (T: 30 s ). Then, the robot moves the target cells indirectly (T: 35–45 s ). The user dexterously moves the robot through the sample for more than 100 μ m to finally displace the cluster of cells for 80 μ m . These results can be observed in Supplementary Video S3.

3.2.3. Teleoperated Optical Robot for Dexterous Single-Cell Manipulation

This experiment is dedicated to dexterously manipulate a single-cell. A robot with fork end-effector is designed according to a RBC shape (∼6 μ m in diameter) in order to allow full mobility of the manipulated cell. Clamp edges touching the cell are jagged, to limit the contact surface. A group of four traps is used to induce 3D motions.
In the experiments, the cell is seized and manipulated without major difficulties. Experimental results are shown in Figure 9c. In the first task, the optical robot turns around the target, a single RBC located in point A, during an exploration phase (T: 0–20 s ). Then, seize the cell and transports it to point B, about 100 μ m away between two obstacles (T: 20–39 s ). Finally, it deposits the cell to point B moving backwards (T: 39–42 s ). The second task consists in manipulating a single cell avoiding a big obstacle. Please see the Supplementary Video S4 for the whole experiment.
During all manipulations, the rotations of the robot have been controlled in position mode. On the other hand, translations have been executed with a mixed control in position and velocity according to the situation. In displacements and cell transportation, the speed control has been preferred, while in contact, loading, and delivery of cells, the control in position has been chosen. The maximal velocity for a four handles optical robot is measured as 60 μ m / s for rotation and 100 μ m / s for translations. Design differences in optical robots (e.g., number of optical handles, size and shape of the end-effector) will impact the performance of the manipulation. Further investigations in the optimization of shape and size of optical robots, e.g., using simulations, should be needed to take advantage of the full potential of optical robots during indirect manipulation tasks.
Indirect 2D rotation of the cell can be performed around any center point. To achieve complete indirect 3D rotation of a cell, it is necessary to firmly grasp the cell and move it along Z-direction without losing it. This could be accomplished, e.g., by using two separated robots to handle the cell, which will be explored in the near future.

4. Conclusions

We proposed and experimentally demonstrated a platform for dexterous cell handling through optical manipulation. As a result of an efficient architecture, the manipulation of more than 15 optical traps in a workspace of (70 × 50 × 9) μ m 3 for rotations and (200 × 200 × 200) μ m 3 for translations, both with nanometric resolution, is presented. Complex arrangements of optical traps can be grouped and transformed in a variety of ways to achieve diverse tasks.
The characterization of the system shows a static precision of less than 0.4 μ m and 1 μ m for the manipulation of four and thirteen simultaneous trapped micro-beads, respectively. Evaluation of the stability and efficiency in trajectory tasks shows the suitability of the system for teleoperated control. Concerning the dynamic motion, a threshold velocity has been defined according to the number of traps, in order to assist the user and ensure the retention of trapped micro-beads. The system provides a straightforward human/machine interaction through a tele–robotic solution allowing dexterous manipulation of synthetic and biological objects in an efficient and intuitive way.
Thanks to its capabilities, and because of the stable and simple set-up design, relevant tasks have been demonstrated in a real biological environment with red blood cells. Besides multiple degrees-of-freedom 3D cells rotation, indirect optical manipulations with 3D printed micro tools were also demonstrated. These experiments illustrate the kind of tasks where the presented platform could be implemented. Possible biological applications that can benefit from the proposed platform include cell sorting, isolation, rotation, stimulation, 3D tomographic imaging, and can contribute to more complex tasks such as single-cell surgeries (e.g., nuclear transplantation, embryo micro-injections and polar-body biopsy [39,40]) or micro-assembling [41].
In contrast to other techniques using magnetic or acoustic fields, the high spatial resolution of optical manipulation allows the straightforward implementation of collaborative tasks by several robots and additional degrees of freedom in individual robots. As future work, we plan to expand the platform in order to allow simultaneous control of several optical robots for collaborative tasks and add 3D force-sensing capabilities to these robots, similar to what was presented in [22]. We believe that teleoperated optical robots with force-feedback will be advantageous for numerous biomedical applications leading micromanipulation to a new level of interaction.

Supplementary Materials

The following are available online at https://www.mdpi.com/2072-666X/10/10/677/s1: Video S1: Teleoperated 3D rotation of red blood cell in position mode; Video S2: Teleoperated 3D rotation of red blood cell in velocity mode; Video S 3: Indirect manipulation through a teleoperated optical-robot for cells transportation; and Video S4: Indirect manipulation through a teleoperated optical-robot for single-cell dexterous manipulation.

Author Contributions

Conceptualization, E.G., S.R. and S.H.; methodology, E.G., F.L., A.M and Y.V.; software, E.G. and F.L.; validation, E.G., F.L. and Y.V.; formal analysis, E.G. and F.L.; investigation, E.G.; resources, A.M. and Y.V.; data curation, E.G. and F.L.; writing—original draft preparation, E.G. and F.L.; writing—review and editing, Y.V., S.R. and S.H.; visualization, E.G. and F.L.; supervision, S.R. and S.H.; project administration, S.H.; funding acquisition, S.R. and S.H.

Funding

This work was supported by the French National Research Agency through the ANR-IOTA project (ANR-16-CE33-0002-01), and partially by the French government research program “Investissements d’avenir” through the Robotex Equipment of Excellence (ANR-10-EQPX-44).

Acknowledgments

The authors would like to thank the team “Migration et différentiation des cellules souches hématopoïétiques” of the Institut de Biologie Paris Seine (IBPS) for their help in the preparation of the biological samples.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.

References

  1. Li, J.; de Ávila, B.E.F.; Gao, W.; Zhang, L.; Wang, J. Micro/nanorobots for biomedicine: Delivery, surgery, sensing, and detoxification. Sci. Robot. 2017, 2. [Google Scholar] [CrossRef] [PubMed]
  2. Medina-Sánchez, M.; Magdanz, V.; Guix, M.; Fomin, V.M.; Schmidt, O.G. Swimming microrobots: Soft, reconfigurable, and smart. Adv. Funct. Mater. 2018, 28, 1707228. [Google Scholar] [CrossRef]
  3. Ceylan, H.; Giltinan, J.; Kozielski, K.; Sitti, M. Mobile microrobots for bioengineering applications. Lab Chip 2017, 17, 1705–1724. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  4. Ashkin, A.; Dziedzic, J.; Yamane, T. Optical trapping and manipulation of single cells using infrared laser beams. Nature 1987, 330, 769–771. [Google Scholar] [CrossRef] [PubMed]
  5. Banerjee, A.; Chowdhury, S.; Gupta, S.K. Optical tweezers: Autonomous robots for the manipulation of biological cells. IEEE Robot. Autom. Mag. 2014, 21, 81–88. [Google Scholar] [CrossRef]
  6. Curtis, J.E.; Koss, B.A.; Grier, D.G. Dynamic holographic optical tweezers. Opt. Commun. 2002, 207, 169–175. [Google Scholar] [CrossRef]
  7. Gerena, E.; Regnier, S.; Haliyo, D.S. High-bandwidth 3D Multi-Trap Actuation Technique for 6-DoF Real-Time Control of Optical Robots. IEEE Robot. Autom. Lett. 2019, 4, 647–654. [Google Scholar] [CrossRef]
  8. Maruyama, H.; Fukuda, T.; Arai, F. Functional gel-microbead manipulated by optical tweezers for local environment measurement in microchip. Microfluid. Nanofluid. 2009, 6, 383. [Google Scholar] [CrossRef]
  9. Fukada, S.; Onda, K.; Maruyama, H.; Masuda, T.; Arai, F. 3D fabrication and manipulation of hybrid nanorobots by laser. In Proceedings of the 2013 IEEE International Conference on Robotics and Automation, Karlsruhe, Germany, 6–10 May 2013; pp. 2594–2599. [Google Scholar]
  10. Hayakawa, T.; Fukada, S.; Arai, F. Fabrication of an on-chip nanorobot integrating functional nanomaterials for single-cell punctures. IEEE Trans. Robot. 2014, 30, 59–67. [Google Scholar] [CrossRef]
  11. Villangca, M.J.; Palima, D.; Bañas, A.R.; Glückstad, J. Light-driven micro-tool equipped with a syringe function. Light Sci. Appl. 2016, 5, e16148. [Google Scholar] [CrossRef]
  12. Xie, M.; Shakoor, A.; Shen, Y.; Mills, J.K.; Sun, D. Out-of-plane rotation control of biological cells with a robot-tweezers manipulation system for orientation-based cell surgery. IEEE Trans. Biomed. Eng. 2018, 66, 199–207. [Google Scholar] [CrossRef] [PubMed]
  13. Hu, S.; Xie, H.; Wei, T.; Chen, S.; Sun, D. Automated Indirect Transportation of Biological Cells with Optical Tweezers and a 3D Printed Microtool. Appl. Sci. 2019, 9, 2883. [Google Scholar] [CrossRef]
  14. Gibson, G.; Barron, L.; Beck, F.; Whyte, G.; Padgett, M. Optically controlled grippers for manipulating micron-sized particles. New J. Phys. 2007, 9, 14. [Google Scholar] [CrossRef]
  15. McDonald, C.; McPherson, M.; McDougall, C.; McGloin, D. HoloHands: Games console interface for controlling holographic optical manipulation. J. Opt. 2013, 15, 035708. [Google Scholar] [CrossRef]
  16. Muhiddin, C.; Phillips, D.; Miles, M.; Picco, L.; Carberry, D. Kinect 4: Holographic optical tweezers. J. Opt. 2013, 15, 075302. [Google Scholar] [CrossRef]
  17. Shaw, L.; Preece, D.; Rubinsztein-Dunlop, H. Kinect the dots: 3D control of optical tweezers. J. Opt. 2013, 15, 075303. [Google Scholar] [CrossRef]
  18. Bowman, R.; Gibson, G.; Carberry, D.; Picco, L.; Miles, M.; Padgett, M. iTweezers: Optical micromanipulation controlled by an Apple iPad. J. Opt. 2011, 13, 044002. [Google Scholar] [CrossRef]
  19. Onda, K.; Arai, F. Parallel teleoperation of holographic optical tweezers using multi-touch user interface. In Proceedings of the 2012 IEEE International Conference on Robotics and Automation, Saint Paul, MN, USA, 14–18 May 2012; pp. 1069–1074. [Google Scholar]
  20. Onda, K.; Arai, F. Robotic approach to multi-beam optical tweezers with computer generated hologram. In Proceedings of the 2011 IEEE International Conference on Robotics and Automation, Shanghai, China, 9–13 May 2011; pp. 1825–1830. [Google Scholar]
  21. Pacoret, C.; Régnier, S. Invited article: A review of haptic optical tweezers for an interactive microworld exploration. Rev. Sci. Instrum. 2013, 84, 081301. [Google Scholar] [CrossRef] [PubMed]
  22. Yin, M.; Gerena, E.; Pacoret, C.; Haliyo, S.; Régnier, S. High-bandwidth 3D force feedback optical tweezers for interactive bio-manipulation. In Proceedings of the 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, BC, Canada, 24–28 September 2017; pp. 1889–1894. [Google Scholar]
  23. Tomori, Z.; Keša, P.; Nikorovič, M.; Kaňka, J.; Jákl, P.; Šerỳ, M.; Bernatová, S.; Valušová, E.; Antalík, M.; Zemánek, P. Holographic Raman tweezers controlled by multi-modal natural user interface. J. Opt. 2015, 18, 015602. [Google Scholar] [CrossRef]
  24. Grammatikopoulou, M.; Yang, G.Z. Gaze contingent control for optical micromanipulation. In Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 29 May–3 June 2017; pp. 5989–5995. [Google Scholar]
  25. Gerena, E.; Legendre, F.; Régnier, S.; Haliyo, S. Robotic optical-micromanipulation platform for teleoperated single-cell manipulation. In Proceedings of the MARSS 2019–International Conference on Manipulation Automation and Robotics at Small Scales, Helsinki, Finland, 1 July–5 July 2019; p. 60. [Google Scholar]
  26. Neuman, K.C.; Block, S.M. Optical trapping. Rev. Sci. Instrum. 2004, 75, 2787–2809. [Google Scholar] [CrossRef]
  27. Maruo, S.; Inoue, H. Optically driven micropump produced by three-dimensional two-photon microfabrication. Appl. Phys. Lett. 2006, 89, 144101. [Google Scholar] [CrossRef]
  28. Vinoth, B.; Lai, X.J.; Lin, Y.C.; Tu, H.Y.; Cheng, C.J. Integrated dual-tomography for refractive index analysis of free-floating single living cell with isotropic superresolution. Sci. Rep. 2018, 8, 5943. [Google Scholar]
  29. Ashkin, A.; Dziedzic, J.M.; Bjorkholm, J.; Chu, S. Observation of a single-beam gradient force optical trap for dielectric particles. Opt. Lett. 1986, 11, 288–290. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  30. Visscher, K.; Gross, S.P.; Block, S.M. Construction of multiple-beam optical traps with nanometer-resolution position sensing. IEEE J. Sel. Top. Quantum Electron. 1996, 2, 1066–1076. [Google Scholar] [CrossRef] [Green Version]
  31. Cao, B.; Kelbauskas, L.; Chan, S.; Shetty, R.M.; Smith, D.; Meldrum, D.R. Rotation of single live mammalian cells using dynamic holographic optical tweezers. Opt. Lasers Eng. 2017, 92, 70–75. [Google Scholar]
  32. Lin, Y.c.; Chen, H.C.; Tu, H.Y.; Liu, C.Y.; Cheng, C.J. Optically driven full-angle sample rotation for tomographic imaging in digital holographic microscopy. Opt. Lett. 2017, 42, 1321–1324. [Google Scholar] [CrossRef]
  33. Kim, K.; Park, Y. Tomographic active optical trapping of arbitrarily shaped objects by exploiting 3D refractive index maps. Nat. Commun. 2017, 8, 15340. [Google Scholar] [CrossRef]
  34. Rasmussen, M.B.; Oddershede, L.B.; Siegumfeldt, H. Optical tweezers cause physiological damage to Escherichia coli and Listeria bacteria. Appl. Environ. Microbiol. 2008, 74, 2441–2446. [Google Scholar] [CrossRef]
  35. Blázquez-Castro, A. Optical Tweezers: Phototoxicity and Thermal Stress in Cells and Biomolecules. Micromachines 2019, 10, 507. [Google Scholar] [CrossRef]
  36. Banerjee, A.G.; Chowdhury, S.; Gupta, S.K.; Losert, W. Survey on indirect optical manipulation of cells, nucleic acids, and motor proteins. J. Biomed. Opt. 2011, 16, 051302. [Google Scholar] [CrossRef]
  37. Aekbote, B.L.; Fekete, T.; Jacak, J.; Vizsnyiczai, G.; Ormos, P.; Kelemen, L. Surface-modified complex SU-8 microstructures for indirect optical manipulation of single cells. Biomed. Opt. Express 2016, 7, 45–56. [Google Scholar] [CrossRef] [PubMed]
  38. Xie, M.; Shakoor, A.; Wu, C. Manipulation of biological cells using a robot-aided optical tweezers system. Micromachines 2018, 9, 245. [Google Scholar] [CrossRef] [PubMed]
  39. Shakoor, A.; Xie, M.; Luo, T.; Hou, J.; Shen, Y.; Mills, J.K.; Sun, D. Achieve automated organelle biopsy on small single cells using a cell surgery robotic system. IEEE Trans. Biomed. Eng. 2018. [Google Scholar] [CrossRef]
  40. Wong, C.Y.; Mills, J.K. Cell extraction automation in single cell surgery using the displacement method. Biomed. Microdevices 2019, 21, 52. [Google Scholar] [CrossRef] [PubMed]
  41. Köhler, J.; Ksouri, S.I.; Esen, C.; Ostendorf, A. Optical screw-wrench for microassembly. Microsyst. Nanoeng. 2017, 3, 16083. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Optical-micromanipulation platform for dexterous single-cell handling.
Figure 1. Optical-micromanipulation platform for dexterous single-cell handling.
Micromachines 10 00677 g001
Figure 2. Example of 3D teleoperation using three groups of traps. The operation is a succession of four different tasks demonstrating the system’s capabilities. The first task (1, 2 and 3) is a flip around the y-axis of three micro-beads. The second task (4, 5 and 6) shows the 3D control of four micro-beads. The third task (7, 8 and 9) presents the ‘radial’ operating mode of the gripper. Finally, the last task (Picture 10, 11 and 12) presents the travel of one micro-bead around all the others. Scale bars are 5 μ m long.
Figure 2. Example of 3D teleoperation using three groups of traps. The operation is a succession of four different tasks demonstrating the system’s capabilities. The first task (1, 2 and 3) is a flip around the y-axis of three micro-beads. The second task (4, 5 and 6) shows the 3D control of four micro-beads. The third task (7, 8 and 9) presents the ‘radial’ operating mode of the gripper. Finally, the last task (Picture 10, 11 and 12) presents the travel of one micro-bead around all the others. Scale bars are 5 μ m long.
Micromachines 10 00677 g002
Figure 3. Static precision tasks (a) Four micro-beads positioned in a square of 40 × 40 μ m . The maximum error between the command and the actual position is 400 n m . (b) Thirteen micro-beads set to cover a 40 × 40 μ m square. The maximum position error is up to 1 μ m and is essentially due to Brownian motion. Positions are measured during 15 s .
Figure 3. Static precision tasks (a) Four micro-beads positioned in a square of 40 × 40 μ m . The maximum error between the command and the actual position is 400 n m . (b) Thirteen micro-beads set to cover a 40 × 40 μ m square. The maximum position error is up to 1 μ m and is essentially due to Brownian motion. Positions are measured during 15 s .
Micromachines 10 00677 g003
Figure 4. Example of teleoperated rotation in position mode of a group of four micro-beads. (a) The master device orientation and the command sent to the actuators. Z angle is the angle of rotation around the z-axis. The scaling factor is set to 5 in order to allow a complete rotation. The measured angle is estimated from the measured position of Trap T1. (b) The 2D trajectory followed by the Trap T1. (c) The measured position of Trap T1 computed from the video and the set-point. Estimated error is 0.31 μ m with a standard deviation of 0.23 μ m .
Figure 4. Example of teleoperated rotation in position mode of a group of four micro-beads. (a) The master device orientation and the command sent to the actuators. Z angle is the angle of rotation around the z-axis. The scaling factor is set to 5 in order to allow a complete rotation. The measured angle is estimated from the measured position of Trap T1. (b) The 2D trajectory followed by the Trap T1. (c) The measured position of Trap T1 computed from the video and the set-point. Estimated error is 0.31 μ m with a standard deviation of 0.23 μ m .
Micromachines 10 00677 g004
Figure 5. Teleoperated rotation in velocity mode of one micro-bead at different Z positions. (a) Pictures of a trapped bead in three different axial configurations. Scale bar is 5 μ m long. (b) Velocity reference and measured velocity in three different configurations (c) X and Y positions of the Trap 1 in “Z = 0 μ m ” configuration at a speed of 21 μ m / s and 462 μ m / s . The maximal reachable velocity without losing the traps is 462 μ m / s with an error of 4%.
Figure 5. Teleoperated rotation in velocity mode of one micro-bead at different Z positions. (a) Pictures of a trapped bead in three different axial configurations. Scale bar is 5 μ m long. (b) Velocity reference and measured velocity in three different configurations (c) X and Y positions of the Trap 1 in “Z = 0 μ m ” configuration at a speed of 21 μ m / s and 462 μ m / s . The maximal reachable velocity without losing the traps is 462 μ m / s with an error of 4%.
Micromachines 10 00677 g005
Figure 6. Teleoperated rotation in velocity mode of four micro-beads at different Z positions. (a) Pictures of four trapped beads in different axial configurations. Scale bar is 5 μ m long. (b) Velocity reference and measured velocity in 4 different configurations. (c) X and Y positions of the Trap 1 in “Z = 0 μ m ” configuration at a speed of 21 μ m / s and 105 μ m / s . The maximal reachable velocity without losing the traps is 105 μ m / s with an error of 5%.
Figure 6. Teleoperated rotation in velocity mode of four micro-beads at different Z positions. (a) Pictures of four trapped beads in different axial configurations. Scale bar is 5 μ m long. (b) Velocity reference and measured velocity in 4 different configurations. (c) X and Y positions of the Trap 1 in “Z = 0 μ m ” configuration at a speed of 21 μ m / s and 105 μ m / s . The maximal reachable velocity without losing the traps is 105 μ m / s with an error of 5%.
Micromachines 10 00677 g006
Figure 7. 3D rotation control of erythrocyte driven by 8 optical traps. (a) Schematic 3D depiction of erythrocyte and the reference frame used in the 3D control. (b) Schematic 2D depiction of the erythrocyte with dimensions in y–x and z–x plane. (c) Time-lapse images from Supplementary Videos S1 and S2 demonstrating multiple degrees-of-freedom cell rotational control. Scale bar is 2 μ m long.
Figure 7. 3D rotation control of erythrocyte driven by 8 optical traps. (a) Schematic 3D depiction of erythrocyte and the reference frame used in the 3D control. (b) Schematic 2D depiction of the erythrocyte with dimensions in y–x and z–x plane. (c) Time-lapse images from Supplementary Videos S1 and S2 demonstrating multiple degrees-of-freedom cell rotational control. Scale bar is 2 μ m long.
Micromachines 10 00677 g007
Figure 8. Robot with a shovel-shaped end-effector for cell transport. (a) Scanning electron microscopy (SEM) image of the robot. (b) Schematic depictions with dimensions of the robot. (c) 2D trajectory of the teleoperated robot with an inset of the 3D trajectory. Note that frames between T: 20 s and T: 30 s show the erythrocytes out-of-focus as the robot is elevated in the Z-direction. Then, a cluster of cells (colored in violet) is transported for 80 μ m . Results also shown in Supplemental Video S3.
Figure 8. Robot with a shovel-shaped end-effector for cell transport. (a) Scanning electron microscopy (SEM) image of the robot. (b) Schematic depictions with dimensions of the robot. (c) 2D trajectory of the teleoperated robot with an inset of the 3D trajectory. Note that frames between T: 20 s and T: 30 s show the erythrocytes out-of-focus as the robot is elevated in the Z-direction. Then, a cluster of cells (colored in violet) is transported for 80 μ m . Results also shown in Supplemental Video S3.
Micromachines 10 00677 g008
Figure 9. Robot equipped with a fork end-effector for single-cell manipulation. (a) Scanning electron microscopy (SEM) image of the robot. (b) Schematic depictions with dimensions of optical robot. (c) Two sets of experiments manipulation of a single red blood cell and its 2D trajectory. Please see Supplementary Video S4 for complete footage of these experiments.
Figure 9. Robot equipped with a fork end-effector for single-cell manipulation. (a) Scanning electron microscopy (SEM) image of the robot. (b) Schematic depictions with dimensions of optical robot. (c) Two sets of experiments manipulation of a single red blood cell and its 2D trajectory. Please see Supplementary Video S4 for complete footage of these experiments.
Micromachines 10 00677 g009
Table 1. Summary of the system parameters.
Table 1. Summary of the system parameters.
ParameterValue
Teleoperation loop200 Hz
Simultaneous optical traps>15
Translation range200 × 200 × 200 μ m 3
Rotation range70 × 50 × 9 μ m 3
Trans Max velocity (1T,4T) *1500, 110 μ m / s **
Rot Max velocity (1T,4T) *462, 105 μ m / s **
Static error (1T,4T) *<200 nm, 400 nm **
* Depends on the trapped object; ** For 3 μ m polystyrene beads.

Share and Cite

MDPI and ACS Style

Gerena, E.; Legendre, F.; Molawade, A.; Vitry, Y.; Régnier, S.; Haliyo, S. Tele–Robotic Platform for Dexterous Optical Single-Cell Manipulation. Micromachines 2019, 10, 677. https://doi.org/10.3390/mi10100677

AMA Style

Gerena E, Legendre F, Molawade A, Vitry Y, Régnier S, Haliyo S. Tele–Robotic Platform for Dexterous Optical Single-Cell Manipulation. Micromachines. 2019; 10(10):677. https://doi.org/10.3390/mi10100677

Chicago/Turabian Style

Gerena, Edison, Florent Legendre, Akshay Molawade, Youen Vitry, Stéphane Régnier, and Sinan Haliyo. 2019. "Tele–Robotic Platform for Dexterous Optical Single-Cell Manipulation" Micromachines 10, no. 10: 677. https://doi.org/10.3390/mi10100677

APA Style

Gerena, E., Legendre, F., Molawade, A., Vitry, Y., Régnier, S., & Haliyo, S. (2019). Tele–Robotic Platform for Dexterous Optical Single-Cell Manipulation. Micromachines, 10(10), 677. https://doi.org/10.3390/mi10100677

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop