*2.2. Orientation of Vehicle*

The next consideration for loading is to park the vehicle next to the freight in such a manner that the vehicle's right side is facing the freight, as shown in Figure 3, since the loading mechanism can only operate from the right-hand side of the vehicle. The maximum tolerance for collection of freight between the freight and vehicle is 300 mm [29]. This limitation, along with the width of the vehicle, defines our parking spot's total width.

**Figure 3.** Parking w.r.t freight.

Furthermore, the vehicle is equipped with two separate loading bays, as shown in Figure 4. The freight has to be loaded in the vacant loading bay, i.e., if bay 1 is occupied, then load the freight in bay 2. As the loading bays are not centrally aligned with the freight, the vehicle has to adjust its parking spot accordingly so that the freight can be loaded in the available loading bay. Furthermore, the freight also needs to be aligned with the center of the forklift of the loading bays (as shown in Figure 4) for the forklifts to be inserted into the freight pallet.

**Figure 4.** Vehicle dimensions w.r.t loading bays.

#### *2.3. Defining Parking Spot Reference to Freight*

The vehicle dimensions are given in Table 1. The width of the vehicle plus the tolerance to the freight defines the width of the parking space. To maintain a reasonable amount of parking maneuvers (≤3 for typical cases) [30], the length of the parking space is chosen to be 2 m more than the length of the vehicle.

**Table 1.** Vehicle dimensions in meters.


Taking the loading of the freight and vehicle dimensions into consideration, the overall parking spot defined for the vehicle is a box of 2.1 × 6.0 m, as shown in Figure 3.

#### **3. Software and Control Architecture**

To speed up the development and integration process with the real experimental vehicle, Robot Operating System (ROS)-based autonomous vehicle's software architecture (called ICARS (Software being developed at LS2N (Laboratoire des Sciences du Numérique de Nantes) www.ls2n.fr (accessed on 15 July 2021)) will be exploited (Figure 5). Considering the task to be accomplished, a particular interest is placed on the Multi-Sensor-Based Predictive Controller (MSBPC) used for parking [31]. As the name of the approach suggests, the parking controller is based on a combination of Model Predictive Control (MPC) and Multi-Sensor-Based Control in order to perform safe (collision-free) parking operations that rely solely on locally perceived sensor features signals and without needing to explicitly plan any path. Moreover, since the technique exploits locally perceived sensor features at each time instant, no localization system is inherently required. The MSBPC approach is now recalled.

An internal-model-control (IMC) structure [32] is used as a basis for formalizing the MSBPC approach (Figure 6). The robotized vehicle and perception system compose the System block. The input to this block is the control variable **v***<sup>r</sup>* = [*v*, *δ*] T, where the longitudinal velocity is denoted by *v* and the steering angle by *δ*, while its output is the current value of the sensor features (i.e., corners of the parking spot). The reference signal **s**∗ is the desired value of the output **s**. General discrepancies between the current sensor features and the values that were predicted from the model (e.g., modeling errors and disturbances) are represented by the error signal :

$$\mathbf{s}(n) = \mathbf{s}(n) - \mathbf{s}\_{\text{mp}}(n) \tag{1}$$

where *n* denotes the current time.

**Figure 6.** Control structure [31].

The difference between the desired value **s**<sup>d</sup> and the predicted model output **s**mp is minimized by an optimization algorithm. Following Figure 6:

$$\mathbf{s}\_{\mathbf{d}}(n) = \mathbf{s}^\*(n) - \boldsymbol{\epsilon}(n) = \mathbf{s}^\*(n) - (\mathbf{s}(n) - \mathbf{s}\_{\text{mp}}(n)),\tag{2}$$

from where one can deduce

$$\mathbf{s}\_{\rm d}(n) - \mathbf{s}\_{\rm mp}(n) = \mathbf{s}^\*(n) - \mathbf{s}(n),\tag{3}$$

thus, to track **s**<sup>∗</sup> by **s** is equivalent to track **s**<sup>d</sup> by **s**mp.

The interaction model described in [31] is used in order to predict the evolution of the sensor features **s**mp over a finite horizon *Np*. The cost function is to be minimized with respect to a control sequence **v**˜*<sup>r</sup>* over *Np* and depends mainly on the difference between **s**<sup>d</sup> and **s**mp. As with any classical Model Predictive Control technique, only the first element **v***r*(*n*) of the optimal control sequence is applied to the system at each iteration.

The controller (implemented in C++) runs online at 10 Hz using the solver NLopt [33] with a Sequential Least Squares Programming (SLSQP) algorithm [34]. Furthermore, it is assumed that both the vehicle's longitudinal velocity and steering angle are controllable; thus, lower-level controllers that directly interface with the actuators are out of the scope of the parking approach.

The simulation inputs are four points generated based on a parked pose and the type of parking spot. This information is then transferred as a topic to the control node, which, in turn, generates a control command as a topic as well. The complete software is constructed in a manner that it is agnostic; it is working in a simulation environment or in a real vehicle.

Therefore, with the experimental vehicle being interfaced with ICARS, two additional ROS nodes would have to be developed: A perception one to extract the freight's pose from sensory data and another one to generate a parking spot next to the freight. Once the parking spot has been successfully generated from the sensory data, the four corners that define it would have to be sent to the ICARS parking controller to park the vehicle in the desired pose to pick up the freight.

#### **4. Mathematical Modeling and Notation**

#### *4.1. Vehicle Kinematic Model*

The vehicle kinematic model for a rear-wheel driven vehicle as taken from [8] is presented by Equation (4). The freight vehicle also follows the same kinematic model for its drive.

$$
\begin{bmatrix}
\dot{\mathbf{x}} \\
\dot{y} \\
\dot{\theta} \\
\dot{\phi}
\end{bmatrix} = \begin{bmatrix}
\cos\theta \\
\sin\theta \\
\tan\phi/l\_{wb} \\
0
\end{bmatrix} \upsilon + \begin{bmatrix}
0 \\
0 \\
0 \\
1
\end{bmatrix} \dot{\phi} \tag{4}
$$

where *v* and *φ*˙ are longitudinal and steering velocities. Since parking maneuvers are performed at low speed, one can consider the kinematic model as accurate enough. Further notations and elaboration of the model as given by Equation (4) are represented in Figure 7a. The vehicle used for parking evaluation is further represented in Figure 7b. There is a designated space for the vehicle operator, as shown in the figure. The battery pack and electronics bay holds the complete electronics for the operation of the vehicle.

**Figure 7.** (**a**) Vehicle kinematic model. (**b**) FURBOT vehicle.

### *4.2. Points Acquisition for Parking Spot*

For correctly identifying parking spots for the vehicle, the acquisition of the freight corners (highlighted by red), as shown in Figure 1, is required through sensor feedback. From there, using these points, the freight center point is calculated using Equation (5).

$$\mathbf{x}\_{fc} = \frac{(\mathbf{x}\_{f1} + \mathbf{x}\_{f2})}{2}, \mathbf{y}\_{fc} = \frac{(\mathbf{y}\_{f1} + \mathbf{y}\_{f2})}{2} \tag{5}$$

Furthermore, knowledge from these two corner points is also used to calculate the inclination angle *θ <sup>f</sup>* of the freight using the Pythagoras theorem (Equation (6)).

$$\theta\_f = \tan^{-1}(\frac{y\_{f2} - y\_{f1}}{x\_{f2} - x\_{f1}}) \tag{6}$$

Using the width of the parking spot *dw* and information of the freight center point, the vehicle center point is calculated from the freight using Equation (7).

$$x\_{\rm rc} = x\_{f\varepsilon} + \frac{d\_{\rm av}}{2} \cos\left(\theta\_f + \frac{3\pi}{2}\right), \\ y\_{\rm rc} = y\_{f\varepsilon} + \frac{d\_{\rm av}}{2} \sin\left(\theta\_f + \frac{3\pi}{2}\right) \tag{7}$$

The above-mentioned points are further explained in Figure 8. Using these points, freight collection parking spots are calculated. This is further explained in the next subsection.

**Figure 8.** Point definition for parking spot.

#### *4.3. Parking Area with Respect to Loading Bays*

As the vehicle is equipped with two loading bays, the parking spot w.r.t each loading bay is different. To load the cargo in the front-loading bay, the vehicle needs to park a certain distance behind its center point, as shown in Figure 4. If the first loading bay is occupied by the previously loaded freight, then the vehicle needs to park a little ahead of its center point by a distance so that the forks are perfectly aligned with the freight to load the freight in the second loading bay. The center point of the parking spot thus varies depending upon the availability of the loading bay. This is further explained in Equation (8).

$$\mathbf{x}\_{\rm pc} = \begin{cases} \mathbf{x}\_{\rm rc} + d\_{b1} \cos \theta\_{f'} \\ \mathbf{x}\_{\rm rc} - d\_{b2} \cos \theta\_{f'} \end{cases}, \mathbf{y}\_{\rm pc} = \begin{cases} y\_{\rm rc} + d\_{b1} \sin \theta\_{f'} & \text{if } \mathbf{b} \mathbf{y} \mathbf{1} = \mathbf{1} \\ y\_{\rm rc} - d\_{b2} \sin \theta\_{f'} & \text{otherwise} \end{cases} \tag{8}$$

where *db*<sup>1</sup> and *db*<sup>2</sup> are the positive distances of loading bay 1 and 2 from vehicle center (measurement shown in Figure 4), and (*xvc*, *yvc*) is the center-point of parking spot. The condition *bay*1 = 1 denotes the condition of availability of loading bay 1. If bay 1 is available to load the freight, then the first condition applies or else the parking spot is generated w.r.t second loading bay.

After defining the center of the parking spot, the four corners of the parking spot can be defined through Equations (9)–(12) using the knowledge of parking width (*dw*), parking length (*dl*), angle of the freight (*θ <sup>f</sup>*) and parking center-point (*xpc*, *ypc*). The collection of the first corner of the parking pose requires the knowledge of parking pose width + length. The derivation of equations is based on trigonometric relations and point transformation from the known parking center-point to the first edge of the parking pose. Since all four lines of the parking pose are parallel or perpendicular to the freight, the solution is formed by adding the respective angle (90◦, 180◦, 270◦) to the angle of the freight *θ <sup>f</sup>* for calculation of next point of the parking pose.

$$\begin{cases} \mathbf{x}\_{p1} = \mathbf{x}\_{pc} + \frac{d\_{\text{pr}}}{2} \cos \left(\theta\_f + \frac{\pi}{2}\right) + \frac{d\_l}{2} \cos \left(\theta\_f + \pi\right), \\ \mathbf{y}\_{p1} = \mathbf{y}\_{pc} + \frac{d\_{\text{pr}}}{2} \sin \left(\theta\_f + \frac{\pi}{2}\right) + \frac{d\_l}{2} \sin \left(\theta\_f + \pi\right) \end{cases} \tag{9}$$

$$x\_{p2} = x\_{p1} + d\_w \cos\left(\theta\_f + \frac{3\pi}{2}\right), \\ y\_{p2} = y\_{p1} + d\_w \sin\left(\theta\_f + \frac{3\pi}{2}\right) \tag{10}$$

$$\mathbf{x}\_{p3} = \mathbf{x}\_{p2} + d\_l \cos \left(\theta\_f \right), \\ y\_{p3} = y\_{p2} + d\_l \sin \left(\theta\_f \right) \tag{11}$$

$$x\_{p4} = x\_{p3} + d\_{\mathcal{W}} \cos \left(\theta\_f + \frac{\pi}{2}\right), \\ y\_{p4} = y\_{p3} + d\_{\mathcal{W}} \sin \left(\theta\_f + \frac{\pi}{2}\right) \tag{12}$$

The four corners of the parking pose (*xp*1, *yp*1),(*xp*2, *yp*2),(*xp*3, *yp*3) and (*xp*4, *yp*4) are the consequent four corners of the parking pose. These points are further explained later.

The summary of the whole pose generation solution is as follows. The vehicle is required to detect freight and acquire the corner points of the freight from the side where it can be loaded. Afterwards, Equation (5)–(12) are solved to get the correct pose for loading freight into the vehicle. The complete process of correct pose generation is further summarized in the flowchart presented in Figure 9.

**Figure 9.** Flowchart for parking pose generation.

#### **5. Results**

By using the mathematical equations and notations discussed in the previous section, the node for the vehicle parking spot was run and tested to verify if the parking spot is generated according to the loading bays. The results are further discussed in the sections below for further clarity.

#### *5.1. Parking Spot Definitions w.r.t Loading Bays*

The results are first validated for generating the respective parking spot for front and rear loading bays. Figure 10 shows the output of parking pose generated for randomly positioned freight at a 45 degree angle. The blue-colored text represents the points for the parking spot definition concerning the front loading bay, whereas the red-colored text represents the parking spot concerning the rear loading bay. The red line from the parking spot center towards freight represents the right-hand side of the vehicle, thus specifying the heading of the vehicle.

**Figure 10.** Parking spot with respect to loading bays.

From Figure 10, we can observe that the vehicle parks behind the center-point of the vehicle for loading freight in the first loading bay, whereas it parks a little ahead of its center-point when it is loading freight in the second loading bay. Furthermore, points 1 and 2 show the front end of the parking pose, and points 3 and 4 show the rear end of the pose generated. These results are also in coherence with the physical vehicle anatomy and the mathematical methodology proposed, henceforth validating our results.

#### *5.2. Results for Different Freight Placement*

As it is not possible to make sure that the orientation of freight is perfectly aligned with any pre-determined reference, the parking spot for the vehicle must be generated automatically, keeping in view the orientation of the freight. This conditional issue has been taken into consideration prior in the mathematical modeling of the parking spot. The code generated is validated for different orientations of the freight (Figure 11) to verify that the correct parking spot is generated irrespective of the orientation of the freight.

The results achieved, as shown in Figure 11, show four different orientations of freight (225, 0, 90 and 135 degrees, respectively) and the respective parking spot generated for each loading bay. The results show that irrespective of the orientation of the freight, the parking spot is generated accordingly for the vehicle for collecting the freight for the available loading bay.

**Figure 11.** Parking spot relative to oriented loading bays.

Furthermore, the results presented also validate our purpose of generating parking pose autonomously irrespective of the orientation of the freight. The results also show that the vehicle will always park against the freight keeping the freight on the right-hand side of the vehicle so that it can be loaded in the available loading bay, i.e., the heading of the vehicle and loading bay location is always kept into consideration during the autonomous generation of a parking pose. The consideration of the freight w.r.t vehicle is shown by the red line generated from the center of the pose towards the freight (as shown in the figure with the red line). This allows the vehicle to know the direction of the parking pose and the heading angle for the vehicle. The parking pose points generated with the help of Equations (9)–(12) lets the vehicle know the heading of the vehicle as well. Point 1 and 2 depict the points where the front of the vehicle should face, and point 3 and 4 depict the points for the rear end of the vehicle.

#### *5.3. Vehicle Parking in ROS Environment*

As discussed earlier in Section 3, ROS environment-based software architecture ICARS is exploited to speed up the actual experiments. Using the already developed parking schemes within the software architecture [8,9,31], we can park the vehicle in the designated parking pose. The already built parking algorithms are sufficient for maneuvering the vehicle within the parking pose generated autonomously for the freight collection. Figure 12 shows the time-wise parking maneuver for vehicle FURBOT in the parking pose generated for the freight. Due to the constraint environment, the vehicle cannot park after detecting the freight because it needs to fulfill the requirement to park the vehicle, keeping the right side of the vehicle towards the freight. Thus, the vehicle has to move ahead of the freight and park while reversing, as shown in Figure 12. These results are generated in ICARS software architecture with input from the simulated sensors mounted on the vehicle.

In Figure 12, the orange box depicts the freight placement and location. The green rectangle represents the parking pose generated w.r.t freight. The red trail left behind the vehicle denotes the performed parking maneuver, and the red lines in front or at the back of the vehicle depict the direction and steering angle of the vehicle.

**Figure 12.** Vehicle parking w.r.t freight at different time intervals.

The virtual experiment is performed in the already installed maps of ICARS software architecture. The environment is constrained by natural buildings surrounding the area. Furthermore, the parking algorithm can detect and avoid obstacles, e.g., pedestrians, in case the vehicle can detect them [31]. The results validated in the ICARS environment prove that the methodology behind generating such parking pose for autonomous freight collection is valid and resolves the issue for autonomous parking pose generation w.r.t freight.

#### **6. Conclusions and Perspectives**

Research perspectives for this research were to highlight the gap between currently available research and how to resolve autonomous parking for freight collection. For addressing this issue, collecting the orientation and freight location are key values for the proposed algorithm. Using these key values, we can identify the parking pose for the vehicle for it to load the freight autonomously.

Major findings of this research discuss and resolve the issue related to parking the vehicle next to the freight for loading it into the vehicle. The hurdle of generating a parking spot that could lead to the successful loading of freight is looked into. Control architecture and vehicle dynamics were previously built and checked against different parking conditions, i.e., perpendicular, parallel and angled parking. This enabled the research to focus on the parking pose of the vehicle for freight collection. The solution required an algorithm that can define parking pose for our unique problem of loading the freight autonomously. With this research, this issue is resolved.

Concerning the correct pose for autonomous loading in respective bays of the vehicle, detection of freight from the surroundings is required. Once localization of freight is attained, then the proposed mathematical model suffices to generate the correct parking pose for the vehicle. However, the detection of freight and its localization is out of the scope of this research and will be looked into our future work. The parking pose length and width are defined, keeping minimal maneuvers required and distance to freight into consideration.

Considerations are kept for loading the freight in the correct loading bay. Separate parking spots are generated depending upon which loading bay is to be used for loading the freight into the vehicle. Simulating the freight with different orientations and generating the parking spot accordingly is also verified. The mathematical model verifies that if the freight is correctly identified from the environment, the parking spot generated will be correct irrespective of the orientation or location of the freight.

The main distinctive factor for the vehicle while demonstrating level 4 autonomy is the autonomous collection of freight which sets an electric freight vehicle apart from conventional vehicles. With the help of a previously designed parking controller, the identification of the freight through sensor's feedback and the definition of parking pose through this research, the vehicle can now align itself with the freight for autonomous collection of freight, increasing the level of autonomy of the vehicle.

This research will help to create an alternative approach for parking pose generation, especially in reference to an inanimate object. Researchers working in developing autonomous freight handling issues can directly benefit from this research. This research can further be used in developing parking pose algorithms where it might be necessary to park against an inanimate object, e.g., a bus stop or a door to a facility/delivery location. Currently, there is an unavailability of previous research in this field and solutions, where being studied might require continuous input from sensors in order to achieve the eventual goal where sensor output has to achieve an eventual goal to find a solution. This solution requires comparatively minimal input from sensors (for identification and localization); once attained, it does not require keeping on checking with sensors' feedback.

Future work involves the use of correct sensors (3D-Lidar in particular) for extracting information from the freight from the environment. This includes extracting data through 3D-Lidar's point-cloud data and reconstructing the freight in a virtual environment. Afterwards, the information from point-cloud data will extract the orientation and freight corner points. This perception module will help us extract the freight pose, which will later be used by the parking pose module for the autonomous generation of a parking spot regarding the freight.

**Author Contributions:** Conceptualization, K.M., V.F. and M.Z.; methodology, K.M., D.P.M. and V.F.; validation, M.Z. and V.F.; formal analysis, K.M. and D.P.M.; investigation, K.M. and V.F.; resources, M.Z. and V.F.; data curation, K.M.; writing—original draft preparation, K.M., D.P.M. and V.F.; writing—review and editing, K.M., D.P.M. and V.F.; visualization, K.M., D.P.M. and V.F.; supervision, R.M., V.F. and M.Z.; project administration, R.M., V.F. and M.Z.; and funding acquisition, R.M., V.F. and M.Z. All authors have read and agreed to the published version of the manuscript.

**Funding:** This research received no external funding.

**Institutional Review Board Statement:** Not applicable.

**Informed Consent Statement:** Not applicable.

**Acknowledgments:** This paper was supported by the European Union's Horizon 2020 research and innovation program under grant agreement No. 875530, project SHOW (SHared automation Operating models for Worldwide adoption).

**Conflicts of Interest:** The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.

#### **Abbreviations**

The following abbreviations are used in this manuscript:

FURBOT Freight Urban Robotic Vehicle SHOW SHared automation Operating models for Worldwide adoption

#### **References**

