Next Article in Journal
SLAM, Path Planning Algorithm and Application Research of an Indoor Substation Wheeled Robot Navigation System
Previous Article in Journal
A 19.6–39.4 GHz Broadband Low Noise Amplifier Based on Triple-Coupled Technique and T-Coil Network in 65-nm CMOS
Previous Article in Special Issue
Priority-Aware Conflict Resolution for U-Space
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Marker-Based 3D Position-Prediction Algorithm of Mobile Vertiport for Cabin-Delivery Mechanism of Dual-Mode Flying Car

Korea Electrotechnology Research Institute, Changwon-si 51543, Gyeongsangnam-do, Korea
*
Author to whom correspondence should be addressed.
Electronics 2022, 11(12), 1837; https://doi.org/10.3390/electronics11121837
Submission received: 10 March 2022 / Revised: 25 May 2022 / Accepted: 26 May 2022 / Published: 9 June 2022
(This article belongs to the Special Issue Urban Air Mobility)

Abstract

:
This paper presents an image-processing technique for cabin delivery employing local localization and docking in a mobile station, which is a mobile vertiport for the use of dual-mode flying cars. A dual-mode flying automobile with an aerial electric vehicle (AEV), a ground electric vehicle (GEV), and a cabin is a future method of transportation that can be used in both the air and on the ground. To enable AEVs to land in a specific position, a landing site is necessary. The proposed AEV uses vertical take-off and landing, and a vertiport landing site is required. As vertical take-off and landing sites require a lot of space and are challenging to operate in multiple positions, we suggest a mobile vertiport that can fit into a small space. A mobile station is appropriate for dual-mode flying cars since it includes critical activities such as transporting AEVs from the ground and charging as well as a cabin-delivery system. The mobile station can generate a path to the AEV by calculating the relative position using the markers attached to the AEV and estimating the position of the landing AEV. The mobile station detects a marker for precise positioning correction, followed by exact position correction for cabin delivery, to travel to the accurate position of the AEV. To increase the success rate of cabin delivery, docking markers are identified and the angle position error between the mobile station and cabin is computed and corrected to rectify the position between the cabin and the mobile station for cabin delivery. In addition, the experimental results revealed a mechanically correctable error range that encompassed all experimental values. Consequently, this study showed that image processing may be used to create a mobile station for dual-mode flying automobiles.

1. Introduction

Urbanization is advancing at a breakneck pace worldwide. According to the United Nations Economic and Social Council, since 2010, the global urban population has surpassed regional populations, with an urbanization rate of 55.5% as of 2018. The rural population has decreased since 2020, while the urban population is expected to increase continuously. Consequently, the global urbanization rate is expected to reach 68.4% by 2050. The UN’s analysis found that there are only 10 megacities with populations of more than 10 million people, but that number has risen to 33 million as of 2018 and is expected to rise to 43 million by 2030. This urbanization has resulted in a slew of issues in sectors such as transportation, housing, the environment, and energy. Cars that clog city streets in particular caused severe traffic congestion and environmental pollution, resulting in a significant socioeconomic loss. For example, Americans lose an average of 97 h per year due to traffic congestion, equating to a loss of USD 1348 per person. The urban traffic problem is more than just a traffic problem; it can lead to a slew of additional issues, including energy waste, pollution, and noise production. The combination of electric-powered autonomous cars with sharing-platform technologies is projected to help alleviate some of the city traffic congestion and environmental issues. However, given the current population concentration in cities, there is a limit to consider it as a fundamental solution. On the ground, autonomous driving and sharing platforms are seen as a better way to distribute urban traffic resources that are currently inefficiently managed through digital disruption [1]. It is necessary to distribute resources efficiently, but for a fundamental solution, a new space resource is required. Due to the saturation of ground and underground spaces, the urban traffic problem can no longer be managed to simply utilize two-dimensional plane space. To accomplish this, the National Aeronautics and Space Administration has designated the city’s short-distance air-transport ecosystem as the foundation for a new metropolis that will rely on low-altitude air for urban air transportation (UAM) [2,3,4,5,6,7,8,9]. The UAM is now widely used in a variety of industries and markets. The UAM has been investigated as a future model of transportation that uses the air to alleviate traffic congestion on the ground. Since it is difficult to establish a runway in the city center, the personal air vehicle (PAV) [10,11,12,13,14,15,16], which is a form of air transportation, has been researched as a form that can take off and land vertically. For vertical take-off and landing, the PAV requires a separate vertiport. Consequently, in addition to PAVs, a vertiport [17,18,19,20] is investigated.
Various transportation platforms and construction businesses in Korea and abroad are researching vertiports, the majority of which are fixed. A fixed-type vertiport is useful for passengers who manage and utilize a large number of PAVs, as well as for business expansion through value-added creation via connections to other modes of transportation. However, because of its size, it necessitates a significant investment as well as a considerable amount of space with unique buildings. Furthermore, a dual-mode flying car, which is a hybrid of aerial electric vehicle (AEV) and ground electric vehicle (GEV) platforms, is becoming closer to being a viable concept for moving people or goods between cities or inside cities in the future. A dual-mode flying car is a mode of transportation in which AEVs are used to travel between distant cities and GEVs are utilized to deliver passengers to their destinations within complex cities. It can be a transportation system that applies to passenger transportation and logistics and other services. In particular, because of the high population density of certain places such as the Asian regions, a vertiport that requires a big space may not be developed. This paper suggests a mobile vertiport for dual-mode flying automobiles as a solution to this challenge. A Special Vertiport is required for docking and undocking of passenger cabins on AEVs and GEVs, especially in a megacity, and in dense building environments, special buildings for Vertiports require space and space constraints. Since a mobile vertiport does not require a great amount of room and can fit into a small space, it can be put in a variety of positions around a city. Consequently, the number of urban stations available to passengers grows, allowing passengers to go closer to their chosen destination. A mobile vertiport or a mobile station is proposed in this paper. To travel to the AEV and GEV, a mobile station should be able to recognize a location. A target’s position should be recognized using cameras or sensors, and the positions of AEV and GEV are accurately detected using three-dimensional (3D) depth cameras to accurately locate a preferable object among many objects, barriers, and identical things. Using docking-image processing for precise position correction and cabin delivery using markers attached to AEV and GEV to move a cabin from the moving platform to the AEV or GEV while recognizing a preferred platform using markers attached to each platform and calculating a relative coordinate with markers, a mobile station for dual-mode flying cars was investigated. The study proposes a fixed Vertiport; the flying AEV must move to the correct Vertiport position, which must be increased in size for accuracy of landing, and it is difficult to control in order to land in the correct position of Vertiport depending on weather conditions. In addition, depending on the Vertiport’s accuracy, it is highly likely that the AEV will have to land in a different scenario, but the proposed Mobile Vertiport will reduce the burden of controlling the accuracy of the AEV because it can move to a large vacant lot or any mobile Vertiport. To this end, we propose Mobile Station, a Mobile Vertiport, and propose a location detection and docking system based on image data available in the proposed system and the image-processing algorithm of the mobile station, which is a mobile vertiport for dual-mode flying cars that can be used as a future transportation means. This image-processing algorithm proposes a precise position-correction algorithm based on markers to estimate the position of the platform and deliver a cabin.

2. Position-Prediction Technology for Cabin Delivery of the Mobile Station

2.1. Marker-Based AEV’s Position-Information Acquisition (Local Localization)

To utilize various functions such as cabin delivery and recharging, a mobile station should move to a spot where an AEV is landed. It is important to precisely detect the posi-tion of AEV for the mobile station to move. Therefore, the associated sensor’s data can be utilized to check the AEV position, but a sensor error occurs. Consequently, a mechanism for obtaining precise position data is required. This method can be accomplished using the suggested image-processing algorithm, which can obtain position information by detecting markers attached to the AEV’s body via the mobile station’s cameras.
The red circles in Figure 1 show the markers attached to the body, and these markers are recognized to obtain the position information of the AEV. The AEV’s markers are recognized using a depth camera attached to the mobile station, and a relative position between the mobile station and AEV is calculated by computing a distance and angle to the marker.
Figure 2 shows the mobile station model and the positions of the cameras to recognize the AEV markers. The mobile station employs a total of six cameras to estimate the position. Here, Cam1 and Cam2 are used to estimate the AEV’s position.
The distance and angle to the marker are calculated using two cameras, Cam1 and Cam2. The distance and angle are calculated at the center point of the camera using a trigonometric function based on the distance and angle measured using two depth cameras.
Figure 3 shows how to calculate the distance and angle to the marker from the center point of the camera. Table 1 describes the parameters used in Figure 3. A distance between the camera and object can be calculated using triangulation as follows:
α 2 = p 1 2 + F 2
F 2 = α 2 p 1 2 = p 1 2 + α 2 2 p 1 α cos θ 1
θ can be summarized using the equation as follows:
θ 1 = cos 1 ( p 1 p 1 2 + F 2 )
θ 2 can also be calculated using the same method.
By rearranging the angles θ 1 ,   θ 2 , the following equations can be obtained:
tan θ 1 = d l
tan θ 2 = d I l
By rearranging for d, the following equation can be obtained:
d = I × tan θ 1 tan θ 2 tan θ 1 + tan θ 2
Using the d and θ obtained above, the AEV’s position can be calculated in the mobile station, and the mobile station can move to the AEV.

2.2. Precision Correction of the Mobile Station Based on Markers

As described in Section 2.1, the mobile station that moves to the AEV should be precisely aligned for docking with the AEV. For this, the markers attached to the AEV’s legs are used to precisely align the position, as shown in the red circles in Figure 4. Cams 3–6 in the mobile station in Figure 2 recognize the markers that indicate each position in the legs of the AEV, and distances and angles in the markers of the legs are recognized by each camera, calculating and correcting the AEV position within the mobile station.
Figure 5 depicts how the mobile station approaches the AEV based on the AEV’s position. Figure 5a depicts how the mobile station moves in response to the AEV position using the front camera. Figure 5b depicts the mobile station approaching the AEV while recognizing the red markers on the AEV’s rear side using Cams 3 and 5. The attachment of four markers and cameras in the AEV and mobile station allows the mobile station to see the AEV’s markers continuously without blank sections when approaching the AEV. The mobile station approaches the AEV while recognizing the marker on the rear side of the AEV and avoiding a collision with the AEV. Figure 5c shows that the mobile station uses cameras to check the markers at all times as it enters. Figure 5d depicts the mobile station docking with the AEV while checking the markers.
As shown in Figure 6, precise correction is complete once the position, distance, and angle of the four markers and depth cameras are precisely aligned with each other. Moreover, if there is a misalignment, the mobile station corrects the position.
If there is a position error between the mobile station and the AEV, it means that the AEV has been rotated or translated based on the mobile station. The position errors between the mobile station and the AEV are depicted in Figure 7. The left-hand figure depicts the translation error between the mobile station and the AEV, while the right-hand figure depicts the rotation error.
For the translation error, the mobile station moves in parallel while matching the distance and angle between the four cameras and markers. For the rotation error, each of the four points’ distance and angle is measured, and a rotation equal to the calculated angle is performed, assuming that the four points are connected by a rigid body. The rotated mobile station then checks the position error again and matches the position via translation movement.

2.3. Docking between the AEV and Cabin Using the Marker-Based Mobile Station

The mobile station is used for docking between the cabin and AEV. The cabin is precisely moved to the AEV’s docking connection area using the mobile station. In this case, a circular marker is used to determine whether the cabin is properly positioned in relation to the AEV’s docking area. A round-shape marker is used for the circular marker, as shown in Figure 8. The reason for using the circular marker is that the position or angle of the camera is highly likely to be misaligned when docking. In this case, circular markers are used because they are least affected by angle and position and are easy to correct. The color of the circular marker is used to classify the red, green, and blue (RGB) areas, resulting in the detection of a cycle for each color and the calculation of the center point of each circle. The reason for classifying the color of the marker is to secure an accurate reference point and origin by using the color position, and divide the color to compose the marker by dividing the front and back to calculate the correct position and angle. The angle between two line segments is calculated by drawing line segments formed by the center points of the circles and a rectangle connecting the outermost points of the circles.
Figure 9 shows an angle θ made between the yellow line segment created using the center point of the circle when the docking markers are rotated and the black rectangle made using the outermost points.
As the angular error occurs as much as θ based on the marker, the cabin is rotated to be docked. This process is iterated until it converges to a small value that is within the error range, and when the error converges, docking is performed.

3. Experiment

3.1. Experiment Environment

The experiment was conducted with the urban personal mobility vehicle, which included an AEV, a GEV, a cabin, and a mobile vertiport. The AEV, GEV, cabin, and mobile station used in the experiment are depicted in Figure 10. The cameras used in image processing were Intel D455 (Inter, USA) for AEV marker recognition and Intel 435i (Inter, USA) for precision-control-marker recognition. Nvidia Jetson Xavier AGX (Nvidia, USA) was used for the image-processing board, and Intel’s NUC7CJYH (Intel, USA) was used for docking-image processing. Iterative experiments were conducted to compare the image-processing error and the real error. Every time the experiment was run, an error was measured and compared to determine if it was a true iterative error.

3.2. Marker-Based AEV’s Position Information

The mobile station recognizes the marker attached to the AEV’s rear side to estimate the AEV’s early position. The position of the AEV’s marker is estimated using two internal cameras in the mobile station, and the relative position between the mobile station and the AEV is calculated using the coordinates estimated by the two cameras. In this manner, the mobile station can carve a path to the AEV position. The experiment to recognize the AEV’s rear marker revealed that AEV markers 4 m, 7 m, and 11 m away from the mobile station were recognized, as well as recognition results and errors.
Figure 11 shows the mobile station’s recognized results for the rear marker. This figure is the result of recognition from an RGB color image and a binarized image. There was no difference in recognition results between color and binarized images. Consequently, users can select image formats based on their needs, such as a greater representation of environmental information or computation capability.
Figure 12 depicts the recognized result of the AEV marker, which is 4 m from the camera’s center point in the mobile station. The position of the marker recognized by the left-side camera in the mobile station was 4.03 m at 5.4° from the center point of the two cameras, whereas the position of the marker recognized by the right-side camera was 4.11 m at 13.9°. This means that the marker identified by the two cameras can be calculated to be approximately 4.01 m from the center point.
Figure 13 and Figure 14 show the recognized AEV position at 7 m and 11 m distances from the mobile station, respectively. Figure 13 shows the recognized results at a distance of 7 m: 7.04 m at an angle of 6.4° from the left-side camera and 7.26 m at an angle of 16.5° from the right-side camera. Using the above results, the AEV’s position was calculated to be 6.96 m from the center point. Figure 14 shows that a distance of 11.35 m was measured at an angle of 15.3° from the left-side camera and 11.21 m was measured at an angle of 10.1° from the right-side camera, which was then calculated to be around 11.13 m from the center point. This result showed a similar distance value when compared to the actual position.
Table 2 shows the measurement results of the preceding experiment after 30 iterative measurements at each position. The errors generated by measuring a different angle while the AEV’s distance remained constant were also included. The average and maximum errors are displayed in the results. In the experiment, the AEV was set to not deviate from the camera by more than 30°. When the angle exceeds 30°, the camera barely sees the marker, making recognition impossible. Consequently, the experiment was limited to angles of less than 30°. All of the average recognition errors were less than 2%, and the maximum errors were also less than 3.2%.

3.3. Precision Correction of the Mobile Station Based on Markers

To recognize a marker, the position is estimated using images shot simultaneously. If images taken at different times are used, the mobile station may move based on the computed results. Consequently, images shot simultaneously are used in the global shutter computation. Figure 15 depicts images recognized by a total of four cameras: left, right, front, and rear cameras, with a translation error in a marker for precise recognition. The angular error was minimized at the target point by the center point of the marker recognized by each camera, but the distance error occurred. The marker was found about 2 cm to the left and 5 cm to the right. Consequently, the mobile station had to move to the left.
Figure 15 shows the angular error between the AEV and mobile station. To correct the angular error that occurred here, the mobile station should be rotated based on the position of the AEV. The angular error in Figure 16′s image indicates that the angle between the mobile station and AEV was tilted to the left. Because the angular error was approximately 1.3°, the mobile station was rotated to reduce the angular error. After checking the distance error, the mobile station moves again after completing the rotation to reduce the translation error.
Figure 17 shows the marker for precise recognition after completing the precision move, in which the angular error was less than 0.1° and all markers are included with the error range (0.5 cm) at a distance of 3.4 cm. Thus, the precision move was determined as completed.
The error measured for marker recognition was a mean value after iterating total 50 measurements. From the marker, the angle was within 1° and the distance error was measured within 1 cm.

3.4. Docking between the AEV and Cabin Using the Marker-Based Mobile Station

The marker, which comprises three circles, is recognized to perform docking. Figure 18 depicts the image recognizing the center of a circle in a marker made up of circles without an angular error.
Figure 19 depicts the recognition of the marker where position and angular errors occur. Once the marker has calculated the position and angular errors, the cabin should be moved in the direction of the error to reduce it. Since an error may occur between the mobile station and the AEV when moving the cabin to reduce the error, the error should be corrected using docking, and the precision position-recognition error mentioned in Section 3.3 should be rechecked.
Docking between the cabin and AEV begins once the error is reduced using the marker shown in Figure 13. After 20 iterative experiments, the error between the image processing and real positions of the marker at the normal position was within 0.3 cm with an angular error of 0.5° or less. Consequently, this result demonstrated that docking can be accomplished using markers.

4. Conclusions

Future transportation systems capable of moving in the air are being researched to improve a congested ground transportation system. The future transportation system will look like this: for large cities with dense populations, transportation means will be divided into two types: one that moves between cities and one that moves within cities. AEVs are used to move quickly between cities, and by using a vertiport in a city, users can move to their destinations using GEVs within cities. The study proposes a mobile vertiport station that can be used with dual-mode flying cars. The mobile station is required to be moved to an AEV landing position after identifying the position of the AEV for various tasks such as moving landed AEVs, recharging, and cabin delivery. To accomplish this, the mobile station must be localized to the AEV’s position and position-corrected. It also necessitates an operation with omnidirection for smooth translation and rotation movements, even in confined spaces. To enable a move, the AEV’s relative position is calculated by estimating the AEV’s position using markers attached to the AEV and depth cameras in the mobile station. To improve position accuracy between the AEV and mobile station, the position is precisely corrected using precision-correction markers attached to the AEV and cameras in the mobile station while eliminating blind spots, reducing the possibility of collision with the AEV when the mobile station moves. It also includes a docking system for cabin delivery, which is an important function of dual-mode flying cars. Cabin delivery can be accomplished between the AEV and GEV using an algorithm that calculates and corrects position and angular errors between the mobile station and cabin using docking markers attached to the AEV. After 50 repeated experiments in which the mobile station moves to the AEV, the average error was controlled within the error range of 1° and 1 cm, and accurate position control was performed. When the cabin repeatedly docked the experiment to the AEV 20 times, the error ranges for docking, marker recognition, and positional correlation were within 0.5° and 0.3 cm, respectively. Consequently, both results were within the previously defined error ranges, demonstrating that the algorithm can be used as a system for precise position correction and cabin delivery of the dual-mode flying car’s mobile vertiport. In the future, we plan to conduct a study based on the actual external environment and weather in which the AEV can fly through camera filters and correction algorithms to increase the recognizable outdoor environment. In addition, we will conduct research on effective system management through the linking of the mobile station algorithm and control systems.

Author Contributions

Conceptualization, K.L. and H.B.; methodology, K.L.; software, H.B. and J.L.; validation, H.B., J.L., and K.L.; investigation, K.L.; writing—original draft preparation, H.B., J.L., and K.L.; writing—review and editing, H.B. and K.L.; supervision, K.L.; project administration, K.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by Korea Electrotechnology Research Institute (KERI) Primary research program through the National Research Council of Science & Technology (NST) funded by the Ministry of Science and ICT (MSIT) (No. 21A01067). This research was also supported by Unmanned Vehicles Core Technology Research and Development Program through the National Research Foundation of Korea (NRF) and Unmanned Vehicle Advanced Research Center (UVARC) funded by the Ministry of Science and ICT, the Republic of Korea (NRF-2020M3C1C1A01086539).

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.

References

  1. Geissinger, A.; Laurell, C.; Sandström, C. Digital Disruption beyond Uber and Airbnb—Tracking the long tail of the sharing economy. Technol. Forecast. Soc. Chang. 2020, 155, 119323. [Google Scholar] [CrossRef]
  2. Straubinger, A.; Rothfeld, R.; Shamiyeh, M.; Büchter, K.-D.; Kaiser, J.; Plötner, K.O. An overview of current research and developments in urban air mobility—Setting the scene for UAM introduction. J. Air Transp. Manag. 2020, 87, 101852. [Google Scholar] [CrossRef]
  3. Thipphavong, D.P.; Apaza, R.; Barmore, B.; Battiste, V.; Burian, B.; Dao, Q.; Verma, S.A. Urban air mobility airspace integration concepts and considerations. In Proceedings of the 2018 Aviation Technology, Integration, and Operations Conference, Atlanta, GA, USA, 25 June 2018; p. 3676. [Google Scholar]
  4. Silva, C.; Johnson, W.R.; Solis, E.; Patterson, M.D.; Antcliff, K.R. VTOL urban air mobility concept vehicles for technology development. In Proceedings of the 2018 Aviation Technology, Integration, and Operations Conference, Atlanta, GA, USA, 25 June 2018; p. 3847. [Google Scholar]
  5. Fu, M.; Rothfeld, R.; Antoniou, C. Exploring preferences for transportation modes in an urban air mobility environment: Munich case study. Transp. Res. Rec. 2019, 2673, 427–442. [Google Scholar] [CrossRef] [Green Version]
  6. Vascik, P.D.; Hansman, R.J. Scaling constraints for urban air mobility operations: Air traffic control, ground infrastructure, and noise. In Proceedings of the 2018 Aviation Technology, Integration, and Operations Conference, Atlanta, GA, USA, 25 June 2018; p. 3849. [Google Scholar]
  7. Brown, A.; Harris, W.L. Vehicle design and optimization model for urban air mobility. J. Aircr. 2020, 57, 1003–1013. [Google Scholar] [CrossRef]
  8. Balac, M.; Rothfeld, R.L.; Hörl, S. The prospects of on-demand urban air mobility in Zurich, Switzerland. In Proceedings of the 2019 IEEE Intelligent Transportation Systems Conference (ITSC), Auckland, New Zealand, 27–30 October 2019. [Google Scholar]
  9. Shamiyeh, M.; Rothfeld, R.; Hornung, M. A performance benchmark of recent personal air vehicle concepts for urban air mobility. In Proceedings of the 31st Congress of the International Council of the Aeronautical Sciences, Belo Horizonte, Brazil, 9–14 September 2018. [Google Scholar]
  10. Moore, M. 21st century personal air vehicle research. In Proceedings of the AIAA International Air and Space Symposium and Exposition, The Next 100 Years, Dayton, OH, USA, 14–17 July 2003; p. 2646. [Google Scholar]
  11. Lewe, J.H.; Ahn, B.; DeLaurentis, D.; Mavris, D.; Schrage, D. An integrated decision-making method to identify design requirements through agent-based simulation for personal air vehicle system. In AIAA’s Aircraft Technology, Integration, and Operations (ATIO) 2002 Technical Forum; The Ohio State University: Columbus, OH, USA, 2002; p. 5876. [Google Scholar]
  12. Lee, J.-H.; Cho, G.-H.; Lee, J.-W. Development and prospect of personal air vehicle as next generation transportation. J. Korean Soc. Aeronaut. Space Sci. 2006, 34, 101–108. [Google Scholar]
  13. Kohout, L.; Schmitz, P. Fuel cell propulsion systems for an all-electric personal air vehicle. In Proceedings of the AIAA International Air and Space Symposium and Exposition, The Next 100 Years, Dayton, OH, USA, 14–17 July 2003; p. 2867. [Google Scholar]
  14. Cha, J.; Yun, J.; Hwang, H.-Y. Initial sizing of a roadable personal air vehicle using design of experiments for various engine types. Aircr. Eng. Aerosp. Technol. 2021, 93, 1–14. [Google Scholar] [CrossRef]
  15. Yi, T.H.; Kim, K.T.; Ahn, S.M.; Lee, D.S. Technical development trend and analysis of futuristic personal air vehicle. Curr. Ind. Technol. Trends Aerosp. 2011, 9, 64–76. [Google Scholar]
  16. Ahn, B.; DeLaurentis, D.; Mavris, D. Advanced personal air vehicle concept development using powered rotor and autogyro configurations. In AIAA’s Aircraft Technology, Integration, and Operations (ATIO) 2002 Technical Forum; The Ohio State University: Columbus, OH, USA, 2002; p. 5878. [Google Scholar]
  17. Rice, S.; Winter, S.R.; Crouse, S.; Ruskin, K.J. Vertiport and air taxi features valued by consumers in the United States and India. Case Stud. Transp. Policy 2022, 10, 500–506. [Google Scholar] [CrossRef]
  18. Shao, Q.; Shao, M.; Lu, Y. Terminal area control rules and eVTOL adaptive scheduling model for multi-vertiport system in urban air Mobility. Transp. Res. Part C Emerg. Technol. 2021, 132, 103385. [Google Scholar] [CrossRef]
  19. Daskilewicz, M.; German, B.; Warren, M.; Garrow, L.A.; Boddupalli, S.-S.; Douthat, T.H. Progress in vertiport placement and estimating aircraft range requirements for eVTOL daily commuting. In Proceedings of the 2018 Aviation Technology, Integration, and Operations Conference, Atlanta, GA, USA, 25 June 2018; p. 2884. [Google Scholar]
  20. Lim, E.; Hwang, H. The selection of vertiport location for on-demand mobility and its application to Seoul metro area. Int. J. Aeronaut. Space Sci. 2019, 20, 260–272. [Google Scholar] [CrossRef]
Figure 1. Aerial electric vehicle’s (AEV) marker and position.
Figure 1. Aerial electric vehicle’s (AEV) marker and position.
Electronics 11 01837 g001
Figure 2. Mobile station and cameras. (a) the structure of the proposed mobile station, (b) the location of the camera that is attached and withdrawn from the mobile station.
Figure 2. Mobile station and cameras. (a) the structure of the proposed mobile station, (b) the location of the camera that is attached and withdrawn from the mobile station.
Electronics 11 01837 g002
Figure 3. Distance angle between the marker and camera’s center point.
Figure 3. Distance angle between the marker and camera’s center point.
Electronics 11 01837 g003
Figure 4. AEV markers for precise position correction of the mobile station.
Figure 4. AEV markers for precise position correction of the mobile station.
Electronics 11 01837 g004
Figure 5. How to move for precise position correction of the mobile station. (a) A situation in which the mobile station starts docking the AEV, (b) the mobile station recognizes the rear leg of the AEV, (c) the mobile station enters and camera images enter between the legs of the AEV, and (d) the camera on the mobile station recognizes all of the legs of the AEV.
Figure 5. How to move for precise position correction of the mobile station. (a) A situation in which the mobile station starts docking the AEV, (b) the mobile station recognizes the rear leg of the AEV, (c) the mobile station enters and camera images enter between the legs of the AEV, and (d) the camera on the mobile station recognizes all of the legs of the AEV.
Electronics 11 01837 g005
Figure 6. Camera position of the mobile station and the AEV marker’s position. (a) the position of the marker and camera as viewed from the front, (b) the position of the marker and camera from the inside.
Figure 6. Camera position of the mobile station and the AEV marker’s position. (a) the position of the marker and camera as viewed from the front, (b) the position of the marker and camera from the inside.
Electronics 11 01837 g006
Figure 7. Position error between the mobile station and AEV. (a) If the mobile station is docked to the AEV in the correct position, (b) if the mobile station and the AEV are docked and there is an error.
Figure 7. Position error between the mobile station and AEV. (a) If the mobile station is docked to the AEV in the correct position, (b) if the mobile station and the AEV are docked and there is an error.
Electronics 11 01837 g007
Figure 8. Marker for the docking system. (a) Circular marker for Cabin Docking, (b) connecting the center point of the circular marker.
Figure 8. Marker for the docking system. (a) Circular marker for Cabin Docking, (b) connecting the center point of the circular marker.
Electronics 11 01837 g008
Figure 9. The angular error between the center point of the circle and outermost rectangle.
Figure 9. The angular error between the center point of the circle and outermost rectangle.
Electronics 11 01837 g009
Figure 10. AEV, GEV, cabin, and mobile station used in the experiment.
Figure 10. AEV, GEV, cabin, and mobile station used in the experiment.
Electronics 11 01837 g010
Figure 11. AEV’s marker recognized from the mobile station. (a) RGB color Image, (b) Gray color Image.
Figure 11. AEV’s marker recognized from the mobile station. (a) RGB color Image, (b) Gray color Image.
Electronics 11 01837 g011
Figure 12. AEV’s marker recognized at a distance of 4 m from the mobile station. (a) Left Camera Image, (b) Right Camera Image.
Figure 12. AEV’s marker recognized at a distance of 4 m from the mobile station. (a) Left Camera Image, (b) Right Camera Image.
Electronics 11 01837 g012
Figure 13. AEV’s marker recognized at a distance of 7 m from the mobile station. (a) Left Camera Image, (b) Right Camera Image.
Figure 13. AEV’s marker recognized at a distance of 7 m from the mobile station. (a) Left Camera Image, (b) Right Camera Image.
Electronics 11 01837 g013
Figure 14. AEV’s marker recognized at a distance of 11 m from the mobile station. (a) Left Camera Image, (b) Right Camera Image.
Figure 14. AEV’s marker recognized at a distance of 11 m from the mobile station. (a) Left Camera Image, (b) Right Camera Image.
Electronics 11 01837 g014
Figure 15. Precision-control marker where a translation error occurs. (a) Image shot from the left front camera (angular error: 0.02°, distance of 2 cm); (b) image shot from the left rear camera (angular error: 0.01°, distance: 1.94 cm); (c) image shot from the right front camera (angular error: 0.02°, distance: 5 cm); (d) image shot from the left front camera (angular error: 0.03°, distance error: 5.06 cm).
Figure 15. Precision-control marker where a translation error occurs. (a) Image shot from the left front camera (angular error: 0.02°, distance of 2 cm); (b) image shot from the left rear camera (angular error: 0.01°, distance: 1.94 cm); (c) image shot from the right front camera (angular error: 0.02°, distance: 5 cm); (d) image shot from the left front camera (angular error: 0.03°, distance error: 5.06 cm).
Electronics 11 01837 g015
Figure 16. Precision-control marker where a rotation error occurs. (a) Image shot from the left front camera (angular error: 1.24°, distance of 1.3 cm); (b) image short from the left rear camera (angular error: 1.29°, distance: 3.7 cm); (c) image shot from the right front camera (angular error: 1.48°, distance: 3.1 cm); (d) image shot from the left front camera (angular error: 1.44°, distance error: 1.21 cm).
Figure 16. Precision-control marker where a rotation error occurs. (a) Image shot from the left front camera (angular error: 1.24°, distance of 1.3 cm); (b) image short from the left rear camera (angular error: 1.29°, distance: 3.7 cm); (c) image shot from the right front camera (angular error: 1.48°, distance: 3.1 cm); (d) image shot from the left front camera (angular error: 1.44°, distance error: 1.21 cm).
Electronics 11 01837 g016
Figure 17. Camera images of the mobile station after completing the precision move (angular error: 0.03°, at a distance of 3.4 cm).
Figure 17. Camera images of the mobile station after completing the precision move (angular error: 0.03°, at a distance of 3.4 cm).
Electronics 11 01837 g017
Figure 18. Marker recognition for docking.
Figure 18. Marker recognition for docking.
Electronics 11 01837 g018
Figure 19. Marker recognition of position and angular errors for docking. (a) Position error occurs when docking with a marker. (b) Angle error occurs when docking with a marker.
Figure 19. Marker recognition of position and angular errors for docking. (a) Position error occurs when docking with a marker. (b) Angle error occurs when docking with a marker.
Electronics 11 01837 g019
Table 1. Definition of parameters in the stereo algorithm.
Table 1. Definition of parameters in the stereo algorithm.
Parameter NameParameter Description
θ The angle between the camera object and the target
dDistance between the camera and object
lDistance between cameras
PPixel distance on the camera
αThe hypotenuse of focal length triangle
FCamera’s focal length
Table 2. Results of AEV’s distance estimation from the mobile station.
Table 2. Results of AEV’s distance estimation from the mobile station.
Distance (m)AVG. Error (m)Max Error (m)
2.50.050.08
40.070.08
70.110.13
110.160.19
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Bae, H.; Lee, J.; Lee, K. Marker-Based 3D Position-Prediction Algorithm of Mobile Vertiport for Cabin-Delivery Mechanism of Dual-Mode Flying Car. Electronics 2022, 11, 1837. https://doi.org/10.3390/electronics11121837

AMA Style

Bae H, Lee J, Lee K. Marker-Based 3D Position-Prediction Algorithm of Mobile Vertiport for Cabin-Delivery Mechanism of Dual-Mode Flying Car. Electronics. 2022; 11(12):1837. https://doi.org/10.3390/electronics11121837

Chicago/Turabian Style

Bae, Hyansu, Jeongwook Lee, and Kichang Lee. 2022. "Marker-Based 3D Position-Prediction Algorithm of Mobile Vertiport for Cabin-Delivery Mechanism of Dual-Mode Flying Car" Electronics 11, no. 12: 1837. https://doi.org/10.3390/electronics11121837

APA Style

Bae, H., Lee, J., & Lee, K. (2022). Marker-Based 3D Position-Prediction Algorithm of Mobile Vertiport for Cabin-Delivery Mechanism of Dual-Mode Flying Car. Electronics, 11(12), 1837. https://doi.org/10.3390/electronics11121837

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop