Figure 1.
The system flowchart.
Figure 1.
The system flowchart.
Figure 2.
Facial and gender recognition flowchart.
Figure 2.
Facial and gender recognition flowchart.
Figure 3.
Example of depth map vs. standard camera image.
Figure 3.
Example of depth map vs. standard camera image.
Figure 4.
(a) Camera image, (b) depth map, (c) depth map threshold of 1.5 m.
Figure 4.
(a) Camera image, (b) depth map, (c) depth map threshold of 1.5 m.
Figure 5.
Correa’s eight obstacles scenarios. (a) No obstacle; (b) One obstacle at the far right; (c) Two obstacles at the right, separated by a gap; (d) One obstacle at the far left; (e) Two obstacles at the left, separated by a gap; (f) An obstacle at the middle; (g) Two obstacles at the far right and far left; (h) Three obstacles separated by two gaps.
Figure 5.
Correa’s eight obstacles scenarios. (a) No obstacle; (b) One obstacle at the far right; (c) Two obstacles at the right, separated by a gap; (d) One obstacle at the far left; (e) Two obstacles at the left, separated by a gap; (f) An obstacle at the middle; (g) Two obstacles at the far right and far left; (h) Three obstacles separated by two gaps.
Figure 6.
Threshold depth maps and detected gaps.
Figure 6.
Threshold depth maps and detected gaps.
Figure 7.
The flowchart for robot’s obstacle avoidance.
Figure 7.
The flowchart for robot’s obstacle avoidance.
Figure 8.
The ANFIS-based fuzzy system for two rules.
Figure 8.
The ANFIS-based fuzzy system for two rules.
Figure 9.
An example of fuzzy-rules-based decision.
Figure 9.
An example of fuzzy-rules-based decision.
Figure 10.
Placement of sonar sensors near our robot’s base.
Figure 10.
Placement of sonar sensors near our robot’s base.
Figure 11.
An example of using fuzzy rules to combine sensors input.
Figure 11.
An example of using fuzzy rules to combine sensors input.
Figure 12.
The path our robot takes when using sonars alone vs. combining sensors.
Figure 12.
The path our robot takes when using sonars alone vs. combining sensors.
Figure 13.
The experimental robot.
Figure 13.
The experimental robot.
Figure 14.
Examples of facial recognition and gender classification.
Figure 14.
Examples of facial recognition and gender classification.
Figure 15.
Static obstacles for the robot.
Figure 15.
Static obstacles for the robot.
Figure 16.
The path the robot took with positions taken at 10 s interval.
Figure 16.
The path the robot took with positions taken at 10 s interval.
Figure 17.
The seventeen scenarios (T1–T17) where decisions of the robot were recorded.
Figure 17.
The seventeen scenarios (T1–T17) where decisions of the robot were recorded.
Figure 18.
Fuzzy inputs and decisions for combining sensor maps at T1–T17.
Figure 18.
Fuzzy inputs and decisions for combining sensor maps at T1–T17.
Figure 19.
The paths of the moving obstacles (blue, orange) and the robot (red) for the 11 scenarios.
Figure 19.
The paths of the moving obstacles (blue, orange) and the robot (red) for the 11 scenarios.
Table 1.
Comparison of Gender Classification Accuracy Using Different Features.
Table 1.
Comparison of Gender Classification Accuracy Using Different Features.
Feature (s) | Classification Accuracy | Feature Points |
---|
LBP | 91.8862% | 2891 |
HOG | 90.0464% | 1548 |
LBP + HOG + p-value | 92.6829% | 1365 |
Table 2.
Fuzzy rules for using depth map alone.
Table 2.
Fuzzy rules for using depth map alone.
1 | When gap width is less than robot width, and the absolute depth difference between neighbors is less than 400, then output LOW. |
2 | When gap value is less than robot width, and the absolute depth difference between neighbors is between 400 and 700, then output LOW. |
3 | When gap value is less than robot width, and the absolute depth difference between neighbors is greater than 700, then output LOW. |
4 | When gap value is around robot width, and the absolute depth difference between neighbors is less than 400, then output LOW. |
5 | When gap value is around robot width, and the absolute depth difference between neighbors is between 400 and 700, then output LOW. |
6 | When gap value is around robot width, and the absolute depth difference between neighbors is greater than 700, then output HIGH. |
7 | When gap value is greater than robot width, and the absolute depth difference between neighbors is less than 400, then output HIGH. |
8 | When gap value is greater than robot width, and the absolute depth difference between neighbors is between 400 and 700, then output HIGH. |
9 | When gap value is greater than robot value, and the absolute depth difference between neighbors is greater than 700, then output HIGH. |
Table 3.
Fuzzy rules for combining sensors.
Table 3.
Fuzzy rules for combining sensors.
1 | If the gap within depth map is greater than robot width, and the left sonar input is high, and the right sonar input is high, then Wd = 1.0 Ws = 0.0. |
2 | If the gap within depth map is greater than robot width, and the left sonar input is high, and the right sonar input is low, then Wd = Ws = 0.5. |
3 | If the gap within depth map is greater than robot width, and the left sonar input is low, and the right sonar input is high, then Wd = Ws = 0.5. |
4 | If the gap within depth map is greater than robot width, and the left sonar input is low, and the right sonar input is low, then Wd = Ws = 0.5. |
5 | If the gap within depth map is less than robot width, and the left sonar input is low, and the right sonar input is low then, Wd = 0.0, Ws = 0.5. |
6 | If the gap within depth map is less than robot width, and the left sonar input is high, and the right sonar input is low then, Wd = Ws = 0.5. |
7 | If the gap within depth map is less than robot width, and the left sonar input is low, and the right sonar input is low then, Wd = Ws = 0.5. |
8 | If the gap within depth map is less than robot width, and the left sonar input is high, and the right sonar input is high then, Wd = Ws = 0.5. |
Table 4.
Sensors Inputs and Decision for Scenarios T1–T17.
Table 4.
Sensors Inputs and Decision for Scenarios T1–T17.
Scenario | Left Obstacle Depth Value | Right Obstacle Depth Value | Gap Normalized to Robot Width | Left Sonar Normalized | Right Sonar Normalized | Final Decision |
---|
T1 | N/A | 317 | 1.41 | 1 | 1 | Forward |
T2 | 97.6 | 10,000 | 1.67 | 0.2 | 1 | Right Turn |
T3 | N/A | 106.3 | 1.20 | 0.8 | 0.44 | Left Turn |
T4 | N/A | N/A | 0.0 | 0.22 | 0.22 | Forward |
T5 | 128.9 | N/A | 0.0 | 0.2 | 0.62 | Right Turn |
T6 | 122.1 | 125.6 | 1.64 | 0.4 | 0.4 | Forward |
T7 | 139.9 | 176.3 | 1.56 | 0.58 | 0.66 | Forward |
T8 | 272.2 | N/A | 2.24 | 0.4 | 0.36 | Forward |
T9 | N/A | N/A | 0.0 | 0.3 | 0.22 | Forward |
T10 | N/A | 188.8 | 1.67 | 1 | 0.5 | Forward |
T11 | 163.0 | 160.3 | 1.05 | 0.5 | 0.38 | Forward |
T12 | 100.1 | N/A | 1.33 | 0.08 | 0.13 | Right Turn |
T13 | 193.1 | 1000.0 | 0.84 | 0.4 | 1 | Right Turn |
T14 | 163.5 | 196.4 | 0.83 | 0.54 | 0.4 | Forward |
T15 | N/A | N/A | 0.0 | 0.04 | 0.22 | Forward |
T16 | N/A | N/A | 0.0 | 0.22 | 0.22 | Forward |
T17 | 115.9 | N/A | 1.13 | 0.22 | 0.42 | Right Turn |
Table 5.
Comparisons between Correa, Csaba, and our proposed method on T1–T17 scenarios.
Table 5.
Comparisons between Correa, Csaba, and our proposed method on T1–T17 scenarios.
Scenario | Correa | Csaba | Our Method |
---|
T1 | Fail! | Success | Success |
T2 | Fail! | Fail! | Success |
T3 | Fail! | Success | Success |
T4 | Success | Success | Success |
T5 | Fail! | Fail! | Success |
T6 | Fail! | Fail! | Success |
T7 | Success | Success | Success |
T8 | Success | Success | Success |
T9 | Success | Success | Success |
T10 | Fail! | Success | Success |
T11 | Success | Success | Success |
T12 | Fail! | Success | Success |
T13 | Fail! | Fail! | Success |
T14 | Fail! | Success | Success |
T15 | Fail! | Success | Success |
T16 | Success | Success | Success |
T17 | Fail! | Success | Success |
T1 | Success | Success | Success |
Table 6.
Eleven scenarios for dynamic moving obstacles.
Table 6.
Eleven scenarios for dynamic moving obstacles.
Scenario # | Scenario |
---|
1 | Single Obstacle Moving Towards the Robot |
2 | Single Obstacle Moving Fast from Right of the Robot |
3 | Single Obstacle Moving Slow from Right of the Robot |
4 | Single Obstacle Moving Fast from Left of the Robot |
5 | Single Obstacle Moving Slow from Left of the Robot |
6 | Dual Obstacles Moving Towards the Robot Then Separates |
7 | Dual Obstacles Moving From Left and Right of the Robot Then Crisscross |
8 | Dual Obstacles Moving From Left of the Robot At The Same Speed |
9 | Dual Obstacles Moving From Left of the Robot At Different Speeds |
10 | Dual Obstacles Moving From Right of the Robot At The Same Speed |
11 | Dual Obstacles Moving From Right of the Robot At Different Speeds |