Gaoijcccv12n6.pdf INTERNATIONAL JOURNAL OF COMPUTERS COMMUNICATIONS & CONTROL ISSN 1841-9836, 12(6), 813-823, December 2017. A Rapid Recognition of Impassable Terrain for Mobile Robots with Low Cost Range Finder Based on Hypotheses Testing Theory Y. Gao, X.Y. Wu, Y. Liu, J. M. Li, J. H. Liu Yang Gao* School of Automobile, Chang’an University, Xi’an, Shaanxi Province, 710061,China *Corresponding author: nchygy@chd.edu.cn Xueyi Wu, Yu Liu, Jian Ming Li, Jia Hao Liu School of Automobile, Chang’an University, Xi’an, Shaanxi Province, 710061,China Abstract: The recognition of impassable terrain plays a vital role in the motion planning of a mobile robot, which generally relies on expensive sensors such as stereo vision cameras. This paper proposes a rapid impassable terrain recognition algorithm based on hypotheses testing theory using low-cost range finders with different diffusion angles. In this algorithm, a slope estimation model using two range finders mounted on different heights is first established, where the influence of the diffusion angle of the range sensor is considered. To deal with inaccurate measuring from low cost range finders, the hypothesis testing theory is then applied to judge whether there is an impassible terrain approaching, where the historical slope estimation results are treated as a sample set of the same slope, and the judgement of impassible terrain is then made based on the sampling set rather than the concurrent slope estimation. So the robot is only required to count the number of slope estimation that support the determination of a terrain as being impassible, and the judgement is confirmed only when that number is larger than a precisely designed threshold value. Then the stable recognition for impassable terrain would be acquired while the risk of wrong judgement is limited. The experiments’ results indicate that this algorithm can provide a reliable recognition of impassable terrain using lower cost range finders with different diffusion angles with minimal computation. Keywords: Impassable terrain, range finder, slope, hypothesis testing. 1 Introduction Recognition of impassable terrain, which generally includes rapid vertical variations, is a critical and fundamental issue for mobile robots. Over the previous few decades, techniques and theories for terrain recognition of intelligent automatic vehicles or robots have achieved many successes. [8] presented a three-dimensional (3-D) vision technique for incrementally building an accurate 3-D representation of rugged terrain using multiple sensors. This technique demon- strated the desired accuracy for self localization tasks but is too complex for rapid recognition of impassable terrain . [17] proposed a method for the retrieval of 3-D surface models of the earth using optical and radar imaging sensors. However, this requires 3-D sensor and is difficult to fulfill the efficiency requirement. [2] proposed a method for building a 3-D world model for a mobile robot through sensory data derived from outdoor images using a camera. [21] proposed a method for terrain recognition using a 3-D reconstruction technique that requires 3-D sensors and shows poor efficiency. [11] proposed an overview of stereo vision, stereo image processing Copyright © CC BY-NC 814 Y. Gao, X.Y. Wu, Y. Liu, J. M. Li, J. H. Liu as well as a method of obstacle avoidance and navigation based on stereo vision for a mobile robot. [4] proposed a novel navigation framework for autonomous vehicles where a stereo camera is used for both slope estimation and obstacle detection. [12] proposed an approach for obstacle detection in a greenhouse environment, where the depth data provided by the Kinect 3D sensor are used for slope estimation. [16] proposed a novel method for analyzing a mobile robot’s ability to travers rough terrain where the roughness and the slope are calculated using an elevation map built by a laser range finder. [9] proposed a hierarchical, decentralized and networked controled system for mobile robots where the slope of terrain is detected using a laser range finder. This approach shows good robustness for various environments but needs much more computational resources. [15] proposed a terrain recognizer where a database consisting of five terrain classes is generated, then depth images are enhanced using confidence map based filtering technology. This approach classifies the terrain into five types, but expensive depth sensing and much com- putational resources are required. [10] proposed a survey of the distribution of slope angles in the Alishan area using airborne Lidar and aerial digital camera. [20] proposed a novel approach for obstacle detection using optical flow without recovering range information. This approach shows good efficiency but is not reliable without recovering range information. [18] proposed a collision-free area detection algorithm for an autonomous mobile robot, using a depth camera. However, depth cameras are expensive and may lead to much computational capacity, while the accuracy would gradually decrease with increasing distance. [7] proposed a multisensory data-fusion system for driving an autonomous earthwork vehicle operating in a sanitary landfill, using ultrasonic sensors, a laser range finder, and several charge-coupled device cameras. [13] proposed a method for recognition of uneven terrain using 3D laser scanner. [23] proposed a novel but complex method for real-time obstacle detection and recognition, where several key obstacle characteristics are identified based on image information and geometric information us- ing support vector machine (SVM). [14] proposed a novel algorithm to qualitatively categorize the terrain type in real-time, using high fidelity sensors. [11] proposed an application of an image registration method for a mobile manipulator, where the range image may be interpreted for object recognition using ultrasonic range finders. [5] proposed a real time terrain recognition method for a mobile robot moving on uneven terrain to capture a terrain depth image using a RGB-D sensor, which may result in poor performance as the environment changes. [1] proposed a probabilistic framework in which the visual information and the mechanical supervision interact to learn the terrain types. It may be concluded that most existing research rely on sensors generally including: 2D sensors such as vision sensors, range finders, laser range finders, ultrasonic range finders, etc; 3D sensors such as 3D vision sensors, 3D laser range finders, etc. The 3D sensor is generally expensive but provides abundant information for the task. As to the 2D sensors, the laser range finder can provide precise angle measuring but is much more expensive. The ultrasonic range finder, on the other hand, is widely used due to its lower cost and can acquire distance measuring along the wave propagation direction of the ultrasonic range which causes a diffusion angle and provides only slightly less accurate measuring in respect of both distance and angle. So, for these lower cost 2D range finders which may come up with different diffusion angles, such as the ultrasonic and infrared range finders, it is important to develop an algorithm for the reliable recognition of the impassable terrain, provide a good conditions for path planning [19], [6], [3]. 2 Mathematical model of recognition In this paper, descent terrain is not discussed because of the limited sensor ability. We can define the impassable terrain as a terrain with a slope angle larger than a threshold value of θT , which is set according to the travel capability of the robot. So, impassable terrain may be A Rapid Recognition of Impassable Terrain for Mobile Robots with Low Cost Range Finder Based on Hypotheses Testing Theory 815 recognized according to the slope estimation result. 2.1 Slope estimation based on range finder with various diffusion angles As the slope estimation algorithm will be proposed in another paper, a brief introduction is proposed as follows. A robot coordinate system is mounted on the mobile robot as shown in Figure 1, where the orientation of the robot is the X axis, the origin point is settled on the floor under the wheels, and the Y axis is settled along the height of the robot. Assume there are two range finders mounted on the robot at different heights, as shown in Figure 1, whose positions are OA(XA, YA) and OB(YB, YB) respectively and the vertical difference is OAOB = h. Assume the measurement of sensor A is DA(mm), whose diffusion angle is ΦA, the measurement of sensor B is DB(mm), whose diffusion angle is ΦB. Sensor A for instance detects a sector such as a spreading beam as shown as Figure 1. Let points a and b denote the two boundary contact points of the spreading beam and the obstacle, among which the lower one is a, and the upper one is b. For an obstacle whose slope is θ, the beams of the two sensors are shown in Figure 2. So, for sensor A, the horizontal and vertical distances between sensor A and contact point a, respectively denoted as DXA(mm) and DYA, is expresses as formula (1). DXA = |DA|cos ( ΦA 2 ) DYA = |DA|sin ( ΦA 2 ) (1) Similarly, sensor B forms the two boundary contact points c and d, where the horizontal and vertical distances between sensor B and the lower contact point c is expressed as formula (2). DXB = |DB|cos ( ΦB 2 ) DYB = |DB|sin ( ΦB 2 ) (2) Let ∆x,∆y denote the difference between the two boundary contact points a and c along the X axis and Y axis respectively. Then, formula (3) can easily be obtained, where Xa, Xc, Ya, Yc, are the X and Y coordinates of the two boundary contact points a and c respectively. ∆x = Xa−Xc = XA + DXA−XB−DXB ∆y = Ya−Yc = YA−DYA−YB + DYB (3) Accordingly, the slope of the obstacle θ can be obtained through formula (4). θ = arctan ( ∆y ∆x ) (4) 2.2 Impassable terrain recognition based on hypothesis testing theory using unreliable slope estimation It is generally the case that sensors can only provide inaccurate measurements, resulting in inaccurate estimations of the slope. To address this problem, Gaussian noise is used to describe the inaccurate nature, whose distribution can be expressed by formula (5), where µ and δ denote the expectation and the variance respectively. In this paper, µ is supposed to be the estimation 816 Y. Gao, X.Y. Wu, Y. Liu, J. M. Li, J. H. Liu Figure 1: One sensor diffusion angle Figure 2: Two sensors diffusion angles of the slope by using the algorithm described above, implying that the estimation results most likely reflect the actual slope, while δ implies the possible distribution of the actual slope. N(µ, σ) = 1√ 2πσ2 e −(X−µ) 2 2σ2 (5) Phit(r, r ∗) = { η1N(r, r ∗, δhit) 0 ≤ r ≤ rmax 0 otherwise (6) η1 = ( ∫ rmax 0 N(r, r∗, δhit)dr) −1 (7) N(r, r∗, δhit) = ∫ r 0 1√ 2πδ2 hit e − (r−r ∗) 2 2δ2 hit dr (8) So, the inaccurate sensor information can be modeled by adding a Gaussian noise. Let Phit(r, r ∗) denote the possibility that r is a noisy observation to r∗ (r∗ represents the actual value of the sensor, rmax represents the maximum measured value of the sensor, r∗ and δhit denote the expectation and the variance respectively of the sensor), which can be obtained by formula (6). The normalizer η1 is expressed as formula (7). The Gaussian observed noise N(r, r∗, δhit), as expressed by formula (8), shows that the further r is from r∗, the less Phit is. Let the wave line superscript denote a Gaussian distributed random variable, while the sub- script k denotes the time step. Then, (θ̃k − N(θk, δ)) denotes the possible distribution of the correct slope on time k affected by the Gaussian noise, as shown in formula (5), whose expectation is the slope estimation value θk. Assume there are n independent historical slope estimations for the same terrain, while the terrain is supposed to be static. These estimations can then be treated as n estimation samples ({θ̃k ∣ ∣ ∣ k = 0...n}) of the same one terrain confronting the robot. For each observation of the same terrain, applying formulas (1) to (4) to estimate its slope can yield only two opposite results; the terrain with a slope that is passable or impassable. Let p denote the probability that the estimation result θ̃k indicates an impassable terrain. Let M denote the number of estimation results that indicates it is an impassable terrain. Then, the probability, that all the n samples are found to indicate an impassable terrain, conforms to a binomial distribution B(n, p). Considering the inaccurate nature of the estimation process, if p is smaller than a threshold value Pu, a passable terrain can be confirmed. So, formula (9) is applied to judge whether there is an impassable terrain. Here the threshold value Pu being approximate to 0 implies a low likelihood of misrecognition. The hypothesis testing theory can then be applied for a reliable recognition of the terrain type based on the n estimation samples as follows. p ≤ Pu (9) A Rapid Recognition of Impassable Terrain for Mobile Robots with Low Cost Range Finder Based on Hypotheses Testing Theory 817 Figure 3: Relationship between K and P0.05(A) Let hypothesis H0 be: a passable terrain is detected (p ≤ Pu). Let event A be: the number of the estimation samples that have proved to indicate an impassable terrain (θ̃k ≤ θT ), is larger than the threshold value K. Let A be the opposite event of A. Let Pp(A) denote the probability of A with p given. Then formulas (10) to (12) can be induced where C j n denotes the extraction of j samples from n samples. According to the hypothesis testing theory, Pp(A) also denotes the probability of rejecting H0 and A happening while p is given. Similarly, Pp(A) denotes the probability of accepting H0 and A happening while p is given. Pp(A) = n ∑ j=K+1 C j n(1−p)n−jpj (10) Pp(A) = K ∑ j=0 C j n(1−p)n−jpj (11) sup p≤Pu P p(A) ≤ α (12) There are two types of error that may occur. The first type is H0 may be rejected and A has happened while H0 is in fact true (p ≤ Pu). The second type is H0 may be accepted and A has happened while H0 is in fact untrue (p > Pu). The hypothesis testing theory has proved that the probability of the both types of error cannot be decreased simultaneously, and the decreasing of the probability of the first type should be guaranteed. Thus, we can confine the probability of the first type under a manually selected level α as expressed by formula (12). As Pp(A) is in proportion to p, let Ppu = α, then formula (12) is fulfilled. Let p = Pu, then K can be obtained from formula (10). As a result, it can be concluded that, with the given α if M ≤ K, then a passible terrain is confirmed to be detected; otherwise an impassable terrain is confirmed. For instance, if n = 100,Pu = 0.05,α = 0.01, then K = 10 would result (Figure 3 shows the relationship between K and P0.05(A) . So an impassable terrain could be confirmed only if M is larger than K. 3 Experimental study To validate the algorithm, several experiments are described in this section. All these experi- ments were carried out in our laboratory, which is a typical indoor environment. A Pioneer 3-DX Mobile robot (Figure 4), equipped with six ultrasonic range finders (Figure 5) and a HOKUYO UTM-30LX laser range finder (Figure 6), are used in the experiments. The laser range finders are mounted on the top of the robot, which is capable of providing a long maximum measuring 818 Y. Gao, X.Y. Wu, Y. Liu, J. M. Li, J. H. Liu range (about 30m) with very limited deviation (up to 50mm). Meanwhile, the maximum sensing range of the ultrasonic range finder is about 5m. The diffusion angle of the laser range finder is so limited that it could be considered as zero, while that of the ultrasonic range finder is 4.6 °. In the following experiment, the number 3 ultrasonic range finder, as shown in Figure 5, and the measuring data of the laser range finders, along the orientation of that ultrasonic sensor, are used. Both of the two range finders are mounted along the Y axis with a height difference of 60mm. The principle of laser and ultrasonic ranging is expressed by formula (13). D = νt/2 (13) Where D represents the distance value, v represents the speed of light propagation or sound wave propagation, t represent the time of light propagation or sound wave propagation. Figure 4: Mobile robot Figure 5: Ultrasonic range finder Figure 6: Laser range finder 4 Results and discussion 4.1 Slope Estimation Experiments Figure 7 shows an example of the inaccurate measuring of the ultrasonic range finder where an object is located in front of the robot at a distance about 500mm. The horizontal ordinate of Figure 7 denotes the elapsed time, while the vertical ordinate denotes the range measuring. The blue line shows the measuring of the ultrasonic sensor where a large deviation appears. Table 1: Objects with different slopes object index 1 2 3 4 5 6 7 8 9 slope(o) 6 7 8 9 10 11 12 13 14 Table 2: Different distance between the robot and the objects distance index 1 2 3 4 5 6 7 8 9 Distance (mm) 500 1000 1500 2000 2500 3000 3500 4000 4500 This section describes several groups of experiments that were conducted. In the first group, nine objects with different slopes were placed for estimation (Table 1) to show the large deviation of estimation result (Figure 8). In the second group, four objects with different slopes were placed to show the performance of the proposed algorithm where 60 historical samples were adopted for each object (Figure 9, 10, 11). It is generally the case that the robot has to distinguish the impassable terrain from the passable ones while the θT is set to 10o. So four objects whose slopes are less than 10o were chosen to represent the passable terrain, during which the slope increase A Rapid Recognition of Impassable Terrain for Mobile Robots with Low Cost Range Finder Based on Hypotheses Testing Theory 819 Figure 7: Ultrasonic sensor measuring value Figure 8: Deviation between the estimated slopes and the actual slopes intervals are set to 1o separately. Meanwhile, four objects whose slopes are larger than 10o were also chosen to represent the impassable terrain, during which 1o is set to be the slope increase interval. The accuracy of the measuring may be getting worse with the growing distance. So, we set a group of increasing distances for the objects as shown in Table 1. The ultrasonic range finder can only operate within 5m, so the largest distance is set to 4.5m while the distance increase interval is set to 500mm, as shown in Table 2. The nine objects with different slopes (Table 1), were set in front of the robot at different distances (Table 2). All the deviations between the estimation results and the actual slopes are shown in Figure 8, in which the horizontal ordinate denotes the actual horizontal distance (mm) between the objects and the sensors while the vertical ordinate denotes the deviation (o). The nine lines in Figure 8 show the different deviations with different object slopes. Overall, Figure 8 shows that larger distances result in larger errors due to the larger deviation in the sensor measuring. Moreover, a poor performance occurs when the slope of object is less than 10o. This is mainly because of the misreading of the ultrasonic sensor caused by the specular reflection of the ultrasonic beam toward any direction other than toward the sensor. On the contrary, a better performance was observed when the slope is larger than 10o, in which case the ultrasonic sensor is less likely to provide a misreading. However, an even better performance was observed when the distance is less than 2000mm, even if the slope is less than 10o. This is mainly because the diffused reflected ultrasonic beam has been received by the sensor although most of the ultrasonic beam has been reflected to some other direction by the specular reflection. Fortunately, obstacles and impassable terrain generally refer to a surface whose slope is larger than 10o. 820 Y. Gao, X.Y. Wu, Y. Liu, J. M. Li, J. H. Liu Figure 9: Estimation result for 8o Slope Figure 10: Estimation result for 10o Slope Figure 11: Estimation result for 12o Slope 4.2 Terrain type recognition experiments Figure 9, 10, 11 shows three objects set in front of the robot with different slopes close to the, a situation which is more likely to result in a wrong judgement. In each experiment, 60 historical slope estimation results were acquired for each object from which 50 estimation results were used to apply to the hypothesis testing. So, for each recognition, the number of samples n = 50 , Pu = 0.03, α = 0.05 then K = 4 could be obtained. Table 3, 4, 5 shows the results of three groups of experiments for three objects where each group consists of three experiments. In Table 3, an object with 8o slope is used, and only one result among the sampling sets of each experiment was larger than θT . As a result, this object was confirmed to be passable terrain in all three experiments. In Table 4, an object with 10o slope is used instead, which resulted in more than 10 wrong slope estimation in each experiment, where the slope estimation result is less than θT . However, with the help of the hypothesis testing theory, our algorithm successfully made right recognition for the type of the terrain in all three experiments. In Table 5, an object with 12o slope is used instead, which resulted in only one wrong slope estimation in both the first and the third experiments, where the slope estimation result is less than θT . Thus, the proposed approach confirms that this is an impassable terrain. Table 3: The recognition results to objects with actual slope 8o Experiment Index M Result 1 1 Passable 1 1 Passable 1 1 Passable A Rapid Recognition of Impassable Terrain for Mobile Robots with Low Cost Range Finder Based on Hypotheses Testing Theory 821 Table 4: The recognition results to objects with actual slope 10o Experiment Index M Result 1 32 Impassable 1 36 Impassable 1 34 Impassable Table 5: The recognition results to objects with actual slope 12o Experiment Index M Result 1 49 Impassable 1 50 Impassable 1 49 Impassable 5 Conclusion This paper proposes an impassible terrain recognition algorithm for a mobile robot. Based on this algorithm, historical slope estimation results are used to lessen the influence of inaccurate estimation according to the hypothesis testing theory. This allows different types of low cost range finders with different diffusion angles and inaccurate measuring to be used in the task of recognizing impassible terrain. Because the simple slope estimation model is employed and only the estimations referring to impassable terrain are counted, this algorithm is rapid and easily implemented. Experiments have demonstrated that the proposed algorithm increases the reliability in most cases with an inexpensive mean for robots to recognize impassable terrain. Acknowledgments This work is supported by The National Natural Science Fundation of China (61503043); Natural Science Foundation of Shaanxi Provincial (2015JQ6214); Shaanxi Natural Science Foun- dation (2017JM7016); Fundamental Research Funds for the Central Universities (310822172204). Bibliography [1] Angelova A., Matthies L., Helmick D., Perona P. (2008); Dimensionality Reduction Using Automatic Supervision for Vision-Based Terrain Learning, Science and Systems, 3, 225-232, 2008. [2] Asada M. (1990); Map building for a mobile robot from sensory data, IEEE Transactions on Systems Man and Cybernetics, 20(6), 1326-1336, 1990. [3] Awasthy M. (2015); Automatic Obstacle Detection for a Star Algorithm Using Digital Image Processing, International Journal of Heat and Technology, 58(1), 84-93, 2015. [4] Cao T., Xiang Z.Y., Liu J.L. (2015); Perception in disparity: An efficient navigation frame- work for autonomous vehicles with stereo cameras, IEEE Transactions on Intelligent Trans- portation Systems, 16(5), 2935-2948, 2015. [5] Chang J.W., Wang R.J., Wang W.J. (2016); A real time terrain recognition method for mobile robot moving, ICSSE, Puli, 1-4, 2016. 822 Y. Gao, X.Y. Wu, Y. Liu, J. M. Li, J. H. Liu [6] Curkovic P., Jerbic B. (2007); Honey-bees Optimization Algorithm Applied to Path Planning Problem, International Journal of Simulation Modelling, 6(3), 154-164, 2007. [7] Foresti G.L., Regazzoni C.S. (2012); Multisensor data fusion for autonomous vehicle naviga- tion in risky environments, IEEE Transactions on Vehicular Technology, 51(5), 1165-1185, 2012. [8] Kweon I.S., Kanade T. (1992); High resolution terrain map from multiple sensor data, IEEE Transactions on Pattern and Machine Intelligence, 14(2), 278-292, 1992. [9] Lee M.F.R., Chiu F.H.S. (2013); A networked intelligent control system for the mobile robot navigation, IEEE/SICE International Symposium on System Integration (SII), Kobe, 42-47, 2013. [10] Liu J.K., Shih T.Y., Liao Z.Y. (2008), The Geomorphometry of Rainfall-Induced Landslides in Alishan Area Obtained by Airborne Lidar and Digital Photography, IGARSS, Boston, 1220-1223, 2008. [11] Najjaran H., Kircansk N. (2001); Map building for a terrain scanning robot, IEEE Interna- tional Conference on Robotics and Automation, SEOUL, 3728-3733, 2001. [12] Nissimov S., Goldberger J., Alchanati V. (2015); Obstacle detection in a greenhouse envi- ronment using the Kinect sensor, Computers and Electronics in Agriculture, 16, 104-115, 2015. [13] Reddy S.K., Pal P.K. (2016); Computing an unevenness field from 3D laser range data to obtain traversable region around a mobile robot, Robotics and Autonomous Systems, 84, 48-63, 2016. [14] Sadhukhan D., Moore C., Collins E. (2004); Terrain estimation using internal sensors, 10th IASTED International Conference on Robotics and Applications, Honolulu, 195-199, 2004. [15] Saudabayev A., Kungozhin F., Nurseitor D. (2015); Locomotion Strategy Selection for a Hybrid Mobile Robot Using Time of Flight Depth Sensor, Journal of Sensors, 1-14, 2015. [16] Tanaka Y., Ji Y., Yamashita A. (2015); Fuzzy based traversability analysis for a mobile robot on rough terrain, IRCA, Seattle, 3965-3970, 2015. [17] Tison C., Tupin F., Maitre H. (2007); A fusion scheme for joint retrieval of Urban height map and classification from high-resolution interferometric SAR images, IEEE Transactions on Geoscience and Remote Sensing, 45(2), 496-505, 2007. [18] Tokuda S., Kinoshita T., Kobayashi K. (2014); Development of collision-free-area detection algorithm for mobile robot, Joint 7th International Conference on Soft Computing and Intel- ligent Systems (SCIS) and 15th International Symposium on Advanced Intelligent Systems, 565-568, 2014. [19] Wang C., Mao Y.S., Du K.J. (2016); Simulation On Local Obstacle Avoidance Algorithm For Unmanned Surface Vehicle, International Journal of Simulation Modelling, 15(3), 460-472, 2016. [20] Young G.S., Hong T.H., Herman M. (1992); Obstacle detection for a vehicle using optical flow, Intelligent Vehicles ’92 Symposium, Detroit, 185-190, 1992. A Rapid Recognition of Impassable Terrain for Mobile Robots with Low Cost Range Finder Based on Hypotheses Testing Theory 823 [21] Zhao Y., Cheng W. (2012); The navigation of mobile robot based on stereo vision, ICICTA , Zhangjiajie, 670-673, 2012. [22] Zhao Y., Cheng W., Jia L., Ma S. (2010); The obstacle avoidance and navigation based on stereo vision for mobile robot, ICOIP, Haiko, 565-568, 2010. [23] Zhu J., Wang Y., Yu H. (2010); Obstacle detection and recognition in natural terrain for field mobile robot navigation, WCICA, Jinan, 6567-6572, 2010.