Scholarly article on topic '3D environment mapping and self-position estimation by a small flying robot mounted with a movable ultrasonic range sensor'

3D environment mapping and self-position estimation by a small flying robot mounted with a movable ultrasonic range sensor Academic research paper on "Computer and information sciences"

CC BY-NC-ND
0
0
Share paper
Keywords
{"Movable ultrasonic range sensor" / "3D environment mapping" / Self-localization / "Small flying robot"}

Abstract of research paper on Computer and information sciences, author of scientific article — Kazuya Nakajima, Chinthaka Premachandra, Kiyotaka Kato

Abstract The light weight of ultrasonic sensors makes them useful for collecting environment information from mobile robots. Ultrasonic sensors are generally used in a circular formation in surface-moving robots, but this is not suitable for small flying robots, which require small size and light weight. Here we created a movable ultrasonic range sensor by combining a small, lightweight servomotor and a single ultrasonic range sensor. This sensor could perform 360° measurements of the distance between objects and the robot. We furthermore constructed a measurement system to perform 3D environment mapping and self-localization by equipping a small flying robot with this movable ultrasonic range sensor and a ground-facing ultrasonic range sensor for altitude measurements. We verified the system by means of a flight test and found that 3D environment mapping and self-localization were realized in real time.

Academic research paper on topic "3D environment mapping and self-position estimation by a small flying robot mounted with a movable ultrasonic range sensor"

""'CS Research 4

Available online at www.sciencedirect.com

ScienceDirect

Journal of Electrical Systems and Information Technology xxx (2017) xxx-xxx

3D environment mapping and self-position estimation by a small flying robot mounted with a movable ultrasonic range sensor

Kazuya Nakajima, Chinthaka Premachandra *, Kiyotaka Kato

Dept. of Electrical Engineering, Graduate School of Engineering, Tokyo University of Science, 6-3-1 Niijuku, Katushika-ku, Tokyo 125-8585, Japan

Received 13 October 2015; accepted 10 January 2017

Abstract

The light weight of ultrasonic sensors makes them useful for collecting environment information from mobile robots. Ultrasonic sensors are generally used in a circular formation in surface-moving robots, but this is not suitable for small flying robots, which require small size and light weight. Here we created a movable ultrasonic range sensor by combining a small, lightweight servomotor and a single ultrasonic range sensor. This sensor could perform 360° measurements of the distance between objects and the robot. We furthermore constructed a measurement system to perform 3D environment mapping and self-localization by equipping a small flying robot with this movable ultrasonic range sensor and a ground-facing ultrasonic range sensor for altitude measurements. We verified the system by means of a flight test and found that 3D environment mapping and self-localization were realized in real time. © 2017 Electronics Research Institute (ERI). Production and hosting by Elsevier B.V. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).

Keywords: Movable ultrasonic range sensor; 3D environment mapping; Self-localization; Small flying robot

1. Introduction

Sensors that detect surrounding objects are essential for the autonomous flight of flying robots. Image sensors, laser rangefinders, and ultrasonic sensors are examples of such sensors. There has been extensive research on image processing by image sensors and comprehension of surroundings by laser rangefinders (Daisuke and Satoru, 2010; Fei et al., 2013; Schmid et al., 2013; Bastian et al., 2008; Chunrong et al., 2009), and recently advanced object recognition has also been performed using range image sensors (Salado et al., 2010; John et al., 2011; Ganganath and Leung, 2012). While these sensors have high precision, they also have the drawback of being large, heavy, and expensive. Also, image processing tends to be slow due to the amount of data handled. In contrast, small cheap ultrasonic sensors

* Corresponding author at: Department of Electronic Engineering, Graduate School of Engineering, Shibaura Institute of Technology, 3-7-5 Toyosu, Koto-ku, Tokyo 135-8548, Japan. Fax: +81 3 5859 8308.

E-mail addresses: nakajima@ee.kagu.tus.ac.jp (K. Nakajima), cpremachandra@yahoo.co.jp, chintaka@sic.shibaura-it.ac.jp (C. Premachandra). Peer review under the responsibility of Electronics Research Institute (ERI).

http://dx.doi.Org/10.1016/j.jesit.2017.01.007

2314-7172/© 2017 Electronics Research Institute (ERI). Production and hosting by Elsevier B.V. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).

K. Nakajima etal. / Journal of Electrical Systems and Information Technology xxx (2017) xxx-xxx

Fig. 1. System configuration.

with a low processing load are commonplace, although they have lower precision. Ultrasonic sensors are thus suited to small flying robots, which require lightweight components.

When used in surface-moving robots, multiple ultrasonic sensors are often deployed in a circular formation (Ho et al., 2007; Toledo et al., 2000), necessitating many ultrasonic sensors to cover the entire surroundings of the robot and thus making it larger and heavier. Also, to improve positional accuracy with respect to objects, it is necessary to increase the number of ultrasonic sensors but crosstalk between adjacent ultrasonic sensors becomes a limiting factor. In sum, this suggests that ultrasonic sensors are not suited to small flying robots.

With respect to this circular formation, a previous study (Sonali et al., 2010) used a movable ultrasonic range sensor in a surface-moving robot. This method is facilitated by a single ultrasonic range sensor rotated by a motor, and resolution can be easily increased in comparison to the circular formation, because measurements can be performed with the ultrasonic range sensor set in an arbitrary direction.

We created a movable ultrasonic range sensor by combining a small, lightweight servomotor and a single ultrasonic range sensor. Distances between the robot and objects in the 360° around it are measured using this sensor, and 2D environment mapping is performed using the movable ultrasonic range sensor (Kazuya et al., 2010). We mounted this sensor and an ultrasonic range sensor for altitude measurements on a small flying robot to realize 3D environment mapping and self-localization.

Fig. 1 shows the constructed measurement system. A grid-form environment map is used and points are assigned to the grid block corresponding to object locations detected as the movable ultrasonic range sensor rotates through preset angles. Coordinates with a high score are identified as those where an obstacle exists.

The remainder of this paper is organized as follows. Section 2 explains the movable ultrasonic range sensor and the measurement system that uses it. Section 3 reports experiment results, and Section 4 gives our conclusions and describes possibilities for future development.

2. Measurement system using a movable ultrasonic range sensor

2.1. Movable ultrasonic range sensor

Fig. 2(a) shows the movable ultrasonic range sensor. The measurement system comprises an ultrasonic range sensor, a servomotor, and a microprocessor. The ultrasonic range sensor emits an ultrasonic wave according to a trigger pulse output by the microprocessor and detects the reflected wave. The time between emission and detection of this ultrasonic wave is output to the microprocessor as pulse-width data. The servomotor sets the angle according to a servo

K. Nakajima etal. / Journal of Electrical Systems and Information Technology xxx (2017) xxx-xxx

Fig. 2. Flying robot with rotatable ultrasonic ranging sensor.

(X2,y2)

A (xi.yi)

yy : Flying robot's position. Y-axis direction is a

robot's face direction. Q : Object's position.

Fig. 3. Positional relationship.

signal output by the microprocessor. Measurement of the surroundings is performed by combining this ultrasonic range sensor and the servomotor. A gear was incorporated to enable 360° measurements because initially the range of mobility was narrow at 180°. The microprocessor estimates the distance to an object by means of the pulse received from the ultrasonic range sensor. It then creates an environment map (described below) and carries out self-localization. Fig. 2(b) shows a small flying robot equipped with this movable ultrasonic range sensor. A wireless communications device is also connected to the microprocessor because it is necessary to verify environment mapping data in this flight test.

62 2.2. Environment mapping using a movable ultrasonic range sensor

63 The environment map is a grid with single grid blocks represented as coordinates (X, Y). Fig. 3 shows the positional

64 relationship between the flying robot and an object detected by the ultrasonic sensor within the environment map.

65 For servomotor angle 0 with respect to the front of the fuselage, range r to an object acquired from the ultrasonic 66Q3 range sensor, block side length d, and fuselage location (x1,y1) with respect to the origin O (Fig. 3), the coordinates of

67 an object (x2,y2) are given by the following equation.

(r sin 0 r cos 0 \

68 (X2, y2) =1 —d--+xi, —d--+ yi ) (1)

69 In this way, the flying robot estimates its positional relationship with the object, but because the ultrasonic wave

70 has directional spread (Fig. 4(a)), a single measurement is insufficient to determine object location. Furthermore, it is

71 possible to acquire incorrect measurements due to the ultrasonic wave being indirectly reflected (Fig. 4(b)). Accordingly,

72 multiple measurements are made, points are added to coordinates in which an object is detected, and coordinates with

K. Nakajima etal. / Journal of Electrical Systems and Information Technology xxx (2017) xxx-xxx

(a) (b)

Fig. 4. Examples of false detections (a: case of ultrasonic wave spreading; b: case of wrong ultrasonic wave reflection).

Fig. 5. Position estimation using scores.

Fig. 6. World map.

73 more points are considered more likely to be object locations (Fig. 5). Coordinate scoring is weighted according to

74 range, with closer coordinates receiving higher scores. This is because data become less reliable as the range increases,

75 due to the spread and reflection characteristics of ultrasonic waves. Also, it is more important for the flying robot to

76 evade closer objects.

77 2.3. Self-localization with respect to the environment map

78 A world map of the entire environment (Fig. 6) is prepared along with a separate local map of the environment seen

79 by the flying robot (Fig. 7). A similar pattern of points is likely to be obtained for maps created in the same location.

80 Accordingly, the location of the flying robot on the world map is estimated by performing matching while shifting the

81 coordinates of the local map on the world map, as shown in Fig. 8.

K. Nakajima etal. / Journal of Electrical Systems and Information Technology xxx (2017) xxx-xxx

Fig. 7. Local map.

Fig. 8. Localization using matching algorithm.

82 2.4. World map update and filter processing

83 After the robot self-localizes by matching the local map with the world map as detailed in Section 2.3, the world

84 map data are updated by adding the local map data. The world map is assembled by repeating the process of local

85 map acquisition, self-localization by matching, and world map updates. However, when raw local map data are added,

86 unnecessary data are incorporated into the world map and accumulate with every update. This gives rise to issues that

87 make it difficult to perform self-localization by matching. This issue is resolved by incorporating filter processing as

88 represented by Eq. (2) such that unnecessary information is ignored when updating the world map.

( W (x, y) when(AW > th1 and W(x, y) > th2 ] W (x,y) ={ \ (2)

0 otherwise

W (x,y):score of world map

91 A W:amount of change of W (x,y)

92 th 1, th2:threshold

93 2.5. Expansion to a 3D environment map

94 Expansion of the environment map into three dimensions is realized by mounting an ultrasonic sensor to the underside

95 of the flying robot (Fig. 9) and acquiring data in the vertical direction. Although the altitude of the flying robot may

96 change while preparing the local map, in such a case the self-localization described in Section 2.3 is performed taking

97 the altitude with the largest amount of data as the current layer (Fig. 10).

6 K. Nakajima etal. / Journal of Electrical Systems and Information Technology xxx (2017) xxx-xxx

(a) (b)

Fig. 9. Downward directional ultrasonic sensor.

Fig. 10. Current layer.

98 3. Experiment

99 3.1. Experimental method

100 A movable ultrasonic range sensor was mounted on the flying robot, which was flown manually to create an

101 environment map. The test system comprised the flying robot, which took measurements, and a ground station personal

102 computer that displayed the environment map. The two were connected via a wireless device.

103 The flying robot prepared an environment map and transmitted its data to the ground station computer via the

104 wireless device. The ground station computer received the transmitted environment map data and displayed them in

105 a visual format via an application that displays the map data in real time (Fig. 11), and a comparison with the actual

106 positional relationship of the objects was performed. The application displays grid coordinates with a higher probability

107 of containing an object (a higher point score) as grid blocks of a darker color. Fig. 11(a) shows information related

108 to the environment map in the layer with the highest amount of data during local map preparation and the result of

109 self-localization. Fig. 11(b) shows a 3D image of the world map.

110 The measurement environment comprised Obstacles 1 and 2 (0.30 m x 0.43 m x 1.32 m) and Obstacle 3

111 (0.30 m x 0.43 m x 0.66 m) arranged in the positional relationship shown in Fig. 12. The width d of the grid block in

112 these measurements was 0.60 m.

113 3.2. Experimental results

114 Fig. 13 shows the results for the flight test. Fig. 13(a)-(e) show map data for a continuous flight and the scene

115 of the experiment. Of these figures, the map data that formed the basis for each layer in the vertical direction in the

K. Nakajima etal. / Journal of Electrical Systems and Information Technology xxx (2017) xxx-xxx

Fig. 11. Map application.

Fig. 12. Experiment environment.

8 K. Nakajima etal. / Journal of Electrical Systems and Information Technology xxx (2017) xxx-xxx

Fig. 13. Robot's position and maps.

117 initial position were obtained from (a) and (b), while map data obtained while moving from the initial position were

118 obtained from the remaining (c)-(e), and self-localization was carried out. We consider each of the data sets (a)-(e).

119 Given an initial position of the flying robot of (X,Y,Z) = (5,5,0), Obstacle 1 occupied (3,5,0) in the test environment and

120 the (3,5,1) grid block. Similarly, Obstacle 2 occupied (5,7,0) and (5,7,1), and Obstacle 3 occupied (6,3,0). Fig. 13(a)

121 shows the map data for the vicinity of the ground surface. Looking at the world map, we see that the scores are higher

122 in the locations of Obstacle 1-3.

K. Nakajima etal. / Journal of Electrical Systems and Information Technology xxx (2017) xxx-xxx

nrü^l

Fig. 14. Result of 3D-map.

123 The data of Fig. 13(b) are a map for an altitude of 0.6-1.2 m above the ground surface. The world map shows that

124 scores are higher in locations corresponding to Obstacle 1 and 2. At this altitude, there is no score in the location that

125 corresponds to Obstacle 3.

12p4 Map data to form a standard for each layer were prepared according to the above. Self-localization processing was

127 added thereafter.

128 Fig. 13(c) shows map data acquired when the flying robot had moved 0.6 m from location (b) in the direction

129 of Obstacle 2. Comparing the world and 3D maps, the estimated location of the flying robot has moved one block

130 from the location in (b), and is on the grid block that corresponds to the location of Obstacle 2. Figs. 13(d) and (e)

131 respectively show map data for when the flying robot had moved one block diagonally in the direction of Obstacle 1

132 and for when it had moved to the same position as in (b). As with the data in (c), we can see that self-localization was

133 achieved.

134 Fig. 14 shows the final 3D map. Comparing Figs. 14 and 12, which shows the test environment, it can be seen that

135 coordinates with high scores in locations corresponding to Obstacles 1-3 in Fig. 12 exist in the 3D map of Fig. 14.

136 3.3. Discussion

137 The test results indicate the feasibility of creating an environment map for a 3D space and performing self-localization

138 using a movable ultrasonic range sensor and an ultrasonic range sensor for altitude measurement.

139 An increase in score for grid blocks in which obstacles did not exist in the real environment was confirmed in the

140 data for each local map in Fig. 12. This was probably due to misdetections as illustrated by Fig. 4 and described in

141 Section 2.2. The scores obtained from these misdetections had lower values than did the locations where there was an

142 actual obstacle. It can be seen from the data for the world map (displayed to the right of the local map) that these were

143 removed by filtering when updating the world map.

144 4. Conclusions and future development

145 We created a movable ultrasonic range sensor that enabled 360° measurements of a robot's surroundings by combin-

146 ing an ultrasonic range sensor and a servomotor. By equipping a flying robot with this movable ultrasonic range sensor

147 and an ultrasonic range sensor for altitude measurements and flying it, creating a grid-form environment map, and using

148 an application to display the created map in a visual format, environment mapping in a 3D space and self-localization

149 in real time were shown to be feasible with the constructed measurement system.

150 In future work, we aim to realize flight with fully autonomous obstacle avoidance by a standalone flying robot by incorporating this measurement system into its flight control system.

10 K. Nakajima etal. / Journal of Electrical Systems and Information Technology xxx (2017) xxx-xxx

151 References

152 Bastian, S., Giorgio, G., Cyrill, S., Wolfram, B., 2008. Visual SLAM for flying vehicles. Proc. IEEE Trans. Rob. 24 (5), 1088-1093.

153 Chunrong, Y., Fabian, R., Hanspeter, A.M., 2009. Visual steering of UAV in unknown environments. Proc. of the 2009 IEEE/RSJ International

154 Conference on Intelligent Robots and Systems (IROS), 3906-3911.

155 Daisuke, N., Satoru, T., 2010. Mobile robot control based on information of the scanning laser range sensor. Proc. of 11th IEEE International

156 Workshop on Advanced Motion Control, 258-261.

157 Fei, W., Jinqiang, C., Swee, K.P., Chen, B.M., Lee, T.H., 2013. A mono-camera and scanning laser range finder based UAV indoor navigation

158 system. Proc. of International Conference on Unmanned Aircraft Systems (ICUAS), 694-701.

159 Ganganath, N., Leung, H., 2012. Mobile robot localization using odometry and kinect sensor. Proc. of International Conference on Emerging Signal

160 Processing Applications (ESPA), 91-94.

161 Ho, K.-D., Sang, S.-W., In, J.-H., Kwee, S.-B., 2007. SLAM of mobile robot in the in-door environment with digital magnetic compass and ultrasonic

162 sensors. Proc. of International Conference on Control, Automation and Systems (ICCAS), 87-90.

163 John, S., Michael, H., Andrew, B.S., 2011. Altitude control of a quadrotor helicopter using depth map from microsoft kinect sensor. Proc. of the

164 2011 IEEE International Conference on Mechatronics, 358-362.

165 Kazuya, N., Premachandra, C., Kiyotaka, K., 2010. Localization and 3D-mapping for small flight robot using movable ultrasonic ranging sensor.

166 Proc. ofSSI2013.

167 Salado, A.M., Vandeportaele, B., Lacroix, S., Hattenberger, G., 2010. Flight autonomy of micro-drone in indoor environments using LiDAR flash

168 camera. In: Proc. of International Micro Air Vehicle Conference and Flight Competition, Braunschweig: Allemagne (IMAV).

169 Schmid, K., Tomic, T., Ruess, F., Hirschmuller, H., Suppa, M., 2013. Stereo vision based indoor/outdoor navigation for flying robots. Proc. of the

170 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 3955-3962.

171 Sonali, K.K., Dharmesh, H.S., Nishant, M.R., 2010. Obstacle avoidance for a mobile exploration robot using a single ultrasonic range sensor. Proc.

172 of the 2010 International Conference on Emerging Trends in Robotics and Communication Technologies (INTERACT), 8-11.

173 Toledo, F.J., Luis, J.D., Tomas, L.M., Zamora, M.A., Martinez, H., 2000. Map building with ultrasonic sensors of indoor environments using neural

174 net-works. Proc. of the 2000 IEEE International Conference on Systems Man, and Cybernetics vol. 2, 920-925.