Faculty of Engineering and Applied Science, Ontario Tech University, 2000 Simcoe Street North, Oshawa, Canada.
10.1088/1742-6596/3058/1/012004
Abstract
Recent trends in the transportation and automotive industry have seen vehicles progress towards complete autonomous navigation methods. Many studies and investigations in this area involve implementing algorithms alongside simulated models and physical experimental platforms. The majority of these platforms lack “car-like” features seen in traditional vehicles such as suspension or actuated steering. In addition, the majority of studies use robotic platforms with differential drive to perform steering by varying rotational wheel speed. With regards to autonomous navigation methods, filtering techniques are widely used to track robot position during navigation. In this research work, a robot referred to as the Scaled Electric Combat Vehicle (SECV) capable of assigning a unique steering angle and unique wheel speed to each of its eight wheels is used to implement a vision-based navigation system. The custom-built robot is inspired by real-world armoured personnel carriers used currently. The steering system actuates all wheels with accurate steering angles of the eight wheels generated according to the Ackermann steering condition. The particle filter alongside SLAM is used as the main system providing the robot with a map of the environment. The experiment is performed with the SECV navigating while generating a map of the obstacle ridden environment. Two trials are performed with the initial trial utilising the onboard laser scanning (LIDAR) sensor as the basis for mapping. The second trial follows a similar procedure in an identical environment but in this case, a stereo depth camera is used to construct a three-dimensional map. The mapping performance of the two implementations is compared. It was seen that certain obstacles are not easily recognized when solely using the LIDAR SLAM. In addition, the camera-based SLAM mapping successfully added visual data to the robot’s map adding key imaging data regarding obstacles. To summarize, minimal investigations are seen in the literature for vision-based navigation approaches alongside steerable platforms. In addition, other studies seen leverage additional laser scanners to map and navigate in the environment while the proposed system leverages a single laser scanner and camera. Furthermore, the majority of the platforms seen in literature leverage four or less wheels without actuated steering and this work aims to reduce the gap in this area.
Peiris, M., Tse, J., Lang, H., El-Gindy, M., & Kishawy, H. (2025). Vision Based Navigation System for 8x8 Scaled Combat Vehicle. The International Conference on Applied Mechanics and Mechanical Engineering, 22(22), 1-10. doi: 10.1088/1742-6596/3058/1/012004
MLA
M Peiris; J Tse; H Lang; M El-Gindy; H Kishawy. "Vision Based Navigation System for 8x8 Scaled Combat Vehicle", The International Conference on Applied Mechanics and Mechanical Engineering, 22, 22, 2025, 1-10. doi: 10.1088/1742-6596/3058/1/012004
HARVARD
Peiris, M., Tse, J., Lang, H., El-Gindy, M., Kishawy, H. (2025). 'Vision Based Navigation System for 8x8 Scaled Combat Vehicle', The International Conference on Applied Mechanics and Mechanical Engineering, 22(22), pp. 1-10. doi: 10.1088/1742-6596/3058/1/012004
VANCOUVER
Peiris, M., Tse, J., Lang, H., El-Gindy, M., Kishawy, H. Vision Based Navigation System for 8x8 Scaled Combat Vehicle. The International Conference on Applied Mechanics and Mechanical Engineering, 2025; 22(22): 1-10. doi: 10.1088/1742-6596/3058/1/012004