Wheel-terrain interaction plays a critical role for vehicle mobility on natural terrain, such as in agricultural, planetary exploration and off-road settings. Estimation of the terrain characteristics and the way they affect traversability is essential for the vehicle to better plan its safest and energy-efficient path. This work proposes a novel approach to learn and predict from a distance the motion resistance encountered by a robotic vehicle, while traversing natural soil, by using visual information from a stereovision device. To this end, terrain appearance and geometry information are first correlated to resistance torque measurements during a learning phase via two alternative regression approaches, namely Least-Squares Boosting and Long-Short Term Memory Recurrent Neural Network. Then, such a relationship is exploited to predict motion resistance remotely, based on visual data only. Results obtained in preliminary experimental tests on ploughed and compact terrain are presented to show the feasibility of the proposed method.
In this paper, an approach for the estimation of relative positions and orientations of multiple rigidly coupled RGB-D sensors is presented. The proposed methodology does not impose constraints on the relative pose among the cameras, except for the rigidity: relative poses among cameras do not change in time. We assume that each camera captures an individual sequence which is afterward processed by a Simultaneous Localization And Mapping (SLAM) algorithm resulting in positions and orientations describing the trajectory followed by each sensor. Several simulations have been run considering a two-camera set having known input trajectories, affected by increasing levels of white noise. The results prove the capability of the algorithm in estimating the relative translations and rotations between the two cameras. Furthermore, experimental laboratory tests corroborate the effectiveness of the proposed method.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.