AURO 2018: Autonomous Flight with Robust Visual Odometry under Dynamic Lighting Conditions

Sensitivity to light conditions poses a challenge when utilizing visual odometry (VO) for autonomous navigation of small aerial vehicles in various applications. We present an illumination-robust direct visual odometry for a stable autonomous flight of an aerial robot under unpredictable light condition. The proposed stereo VO achieves robustness with respect to the light-changing environment by employing the patch-based affine illumination model to compensate abrupt, irregular illumination c...

ECCV 2018: Linear RGB-D SLAM for Planar Environments

We propose a new formulation for including orthogonal planar features as a global model into a linear SLAM approach based on sequential Bayesian filtering. Previous planar SLAM algorithms estimate the camera poses and multiple landmark planes in a pose graph optimization. However, since it is formulated as a high dimensional nonlinear optimization problem, there is no guarantee the algorithm will converge to the global optimum. To overcome these limitations, we present a new SLAM method that ...

IJCAS 2018: Visual Inertial Odometry with Pentafocal Geometric Constraints

We present the sliding-window monocular visual inertial odometry that is accurate and robust to outliers by employing a new observation model grounded on the pentafocal geometric constraints. The previous approaches are dependent on the unknown 3D coordinates of the features to estimate the ego-motion. However, the inaccurate 3D position of the features can lead to poor performance in motion estimation. To overcome these limitations, we utilize the pentafocal geometry relationship between fiv...

CVPR 2018: Indoor RGB-D Compass from a Single Line and Plane

We propose a novel approach to estimate the three degrees of freedom (DoF) drift-free rotational motion of an RGB-D camera from only a single line and plane in the Manhattan world (MW). Previous approaches exploit structural regularities to achieve accurate 3-DoF rotation estimation by using the distribution of surface normal vectors and points at infinity, i.e., vanishing points (VPs). However, they require multiple orthogonal planes or a plenty of consistent lines to be visible throughout t...

ICRA 2018: Low-Drift Visual Odometry in Structured Environments by Decoupling Rotational and Translational Motion

We present a low-drift visual odometry algorithm that separately estimates rotational and translational motion from lines, planes, and points found in RGB-D images. Previous methods estimate drift-free rotational motion from structural regularities to reduce drift in the rotation estimate, which is the primary source of positioning inaccuracy in visual odometry. However, multiple orthogonal planes are required to be visible throughout the entire motion estimation process, otherwise these VO a...

BMVC 2017: Visual Odometry with Drift-Free Rotation Estimation Using Indoor Scene Regularities

We propose a hybrid visual odometry algorithm to achieve accurate and low-drift state estimation by separately estimating the rotational and translational camera motion. Previous methods usually estimate the six degrees of freedom camera motion jointly without distinction between rotational and translational motion. However, inaccuracy in the rotation estimate is a main source of drift in visual odometry. We design a hybrid visual odometry algorithm which separately estimates the rotational a...

ICRA 2017: Robust Visual Localization in Changing Lighting Conditions

We present an illumination-robust visual localization algorithm for Astrobee, a free-flying robot designed to autonomously navigate on the International Space Station (ISS). Astrobee localizes with a monocular camera and a prebuilt sparse map composed of natural visual features. Astrobee must perform tasks not only during the day, but also at night when the ISS lights are dimmed. However, the localization performance degrades when the observed lighting conditions differ from the conditions wh...

IROS 2015: Robust Visual Odometry to Irregular Illumination Changes with RGB-D camera

Sensitivity to illumination conditions poses a challenge when utilizing visual odometry (VO) in various applications. To make VO robust with respect to illumination conditions, they need to be considered explicitly. In this paper, we propose a direct visual odometry method which can handle illumination changes by considering an affine illumination model to compensate abrupt, local light variations during direct motion estimation process. The core of our proposed method is to estimate the rela...

SMC 2014: 6-DoF Velocity Estimation Using RGB-D Camera Based on Optical Flow

In this paper, we suggest a new 6-DoF velocity estimation algorithm using RGB and depth images. Autonomous control of mobile robots requires their velocity information. There exist numerous researches on estimating and measuring the velocity. However, more investigations are needed related to vision sensors and depth image. In this work, we propose an algorithm for velocity estimation with an RGB-D sensor based on image jacobian matrix usually used in image-based visual servoing. We validate ...