Patrick Geneva
home
publications
tech reports
readings
notes
projects
fsae
Open-Sourced VINS Codebases
rpng / OpenVINS -
Link
rpng / R-VIO -
Link
ethz-asl / okvis -
Link
ethz-asl / maplab -
Link
TUM / basalt -
Link
HKUST-Aerial-Robotics / VINS-Fusion -
Link
HKUST-Aerial-Robotics / VINS-Mono -
Link
MIT-SPARK / Kimera-VIO -
Link
ucla-vision / xivo -
Link
KumarRobotics / msckf_vio -
Link
Continuous-Time Trajectory Estimation
Unified temporal and spatial calibration for multi-sensor systems -
Link
Spline Fusion: A continuous-time representation for visual-inertial fusion with application to rolling shutter cameras. -
Link
Continuous-time visual-inertial odometry for event cameras -
Link
Efficient Visual-Inertial Navigation using a Rolling-Shutter Camera with Inaccurate Timestamps. -
Link
Alternating-stereo VINS: Observability analysis and performance evaluation -
Link
Multi-camera visual-inertial navigation with online intrinsic and extrinsic calibration -
Link
Decoupled Representation of the Error and Trajectory Estimates for Efficient Pose Estimation -
Link
Machine Learning Uncertainty
Uncertainty in Deep Learning -
Link
Geometry and uncertainty in deep learning for computer vision -
Link
Modelling uncertainty in deep learning for camera relocalization -
Link
Dropout as a bayesian approximation: Representing model uncertainty in deep learning -
Link
What uncertainties do we need in bayesian deep learning for computer vision? -
Link
Multi-task learning using uncertainty to weigh losses for scene geometry and semantics -
Link
Resource Constrained Extended Kalman Filtering
A provably consistent method for imposing sparsity in feature-based SLAM information filters -
Link
Optimization-based estimator design for vision-aided inertial navigation -
Link
Vision-aided inertial navigation for resource-constrained systems -
Link
Power-SLAM: a linear-complexity, anytime algorithm for SLAM -
Link
A resource-aware vision-aided inertial navigation system for wearable and portable computers -
Link
An iterative kalman smoother for robust 3D localization and mapping -
Link
Inverse Schmidt Estimators -
Link
Consistent map-based 3D localization on mobile devices -
Link
RISE-SLAM: A Resource-aware Inverse Schmidt Estimator for SLAM -
Link
Event-based Cameras
Real-Time 3D Reconstruction and 6-DoF Tracking with an Event Camera -
Link
Continuous-Time Trajectory Estimation for Event-based Vision Sensors -
Link
Continuous-Time Visual-Inertial Trajectory Estimation with Event Cameras -
Link
EVO: A Geometric Approach to Event-Based 6-DOF Parallel Tracking and Mapping in Real-time -
Link
Event-based Visual Inertial Odometry -
Link
Rolling Shutter Cameras
Vision-aided inertial navigation with rolling-shutter cameras -
Link
Efficient Visual-Inertial Navigation using a Rolling-Shutter Camera with Inaccurate Timestamps -
Link
Real-time Motion Tracking on a Cellphone using Inertial Sensing and a Rolling-Shutter Camera -
Link
3-D Motion Estimation and Online Temporal Calibration for Camera-IMU Systems -
Link
High-fidelity Sensor Modeling and Self-Calibration in Vision-aided Inertial Navigation -
Link
Visual-inertial navigation systems (VINS)
A Robust and Modular Multi-Sensor Fusion Approach Applied to MAV Navigation -
Link
Determining the Time Delay Between Inertial and Visual Sensor Measurements -
Link
A Multi-State Constraint Kalman Filter for Vision-aided Inertial Navigation -
Link
Vision-Aided Inertial Navigation for Spacecraft Entry, Descent, and Landing -
Link
ORB-SLAM
ORB: an efficient alternative to SIFT or SURF -
Link
Bags of Binary Words for Fast Place Recognition in Image Sequences -
Link
ORB-SLAM: Tracking and Mapping Recognizable Features -
Link
ORB-SLAM: a Versatile and Accurate Monocular SLAM System -
Link
Parallel Tracking and Mapping for Small AR Workspaces -
Link