Dataset generation for Visual SLAM and Machine Learning

by Adam Kalisz

Slides, Links: http://BCon19.Kalisz.co

Overview

  • Introduction
  • Visual SLAM
  • Sensor Data Fusion
  • Machine Learning
  • Conclusions for SciViz

Introduction

  • Adam Kalisz
  • Media Engineering (Bachelor)
  • Computer Science (Master)
  • BFCT since May 2016
  • PhD Student: Monocular Visual SLAM / Sensor data fusion

Our chair at University

BCon17 Talk

Talk from 2017
See: https://www.youtube.com/watch?v=PMYET5LA-Kg

Visual SLAM

My research

Motion Tracking (libMV)

Motion Tracking

The Indirect Method (libMV)

Damaged Downtown

3D Reconstruction (libMV)

Damaged Downtown

Dataset in Blender

Damaged Downtown

Without camera rotation

Damaged Downtown Paper: "Systematic Analysis of Direct Sparse Odometry", DICTA 2018

With camera rotation

Damaged Downtown Paper: "Systematic Analysis of Direct Sparse Odometry", DICTA 2018

Sensor Data Fusion

Sensor Fusion in Blender

Paper: "B-SLAM-SIM: A novel approach to evaluate the fusion of Visual SLAM and GPS by example of Direct Sparse Odometry and Blender", VISAPP 2019

Visual SLAM and GPS Plots

Fusion Paper: "B-SLAM-SIM: A novel approach to evaluate the fusion of Visual SLAM and GPS by example of Direct Sparse Odometry and Blender", VISAPP 2019

Machine Learning

Dataset generator

Dataset generator output

Dataset Generator Output

Conclusions

  • Think about the idea of direct methods in Blenders Motion Tracker
  • Combine ideas from both rendering and computer vision
  • Use synthetic data to train and evaluate Artificial Intelligence

Blender User Group Nuremberg (NuremBUG)

NuremBUG Banner

https://www.nurembug.org/

Thank you!