TraDaG  1.0
Training Data Generator
TraDaG Documentation

Table of Contents


TraDaG (short for Training Data Generator) is a framework for dropping objects into scenes which are given only by an RGB image, a depth image, and a labeling of the scene contents in form of a label image. The goal is to provide a way to insert objects into these scenes in a physically plausible way (i.e. to not end up with floating objects or impossible resting poses, where the object would normally tilt over). This is achieved by using a physics engine to drop these objects on a plane extracted from the depth and label images. The resulting scene with the objects inside can then be rendered into new RGB and depth images. In addition, the framework will provide the following information for each object:

The rendered images and the additional information can then be used in whatever way the user wishes. The main purpose at the time of writing this framework is to provide physically plausible training and testing data for machine learning algorithms, which are supposed to robustly recognize objects in images and (optionally) detect the 6D pose (rotation and translation) for each object.

The framework allows for fine-grained control over the whole process and can easily be used with big data sets of RGB-D scenes. It supports reading in whole directories of RGB, depth and label images and searching them for scenes suitable for specific ground planes (e.g. scenes which contain a table that is viewed by the camera from a bird's eye view).


Below is a list of the features that TraDaG provides. It is not exhaustive, but should give a good overview about the capabilities of the framework.

For more in-depth and detailed information, read the Getting Started chapter and consult the API reference pages of this documentation.

Further Reading


TraDaG was written in 2015/2016 by Julian Harttung for the Computer Vision Lab Dresden (CVLD). It is part of a research project on 6D pose estimation at the Faculty of Computer Science of the Dresden University of Technology.


Not yet determined.