Support Surface Prediction in Indoor Scenes

Ruiqi Guo, Derek Hoiem; Proceedings of the IEEE International Conference on Computer Vision (ICCV), 2013, pp. 2144-2151

Abstract


In this paper, we present an approach to predict the extent and height of supporting surfaces such as tables, chairs, and cabinet tops from a single RGBD image. We define support surfaces to be horizontal, planar surfaces that can physically support objects and humans. Given a RGBD image, our goal is to localize the height and full extent of such surfaces in 3D space. To achieve this, we created a labeling tool and annotated 1449 images with rich, complete 3D scene models in NYU dataset. We extract ground truth from the annotated dataset and developed a pipeline for predicting floor space, walls, the height and full extent of support surfaces. Finally we match the predicted extent with annotated scenes in training scenes and transfer the the support surface configuration from training scenes. We evaluate the proposed approach in our dataset and demonstrate its effectiveness in understanding scenes in 3D space.

Related Material


[pdf]
[bibtex]
@InProceedings{Guo_2013_ICCV,
author = {Guo, Ruiqi and Hoiem, Derek},
title = {Support Surface Prediction in Indoor Scenes},
booktitle = {Proceedings of the IEEE International Conference on Computer Vision (ICCV)},
month = {December},
year = {2013}
}