Visual and Tactile 3D Point Cloud Data from Real Robots for Shape Modeling and Completion

Published: 17 March 2020| Version 1 | DOI: 10.17632/ztkctgvgw6.1
Contributors:
Yasemin Bekiroglu,
, Gabriela Zarzar Gandler, Johannes Exner,
,

Description

If you use this data, please cite "Y. Bekiroglu, M. Björkman, G. Zarzar Gandler, J. Exner, C. H. Ek, D. Kragic. Visual and Tactile 3D Point Cloud Data from Real Robots for Shape Modeling and Completion, Data in Brief (2020), https://doi.org/10.1016/j.dib.2020.105335". The data was used for shape completion and modeling via Implicit Surface representation and Gaussian-Process-based regression, in the work “G. Zarzar Gandler, C. H. Ek, M. Björkman, R. Stolkin, Y. Bekiroglu. Object shape estimation and modeling, based on sparse Gaussian process implicit surfaces, combining visual data and tactile exploration, Robotics and Autonomous Systems (2020), https://doi.org/10.1016/j.robot.2020.103433”, and also used partially in “M. Björkman, Y. Bekiroglu, V. Högman, D. Kragic. Enhancing visual perception of shape through tactile glances, in IEEE/RSJ International Conference on Intelligent Robots and Systems (2013)".

Files

Institutions

Kungliga Tekniska Hogskolan

Categories

Robotics, Tactile Perception, Depth Perception, Point Cloud

Licence