Authors
Yew Cheong Hou, Khairul Salleh Mohamed Sahari*
Department of Mechanical Engineering, Universiti Tenaga Nasional, Selangor,
Malaysia
*
Corresponding author. Email: [email protected]
Corresponding Author
Khairul Salleh Mohamed Sahari
Received 24 July 2018, Accepted 16 November 2018, Available Online 8 April
2019.
DOI
https://doi.org/10.2991/jrnal.k.190220.001How to use a DOI?
Keywords
Deformable object; robotic manipulation; computer vision; particle-based
model
Abstract
This work considers the problem of garment handling by a general household
robot that focuses on the task of classification and pose estimation of
a hanging garment in unfolding procedure. Classification and pose estimation
of deformable objects such as garment are considered a challenging problem
in autonomous robotic manipulation because these objects are in different
sizes and can be deformed into different poses when manipulating them.
Hence, we propose a self-generated synthetic dataset for classifying the
category and estimating the pose of garment using a single manipulator.
We present an approach to this problem by first constructing a garment
mesh model into a piece of garment that crudely spread-out on the flat
platform using particle-based modeling and then the parameters such as
landmarks and robotic grasping points can be estimated from the garment
mesh model. Later, the spread-out garment is picked up by a single robotic
manipulator and the 2D garment mesh model is simulated in 3D virtual environment.
A dataset of hanging garment can be generated by capturing the depth images
of real garment at the robotic platform and also the images of garment
mesh model from offline simulation respectively. The synthetic dataset
collected from simulation shown the approach performed well and applicable
on a different of similar garment. Thus, the category and pose recognition
of the garment can be further developed.
Copyright
© 2019 The Authors. Published by ALife Robotics Corp. Ltd.
Open Access
This is an open access article distributed under the CC BY-NC 4.0 license
(http://creativecommons.org/licenses/by-nc/4.0/).