Please use this identifier to cite or link to this item: https://repositorio.ufba.br/handle/ri/39253
metadata.dc.type: Tese
Title: Grasping and identifying objects in unstructured environments with deep learning methods.
metadata.dc.creator: Viturino, Caio Cristiano Barros
metadata.dc.contributor.advisor1: Conceição, André Gustavo Scolari
metadata.dc.contributor.referee1: Conceição, André Gustavo Scolari
metadata.dc.contributor.referee2: Ribeiro, Tiago Trindade
metadata.dc.contributor.referee3: Simas Filho, Eduardo Furtado de
metadata.dc.contributor.referee4: Grassi Junior, Valdir
metadata.dc.contributor.referee5: Santos, Eduardo Telmo Fonseca
metadata.dc.description.resumo: In recent years, robotic grasping methods based on deep learning have outperformed traditional methods. However, most of these methods use planar grasps due to the high computational cost associated with 6D grasps. Planar grasps, despite having a lower computational cost, have spatial limitations that restrict their applicability in complex environments, such as grasping objects inside 3D printers. Some robotic grasping techniques generate only one viable grasp per object. However, it is essential to obtain multiple possible grasps per object, as not all generated grasps have viable kinematic solution or avoid collisions with nearby obstacles. To overcome these limitations, a robotic grasping method is proposed that is capable of generating multiple selective 6D grasps per object, avoiding collisions with adjacent obstacles. Grasping tests were carried out in an Additive Manufacturing Unit, which presents a high level of complexity due to the possibility of collisions between the end effector and the inside of the printer. Experimental results indicate that it is possible to achieve a success rate of 62% in the 6D grasp of manufactured parts in confined environments. In addition, a success rate of 68% and 177 MPPH (Man Picks Per Hour) was achieved in the selective planar grasp of objects positioned on flat surfaces. The UR5 robotic arm, the Intel Realsense D435 camera, and the Robotiq 2F-140 end effector were used to validate the proposed method in real experiments.
Abstract: In recent years, robotic grasping methods based on deep learning have outperformed traditional methods. However, most of these methods use planar grasps due to the high computational cost associated with 6D grasps. Planar grasps, despite having a lower computational cost, have spatial limitations that restrict their applicability in complex environments, such as grasping objects inside 3D printers. Some robotic grasping techniques generate only one viable grasp per object. However, it is essential to obtain multiple possible grasps per object, as not all generated grasps have viable kinematic solution or avoid collisions with nearby obstacles. To overcome these limitations, a robotic grasping method is proposed that is capable of generating multiple selective 6D grasps per object, avoiding collisions with adjacent obstacles. Grasping tests were carried out in an Additive Manufacturing Unit, which presents a high level of complexity due to the possibility of collisions between the end effector and the inside of the printer. Experimental results indicate that it is possible to achieve a success rate of 62% in the 6D grasp of manufactured parts in confined environments. In addition, a success rate of 68% and 177 MPPH (Man Picks Per Hour) was achieved in the selective planar grasp of objects positioned on flat surfaces. The UR5 robotic arm, the Intel Realsense D435 camera, and the Robotiq 2F-140 end effector were used to validate the proposed method in real experiments.
Keywords: preensão Robótica
redes neurais convolucionais
manipuladores robóticos
metadata.dc.subject.cnpq: CNPQ::ENGENHARIAS::ENGENHARIA ELETRICA
metadata.dc.language: eng
metadata.dc.publisher.country: Brasil
Publisher: Universidade Federal da Bahia
metadata.dc.publisher.initials: UFBA
metadata.dc.publisher.department: Escola Politécnica
metadata.dc.publisher.program: Programa de Pós-Graduação em Engenharia Elétrica (PPGEE) 
metadata.dc.rights: Acesso Aberto
URI: https://repositorio.ufba.br/handle/ri/39253
Issue Date: 1-Dec-2023
Appears in Collections:Tese (PPGEE)

Files in This Item:
File Description SizeFormat 
Caio Cristiano Barros Viturino. TCC - tese.pdf74,66 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.