You are here
Interactive Perception in Robotics
- Date Issued:
- 2019
- Abstract/Description:
- Interactive perception is a significant and unique characteristic of embodied agents. An agent can discover plenty of knowledge through active interaction with its surrounding environment. Recently, deep learning structures introduced new possibilities to interactive perception in robotics. The advantage of deep learning is in acquiring self-organizing features from gathered data; however,it is computationally impractical to implement in real-time interaction applications. Moreover, it can be difficult to attach a physical interpretation. An alternative suggested framework in such cases is integrated perception-action.In this dissertation, we propose two integrated interactive perception-action algorithms for real-time automated grasping of novel objects using pure tactile sensing. While visual sensing andprocessing is necessary for gross reaching movements, it can slow down the grasping process if it is the only sensing modality utilized. To overcome this issue, humans primarily utilize tactile perceptiononce the hand is in contact with the object. Inspired by this, we first propose an algorithm to define similar ability for a robot by formulating the required grasping steps.Next, we develop the algorithm to achieve force closure constraint via suggesting a human-like behavior for the robot to interactively identify the object. During this process, the robot adjuststhe hand through an interactive exploration of the object's local surface normal vector. After the robot finds the surface normal vector, it then tries to find the object edges to have a graspable finalrendezvous with the object. Such achievement is very important in order to find the objects edges for rectangular objects before fully grasping the object. We implement the proposed approacheson an assistive robot to demonstrate the performance of interactive perception-action strategies to accomplish grasping task in an automatic manner.
Title: | Interactive Perception in Robotics. |
41 views
19 downloads |
---|---|---|
Name(s): |
Baghbahari Baghdadabad, Masoud, Author Behal, Aman, Committee Chair Haralambous, Michael, Committee Member Lin, Mingjie, Committee Member Sukthankar, Gita, Committee Member Xu, Yunjun, Committee Member University of Central Florida, Degree Grantor |
|
Type of Resource: | text | |
Date Issued: | 2019 | |
Publisher: | University of Central Florida | |
Language(s): | English | |
Abstract/Description: | Interactive perception is a significant and unique characteristic of embodied agents. An agent can discover plenty of knowledge through active interaction with its surrounding environment. Recently, deep learning structures introduced new possibilities to interactive perception in robotics. The advantage of deep learning is in acquiring self-organizing features from gathered data; however,it is computationally impractical to implement in real-time interaction applications. Moreover, it can be difficult to attach a physical interpretation. An alternative suggested framework in such cases is integrated perception-action.In this dissertation, we propose two integrated interactive perception-action algorithms for real-time automated grasping of novel objects using pure tactile sensing. While visual sensing andprocessing is necessary for gross reaching movements, it can slow down the grasping process if it is the only sensing modality utilized. To overcome this issue, humans primarily utilize tactile perceptiononce the hand is in contact with the object. Inspired by this, we first propose an algorithm to define similar ability for a robot by formulating the required grasping steps.Next, we develop the algorithm to achieve force closure constraint via suggesting a human-like behavior for the robot to interactively identify the object. During this process, the robot adjuststhe hand through an interactive exploration of the object's local surface normal vector. After the robot finds the surface normal vector, it then tries to find the object edges to have a graspable finalrendezvous with the object. Such achievement is very important in order to find the objects edges for rectangular objects before fully grasping the object. We implement the proposed approacheson an assistive robot to demonstrate the performance of interactive perception-action strategies to accomplish grasping task in an automatic manner. | |
Identifier: | CFE0007780 (IID), ucf:52361 (fedora) | |
Note(s): |
2019-12-01 Ph.D. Engineering and Computer Science, Electrical and Computer Engineering Doctoral This record was generated from author submitted information. |
|
Subject(s): | Interactive Perception -- Robotics -- Grasping -- Control | |
Persistent Link to This Record: | http://purl.flvc.org/ucf/fd/CFE0007780 | |
Restrictions on Access: | public 2019-12-15 | |
Host Institution: | UCF |