REVOLUTION ROBOTICS

ABOUT

Robots have been in use in the industry since the early 20th century when the first humanoid robot was developed. Modern robots were developed sometime later, initially used in factories as industrial robots. But the field of robotics has come a long way since then, producing digitally programmed and AI-powered robots. These modern robots are used in many cutting-edge fields apart from manufacturing, like surgeries, transportation, and even space exploration. Robots have moved from the role of replacing humans in repetitive and monotonous tasks, to an assisting role, where they work alongside humans. Though great advancements in robotics and AI have been made in the past decade to make them ‘smart’, not much emphasis has been given to making these robots safe, or in other words, collaborative.

According to ISO/TS 15066, a collaborative robot is a robot that is capable of being used in a collaborative space. A collaborative space is defined as a space within the operating space (including the workpiece), where the robot system and a human can perform tasks concurrently during the product operation. Most of the modern robots used in the industry can be digitally controlled and many of them are powered by AI, but most of them lack the required safety features that allow them to be fully collaborative. Some companies do make collaborative robots, but most of them are too expensive to buy, and can only be configured for smaller payloads (up to 5kgs in most cases), making them infeasible for regular industrial operation.

The main goal of this project is to design a cheap and universal collaborative robotics solution, which can easily be integrated with existing industrial robotic manipulators in use, and can make them collaborative, or in other words, safe enough for humans to work alongside them. The proposed solution will use Computer Vision for object detection, using Depth and vision information from Intel Realsense Depth cameras. Collision meshes will be created after obtaining this information, and instructions will be sent to the robot manipulator to avoid collision without affecting its performance. A Jetson TX1 development board has been chosen for implementing this project since it has the required processing power needed to run computer vision algorithm and can run ROS (Robot operating system) on a customized Linux OS, which has most of the computer vision and dynamic path modeling libraries needed for collision avoidance. It also has enough power to receive feed from multiple Realsense depth cameras and has built-in libraries to communicate with a FANUC robot controller chosen for this project.

PROJECT SOLUTION

PROJECT MEMBERS

CONTACT INFO