Anzeige

Capturing Speed

AI Powered 3D Vision System for Robotic Pick&Place Applications

VCV-Cortex is an AI powered stereovision system developed for robotic applications. It works with any camera and optical system to provide the desired range, FoV and precision. Unlike structured light cameras, it can work on dark and/or shiny objects at high speed.

Bild: Computer Vision Ltd.Bild: Computer Vision Ltd.
Image 1 | VCV-Cortex Pose estimation viewer. Showing a bin full of dark shiny rivets. The found rivet is highlighted in green and its 6D pose shown on the bottom left corner. Being a pre-trained AI system, it needs less than one minute to learn the geomet

Stereovision is an imitation of human visual system and is one of the first attempts at building a 3D vision system for robots and machines. However, traditional algorithms have very low accuracy and break down facing texture-less surfaces.

Bild: Computer Vision Ltd.Bild: Computer Vision Ltd.
Image 2 | Point cloud of a dark and shiny PVC part sitting on top of a cardboard box: Left generated by structured light system, right created by VCV-Cortex.

As a result, active systems, specially structured light cameras, are nowadays the standard 3D vision systems of the industrial world. Structured light cameras are perfect for metrology and quality inspection. However, for lack of a better solution, they are also being fitted into robotic pick&place applications where they leave a lot to be desired; the process of casting the light pattern and capturing can be time consuming. The same process prevents them from capturing dark and/or shiny objects in one shot. They require HDR imaging and that means more wasted time. They cannot work next to each other because of light projection cross talk. They can be blinded by an intense light like that of welding or the sun. They have low resolution and offer limited FoV and range. As a result of the above issues, industries still prefer to use humans or 2D vision to solve manufacturing challenges.

Computer Vision Ltd.

Dieser Artikel erschien in inVISION 4 2019 - 09.09.19.
Für weitere Artikel besuchen Sie www.invision-news.de