Modular, Efficient, Versatile

Platform Strategy for Automated Material Handling

  • The operator panel for the depalletizing of containersThe operator panel for the depalletizing of containers
  • The operator panel for the depalletizing of containers
  • The software recognizes the position of the brake disks
  • A robot depalletizes small load carriers automatically
  • Brake disks are supplied  in transport containers

Part unloading or robot-aided depalletizing are sought-after industrial applications for automating production processes, increasing capacity and cutting costs. When the parts to be handled are unknown and the position within the container is random, the machine vision system is required to identify these parts and to localize the part position in 3D.

Unloading parts from containers, e.g. before assembling processes, is a simple exercise for humans. It is more difficult, when a robot has to undertake this task. If form and position of the parts in the container are known, it's a pick-and-place-task: a standard in robotics. But if the robot has no detailed information about the objects in the container, and these objects are positioned randomly, the technological challenge is huge and the environmental conditions are complex. In the most challenging scenario, the vision system must be capable of recognizing a hitherto unknown position scheme, in order to subsequently be able to identify a hitherto unknown part. The problem is known in the industry for years as "Bin Picking". To solve this task, special know-how is needed from diverse areas of metrology like optics, illumination, sensors and algorithms. Particularly important in practice are operability aspects like re-calibration and re-installation, traceability, failure analysis options, process documentation, system handling, availability, and prevention of pseudo defects. In addition, the sensor should be easy and quick to change in case of failures.
The concrete task of the sensor system with "Bin Picking" is to define the 3D position of the parts to be handled with adequate accuracy and speed rate. This leads to the calculation of position correction data and transmission of these data to the handling device.

Uniform System Platform
Thanks to dynamic development in the area of sensor technology, there is a multitude of highly efficient sensor technologies available today. There are the classic gray-value and color cameras which gather images or image matrices. Furthermore, triangulation sensors detect distances and height contours and determine 3D height profiles.

At last TOF (Time of Flight) sensors allow LIDAR measurements to base the calculation of distances, height contours and 3D height profiles on, in effect similar to the data calculated from laser triangulation sensors. Moreover, the availability of high performance hardware allows for the application of computer intensive algorithms, so that even complex mathematical methods can be implemented. So far, existing systems have focused on solving only individual task classes. Any change in the task specifications or environmental conditions (ambient light, new containers, range of part characteristics, such as geometry or color) usually requires extensive interventions, modifications and expansions, in order to adjust the new task setting to the task-specific sensor system. The system approach of the company Vision Machine Technic Bildverarbeitungssysteme (VMT), however, pursues a modular strategy. It is based on the industry-proven functions of the current VMT system solutions, on the one hand, and the integration of new applications into the open system architecture, on the other hand. The second important aspect of this platform is the end-to-end employment of as little as possible different but carefully selected hardware components of renowned manufacturers. This helps to avoid compatibility problems at later system adjustments.

Put into Practice
To solve the most diverse tasks, VMT developed multi-sensor systems with a number of highly effective functions, yet with the aim to keep the system transparent and easy to use for customers.
Bin-picking tasks can be graded into different classes (see table). For these classes two fundamentally different recognition models can be formulated: The part to be handled is known (case 1) or it is not known (case 2).
In case 1 the known part is described mathematically. For this the object's geometry and properties, for example the color, are recognized. This model is used, e.g., for the unloading of door hinges from SLCs (small load carrier). The sensor signals are provided by a TOF sensor placed on a linear axis or at the robot's hand.
In case 2 where the part to be handled is unknown, the recognition model describes the object's properties like "The part contains areas with a minimum size of". Based on this information it is calculated to which location the suction gripper of the robot needs to be positioned. Another object property might be "The object is round with a hole in the middle". A gripper designed for this operation can then dip into this hole.
An example for this class would be the unloading of bulk material from a SLC. The bulk material consists of different shaped parts. For this task a triangulation sensor is placed on the robot's hand. The exact 3D position of the parts to be gripped and the part type is then determined by the connected 3D camera system.
Another sought-after application is the depalletizing of small load carriers (SLC) or VDA boxes. The SLCs to be depalletized are provided on a pallet and placed at a random stacking order. There are up to four different SLC types in a multitude of different colors. Firstly, a TOF sensor on a linear axis is used to determine the stacking scheme and to recognize the part type of the SLC to be gripped next, as well as its approximate 3D position. The gripper which the robot moves over the SLC contains four cameras. With a single image acquisition the 3D gripping position is thus calculated within a fraction of a second. The storage of the SLCs in the high-rack warehouse of the goods-receiving area can thus be executed fully automatically. The only manpower necessary is for transporting the pallets with SLCs with a stacker from the shipping truck into the small parts warehouse and for removing the empty pallets.
In order to make these solutions industrially applicable, a number of additional basic functions are provided: gripping feasibility analysis, "container -empty" control, collision prevention, checking the container for damages, definition of container and/or unloading positions, automatic calibration etc. The sensor system can also assume additional functions, such as type recognition, inspection and reading (plain script, barcode/matrix code). Integration of these tasks in the VMT system is easily feasible thanks to the system's uniform platform.

Conclusion
Different task classes can be derived from the versatility of practical applications in the area of material handling. Solutions for these tasks within the individual classes require special functionalities of the sensorics (algorithms, functions, sensors). VMT offers a platform of available solutions whose functionality may be expanded for the purposes of each new application, on the one hand, or combined with other functions avail­able in the VMT system, on the other, depending on the complexity of the given project. Since the platform is fundamental, application independent, and expandable, it is possible to also use it - in addition to or in combination with the above-described robot guidance tasks - for applications in other areas, such as inspection, type recognition and code reading.


 


Task Classification
Class 1      Position scheme is known      Part type is known
Class 2a    Position scheme is unknown   Part type is known
Class 2b    Position scheme is known      Part type is unknown
Class 2c    Position scheme is unknown   Part type is unknown
Position scheme: Arrangement of parts within transport carrier
Part type: Form and color of the part to be gripped

Authors

Contact

VMT Vision Machine Technic Bildverarbeitungssyssteme GmbH
Mallustr. 50-56
68219 Mannheim
Germany
Phone: +49 (0)621 84250-0
Telefax: +49 (0)621 84250-290

You may also be interested in

Register now!

The latest information directly via newsletter.

To prevent automated spam submissions leave this field empty.