Mobile human-computer interaction for commissioning and controlling of automated guided vehicles

Theme Automated guided vehicles, Artificial Intelligence, Industry 4.0
Project title Mobile human-computer interaction for commissioning and controlling of automated guided vehicles (MobiMMI)
Project duration 01.07.2018 – 30.07.2020
Video
Download
Press release

The operation of unsteady conveyors is currently done manually e.g. with forklift trucks or fully automated with automated transport vehicles (AGV). Compared to manual unsteady conveyors, AGVs are characterized by a lower accident rate, faster processing of orders and a reduced wear due to incorrect operation. They make a decisive contribution to lower operating costs for SMEs.

The disadvantage of using AGVs is its behavior in critical operating situations such as e.g. the storage at undefined plate positions. While the operator of a forklift truck can react individually to these situations, it is not possible for an AGV to solve these situations independently.

The goal of this research project was the development of a mobile system for speech- and gesture-based human-computer interaction to enable intervention in critical operating situations. In this way, the more economical operation of an AGV can be used while avoiding unwanted behavior.

Publications about the project

Automated guided vehicles are a crucial component for more efficient production systems in intralogistics, but they have weaknesses in human-machine interaction. Scientists at IPH are developing a gesture-based control system to make the interaction intuitive and increase its acceptance.

Driverless transport vehicles, guidance control, gesture-based control

This article shows how the abilities known to humans to be flexible and adapt to changing environmental conditions, which are reflected in human cognitive characteristics, can be transferred to industrial trucks in intralogistics. As examples for the implementation of Industry 4.0 in intralogistics, technologies are presented that enable industrial trucks to recognize their environment, communicate information, draw conclusions, act autonomously, make decisions, learn or plan. These capabilities will be realized by an optical positioning system for position determination, camera-based storage/retrieval support and sensor technology integrated into tires, as well as novel forms of interaction for industrial trucks in the form of speech and gestures.

automated guided vehicle, augmented reality, smart glasses

Driverless transport systems are a building block for more efficient production systems in intralogistics, but have weaknesses in human-machine interaction. In a complex research project, a voice-based assignment is being developed, among other things, which is intended to make human-machine interaction more intuitive and increase its acceptance.

automated guided vehicle, augmented reality, smart glasses, voice control

Driverless transport systems (AGV-Systems) are an established and effective instrument for increasing the profitability of modern production plants and making intralogistical processes more efficient. In addition to a master control system and a communication system, driverless transport vehicles (AGVs) are among the main components of an AGV-System. In relation to manually controlled industrial trucks, automated AGVs are characterised by higher efficiency. The disadvantage of AGV-Systems is that they are not able to solve critical operating situations independently. In this case, extensive intervention by specialist personnel is required.
With the aim of overcoming these obstacles, the project "Mobile Human-Machine Interaction for commissioning and control of AGV-Systems (MobiMMI)" was developed. In this project, the human-machine interaction between an operator and an AGV is to be extended by the use of a speech and gesture-based system in order to make the intervention by the operator easier and more intuitive and thus significantly reduce the acquisition and operating costs of AGV-Systems.
Against the background of safety, ergonomics, user-friendliness and integrability, a mobile system will be developed for this purpose and equipped with various sensors for 3D detection of the environment, indoor positioning and multimodal communication. The recorded data is evaluated by means of computer vision and machine learning, enabling the operator to react quickly and easily to critical operating situations.

automated guided vehicle, human-machine-interface

Sponsor

Partner

Your contact person

Dr.-Ing.

Benjamin Küster

Manager production automation