Andreas Seel

Graduation:
M.Sc.
Function:
Project engineer
Practice Areas:
Image processing, AGV / floor conveyor
Phone:
+49 (0)511 279 76-234
E-Mail:
seel@iph-hannover.de
vCard:
vCard

Publications

This article shows how the abilities known to humans to be flexible and adapt to changing environmental conditions, which are reflected in human cognitive characteristics, can be transferred to industrial trucks in intralogistics. As examples for the implementation of Industry 4.0 in intralogistics, technologies are presented that enable industrial trucks to recognize their environment, communicate information, draw conclusions, act autonomously, make decisions, learn or plan. These capabilities will be realized by an optical positioning system for position determination, camera-based storage/retrieval support and sensor technology integrated into tires, as well as novel forms of interaction for industrial trucks in the form of speech and gestures.

automated guided vehicle, augmented reality, smart glasses

Driverless transport systems are a building block for more efficient production systems in intralogistics, but have weaknesses in human-machine interaction. In a complex research project, a voice-based assignment is being developed, among other things, which is intended to make human-machine interaction more intuitive and increase its acceptance.

automated guided vehicle, augmented reality, smart glasses, voice control

Driverless transport systems (AGV-Systems) are an established and effective instrument for increasing the profitability of modern production plants and making intralogistical processes more efficient. In addition to a master control system and a communication system, driverless transport vehicles (AGVs) are among the main components of an AGV-System. In relation to manually controlled industrial trucks, automated AGVs are characterised by higher efficiency. The disadvantage of AGV-Systems is that they are not able to solve critical operating situations independently. In this case, extensive intervention by specialist personnel is required.
With the aim of overcoming these obstacles, the project "Mobile Human-Machine Interaction for commissioning and control of AGV-Systems (MobiMMI)" was developed. In this project, the human-machine interaction between an operator and an AGV is to be extended by the use of a speech and gesture-based system in order to make the intervention by the operator easier and more intuitive and thus significantly reduce the acquisition and operating costs of AGV-Systems.
Against the background of safety, ergonomics, user-friendliness and integrability, a mobile system will be developed for this purpose and equipped with various sensors for 3D detection of the environment, indoor positioning and multimodal communication. The recorded data is evaluated by means of computer vision and machine learning, enabling the operator to react quickly and easily to critical operating situations.

automated guided vehicle, human-machine-interface

Research projects