Maintenance robot motion control based on Kinect gesture recognition

: Maintenance robot is used widely in industries due to its convenience. The maintenance robot system is introduced. The hardware frame is designed in detail. Based on gesture recognition, the effect of controlling the robot movement is achieved. Gesture recognition information is achieved through the acquisition of human skeleton information by a Kinect sensor. The construction of the gesture library is done. The maintenance robot control system is set up to control the omnidirectional mobile robot driven by the Mecanum wheel. Finally, the maintenance robot is developed and the design functions are carried out.


Introduction
When the machine tool fails in the factory [1], the maintenance engineer should immediately reach out for the damaged machine tool and be able to find out the fault location of the machine tool on time and complete the maintenance of the machine tool in the shortest time, so as to make-up with the damage in a short time. Machine tool completes the normal operation. In the process of repairing the machine tool, it is difficult for the maintenance engineer to find out the fault parts of the machine tool and carry a large number of maintenance tools. Therefore, it is necessary to design maintenance robots to help maintenance engineers solve this problem. This article focuses on the control of maintenance robot movements through gesture recognition to achieve humancomputer interaction.
According to Kinect2.0 to obtain human skeleton information through bone information recognition gestures, a maintenance robot control system must be established, instructions must be sent to the robot through the serial port, and movement of the robot must be controlled, thus completing the interaction with the robot, so that engineers can have a better experience while solving the problem.

Maintenance robot system design
Maintenance robot can complete the free walking in the factory and can carry common maintenance spare parts, such as wrench, screwdriver, and so on. At the same time, maintenance robots equipped with intelligent operation and maintenance system [2] can help repair engineers to solve the difficulties encountered during maintenance, such as pointing out the damaged parts, provide maintenance solutions (as shown in Fig. 1).

Hardware frame design of maintenance robot
The hardware frame of maintenance robot system includes a power module, subordinate machine motion control module, servo movement system, host computer control system, and body structure (as shown in Fig. 2).
(i) Power module: Robot body drive motor adopts 36 V DC power supply and select 36 V lithium battery as power supply unit. Although the Kinect sensor requires an external DC 12 V power supply, a step-down module is required to step down the 36 V DC power supply to a 12 V DC power supply for the sensor. (ii) Subordinate machine motion control module: Adopt Nucor Corporation nuc120 chip as a control unit. The chip has eight pulse width modulation (PWM) interfaces, three serial ports, and several general-purpose input/output (GPIO) interfaces to fulfil and control the main requirements of the mechanical hardware. (iii) Servo control system: A closed-loop stepper motor as the drive unit. The closed-loop stepper motor combines the characteristics of a stepping motor and servo control mode, and therefore, it is easy to debug avoiding the overshoot during the movement of DC brushless motor and solve the problem of stepping motor loss. (iv) Host computer control system: Use a common notebook as the master computer, configured with Intel Core i5, 8G RAM, Windows operating system. (v) Body structure: The robot body adopts a four-wheel omnidirectional mobile chassis structure and the normal motion of the robot body in a prescribed environment is completed based on the Mecanum wheels drive.

Body structure design:
The maintenance robot is divided into the walking mechanism, oil storage mechanism, tool storage mechanism, and robot arm.
For wheeled mobile robots, the two-wheel differential movement is usually used. Two driving motors are used as robot driving components, and the follower universal wheel is used to complete the overall movement balance of the robot. Maintenance robot adopts Mecanum wheel structure. With four Mecanum wheels as the drive components, it has three degrees of freedom in the plane, and it is very much used in factories where space is narrow and the mobility requirements are high, and it can adapt to the high adaptability and more complex and compact work environment.
The oil storage mechanism is mainly used to store the necessary supplies for the machine tools -lubricant liquid. Lubrication fluid is generally equipment maintenance supplies, and is raised every 1-1.5 months to ensure lubrication of guide rail.
Tool storage mechanism is used to store spare parts needed for daily maintenance, as well as standard parts such as bolts and nuts.
The robotic arm is used to complete the grabbing of maintenance spare parts from the spare parts library and put it into the tool storage mechanism. Also, when service engineers need to repair spare parts and handover repaired spare parts to maintenance engineers, to reduce the workload of maintenance engineers, the robotic arm is used.

Vision-based gesture recognition system
Gesture recognition generally has three basic methods, algorithmbased, neural network-based, gesture-based sample library. Kinect for Windows Software Development Kit (SDK) provides Visual Gesture Builder (VGB) tools. VGB is a gesture database that can be generated and used to execute runtime gesture tools. The VGB tool encapsulates machine learning-related algorithms and trains the recorded video import software to generate gesture databases. VGB development of gesture recognition greatly shortens the software development cycle and makes it simple and efficient.

Introduction to Kinect2.0: 2014 Microsoft launched
Kinect2.0, and also launched the Kinect for Windows v2.0, through simple hand and voice features to achieve the human-computer natural interaction application development. The Kinect for Windows Sensor uses depth sensing technology, built-in colour cameras, infrared transmitters, and array microphones to perceive human locations, movements, and sounds (as shown in Fig. 3).
The Kinect for Windows SDK provides developers with drivers, tools, application interfaces, and device interfaces (as shown in Fig. 4).

Gesture library building:
Open the Kinect for Windows SDK2.0, the Kinect studio, which is a tool that helps you to record and play the depth and colour streams from the Kinect. Use this tool to read and write data streams to help debug features, create repeatable test scenarios, and analyse performance.
Record seven batches of the clip in Kinect studio, record five sets of clips in each batch clip, record three to four gesture [3,4] fragments in each group, four for training in each batch, and one for analysis, test record clip completion and fitness. Corresponding gestures are assigned to the movement state of the robot. The gesture allocation principle is defined according to the daily understanding of control on the movement of the robots; the gestures and the coordination of the limbs are designed to be aesthetically pleasing to the engineer's memory such as left line, left arm lifted and straightened (as shown in Table 1).
Open the VGB software, create a gesture recognition project according to the gesture recognition wizard, import the recorded clip into the gesture project, and select four groups of gestures for training. Intercept valid clips, label the clips, and define the valid clips as Ture. After the gesture data is completely calibrated, the project is built and the classifier (.bga file) is generated. Then open the VGB Viewer software to import the .bga file, observe the tree diagram and test whether the gesture training results meet the requirements (as shown in Fig. 5).

Design of visual interface of host computer:
Utilise the advantage of computer visualisation to adopt the Microsoft Foundation classes (MFC) [5,6] for object-oriented design method and build the upper computer control software. Through OpenCV technology [7], real-time display images and bone information can be collected by Kinect sensors to recognise gesture information. This MFC program interface is divided into the image area, control area, and test area. The image area shows the depth image and skeleton information transmitted through the sensor. Control area layout control button can be used for the operation of the software. The test area shows the back-testing data, which is used to determine whether the robot is moving or not, and to verify the successful implementation of the function (as shown in Fig. 6).

Program frame of host computer:
The host computer program mainly realises the data display, manages the manmachine interaction function, and handles the target data collected by the Kinect sensor. This procedure has the data processing, the decision algorithm computation, the upper and lower level machine communication and other functions (as shown in Figs. 7 and 8).

Robot motion test
According to the experimental requirements build a maintenance robot movement platform. The experimental results show that the skeleton information of the human body can be displayed and robot controlled (as shown in Figs. 9 and 10).

Conclusion
The operator can perform somatosensory control of the robot at a distance of about 3 m from the Kinect sensor. At present, the robot can turn left, turn right, move left, move right, advance, retreat, stop, and seven kinds of motion states through seven gestures. With the VGB tool, the acquisition and processing of complex gestures can be completed, the learning cost is made simple, and the development cycle is shortened.
Robots are playing an increasingly important role in people's lives. The system provides a new type of control -somatosensory control, which makes the robot's control more flexible and diverse, enabling more natural human-computer interaction, increase in theoretical research value, and advanced nature of control. On this basis, more types of automated robots can be developed based on industrial production and other needs, providing active support and assistance for creating an automated robot system.

Acknowledgments
This work was supported by the Major Science and Technology Projects of China (no. 2015ZX04001002).