Abstract Scope |
Manual welding remains widely used due to its flexibility, yet it suffers from low efficiency and a strong dependence on operator skill and physical condition. Hand fatigue during extended operations often leads to reduced weld accuracy and stability. To address this, we propose a human-machine collaborative welding system featuring a head-mounted control interface that enables hands-free adjustment of welding parameters. Instead of relying on traditional handheld controllers, the system interprets head movement through multi-source time-series data, including Euler angles, three-axis angular velocity, motion state labels, and temporal information. A Multi-Branch Fusion Transformer (MBFT) is introduced to model the temporal dynamics and interdependencies among these heterogeneous kinematic features. Each data source is processed in a dedicated branch before fusion, allowing the network to learn rich posture representations and predict the operator’s neck orientation in real time. These predictions are used to drive precise, adaptive parameter adjustments during welding. The proposed method enhances weld quality, improves consistency, and reduces operator strain through posture-aware, intelligent human-machine interaction. |