Gas Tungsten Arc Welding (GTAW) as one of the major welding methods has been used widely in industrial manufacture, especially for aerospace and pressure vessels, etc. For these critical applications, the sensing and control of penetration state are required to complete the high-quality welding joint, in which the backside width is a key factor to present the penetration state. In the practical manufacture process, the backside of workpiece is not available due to space limitation. Therefore, how to sense welding process information plays a key role in the automatic and intelligent control of welding processes. There are so many sensing methods including voltage sensing, spectrum sensing, ultrasonic sensing, infrared sensing, weld pool oscillation sensing, and vision sensing, etc. In the traditional welding process, the welders observe the surface of weld pool and control the welding speed and current to complete the designed penetration state according to their professional experience, so the topside information of weld pool has a strong tie with backside width. Hence, we select the vision sensing method to simulate the welder’s observation, in which the vision sensing method can be divided into passive sensing and active sensing. For passive sensing, the intensive disturbance from the welding process like high-intensity arc brightness, spatter and smoke can reduce the prediction accuracy, so the active vision is used as the sensing method to monitor the penetration state. The simplified and clear information can help us complete the identification efficiently, and thus an active sensing method is developed at the University of Kentucky using a single stripe laser pattern to extract the topside information of weld pool. The feature in the reflected image is not easily extracted for us using the traditional approach. Luckily, deep learning has a great ability for non-linear fitting and generalization which is a perfect and strong tool to solve this problem. A convolutional neural network model is designed to training and predict the penetration state or backside width of weld pool.
In this research, a series of GTAW experiments using random weld current and speed in a certain range to make the full penetration have adopted and conducted to acquire dataset. The welding experimental platform contains the welding system, sensing system and control system. For the sensing system, a laser generator, an imaging plane, and two cameras (topside and backside) are installed around the weld torch on the same plane. The laser pattern generated from the lase generator is projected onto the topside surface of weld pool, and the reflected laser will be received by the imaging plane and captured by the camera. The topside camera acquires the characteristics of the topside surface of weld pool as the feature, and the backside camera is used to sense the brightness of backside weld pool as the label in the CNN model. In this experimental platform, the welding current control, welding speed control and sensing of two-side images are conducted and completed using personal computer that is based on the Python programming environment.
The dynamic penetration status of weld pool in GTAW is studied and divided into many stages such as part penetration, critical penetration, full penetration and excessive penetration. From the analysis of experiments and physical model, the results show that active vision images can establish an obvious relationship with the penetration/backside width of the weld pool and establish the theory foundation for real-time intelligent control. The application of deep learning (CNN) can calculate the all characteristics of vision sensing. The designed model of CNNs showed that the error of test data is 0.497 mm after processing the predicted width using the moving average method and the model can process 61.79 data pairs per second. Therefore, this research can realize and complete the innovation and application of information sensing in the field of real-time high-quality penetration control.
Keywords：Intelligent Welding, Active Vision, Penetration Width, Prediction, Deep Learning