Template for an Acta IMEKO event paper ACTA IMEKO ISSN: 2221-870X December 2016, Volume 5, Number 4, 4-11 ACTA IMEKO | www.imeko.org December 2016 | Volume 5 | Number 4 | 4 Optical system for on-line monitoring of welding: a machine learning approach for optimal set up Giulio D’Emilia, David Di Gasbarro, Emanuela Natale Università degli Studi dell’Aquila, Via Giovanni Gronchi 18, 67100 L'Aquila, Italy Section: RESEARCH PAPER Keywords: p-value; machine learning; vision system; uncertainty; online control; welding Citation: Giulio D’Emilia, David Di Gasbarro, Emanuela Natale, Optical system for on-line monitoring of welding: a machine learning approach for optimal set up, Acta IMEKO, vol. 5, no. 4, article 2, December 2016, identifier: IMEKO-ACTA-05 (2016)-04-02 Section Editor: Lorenzo Ciani, University of Florence, Italy Received September 24, 2016; In final form December 14, 2016; Published December 2016 Copyright: © 2016 IMEKO. This is an open-access article distributed under the terms of the Creative Commons Attribution 3.0 License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited Corresponding author: Giulio D’Emilia, e-mail: giulio.demilia@univaq.it 1. INTRODUCTION On-line monitoring of laser welding can be carried out by many inspection techniques, based on acoustic emission, optical detection, image analysis, radiographic testing and ultrasonic [1]. In particular, non-contact optical systems are widely used for on-line monitoring of quality, geometry and position of welding, based on different optical measurement methods, like spectroscopic studies of the welding plasma plume [2], triangulation systems [3] and other types of solutions [4]; remarkable measuring performances could be achieved in terms of measurement uncertainty, even though the cost of these solutions is often high, so that their use is limited to specific applications. Furthermore, if reliable and accurate information is needed, more actions have to be provided, in order to improve the efficaciousness of results for geometry and defect classification, in particular for what the image processing is concerned. In fact, careful image analysis algorithms have been developed for defect classification for images obtained by radiographic systems [5], [6] or by measurement sensors of the eddy current type [7]. Further approaches appear in literature, often based on neural networks and genetic algorithms, having different objectives: identification of welding parameters for optimization of both welding characteristics and cost [8], [9], classification of defects in general of components for the automotive industry [10] or of welding in particular [11]. Furthermore, deep attention has been paid to the use of neural networks for image analysis for quick and effective classification based on a machine learning method [12]. An optical laser sheet is a simple and economical way to carry on non-contact geometrical measurements in different types of applications [3], [4], [13], [14]; a sheet of laser light allows to overcome the problem of low contrast images and to obtain, in a simple manner, information on the shape of surfaces. This is particularly important in case of investigation of untreated metal surfaces. In fact, at the intersection of the laser sheet with the object to be analysed, the laser line appears to be ABSTRACT In this paper a methodology is described for continuous checking of the settings of a low-cost vision system for automatic geometrical measurement of welding embedded on components of complicated shape. The measurement system is based on a laser sheet. Measuring conditions and the corresponding uncertainty are analyzed by evaluating their p-value and its closeness to an optimal measurement configuration also when working conditions are changed. The method aims to check the holding of optimal measuring conditions by using a machine learning approach for the vision system: based on a such methodology single images can be used to check the settings, therefore allowing a continuous and on-line monitoring of the optical measuring system capabilities. According to this procedure, the optical measuring system is able to reach and to hold uncertainty levels adequate for automatic dimensional checking of welding and of defects, taking into account the effects of system hardware/software incorrect settings and environmental effects, like varying lighting conditions. The paper also studies the effects of process variability on the method for quantitative evaluation, in order to propose on-line solutions for this system. ACTA IMEKO | www.imeko.org December 2016 | Volume 5 | Number 4 | 5 well contrasted with respect to the object surface. In this way, the intersection points (circled in red in Figure 1) between different surfaces of an object are easily identified. Usually the accuracy of these systems has to be optimized, in order to achieve a satisfactory uncertainty level [15]-[17], especially for the control of complicated geometries as in case of automotive applications. Furthermore, the best quality level for measurements has to be maintained and this should be assessed on-line with a method not too onerous from the point of view of time, operations and data processing requirements. Therefore, it is important to develop both a vision system that works well under certain environmental conditions and a methodology able to automatically maintain good metrological characteristics, also in complex applications and varying operating conditions. In this paper the measurement capability of a measurement system based on a laser sheet is described, with reference to the monitoring of welding having a complicated geometry, for realization of big mechanical components. Furthermore, an experimental methodology is presented, in order to improve the possibility of using this simple measuring solution also in difficult situations, aiming to automatically check and maintain conditions of good setting of the vision system, with particular attention to the variations of the lighting level and characteristics. This work aims to individuate the main parameters affecting uncertainty and to evaluate their effect from both a theoretical and experimental point of view. A previous work of the authors [18] developed a first attempt to set a method based on a laser sheet system able to be robust and simply to use, for on-line continuous check of the right optical and procedural settings. The p-value parameter [19] was used to automatically identify the best settings for the vision system, taking into account the main causes of uncertainty. The method demonstrated to be able to detect the effects of a little variation of settings, so strongly supporting the setup of the vision system. The effect of process variability was also studied, in particular dimensional variability of pieces to be welded together, showing that the method is very sensitive to it. The methodology proved to be robust and reliable in practical applications. According to the previous considerations, the paper describes an improved methodology able to improve the accuracy of dimensional measurements and to reduce the processing time of the images of the measured pieces for more efficient use in on-line applications. The method is based on the use of a machine learning approach for optimal set-up and uncertainty reduction, despite of the variation of procedural and environmental conditions. A specific section will present the improved methodology explaining the theoretical and experimental motivations of it. The experimental results and the way the data have been processed will be described, in order to validate the methodology for a class of practical applications. Some discussions with reference to the use in field will conclude the paper. 2. MATERIALS AND METHODOLOGY Position and dimensional check of welding is by a vision system, which is based on a laser sheet, whose architecture is shown in (Figure 2 and Figure 3). A detailed description of components and the complete procedure for the setup of the system is given in a previous work [20]. In the present paper, the Matlab software and Machine Learning Toolbox and Computer Vision System Toolbox have been used, to analyse the illumination conditions in the acquired images. The proposed system uses the same hardware components of the triangulation systems [3], but it provides a different software elaboration; in fact, this approach does not identify the Figure 2. Scheme of the system for the welding control. Figure 3. Picture of the system for the welding control. Figure 1. Example of identification of the intersection point between two lines generated by the laser sheet. ACTA IMEKO | www.imeko.org December 2016 | Volume 5 | Number 4 | 6 geometrical surface of the analysed object, but it evaluates, in a simple manner, the distance of characteristic points from a reference. This possibility can be useful in all phases of a welding process: the checking of the line of contact of two pieces to be welded (seam tracking); the driving of the robotic arm that deposits the welding bead; the validation of the position and of the dimensional parameters of the realized welding (positioning and shape of the weld). Measurements have been carried out on a piece constituted by two perpendicular steel plates, to be welded together by a linear bead, that is the typical T-joint. The weld seam will have a convex shape and a throat thickness of about 5 mm. In this case, the position of interest is the intersection point between two lines generated by the laser sheet (Figure 1), with respect to a reference point of the piece (seam tracking). It is to be noticed that the method is also suitable for other configurations, allowing the measurement of any dimensional parameter based on the evaluation of the distance between two different points. This method is potentially not influenced by the welding parameters. It is important to ensure that characteristics of correct setting of the system are set over all the time, so that the measurement results are reliable and accurate. Having a correct setting physically means that all variability causes act in a random and reduced way, so that no systematic and no remarkable effects of any specific variability cause appear. Therefore, if the system is properly set, measurements are according a Gaussian variability with fixed mean and reduced standard deviation [21]. The methodology described hereinafter is based on this assumption. This condition should be guaranteed both at the installation time of the vision system and during the on-line working. In the previous step [18] the Gaussian distribution of repeated measurement of quantities of interest was verified by the p-value [19]. The p-value is widely used in statistical hypothesis testing. A model (the null hypothesis) and a threshold value for p, called the significance level of the test and denoted as α, have to be chosen. If the p-value is less than or equal to the significance level (α), the test suggests that the observed data is inconsistent with the null hypothesis, so the null hypothesis must be rejected [19]. In this case the null hypothesis is the Gaussian distribution of measurement results, then, if the p-value, ranging between 0 and 1, is greater than a pre-set threshold, the Gaussian distribution of measurements is guaranteed, with the confidence level indicated in the calculation of the p-value itself. Of course, before the production activity begins, the time available for checking that the measurement distribution is effectively Gaussian is much more than the one available during the production, due to the need of avoiding bottle necks to the production rhythms. Therefore, automatically checking these conditions asks different requirements due to differences of time duration available for control before the start of production and during the production itself; this will influence the number of repeated measurements which are possible. The need of assuring that the probability distributions of possible measurements are unchanged (according to the measurement performances) all along the time requires that the closeness of both distributions is evaluated in very short time for on-line control. In the former approach [18] measurements were carried out on each production workpiece, to check that the optimal setup conditions are maintained. Each measurement was repeated n times (typically n=30); that takes about 2 s to 3 s depending on the acquisition and image processing rate of the system. The p- value was calculated on the basis of these n measurements. Then, the p-value estimate is compared with a threshold value: if the p-value is above the threshold level, the system is correctly set. The threshold level can be defined in relation to a requested accuracy and to specific environmental conditions by means of a statistical analysis (mean and standard deviation of p-values) based on a sufficient number of p-values (at least 30 p-values) calculated on the first batch of production. Being necessary to acquire 30 measurements in static condition and to post-elaborate the p-value, in some cases the production process can be too fast for this method. For these reasons in this paper the procedure has been improved according to actions described in the following: 1. "Optical" setup of the vision system and spatial calibration: the optical setup is carried out using a simple calibration pattern. The pattern has to be framed by the camera, and a program for the identification of edges and intersection points is implemented, using the software for the image processing. Focusing and exposure time are manually set, until all elements are recognized. A gauge block of known size is used for the evaluation of the ratio pixel/mm. 2. Preliminary tests: an experimental and parametric study of the effect of changing the settings of the optical system and of the variation of environmental conditions is carried out. Of course, the most relevant aspects are taken into account, like grey level settings and environmental light intensity and typology variation, but other variability sources of the welding process could be considered like, for example, differences in surface reflection. For each test condition the p-value and its statistics are evaluated, in order to correlate estimated ranges of the p-value to different operating conditions, depending on settings and environmental scenario. This step is intended to realize a correspondence between level of p-value, illumination conditions and settings of the vision system. The estimated uncertainty level is also of concern. 3. Training and testing of the machine learning classifier: for each test condition 150 images are acquired to train and test a machine learning classifier. The aim of the classification is to quickly identify the operating illumination condition and then the corresponding optimal settings of the vision system. 4. Validation: repeated measurements for all the considered working conditions are carried out in order to check the reproducibility of the p-value measurements. This step will be also used to more effectively define the p-value threshold, taking into account the variability in reproducibility conditions. 5. On-line monitoring of the optical system: one image is recorded and classified by the machine learning classifier, to identify the actual operating conditions (illumination condition and settings of the vision system). In this way, image classification allows to measure indirectly the p-value for the system: in case a predefined threshold in not reached, the system settings should be automatically modified in order to improve the p-value for optimal setting. When the p-value is greater than the threshold limit, the measure of the distance of interest can be accepted with known uncertainty level. A synthesis of the methodology is represented in the flow diagram of Figure 4. ACTA IMEKO | www.imeko.org December 2016 | Volume 5 | Number 4 | 7 In the next section the methodology will be discussed with reference to some practical applications concerning pieces to be welded. 3. RESULTS In this section the effects of varying the settings of the vision system with respect to the best ones in different lighting conditions and the application of the machine learning procedure are evaluated. Measurements have been carried out on a piece, as an example, constituted by two perpendicular steel plates, to be welded together by a linear bead (Figure 1). Grey threshold has been identified as the most relevant setting parameter, to be studied, together with the lighting conditions as for the environmental parameters. The change of setting of the vision system, by varying the grey thresholds (variation of the grey level in the transition from dark area to the light one, ranging from 0 to 255) modifies the two boundaries of the laser line, which is a broken line because of the object shape (Figure 1). These data should be processed in order to find the distance to be measured. Six different conditions of illumination (ic) have been compared. For each lighting different settings of the grey threshold have been examined: - ic 1: a neon light system at a 2 m distance from the object. - ic 2: the same as ic 1, but the neon light system is partially obscured on the left side at 1.5 m of distance of the steel piece. - ic 3: the same as ic 1, but the neon light system is partially obscured at 0.3 m distance. - ic 4: the same as ic 3, but the neon light system is partially obscured at 0.5 m distance. - ic 5: the same neon system as ic 1, but a led lamp is added at 0.3 m distance. - ic 6: the same as ic 2, but the led lamp is placed at 0.5 m distance. The system setting can be optimal or not. This depends on the grey threshold values selected. The conditions are optimal when the measurement result corresponds to the reference one and the standard deviation is based on casual effects and then the calculated p-value is very high (near to 1). For each illumination condition four cases are analysed: three cases have different grey threshold values, the last one is the reproduction of the best case among the previous, based on the p-value. Table 1 shows the test plan, having 24 different cases. According to the procedure described in the previous section, a large number of data has been acquired, in the order of 3000 measurements for each case and 150 bitmap images. The results of the “preliminary tests” are discussed in the following subsections. 3.1. P-value analysis The results “preliminary steps” are described in the graphs of Figure 5. For all tests n = 30, being n the number of repeated on-line measurements. In particular, the diagrams of Figure 5 show the behaviour of the mean p-value (averaging 100 p-values) with respect to the set measurement uncertainty (standard deviation of the normal reference distribution of the vision system measurements) for all the examined cases. The reference distance value is (60.77 +0.01) mm. For the cases with the optimal settings, even though the set uncertainty is reduced, the mean p-value remains practically unchanged, maintaining high values; for the other two cases, the found p-value quickly drops when the requested uncertainty of measurements is reduced. This is true for all the illumination conditions (except for ic 5). This result is very reasonable and it also suggests that a threshold value could be set to separate the best setting case from other ones. The diagrams of Figure 5 show that separated p-value ranges occur, between settings optimal or not, if a reduced standard uncertainty is considered (standard uncertainty less than 0.23 mm). This result confirms the ability of this method of distinguishing the correct setting from slightly different configurations of the vision system. In all the conditions of lighting considered, in case the right setting of the grey threshold is fixed, the estimated distance between the reference points is in the range (60.77 ± 0.01) mm. The diagram of Figure 5 shows that the p-value is strongly dependent on the grey setting for all the lighting conditions, which have been examined. Due to the found differences of p- value, even though the variability of p-value is taken into account, threshold values could be easily set, in order to quickly identify a condition of right setting. It is interesting to compare for each illumination condition p-values between cases 1 and 4, corresponding to same settings: the p-value estimates are very repeatable for all conditions. 3.2. Machine learning classification The results of the “training and testing of the machine learning classifier” phase of the methodology described in Section 2 are now discussed. 150 bitmap images, randomly selected from cases 1 to 3 for each illumination condition, have been used for the training and the testing of the machine learning algorithm. The steps for the machine learning system are described in the following: 1. Loading the images in the system: 900 images are loaded in the system (150 for each illumination condition). Figure 4. Flow diagram of the methodology. ACTA IMEKO | www.imeko.org December 2016 | Volume 5 | Number 4 | 8 Table 1. Tests plan. Illumination condition Case Grey threshold 1 Grey threshold 2 Optimal Illumination condition 1 Case 1 40 60 Illumination condition 1 Case 2 45 80 optimal Illumination condition 1 Case 3 55 100 Illumination condition 1 Case 4 45 80 optimal Illumination condition 2 Case 5 45 80 Illumination condition 2 Case 6 75 40 optimal Illumination condition 2 Case 7 55 60 Illumination condition 2 Case 8 75 40 optimal Illumination condition 3 Case 9 45 80 Illumination condition 3 Case 10 55 50 optimal Illumination condition 3 Case 11 65 65 Illumination condition 3 Case 12 55 50 optimal Illumination condition 4 Case 13 45 80 Illumination condition 4 Case 14 50 90 optimal Illumination condition 4 Case 15 55 60 Illumination condition 4 Case 16 50 90 optimal Illumination condition 5 Case 17 45 80 Illumination condition 5 Case 18 50 75 Illumination condition 5 Case 19 55 60 optimal Illumination condition 5 Case 20 55 60 optimal Illumination condition 6 Case 21 45 80 Illumination condition 6 Case 22 85 54 optimal Illumination condition 6 Case 23 65 65 Illumination condition 6 Case 24 85 54 optimal Figure 5. Mean p-values and their standard deviation ranges vs. standard uncertainty. ACTA IMEKO | www.imeko.org December 2016 | Volume 5 | Number 4 | 9 2. Extracting features from the images and keeping 80 % of the strongest features from each image set. A feature is a measurable property or a characteristic of the image; for instance, a geometrical point of discontinuity in the image or an area at a specific grey level. In total the Matlab script automatically finds 24491 features for each illumination condition. 3. Using K-Means clustering to create a 200 words visual vocabulary from the 24491 features. 4. Encoding all the 900 images with the features and the clustering method described in point 2 and 3. 5. Selecting and training the supervised machine learning classifier: the images used for the training are 50 % of the available ones for each illumination condition. The others ones are used for the testing of the system. A large number of algorithms are compared for the training of the system. The results are described in Table 2. Most of the algorithms detect the features with 100 % of accuracy. This result confirms the high quality of the images and the uniformity of them in each training set. This is a good result for the training phase but a problem in on-line monitoring. In fact, an algorithm trained with a dataset having low variability generally is unable to work with real data, which are more variable than the original ones. 3.3. Validation of the machine learning classifier The results of the “validation” phase of the methodology described in Section 2 are now discussed: the images of the fourth case of each illumination condition are used to test the classifier. Five images are taken from each of the following cases: 4, 8, 12, 16, 20, 24. These cases are representative of all the illumination conditions (ic 1 to ic 6). These images are taken in the same conditions as the other cases, after a few hours delay. Even though a few hours interval occurs between acquisitions, the images to be compared are very similar (Figure 6). The classifiers with accuracy 100 % have been used in the validation step. Among these, the best results have been reached by means of the “Linear Discrimination” classifier, described in Table 3, in the second column. All the conditions are correctly identified. Based on the successful classification, the working situation is identified and the standard deviation of the measurements can be estimated, depending on the actual setting of the grey threshold levels, according to the results of Figure 5. In order to check the reproducibility of measurements, the mechanical and the optical systems have been dismounted and reassembled; also all the different illumination conditions were recreated as before, with the aim of obtaining the same working conditions as cases 4-8-12-16-20-24. A preliminary p-value evaluation has been made, to check that optimal working conditions arise for all illumination conditions to be analyzed. A test concerning the classification of images, according to the illumination condition, is carried out; results are shown in the same Table 3 in the third column. First image of case 1 Last image of case 4 Figure 6. Comparison of images with same illumination condition. Table 3. Response of Linear Discrimination classifier Training results. Illumination condition of the input image Response of Linear Discrimination classifier Response of Linear Discrimination classifier with new data ic_4 ic_4 ic_3 ic_2 ic_2 ic_2 ic_3 ic_3 ic_2 ic_2 ic_2 ic_2 ic_1 ic_1 ic_1 ic_6 ic_6 ic_6 ic_5 ic_5 ic_5 ic_4 ic_4 ic_3 ic_6 ic_6 ic_6 ic_5 ic_5 ic_5 ic_5 ic_5 ic_5 ic_4 ic_4 ic_3 ic_6 ic_6 ic_6 ic_3 ic_3 ic_2 ic_1 ic_1 ic_1 ic_3 ic_3 ic_2 ic_2 ic_2 ic_2 ic_1 ic_1 ic_1 ic_4 ic_4 ic_3 ic_1 ic_1 ic_1 ic_3 ic_3 ic_2 ic_2 ic_2 ic_2 ic_6 ic_6 ic_6 ic_1 ic_1 ic_1 ic_6 ic_6 ic_6 ic_4 ic_4 ic_3 ic_3 ic_3 ic_2 ic_5 ic_5 ic_5 ic_2 ic_2 ic_2 ic_5 ic_5 ic_5 Table 2. Training results. Classifier Accuracy Decision Trees Complex 99.3 % Decision Trees Medium 99.3 % Decision Trees Simple 83.3 % Discrimination Analysis Linear 100 % Discrimination Analysis Quadratic 100 % SVM Linear 100 % SVM Quadratic 100 % SVM Cubic 100 % KNN Fine 100 % KNN Medium 100 % KNN Coarse 100 % KNN Cosine 100 % KNN Cubic 100 % KNN Weighted 100 % Ensembles Boosted Trees 17 % Ensembles Bagged Trees 100 % Ensembles Subspace Discriminant 100 % Ensembles Subspace KNN 100 % ACTA IMEKO | www.imeko.org December 2016 | Volume 5 | Number 4 | 10 The classification capability of the classifier is satisfactory: only a few conditions are confused, being very similar with respect to the illumination condition (Figure 7). In case of confusion, the uncertainty corresponding to the poorer illumination condition can be considered. 4. CONCLUSION A method has been discussed for on-line validation of position measurements of a monitoring system for welding. The device is based on a vision system and a laser sheet. The p-value parameter has been used to automatically identify the best settings for the vision system, taking into account the main cause of uncertainty. The method is able to detect the effects of a little variation of settings, so strongly supporting the setup of the vision system. A procedure has been proposed with the purpose of applying it on-line; it has been validated for different conditions of lighting. The results proved that the procedure is able to check the holding of optimal measuring conditions by using a machine learning approach for the vision system: based on such a methodology single images can be used to check the settings, for on-line use. Once the working situation is identified, the standard deviation of the measurement can be estimated, depending on the actual setting of the grey threshold levels. Achievable uncertainty values are in the order of some tenths of a millimeter. As an important improvement of the previous version of the method, the presented solution allowed to make the measurement procedure unaffected by the process variability with a remarkable reduction of measurement uncertainty. ACKNOWLEDGEMENT The courtesy of Vision Device Srl, Torrevecchia Teatina, Italy, for making available the vision system and the processing software, is gratefully acknowledged. REFERENCES [1] J. Shao, Y. Yan “Review of Techniques for on-line Monitoring and inspection of Laser Welding”, Journal of Physics: Conference Series 15 (2005) pp. 101-107. [2] M. Ferrara, A. Ancona, P. M. Lugara, M. Sibilano, “Online quality monitoring of welding processes by means of plasma optical spectroscopy”, P Soc Photo-Opt Ins (2000) pp. 750-758. [3] D. Acosta, O. Garcia, J. Aponte, “Laser Triangulation for Shape Acquisition in a 3D Scanner Plus Scan Electronics”, Proc. Robotics and Automotive Mechanics Conference, 26-29 Sept. 2006, Cuernavaca, Morelos, Mexico, pp. 14-19. [4] F. Kong, J. Ma, B. Carlson, R. Kovacevic, “Real-time monitoring of laser welding of galvanized high strength steel in lap joint configuration”, Opt. Laser Technol. 44 (2012) vol. 44 pp. 2186– 2196. [5] G. Wang, T. Warren Liao, “Automatic identification of different types of welding defects in radiographic images”, NDT &E International 35 (2002) pp. 519-528. [6] K. Aoki, Y. Suga, “Application of artificial neural Network to Discrimination of defect type in Automatic radiographic Testing of Welds”, ISIJ International 39 (1999) pp. 1081-1087. [7] O. Postolache, A. Lopes Ribeiro, H. Ramos, “Weld testing using eddy current probes and image processing”, Proc. of XIX IMEKO World congress Fundamental and Applied Metrology, September 6-11, 2009, Lisbon, Portugal, pp. 438-442. [8] J. Gunther, P.M., Pilarski, G. Helfrich, H. Shen, K. Diepold, “Intelligent lase welding through representation, prediction and control learning: An architecture with deep neural networks and reinforcement learning”, Mechatronics 34 (2016) pp. 1-11. [9] H.Y. Tseng, “Welding parameters optimization for economic design using neural approximation and genetic algorithm”, Int. J. Adv. Manuf. Technol. 27 (2006) pp. 897-901. [10] A. Sumesh, K. Rameshkumar, K. Mohandas, R. Shyam Babu, “Use of Machine Learning Algorithms for Weld Quality Monitoring using Acoustic Signature”, Procedia Computer Science 50 (2015) pp. 316-322. [11] S. Ravikumar, K.L. Ramachandran, V. Sugumaran, “Machine Learning approach for automated visual inspection of machine components”, Expert Systems with Applications 38 (2011) pp. 3260-3266. [12] A. Krizhevsky, I. Sutskever, G. E. Hinton, “ImageNet Classification with Deep Convolutional Neural Networks”, Advance in Neural Information Processing Systems 2 (2012) pp. 1097-1105. [13] T. Qing-bin, J. Chao-qun, H. Hui, L. Gui-bin, D. Zhen-liang, Y. Feng, “An automatic measuring method and system using laser triangulation scanning for the parameters of a screw thread”, Meas. Sci. Technol. 25 (2014) pp. 035202-035211. [14] P. Bellandi, F. Docchio, G. Sansoni, “Roboscan: a combined 2D and 3D vision system for improved speed and flexibility in pick- and-place operation”, Int J Adv Manuf Tech 69 (2013) pp. 1873- 1886. [15] I. Beˇsic´, N. VanGestel, J.P. Kruth, P. Bleys, J. Hodolicˇ, “Accuracy improvement of laser line scanning for feature measurements on CMM”, Opt Laser Eng, 49 (2011) pp. 1274– 1280. [16] M. Mahmud, D. Joannic, M. Roy, A. Isheil, J.F. Fontaine, “3D part inspection path planning of a laser scanner with control on the uncertainty”, Computer-Aided Des. 43 (2011) pp. 345–355. [17] F. Xi, Y. Liu, H.-Y. Feng, “Error compensation for three- dimensional line laser scanning data”, Int J Adv Manuf Tech 18 (2001) pp. 211–216. [18] G. D’Emilia, D. Di Gasbarro, E. Natale, “On line control of ic_2 ic_3 ic_4 Figure 7. Example images from ic_2 ic_3 ic_4. ACTA IMEKO | www.imeko.org December 2016 | Volume 5 | Number 4 | 11 optimal set-up of a laser sheet system for real time monitoring of welding”, Proc. of 14th IMEKO TC10 Workshop on Technical Diagnostics, June 27-28, 2016, Milan, Italy, pp.158-163. [19] D. C. Montgomery, “Statistical quality control”, McGraw-Hill, 2006, ISBN 978-0470169926, pp.116-117. [20] G. D’Emilia, D. Di Gasbarro, E. Natale, “A simple and accurate solution based on a laser sheet system for position and size on line monitoring of weldings”, J Phys Conf Ser. 658 (2015). [21] S. A. Alfaro, G. C. Carvalho, F. R. Da Cunha, “A statistical approach for monitoring stochastic welding processes”, J. Mater. Process. Technol. 175 (2006) pp. 4-14. Optical system for on-line monitoring of welding: a machine learning approach for optimal set up