Microsoft Word - 1murphy.docx CHEMICAL ENGINEERING TRANSACTIONS VOL. 58, 2017 A publication of The Italian Association of Chemical Engineering Online at www.aidic.it/cet Guest Editors: Remigio Berruto, Pietro Catania, Mariangela Vallone Copyright © 2017, AIDIC Servizi S.r.l. ISBN 978-88-95608-52-5; ISSN 2283-9216 Performance Optimization of Neural-Network Based Colour Measurement Tools for Food Applications Paolo Bargea, Lorenzo Combab, Paolo Gaya, Davide Ricauda Aimoninoa, Cristina Tortia*a, Nahid Aghilinateghc, Mohammad Jafar Dalvandc a DI.S.A.F.A. – Università degli Studi di Torino, 2 Largo Paolo Braccini, 10095 Grugliasco (TO), Italy b DENERG – Politecnico di Torino, 24 Corso Duca degli Abruzzi, 10129 Torino, Italy c D.A.M.E. - University of Tehran, Karaj, Iran cristina.tortia@unito.it Colour is the first attribute subject to consumer perception in determining food quality and, in many cases, this is the only possible mean to qualify product at purchase. For this reason, the description of colour by analytical methods is fundamental in food processing control. Computer vision systems acquire RGB data which are device-dependent and sensitive to the different lightning. Therefore, they are not directly useful for colour evaluation to mimic human vision. On the contrary, traditional colorimeters, which adopt CIELab coordinates, work in human-oriented colour space where euclidean distance between two different colours (∆E) is well related to the difference perceived by human sight. Nevertheless, vision systems have many advantages as the capability of acquiring larger areas of the food surface and the easiness of implementation in automated plants at low costs. Neural networks, trained on a set of selected colour samples, can approximate RGB to L*a*b* relationships to characterise the colour of food samples under test. The aim of this paper is to present a rapid method based on neural networks for the calibration of a CCD (Charge-Coupled Device) camera colour acquisition system to obtain reliable L*a* b* information. Preliminary results concerning the influence of the composition of the training test and the camera settings (aperture and time of exposure) on the reliability and accuracy of the colour measurement system are also discussed. 1. Introduction In food processing chains, colour is often checked on production lines to verify the product quality. Traditionally, colour inspection is performed by human eye or by colorimeters, which usually measure the CIELab coordinates with the consequence that the quality control is highly expensive in terms of manpower employed, the colour is affected by subjectivity or, as in the case of colorimeters, a very small area is considered. The colour acquisition of RGB values by camera sensors has the disadvantage to be dependent on the electronic sensor and on the illuminant. Methods to convert RGB encoded colours in the CIELab colour space have been proposed by different authors (Leon et al., 2006; Valous et al., 2009; Kılıç et Al., 2007; Taghadomi- Saberi et al., 2015). Among these, Artificial Neural Networks gave good results. This paper aims to investigate the effect of the following factors on the performances of Artificial Neural Networks (ANN) colour calibration: 1) number of hidden neurons in the ANN 2) camera aperture and shutter 3) colours used for training. The performance was tested separately for different colours using a commercial palette used by photographers for camera calibration. DOI: 10.3303/CET1758099 Please cite this article as: Barge P., Comba L., Gay P., Ricauda D., Tortia C., Aghilinateg N., Dalvand M.J., 2017, Performance optimization of neural-network based colour measurement tools for food applications, Chemical Engineering Transactions, 58, 589-594 DOI: 10.3303/CET1758099 589 2. Methods A Nikon D7000 colour DSLR (Digital Single-Lens Reflex) camera with a CMOS (Complementary Metal-Oxyde Semiconductor) image sensor was mounted on a stand and connected with a dome lighting system, which ensured a uniform illumination of food samples. A white plastic hemisphere (350 mm diameter) reflected the light provided by two LEDs circular concentric crowns at the edge of the dome, obtaining a 5500 K colour temperature (Figure 1). The electric power of the lighting system was 8.45 W. The camera and the illumination system were placed into a (500 mm × 600 mm × 900 mm) wooden box internally black-painted to minimise external light and reflection (Valous et al., 2009). This system is a improvement of the equipment used in a previous work (Ricauda et al., 2015). Figure 1: Picture of the Computer Vision System Figure 2: X-rite Digital Colorchecker SG. Zone A and zone B are contoured in red (upper zone) and yellow (lower rectangle) respectively. The white balance was obtained using a X-rite white balance card. For each colour, 195 pictures of a ColorChecker Color Rendition chart were taken in all combinations of 11 apertures (5.6, 6.3, 7.1, 8, 9, 10, 11 13, 14, 16, 18, 20 and 22) and time of exposure (1/15 s, 1/20 s, 1/25 s, 1/30 s, 1/40 s, 1/50 s, 1/60 s, 1/80 s, 590 1/100 s, 1/125 s, 1/160 s, 1/200 s, 1/250 s, 1/320 s and 1/400 s). Camera ISO sensitivity was set to 200 and the image resolution at 300 x 300 dpi. Shooting sequence, shutter and aperture were automatically controlled by a PC running the software Digicam Control and a customised Matlab® program. The ColorChecker Color Rendition chart contains 24 patch of different colours (McCamy et al., 1976) and is widely used in photographic and video sectors for calibration purposes of imaging devices. Another ColorChecker version (Digital SG) was added with other colours (e.g. similar to skin, sky and grass) and more repetitions of grey-scale colours for a total of 140 patches. The manufacturer provides CIE L*a*b* coordinates for illuminant D50 (2 degree obsever) of the various patches for the two different ColorChecker versions as, for toxicity of pigments, in 2014 the firm had to change pigment formulations. Pictures of two zones of the two versions (February 2011 and July 2016) of the Munsell Digital ColorChecker SG (X-rite) were taken separately: a 6x4 patches zone containing the colours of the classic ColorChecker, which is the rectangle defined by the letters E2 (left upper corner) and J5 (right bottom corner) was defined as set A, and a set B, which refers to the zone defined by the E6 (left upper corner) and J9 (right bottom corner) patches (Figure 2). RGB values were obtained post-processing the acquired images by sampling RGB values in Matlab ®. From each colour patch 200 random 10x10 pixel square zones were sampled. Mean RGB values were used to train the neural network, for a total of 4800 RGB data used as input for each photo. CIE L*a*b* values (2° standard observer, D65 illuminant) were acquired by a Konica Minolta Chroma Meter CR-400 with a CR-A33f glass light-protection tube on Colorchecker. Five repetitions for each measurement were performed. The Artificial Neural Network (ANN) was composed by an input layer (Ni) containing the three RGB coordinates (Figure 3), a hidden layer (Nh) composed by a variable number of n neurons (with 7