CHEMICAL ENGINEERING TRANSACTIONS VOL. 57, 2017 A publication of The Italian Association of Chemical Engineering Online at www.aidic.it/cet Guest Editors: Sauro Pierucci, Jiří Jaromír Klemeš, Laura Piazza, Serafim Bakalis Copyright © 2017, AIDIC Servizi S.r.l. ISBN 978-88-95608- 48-8; ISSN 2283-9216 Machine Vision-Guided Food Processing Automation Yang Tao*, John Lin, Robert Vinson, Maxwell Holmes, Gary Seibel Bioimaging and Machine Vision Laboratories, Fischell Department of Bioengineering, 1426 Ani Sci Bldg, University of Maryland, College Park, MD 20742, USA ytao@umd.edu In this presentation, machine vision-guided systems for food processing automations will be presented. The goals of these equipment designs were to improve the process efficiency and minimize labour intensities. A showcase of an automated vision-guided intelligent strawberry de-calyxing machine and an automated 3-D imaging system for seafood sorting lines are introduced and discussed in this paper. These automated system technologies have the capabilities of huge labour savings from 10 to 120 folds. 1. Introduction Each year, vast quantities of agricultural raw products are processed before reaching the market. In food processing facilities, raw materials of some commodities are still processed by hands, where the tedious labor intensive operations recur on an annual basis. The following sections introduce the research and developments of automated processing techniques to improve efficiencies. 2. Automated Vision-Guided Strawberry Calyx Removal System In 2015, processed strawberries in the U.S. reached annual evaluation of over $241 million, growing 30 percent over the previous year (USDA, 2015). This represents over 250 million kilograms of strawberries harvested for processing. These processed strawberries end up in foods such as ice cream, yogurt, juices, jams, jellies, and baked goods. However, before these strawberries reach the consumer, the calyx must be removed before each strawberry is individually quick-frozen to preserve taste and quality. Strawberries harvested for processing as frozen fruits are currently de-calyxed manually. This process requires the removal of the stem cap with green leaves (i.e. the calyx) and incurs many disadvantages when performed by hand. An automated system would minimize labor on processing and improve the labor shortage currently present strawberry harvesting. 2.1 The Approach An overview of the vision-guided strawberry de-clayxing system is shown in Figure 1. It consists of four sections: 1) the loading section that load strawberries on to the machine, 2) the strawberry material handling section that orients strawberries; 3) the machine vision chamber that identifies strawberry cut location and guides the calyx removal and, 4) the multi-waterjet knives that remove the calyxes. The machine is made of stainless steel and is capable of wash-down for sanitation. The system is designed for conical strawberries to be first loaded into a water tank. The strawberries are immersed in a sanitation bath and float to the water’s surface to form a single layer. An elevating conveyor consisting of specially designed parallel roller rods then lifts the floating strawberries from the water by the valley of adjacent rods. This common technique enables one layer of fruit to be processed. The roller rods then begin rotating; due to the strawberries’ conical or polarized shape, the berries will orient themselves such that the direction of their symmetrical axes aligns parallel with the axes of the roller rods. In addition, the roller rods are molded to cup individual strawberries, creating an evenly spaced grid of oriented strawberries along the conveyor. The automated strawberry removal process is illustrated in Figure 2. DOI: 10.3303/CET1757295 Please cite this article as: Tao Y., Lin J., Vinson R., Holmes M., Seibel G., 2017, Machine vision-guided food processing automations, Chemical Engineering Transactions, 57, 1765-1770 DOI: 10.3303/CET1757295 1765 When the fruit comes into the optical section, an industrial camera will take an image of the fruit’s surface. By the time the strawberries exit the viewing area, all calyx positions will have been precisely located. At this time, all strawberries remain in their stationary locations as they move to the calyx removal section. Figure 1. An overview of the system. (1) Water tank loading section. (2) Strawberry material handling section. (3) Vision chamber with guards removed. (4) multi-waterjet knife calyx removal actuation system. The dimensions of the AVID machine were132cm wide by 259cm long by 210cm high. The frame material was 11 gauge 304 stainless steel. Figure 2. The automated strawberry calyx removal flow process. (1) Strawberries are first loaded on the custom designed material handling conveyor, (2) the conveyor singulates and orientates the strawberries, (3) the appropriate cutting line for clayx removal is identified through use of computer-vision techniques, (4) waterjets remove the calyx, and finally (5) the de-calyxed strawberry are ready for processing and packaging. The strawberry position on the conveyor is known through the use of a synchronized conveyor encoder. The computer will register each calyx’s position as coordinates with respect to the conveyor, thereby locating the calyx at any given time with sub-millimeter precision. While there are several calyx removal mechanisms and options, the system uses a non-metal, blade-free waterjet knife approach. By using the machine vision guided waterjet knife cutting technique, the automated calyx removal machine will take the coordinates from the vision system and remove each calyx with millimeter precision. The interactions of all three components are depicted in Figure 3. 1766 Figure 3. A flow diagram illustrating the interactions between the main components of the AVID machine is shown. 2.2 Image analysis and understanding The image analysis was performed on a PC and implemented in C++. The image analysis and understanding components utilized a knowledge-based system. This consisted of a human-defined database of strawberry fruit and leaf colors that allowed for their respective binary blob to be isolated from the background image. Blob characteristics including bounding box coordinates, pixel area, x-axis projection and, vicinity to neighboring blobs are then used to identify strawberry shape and orientation. These features would affect cutline positon or the decision to reject and avoid cutting the strawberry all together. A flow chart of the image analysis and image understanding components is shown in Figure 4. Figure 4. Flow chart of how the calyx removal cut-line is determined is shown The orientation of a strawberry was classified as being either aligned, diagonal, perpendicular, calyx-up, or tip- up. The aligned case was considered ideally oriented. The tip-up case was identified as having little to no calyx blob pixels near the fruit blob. The other orientations were identified by the average vector between the calyx blob pixels and the fruit blob’s center of gravity. This vector �⃗� is calculated as follows. (𝑥𝑂, 𝑦𝑂) = { ∑ ∑ (𝑥 − 𝑥𝑐𝑜𝑔) ∙ 𝐼(𝑥, 𝑦) 𝑁−1 𝑦=0 𝑀−1 𝑥=0 ∑ ∑ 𝐼(𝑥, 𝑦)𝑁−1𝑦=0 𝑀−1 𝑥=0 , ∑ ∑ (𝑦 − 𝑦𝑐𝑜𝑔) ∙ 𝐼(𝑥, 𝑦) 𝑁−1 𝑦=0 𝑀−1 𝑥=0 ∑ ∑ 𝐼(𝑥, 𝑦)𝑁−1𝑦=0 𝑀−1 𝑥=0 } 1767 Where 𝑥𝑂 and 𝑦𝑂 are the end coordinates of the �⃗� vector, 𝑀 is the column dimension of the image, 𝑁 is the row vector of the image, and 𝐼(𝑥, 𝑦) is either 1 for a white or 0 for a black leaf pixel. The magnitude and direction of the �⃗� vector would determine the orientation of the strawberry based off of user-defined thresholds. In order to improve sliced fruit weight yield a rejection algorithm was implemented. This allowed for strawberries that were not ideally aligned or were in other ways defective to be actively avoided by the waterjet cutting stream. This fruit would then be sent into a return cycle for a second chance at orientation and calyx removal. In terms of output product, this strawberry would be considered Return Fruit. A strawberries cut-line for calyx removal was assigned as a horizontal length between the fruit-blob’s center of gravity (cog) and cut-line position. This length (𝐶𝐿) was calculated using the following equations. 𝐺𝑟 = ∑ 𝐼(𝑥𝑐𝑜𝑔, 𝑦) 𝑁−1 𝑦=0 2 𝐶𝐿 = (𝐾) ( 𝐻 𝑊 ) (𝐺𝑟) Where 𝐾 is a scalar constant that adjusts depending on strawberry orientation and shape. 𝐻, 𝑊, and 𝐺𝑟 are fruit-blob’s bounding box height, width, and gravity radius respectively. 𝑁 is the row dimension of the image, and 𝐼(𝑥, 𝑦) is either 1 for a white pixel or 0 for a black pixel. 2.3 Test Result Tests were conducted during the 2016 spring and fall sessions over about 450,000 kilograms of strawberries. Figure 5 shows the system setup. The throughput of the system reached about 4,535 Kgs of strawberries per hour, which is equivalent to the productivity of about 120 manual laborers. Figure6 was the result of live demonstrations to the growers and processors during the spring tests. These random samples were taken from the output conveyor. The residuals are to be removed by the automated detection system later on. The uncut rejected strawberries are decided by the vision system and are diverted to a return flow after the 2nd chance at being cut. Figure 5. The prototype vision-guided strawberry decalying system setup. Table 1: Machine-vision detection results of independent test samples in terms of sensitivity and specificity Detection Feature No. of samples Sensitivity (%) Specificity (%) Accuracy (%) Calyx 440 92.0 99.7 97.3 Fruit 440 95.2 100 98.4 Rejection Scenario 200 93.0 74.1 85.0 The results of the sensitivity and the specificity of the machine-vision system are shown in Table 1. In this table, the accuracies of identifying strawberry calyx and fruit regions are at 97.3% and 98.4%, respectively as verified by the ground truth manually. However, the accuracy of correctly identifying if the strawberry should be rejected was lower at 85.0 percent. This error is contributed to the larger range of strawberry shapes, sizes, and colors experienced in a processing plant as compared to a fresh-market setting. Methods to improve this accuracy include adding additional features and rules to the detection algorithm’s knowledge base, and modifying the material handling system so as to increase uniform orientation of the strawberries before entering the vision section. 1768 Figure 6. Initial test results: Left: the photo of the spring live demo result. Random samples are taken out from the output conveyor and manually sorted for analysis. Right: Decap output tested during Fall Trial that minimized the calyx and residual elements. 3. Automated 3D Oyster Imaging Grading and Sorting System Cultured oyster aquaculture is a fast growing business. This paper presents an on-line oyster 3D imaging grading and sorting system to separate oysters into quality grades for marketing. The technology enables high-volume sorting and offer greater opportunities for oyster aquaculture by shipping quality oysters in the marketplace. A fully automated machine that meets USDA wash-down sanitation standard is presented. 3.1 Introduction One of the quality attributes of oyster grade is its size that potentially contains the meat. 2D imaging is limited and is unable to relate to an object’s volume in irregular shape. Oysters harvested from wild or farm raised have variations in their sizes and volumes. The grades of sizes and volumes can have different market values. Although 2D imaging systems have been used for fruit and vegetable sorting (Jiang, et al. 2009, Tao et al. 2016, 2002, Chao, et al. 2010, Qin and Lu, 2005, and Zhu et al. 2007), the 3D imaging for oyster sorting is lacking. Figure 7. Oysters are to be graded and sized before shipping quality products to consumers. 3.2 The 3D Oyster Imaging Sorter Figure 8 shows an automated 3D imaging system we developed for sorting oysters. It is 3D voxel resolution of 0.4 x0.4x0.2mm3/voxel. It is capable of running up to 22,000 oysters per hour. After scanning by the 3D camera and electronic processors, the oysters will be automatically separated into grades per user definitions. Figure 9 shows the images of an oyster. A point cloud data was generated first and the 3D image was generated for the 3D measurements. Users can define threshold based on length, width, height, and volume for the grades. The machine provide oyster growers with the modern tools for benefits 1) enhanced trading of quality products, 2) consistency of oysters, 3) inventory monitoring, and 4) maximized the economic returns. The machine meets the USDA sanitation standard and is IP66 wash-down capable for in-door and out-door and board operations. For dynamic motion of conveyor operations, some videos will be presented during the presentation. 1769 Figure 8. Automated 3D imaging system for oyster grading and sorting. Figure 9 The image results of the 3D oyster imaging sorting system. Left: the point cloud image with the height information in colour, Middle and right: the 3D reconstruction of the oyster image. The image was scanned at 1meter speed at 0.4x0.4x0.2mm 3 resolution. 4. Conclusions Two automated machine vision based systems are introduced in this paper. An automated strawberry calyx removal machine was developed and pilot tested. The machine’s material handling system has shown that strawberries can be oriented and positioned appropriately for efficient processing. The vision system can identify calyx removal cutting locations with relatively high accuracy in real-time at 4,500Kg strawberries per hour. These advancements could include a more robust strawberry orientation method or a multi-axis calyx removal system that would accommodate a wider range of shapes and sizes. However, this approach shows potential of being a feasible option for automated industrial high-volume cost effective post-harvest strawberry de-calyxing. For seafood processing, a fully automated 3D imaging system for oyster grading and sorting was developed. It provides high-throughput oyster sorting at up to 22,000 oysters per hour. All these machines enable the dramatic processing efficiency and labor savings for the food industries. Reference Chao, Kuanglin, Chun-Chieh Yang, and Moon S. Kim. 2010. Spectral line-scan imaging system for high-speed non-destructive wholesomeness inspection of broilers. Trends in Food Sci. Tech. 21(3): 129-137. Jiang, L., B. Zhu, Y. Luo, Y. Tao, and X. Cheng. 2009. 3D Surface Reconstruction and Analysis in Automated Apple Stem-end/Calyx Identification. Trans. of ASABE, Vol.52(5)1775-1784. Qin, J., and R. Lu. 2005. Detection of Pits in Tart Cherries by Hyperspectral Transmission Imaging. Trans. ASAE 48(5): 1963-1970. Tao, Y., J. Lin, Xin, C., & Seibel, G. E. (2016). Patent No. 9,364,020. United States of America. Tao, Y. and Z. Wen, 2002. High-speed Machine Vision for On-line Sorting of Fresh Fruit and Vegetables. In Book: Advances in Bioprocessing Engineering. ISBN:981-02-4696-x. (peer reviewed) USDA, 2015. Noncitrus Fruits and Nuts 2015 Summary. Washington DC: USDA, National Agricultural Statistics Service. Zhu, B., L Jiang, and Y. Tao. 2007. Three-Dimensional Shape Enhanced Transform for Automatic Apple Stem-End/Calyx Identification. Optical Engineering, Vol.46(01):017201. 1770