Microsoft Word - 1murphy.docx CHEMICAL ENGINEERING TRANSACTIONS VOL. 58, 2017 A publication of The Italian Association of Chemical Engineering Online at www.aidic.it/cet Guest Editors: Remigio Berruto, Pietro Catania, Mariangela Vallone Copyright © 2017, AIDIC Servizi S.r.l. ISBN 978-88-95608-52-5; ISSN 2283-9216 A Mobile Laboratory for Orchard Health Status Monitoring in Precision Farming Gianluca Ristorto*a, Raimondo Galloa, Alessandro Gasparettob , Lorenzo Scalerab, Renato Vidonia, Fabrizio Mazzettoa a Free University of Bozen-Bolzano – Faculty of Science and Technology – Piazza Universitá 5 – 39100 Bolzano (BZ) b Universitá di Udine – Dipartimento Politecnico di Ingegneria e Architettura – Via delle Scienze 206 – 33100 Udine (UD) gianluca.ristorto@unibz.it Nowadays, Precision Farming (PF) has been fully recognized for its potential capability to increase field yields, reduce costs and minimize the environmental impact of agricultural activities. The first stage of the PF management strategy is the collection of field and environmental data useful to obtain information about the crop health status in a field or among different fields and to operate suitably in each of these partitions (e.g., distribution of fertilizers, pesticide treatments, differential harvesting), according to a site-specific approach, depending on their actual needs. A mobile vehicle (Bionic eYe Laboratory - ByeLab) has been developed to monitor and sense the health status of orchards and vineyards. The ByeLab is a tracked-bins carrier equipped with different sensors: two LiDARs to evaluate the shape and the volume of the canopy and six optical sensors to evaluate its chlorophyll content, thus to monitor the health state of the crops. Moreover, a RTK GNSS (Real Time Kinematic Global Navigation Satellite System) is used to geo-reference the acquired data and an IMU (Inertial Measurement Unit) to record the orientation of the ByeLab during the surveys. In order to reproduce the three-dimensional maps of the shape and the health of the canopy, data-processing algorithms have been developed and customized for the application. The combined use of heterogeneous sensors and a deep analysis of the results permit to define an efficient methodology for the crop monitoring. A Matlab® routine has been implemented for the canopy volume reconstruction and the health status mapping. Moreover, a preliminary diagnostic algorithm exploiting both the LiDAR and the optical sensors information to detect early situations of plants stress has been conceived and implemented. The aim of this paper is to disclose the data-processing algorithms exploited to obtain a precise canopy thickness reconstruction and an accurate vegetative-state mapping. The methodology used to validate the canopy thickness measurements and the vegetative state mapping is also presented. In particular, to validate the thickness results obtained with the mobile ByeLab, they have been compared with the acquisitions made with a fixed terrestrial Laser Scanner showing a good correlation. A representation of the parcels monitored during the in-field surveys and of an entire orchard field is finally provided. 1. Introduction Precision Farming (PF), also known as Precision Agriculture (PA) or Smart Farming, is the most modern managing strategy used in crop production. Its methodologies can be applied on crops vegetative stage and agricultural operations and activities, from the tillage to the application of fertilizers (Pahlmann I, et al. 2016), from the soil characterization to the crop monitoring. PA activities can be grouped into four sequential phases involving several technologies, ranging from Electronics and Informatics to Mechanics. First, to collect field and environmental data. Second, to elaborate data and integrate them into an information system (automatic operational monitoring system) tailored to better adapt to the farm requirements (Calcante A., Mazzetto F., 2014, Kruize J., et al. 2016). Third, take decision on the crop management and finally, drive the machines and/or adapt the action of the actuators in all the agricultural operations. Indeed, thanks to a continuous adjustment of the settings of the implements used for a crop (principle of "variable-rate application"), PA gives DOI: 10.3303/CET1758111 Please cite this article as: Ristorto G., Gallo R., Gasparetto A., Scalera L., Vidoni R., Mazzetto F., 2017, A mobile laboratory for orchard health status monitoring in precision farming, Chemical Engineering Transactions, 58, 661-666 DOI: 10.3303/CET1758111 661 many opportunities for increasing the field yields and decreasing the production costs, or, in other words, to make agriculture more efficient and sustainable (Gavioli A., et al., 2016). The purpose of this work is presenting a crop-monitoring mobile laboratory making a combined use of two different types of optical sensors. The organization of this paper is as follows. Section 2 describes the mobile platform, the hardware and software developed, whereas the algorithm workflow developed to managing the sensors data is illustrated in Section 3. Section 4 presents the experimental tests to validate the thickness evaluation whereas the results of the validation are presented in Section 5. A simplify representation of an entire orchard is then presented in Section 6. Finally, this paper ends with the conclusions and future work. 2. The ByeLab The mobile laboratory (Bionic eYe Laboratory - ByeLab) is equipped with different type of sensors. Two SICK LMS 111 LiDAR sensors determine the shape and the volume of the plants. Six AgLeader OptRx ACS430 crop sensors obtain information about the plant health. A LMRK 10 AHRS Inertial Measurement Unit (IMU) by Gladiator Technologies measures the mobile lab orientation in terms of roll, pitch and yaw angles. A RTK GNSS sensor provides the mobile lab position inside the crop field. Furthermore, a sonar sensor (MaxBotix MB1023 HRLV -MaxSonar® - EZ2TM) is used for the preliminary indoor laboratory tests, where the GNSS sensor cannot provide the position of the ByeLab during the surveys. This set of sensor is connected to a Laptop, which runs an acquisition software program, developed in LabViewTM environment, to record the sensors data. A simple graphical interface has been implemented in order to allow also unskilled operators to work with the ByeLab. Chapter 2 The set of sensors has been installed on a vertical adjustable frame of steel joints and aluminium tubular beams designed for this application. The frame can be extended and adapted to the height of the orchard row, according to the best height for the sensors. The adjustable frame is finally fixed to a tracked bins-carrier (the NEO Alpin By Windegger S.r.L.m Lana, Bolzano, Italy). The tracked configuration has been chosen for its high versatility to move nimbly within the orchard even in off-road and steep terrains, its ease portability inside the investigation site and its very high carrying capacity (500 kg) in comparison with its mass (250 kg). a) b) Figure 1: a) the Bye Lab during in-field surveys; b) scheme of the sensor placement on the Bye Lab frame. For the sake of simplicity, only 3 out of 6 OptRx sensors have been represented. 3. Data-Processing Algorithm The data processing algorithm has been developed in Matlab® environment. Figure 2 shows the algorithm flowchart. 662 After importing the data, the algorithm converts the point cloud coordinates from a cylindrical reference system to a Cartesian one, as follows: coscosΔ (1) where is the distance measured by LiDAR at the angle , is the robot speed and Δ is the time interval for the considered acquisition. Figure 2: Workflow of the developed algorithm. The coordinates, which are referred to the sensor reference system, are then shifted and rotated to the robot reference system by taking into account the LiDAR sensor mounting position and rotation (the upper LiDAR is rotated 180 degrees respect the lower LiDAR, as shown in Figure 1b). Furthermore, the IMU information is used to correct possible deviations of the ByeLab from the trajectory path and oscillations due to the terrain conformation. At this step, the point cloud is organized in a non-uniformly y-z grid. To properly merge together upper LiDAR and lower LiDAR data, the two data series are interpolated on a uniformly distributed y-z grid. A merging algorithm has then been implemented to obtain the final 3D-surface map, as follows: , , 2, , , ,, , , ,, , ∧ , , , , ∧ , , (2) where tol is a tolerance parameter expressed in mm and subscript I, high,I and low,I refer to the i-th point, top and bottom sensor data, respectively. Finally, a 3D mesh surface (Figure 3a) is obtained by interpolating the point cloud. Objects area and volume (the half of the object) can be computed by using previously developed algorithms (Bietresato, et al. 2016a, Bietresato, et al. 2016b). The OptRx sensors provide information about the plant health status. It is possible to identify the differences in vegetative development among plant groups, using the Normalized Difference Vegetation Index (NDVI) (Povh, 2014): high NDVI values means dense and healthy vegetation, whereas low NDVI values means sparse or unhealthy vegetation. The Kriging algorithm with spherical model is used with the NDVI data from the six sensors (three per side) to obtain a high detailed NDVI map, that shows difference in vegetative development among plant group. This map has the same grid used for the LiDAR y-z interpolation and it is merged and overlapped with the previous 3D mesh surface (Figure 3b). Moreover, by correlating the vegetation thickness (e.g. dividing it in ranges) with the NDVI index, it is possible to create additional new simplified maps that allow to better highlight different healthy and unhealthy regions of the vegetation. For example, an early disease diagnostic algorithm has been developed and implemented. The algorithms integrates the NDVI index and the vegetation thickness as showed in Figure 3c. The matrix rows discriminate the vegetation thickness (t) in three different ranges: • t < 0.1 m • 0.1≤ t < 0.2 • t ≥ 0.2 whereas the matrix columns discriminate the NDVI in three different fields: • NDVI < 0.4 • 0.4 ≤ NDVI < 0.6 663 • NDVI ≥ 0.6 As a result, the black cell encompasses the area without foliage, while the green areas of the matrix characterize the healthy vegetation. The yellow cells correspond to an early situation of stress that requires further investigation and the red cells coincides with a very unhealthy vegetation. a) b) c) d) Figure 3: Orchard row 3D representations: a) Lateral-Linear-Stereoscopic LiDAR based; b) Merging LiDAR and OptRx data; c) Early disease diagnostic algorithm matrix; d) Diagnostic algorithm representation 4. Experimental Tests The ByeLab has been tested in both outdoor controlled scenario and in-field scenario. Tests had the aim to check and tune: a) the hardware and the acquisition system; b) the implemented processing algorithms with a focus on the vigour maps and on the early diagnostic algorithm c) the evaluation of the volume, specifically the canopy thickness evaluation of the algorithm; d) new methodologies to represent the volume and health status of an entire crop field. An orchard row is simulated, aligning on the same line four plants and a wood structure (Figure 4a). The ByeLab travels at a constant speed (0.5 m/s) on both side of the row, recording the sensors data. A Faro Focus 3D X330 HDR Terrestrial Laser Scanner (TLS), with ± 2 mm distance accuracy has been used to compare the data about the canopy thickness from the ByeLab. Four scanning positions have been adopted to obtain the 3D reconstruction of the experimental layout with the TLS. The point cloud of the ByeLab and of the TLS are aligned on the same reference system and then the thickness of the raw is computed. a) b) c) Figure 4: Experimental tests: a) plants and structure used; b) first layout; c) second layout. 664 5. Validation Results Next figures show the canopy thickness evaluated by the TLS and the ByeLab, for both experimental layouts. Figures 5a and 6a present the correlation between the TLS and the ByeLab canopy thickness measurements: R2 = 0.83 and R2 = 0.89 respectively for the first and the second layout. a) b) Figure 5: Canopy thickness evaluated by the TLS and the ByeLab for the first experimental layout: a) Correlation between the measurements b) Trend along the orchard row. a) b) Figure 6: Canopy thickness evaluated by the TLS and the ByeLab for the second experimental layout: a) Correlation between the measurements b) Trend along the orchard row. Figure 5b and 6b illustrate the trend of the canopy thickness along the simulated orchard row. The solid structure is correctly measured, but discrepancies in plant measurement occur. ByeLab underestimates the plants thickness, probably due to the windy conditions experienced during the tests. Furthermore, due to the plants separation, both ByeLab and TLS measure zero value of canopy thickness. In some cases, 4 points for the first layout and 3 point for the second layout, the ByeLab reports a null value of canopy thickness, while the TLS returns a non-zero value, presumably because of an incorrect evaluation of the speed during the ByeLab travel. 6. In-field survey representation Several in-field campaigns take place during the last year and huge amount of data has been recorded. In order to provide synthetic, but also representative vigour maps of the orchard scanned, three types of maps are here proposed. The orchard row has been divided in constant slices along the row of dz = 0.5 m. For each slice, the mean values of half canopy thickness and of NDVI at a height of 1.5 m ± 0.1 m (as example, the height and the buffer could change) are then calculated. Figure 8a represents the trend canopy thickness along the orchard rows. The height of each rectangle is equal to dz, while the base is correlated to the average canopy thickness measured by the ByeLab. The average canopy thickness (tavg) is divided into four classes (T): • T = 0.0 m if tavg < tlow • T = 0.5 m if tlow ≤ tavg < tmed 665 • T = 1.0 m if tmed ≤ tavg < thigh • T = 1.5 m if tavg ≥ thigh where tlow, tmed and thigh have been chosen as 0.1, 0.4 and 0.6 m. To distinguish left and right side of the orchard, different tonalities of green has been used: left side - dark green, right side - light green. As a result, the map in Figure 7a is obtained. Furthermore, a second representation is proposed in Figure 7b, where each rectangle is coloured according to the NDVI mean value inside the slice. Finally, Figure 7c shows the early disease representation. For each slice, the early disease algorithm is applied. Average half canopy thickness and NDVI is used to determine if the vegetation of the slice is healthy (green), presents an early situation of stress (yellow) or it is very unhealthy. a) b) c) Figure 7: Orchard maps representation: a) half canopy thickness b) NDVI and semi-canopy thickness c) early disease representation. 7. Conclusions A mobile laboratory (ByeLab) for orchard health status monitoring in precision farming has been presented. Hardware, software as well as developed algorithms to reconstruct the plant volume and health status have been described. Experimental tests have been performed to validate the canopy thickness measured by the ByeLab, showing an R2 = 0.83 and R2 = 0.89 for two different experimental layouts. Furthermore, new representation charts have been introduced to obtain simplified, synthetic but also representative maps. Further works, will deal with in-field surveys to set-up the early disease diagnostic algorithm. In particular the range of the thickness and of the NDVI will be tuned to detect disease inside an orchard field, comparing the information obtained from the ByeLab to visual observation of the orchard by in-field surveys of agronomists or specialists. Reference Bietresato M., Carabin G., Vidoni R., Gasparetto A., Mazzetto F., 2016a, Evaluation of a lidar-based 3d- stereoscopic vision system for crop-monitoring applications, Computers and Electronics in Agriculture, vol. 124, pp. 1-13. Bietresato M., Carabin G., D'Auria D., Gallo R., Ristorto G., Mazzetto F., Vidoni R., Gasparetto A., Scalera L., 2016b, August. A tracked mobile robotic lab for monitoring the plants volume and health. In Mechatronic and Embedded Systems and Applications (MESA), 2016 12th IEEE/ASME International Conference on (pp. 1-6). IEEE. Calcante A., Mazzetto F., 2014, Design, development and evaluation of a wireless system for the automatic identification of implements, Computers and Electronics in Agriculture, vol. 101, pp. 118-127. Gavioli A., de Souza E., Bazzi C., Guedes L., Schenatto K., 2016, Optimization of management zone delineation by using spatial principal components, Computers and Electronics in Agriculture, vol. 127, pp. 302-310. Kruize J., Wolfert J., Scholten H., Verdouw C., Kassahun A., Beulens A., 2016. A reference architecture for farm software ecosystems, Computers and Electronics in Agriculture, vol. 125, pp. 12-28. Pahlmann I., Böttcher U., Kage H., 2016, Evaluation of small site-specifc n fertilization trials using uniformly shaped response curves, European Journal of Agronomy, vol. 76, pp. 87-94. Povh F., de Paula G., dos Anjos W., 2014, Optical sensors applied in agricultural crops, Optical Sensors - New Developments and Practical Applications. 666