Available online at http://ijcpe.uobaghdad.edu.iq and www.iasj.net Iraqi Journal of Chemical and Petroleum Engineering Vol.21 No.3 (September 2020) 57 โ€“ 66 EISSN: 2618-0707, PISSN: 1997-4884 Corresponding Authors: Name: Yahya Jirjees Tawfeeq, Email: yahyapetroleum@uokirkuk.edu.iq, Name: Jalal A. Al-Sudani, Email: jalsud@uobaghdad.edu.iq IJCPE is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License. Digital Rock Samples Porosity Analysis by OTSU Thresholding Technique Using MATLAB Yahya J. Tawfeeq and Jalal A. Al-Sudani Petroleum Engineering Department, College of Engineering, University of Baghdad, Baghdad, Iraq Abstract Porosity plays an essential role in petroleum engineering. It controls fluid storage in aquifers, connectivity of the pore structure control fluid flow through reservoir formations. To quantify the relationships between porosity, storage, transport and rock properties, however, the pore structure must be measured and quantitatively described. Porosity estimation of digital image utilizing image processing essential for the reservoir rock analysis since the sample 2D porosity briefly described. The regular procedure utilizes the binarization process, which uses the pixel value threshold to convert the color and grayscale images to binary images. The idea is to accommodate the blue regions entirely with pores and transform it to white in resulting binary image. This paper presents the possibilities of using image processing for determining digital 2D rock samples porosity in carbonate reservoir rocks. MATLAB code created which automatically segment and determine the digital rock porosity, based on the OTSU's thresholding algorithm. In this work, twenty-two samples of 2D thin section petrographic image reservoir rocks of one Iraqi oil field are studied. The examples of thin section images are processed and digitized, utilizing MATLAB programming. In the present study, we have focused on determining of micro and macroporosity of the digital image. Also, some pore void characteristics, such as area and perimeter, were calculated. Digital 2D image analysis results are compared to laboratory core investigation results to determine the strength and restrictions of the digital image interpretation techniques. Thin microscopic image porosity determined using OTSU technique showed a moderate match with core porosity. Keywords: Digital rock physics, OTSU thresholding, Thin section image, Porosity, Macro pores, Micro pores Received on 26/06/2020, Accepted on 22/08/2020, published on 30/09/2020 https://doi.org/10.31699/IJCPE.2020.3.8 1- Introduction Image analysis has been used for many years to extract relevant information from digital microscopic images. Image analysis includes all operations required to obtain quantified image information. The typical image analysis sequence involves image acquisition, processing, segmentation, measurements, data processing and interpretation. Segmentation of images considers one of the essential techniques utilized to divide the image into its integrated portions for extracting the relevant image information [1]. Briefly, the segmentation of the image transforms the representation of an image into its simplified shape that can be examined more critically and naturally [2], [3]. Several practical image segmentation applications are available such as trace tumors and additional pathologies [4], [5], machine vision, object detection [6], face detection, medical imaging [7], [8], anatomic building studies and diagnoses [9], fingerprint recognition and video surveillance. Several techniques of image segmentation, such as thresholding [10], edge-based segmentation [11], and compression-based methods [12], have been taken during recent decades. In image processing techniques, many algorithms used, such as Artificial Neural Network [13], Convolutional Neural Network [14], and K - Nearest Neighbors. In all image segmentation methods, the simplest and most relevant and useful technique of dividing an image into the front class and the background class is thresholding technique [15]. The thresholding process converts the grayscale image into the binary image depending on the threshold values. The important of the thresholding process is to select an optimal threshold value when there are more threshold levels are implemented. Several methods of thresholding are currently employed, including OTSU technique, clustering [16] and utmost entropy technique [17]. OTSU method is fast and ease of coding thresholding method among all the purposes mentioned above. Because the OTSU threshold operates on histograms (which are integer or float arrays of length 256) it's quite fast and approximately 90 lines MATLAB code needed. However, OTSU technique is a histogram-based [18] threshold approach for automatic thresholding of the image. OTSU's algorithm suggests that the image can be divided into two main categories: foreground and background. http://ijcpe.uobaghdad.edu.iq/ http://www.iasj.net/ mailto:yahyapetroleum@uokirkuk.edu.iq mailto:jalsud@uobaghdad.edu.iq mailto:jalsud@uobaghdad.edu.iq http://creativecommons.org/licenses/by-nc/4.0/ https://doi.org/10.31699/IJCPE.2020.3.8 Y. J. Tawfeeq and J.A. Al-Sudani / Iraqi Journal of Chemical and Petroleum Engineering 21,3 (2020) 57 - 66 58 The algorithm designed to find the best threshold value that divides the histogram into two classes that maximizes the variance between two classes. The development of the current OTSU threshold to the multi-level threshold referred to as the multi-OTSU thresholds [19]. Porosity is the porous space fraction that the rock matrix does not occupy [20]. A comprehensive study of the distribution of porosity is essential for the reservoir evaluation project [21]. Porosity is a crucial property of rock due to measuring potential storage volume of hydrocarbons. In the carbonate reservoir, porosity ranges about 0.01-0.35 [22]. The volume fraction of void spaces, i.e. non-rock space, divided by the total volume of the sample defined as porosity [23]. High porosity values indicate high capacities of the reservoir rocks to contain these fluids, while low porosity values indicate the opposite [24]. The porosity evaluated either through formation evaluation logs or through laboratory measurements on core samples. A general industry practice is to regard core measurements as ground truth. However, there can be un- certainties associated with core measurements especially when laboratory conditions ignored under which core measurements were made. There are certain factors that control porosity of a formation, pore and grain size distribution, mineralogy, sorting and diagenesis, etc. [25], [26]. Characterizing these controlling factors require advanced logging and special analysis on core data (SCAL), yet some of these properties require alternate interpretation techniques. โ€˜Digital image analysis of thin sectionsโ€™ is presented as this alternate technique. Porosity, mineralogy, pore size distribution and sorting analyzed through digital image analysis of thin sections. In this study, OTSU's thresholding implemented for microscopic image segmentation. The samples of microscopic images are processed utilizing MATLAB programming. In the present study, microporosity and macroporosity of the digital image are determined. Also, some pore void characteristics, such as area and perimeter, were calculated. 2- Material and Methods In this work, twenty-two samples of 2D thin section petrographic image used for analyses from the core plugs taken from the Buzurgan oil field. Each sample was impregnated with blue-dyed epoxy, thin sectioned and then was stained for discrimination of carbonate minerals, the scanned image has resolution of about 10 ๐œ‡m/pixel. The procedure of scanning and digitizing the image called โ€˜optical microscopyโ€™ and is of lower resolution as compared to digital images obtained from โ€˜scanning electron microscopyโ€™. The advantage of the former is that it is a fast technique to obtain digital images and a disadvantage that pore sizes less than 10 ๐œ‡m cannot be quantitatively resolved with optical microscopy. In image processing, OTSU's technique used to implement automatic image thresholding, and this method named after Japanese scientist Nobuyuki OTSU [10]. The algorithm in the simplest form returns a single threshold of intensity to separate the pixels into two groups or classes, foreground and background. The algorithm searches deeply for the threshold that maximizes the variance of between-class or minimizes the variance of within-class. The fundamental concept is that suitable threshold classes must separate the intensity values in terms of their pixel and, conversely, that the optimum threshold would be a threshold providing the best class separation in terms of intensity values [27]. OTSU's technique has the significant property, in relation to its optimality, that it is entirely based on computations executed on an image's histogram, an easily accessible 1-D array. Let {0, 1, 2 โ€ฆ โ€ฆ โ€ฆ . . , ๐ฟ โˆ’ 1} are the separate intensity levels in a digital image of size (M.N) (row and column dimensions of the image) pixels, and assume (ni) is the number of pixels with intensity (i). The total image pixel is; ๐‘€ ๐‘ = ๐‘›0 + ๐‘›1 + ๐‘›2+, , , , , , , , , +๐‘›๐ฟโˆ’1. the normalized histogram contains parts as [10]; ๐‘๐‘– = ๐‘›๐‘– ๐‘€๐‘ (1) โˆ‘ ๐‘๐‘– = 1 ๐‘Ž๐‘›๐‘‘ ๐‘๐‘– โ‰ฅ 0 ๐ฟโˆ’1 ๐‘–=0 (2) Where: M.N = row and column dimensions of the image ni = the number of pixels with intensity (i) Pi = probability distribution of intensity (i) i = intensity Now, assume that a threshold (T) was selected with value (0 < T < L-1), and utilized this threshold to separate the digital input image into two classes or groups, (C1) and (C2). Where (C1) involves all the image pixels that have intensity values ranges (0, T) and (C2) includes all the image pixels that have intensity values ranges (T+1, L-1). Utilizing this threshold, probability ๐‘ƒ1(๐‘‡) of class C1 (i.e. Background class) is given by the cumulative sum; ๐‘ƒ1(๐‘‡) = โˆ‘ ๐‘๐‘– ๐‘‡ ๐‘–=0 = โˆ‘ ๐‘›๐‘– ๐‘€๐‘ ๐‘‡ ๐‘–=0 (3) Where: T = threshold. In the same way, the probability of the second class ๐‘ƒ2(๐‘‡) (class C2, or Foreground class) is given by; ๐‘ƒ2(๐‘‡) = โˆ‘ ๐‘๐‘– ๐ฟโˆ’1 ๐‘–=๐‘‡+1 = 1 โˆ’ ๐‘ƒ1(๐‘‡) (4) The class C1 mean intensity value of the pixels is given by; ๐œ‡1(๐‘‡) = โˆ‘ ๐‘– ๐‘( ๐‘– ๐ถ1 )๐‘‡๐‘–=0 = โˆ‘ ๐‘– ๐‘( ๐ถ1 ๐‘– )๐‘‡๐‘–=0 ( ๐‘ƒ(๐‘–) ๐‘ƒ(๐ถ1) ) = 1 ๐‘ƒ1(๐‘‡) โˆ‘ ๐‘– ๐‘๐‘– ๐‘‡ ๐‘–=0 (5) Where: ๐œ‡1(๐‘‡) is mean intensity value for class C1. Where ๐‘ƒ1(๐‘‡) is given in Eq. (2). The term๐‘ ( ๐‘– ๐ถ1 )is the probability of value (i) which derives from class (C1). Y. J. Tawfeeq and J.A. Al-Sudani / Iraqi Journal of Chemical and Petroleum Engineering 21,3 (2020) 57 - 66 59 The second part of the Equation comes from Bayes' formula. The third part comes from the fact ๐‘ ( ๐ถ1 ๐‘– ) = 1, probability of (C1) for given (i) equal to (1), since only class (C1) values is considered. While ๐‘ƒ(๐‘–)is the probability of the (ith) value, which is simply the (ith) component of histogram(๐‘๐‘– ). Finally, ๐‘ƒ(๐ถ1) is the probability of class (C1) that is equal to Eq. (3). In the same way, the class (C2) means intensity value of pixels given by; ๐œ‡2(๐‘‡) = โˆ‘ ๐‘– ๐‘( ๐‘– ๐ถ2 )๐ฟโˆ’1๐‘–=๐‘‡+1 = 1 ๐‘ƒ2(๐‘‡) โˆ‘ ๐‘– ๐‘๐‘– ๐ฟโˆ’1 ๐‘–=๐‘‡+1 (6) Where: ๐œ‡2(๐‘‡) is mean intensity value for class C2. The term๐‘ ( ๐‘– ๐ถ2 )is the probability of value (i) which derives from class (C2). The average intensity of the entire image (i.e., the global mean) given by; ๐œ‡๐บ = โˆ‘ ๐‘– ๐‘(๐‘–) ๐ฟโˆ’1 ๐‘–=0 (7) Where: ๐œ‡๐บ = is a global mean or average intensity of the entire image By substitute of the previous results, the validity of the following two equations can be confirmed: ๐‘ƒ1๐œ‡1 + ๐‘ƒ2๐œ‡2 = ๐œ‡๐บ (8) And ๐‘ƒ1 + ๐‘ƒ2 = 1 (9) Where: ๐‘ƒ1 = Background class (class C1) probability. ๐‘ƒ2= Foreground class (class C2) probability. ๐œ‡1 = mean intensity value for class C1 ๐œ‡2 = mean intensity value for class C2 ๐œ‡๐บ = is a global mean or average intensity of the entire image We can use the normalized, dimensionless metric to assess the "goodness" of the threshold at level (T)[28]; ๐œ‚ = ๐œŽ๐ต 2 ๐œŽ๐บ 2 (10) Where; (๐ˆ๐‘ฎ ๐Ÿ )= global variance (the variance in intensity of all pixels in image) ๐œ‚ = dimensionless metric (๐œŽ๐ต 2) = between-class variance The global variance and it is given in Equation below [10]; ๐œŽ๐บ 2 = โˆ‘ (๐‘– โˆ’ ๐œ‡๐บ ) 2๐‘๐‘– ๐ฟโˆ’1 ๐‘–=0 (11) And between-class variance (๐œŽ๐ต 2) is given in Equation below; ๐œŽ๐ต 2 = ๐‘ƒ1(๐œ‡1 โˆ’ ๐œ‡๐บ ) 2 + ๐‘ƒ2(๐œ‡2 โˆ’ ๐œ‡๐บ ) 2 (12) Equation (12) can also be written as; ๐œŽ๐ต 2(๐‘‡) = ๐‘ƒ1 ๐‘ƒ2[๐œ‡1 โˆ’ ๐œ‡2] 2 (13) Eq. (13) showed that the ๐œŽ๐ต 2 will be larger whenever the two means (๐œ‡1) and (๐œ‡2) are farther from each other, demonstrating that (between-class variance) separability measure between classes. Since (๐œŽ๐บ 2) is constant for a given image; therefore the (๐œ‚)is a measure of separability also, and maximizing this dimensionless metric is equivalent to maximizing (๐œŽ๐ต 2). Note that Eq. (10) assumes implicitly that (๐œŽ๐บ 2 > 0). This variance can only be zero if all intensity levels in the image are the same, which implies that only one class of pixels exists. This, in turn, means that (๐œ‚ = 0) for a constant image since separability of single class from itself is zero. the final results yield when (T)reintroduced again: ๐œ‚(๐‘‡) = ๐œŽ๐ต 2 (๐‘‡) ๐œŽ๐บ 2 (14) And ๐œŽ๐ต 2(๐‘‡) = ๐‘ƒ1(๐‘‡)[(๐œ‡1(๐‘‡) โˆ’ ๐œ‡๐บ )] 2 + ๐‘ƒ2(๐‘‡) [(๐œ‡2(๐‘‡) โˆ’ ๐œ‡๐บ )] 2 (15) Then, the best threshold value is the, (T*) that maximizes ๐œŽ๐ต 2(๐‘‡) : ๐œŽ๐ต 2(๐‘‡โˆ—) = ๐‘š๐‘Ž๐‘ฅ0โ‰ค๐‘‡โ‰ค๐ฟโˆ’1๐œŽ๐ต 2(๐‘‡) (16) OTSU's algorithm also is defined as a weighted sum of the two classes' variances [10]: ๐œŽ๐‘Š 2 = ๐‘ƒ1(๐‘‡) ๐œŽ1 2(๐‘‡) + ๐‘ƒ2(๐‘‡)๐œŽ2 2(๐‘‡) (17) Where: weights (๐‘ท๐Ÿ) and (๐‘ท๐Ÿ) are the probabilities of the background (class C1) and foreground(class C2) classes respectively, separated by a threshold (T) are stated previously. While๐ˆ๐Ÿ ๐Ÿ(๐‘‡)is the variance of the pixels in the background (below threshold), ๐ˆ๐Ÿ ๐Ÿ(๐‘‡) is the variance of the pixels in the foreground (above threshold), given in Equations below; ๐œŽ1 2(๐‘‡) = 1 ๐‘ƒ1(๐‘‡) โˆ‘ [๐‘– โˆ’ ๐œ‡1(๐‘‡)] 2 ๐‘(๐‘–) ๐‘‡๐‘–=0 (18) ๐œŽ2 2(๐‘‡) = 1 ๐‘ƒ2(๐‘‡) โˆ‘ [๐‘– โˆ’ ๐œ‡2(๐‘‡)] 2 ๐‘(๐‘–) ๐ฟโˆ’1๐‘–=๐‘‡+1 (19) And ๐ˆ๐‘พ ๐Ÿ (๐‘ป) is within-class variance. Then, the best threshold value is the, T* that minimizes๐œŽ๐ต 2(๐‘‡) ; ๐œŽ๐‘Š 2 (๐‘‡โˆ—) = ๐‘š๐‘–๐‘›0โ‰ค๐‘‡โ‰ค๐ฟโˆ’1๐œŽ๐‘ค 2 (๐‘‡) (20) Y. J. Tawfeeq and J.A. Al-Sudani / Iraqi Journal of Chemical and Petroleum Engineering 21,3 (2020) 57 - 66 60 For two classes, OTSU showed that minimizing the within-class variance is the same as maximizing within- class variance, in another meaning; subtracting within- class variance from the total variance get something called the between-class varianceฯƒB 2 (T) [29].The best threshold using the OTSU method is the one maximizing overall between classes variance or minimizing overall within-class variance. In Fig. 1-a there is a simple bimodal distribution with two homogeneous classes where the threshold value (T) can be easily determined. If there is no valley one as shown in Fig. 1-b the method of determining threshold value (T) is minimizing the total variance within both classes or maximizes the overall variance between both classes. The best threshold is maximizing the between classes variance or, contrariwise, minimizing the within-class variance[30-31]. Fig. 1. Typical Image Histogram Show (1-a) Simple Bimodal Distribution (1-b) No-Bimodal Distribution In this study, OTSU algorithm implementation started with converting the color image to grayscale image and plotting the input image's normalized histogram. According to the threshold value, the histogram pixels are separated into two clusters or class. The cumulative sums and the cumulative means for each class are calculated using Eq. (3) through (6). The total (global) intensity means it is calculated using Eq. (7). The between-class variance ๐œŽ๐ต 2(๐‘‡) is calculated using Eq. (13), where the farther apart the means, the larger will be ๐œŽ๐ต 2(๐‘‡). The maximum"between-class variance ๐œŽ๐ต 2(๐‘‡) is set as an optimum OTSU threshold value (T*). When the maximum is not unique, the values corresponding to the detected maximum can be (T *) averaged. Finally, optimum separability measure, (๐œ‚ โˆ—)at (T = T*) is obtained using Eq. (14). The separability is a measure of how easily separable the classes are. A uniform distribution is (0), and a clear, bimodal is (1). 3- Calculations and Analysis 3.1. OTSU Thresholding Implementation The objective of this section is to introduce and apply the binary segmentation algorithm of OTSU on the samples under study. MATLAB code created which automatically segment and determine the digital rock porosity, based on the OTSU's thresholding algorithm using the following main MATLAB code; As a first step, a pixel-value histogram using MATLAB image analysis toolbox plotted for each digital image, as shown in Fig. (2), because the used image type is an 8-bit image, there were (256) possible pixel values. The histogram height with 256 bins is calculated, where each bin's height is equal to the number of pixels with that pixel value from (0) to (255). For each given image, the probability of the pixel value (i) calculated using Eq. (3) by separating the height of bin by the complete number of pixels in the histogram. Since the goal is to maximize between-class variance, each class mean, global intensity means and variances of both classes calculated using Eq. (5), (7) and (15), respectively. Fig. 2. Pixel-Value Image Histogram for Sample No.(7) function level = otsu(histogramCounts) total = sum(histogramCounts); % total number of pixels in the image %% OTSU automatic thresholding top = 256; sumB = 0; wB = 0; maximum = 0.0; sum1 = dot(0:top-1, histogramCounts); for ii = 1:top wF = total - wB; if wB > 0 && wF > 0 mF = (sum1 - sumB) / wF; val = wB * wF * ((sumB / wB) - mF) * ((sumB / wB) - mF); if ( val >= maximum ) level = ii; maximum = val; end end wB = wB + histogramCounts(ii); sumB = sumB + (ii-1) * histogramCounts(ii); end end Y. J. Tawfeeq and J.A. Al-Sudani / Iraqi Journal of Chemical and Petroleum Engineering 21,3 (2020) 57 - 66 61 Finally, the OTSU threshold value (TB) obtained as a value of for which ๐œŽ๐ต 2(๐‘‡) is maximum. For not unique maximum threshold value, the OTSU threshold value (TB) obtained by averaging the values of corresponding to the various maxima detected. Additionally, the separability measure, ๐œผ โˆ— was calculated using Eq. (14) at (T = TB). It is sufficient to increase between class variance Eq. (13) this will decrease within-class variance too. Therefore, only the "between-class varianceโ€ calculated for each threshold and picked the threshold that maximizes the variance. Depending on each microscopic image quality, OTSU's algorithm was run several times for better results. The MATLAB function (im2bw) is used to convert an intensity image to a binary image. The binary image level which is a normalized intensity value that lies in the range (0, 1) was calculated depending on optimal threshold value (TB). The results of threshold, class variance and separability criterion are for twenty-two used samples are listed in the Table 1. The results of OTSU thresholding are exposed in Fig. 3 for some analyzed samples, for example. It can be observed that the threshold of the OTSU has divided the digital image into two levels: white (porous) and Black (matrix) background. 3.2. Porosity and Pore Space Characteristics Determination A digital image comprises pixels, which are building blocks of an image. Core samples image used in this study were cropped at (637x478) pixels; hence the total number of pixels in a sample is 304486 pixels. Core thin section samples consist of empty pore space filled with the blue liquid epoxy and solid grains comprising of different minerals colors, as shown in Fig. 3. The definition of porosity from image analysis can be written in pixels term as [32-33]; ๐‘ƒ๐‘œ๐‘Ÿ๐‘œ๐‘ ๐‘–๐‘ก๐‘ฆ (๐ผ๐‘š๐‘Ž๐‘”๐‘’) = โˆ‘ ๐‘๐‘–๐‘ฅ๐‘’๐‘™๐‘  ๐‘–๐‘› ๐‘๐‘œ๐‘Ÿ๐‘’ ๐‘ ๐‘๐‘Ž๐‘๐‘’ ๐‘‡๐‘œ๐‘ก๐‘Ž๐‘™ ๐‘›๐‘ข๐‘š๐‘๐‘’๐‘Ÿ ๐‘œ๐‘“ ๐‘๐‘–๐‘ฅ๐‘’๐‘™๐‘  (21) Table 1. Results of OTSU Thresholding Method Parameters CoreID Threshold Separability Criterion Between Class Variance Level Global Mean 7 116 0.738 2014.095 0.453 77.549 19 70 0.655 857.701 0.273 51.699 23 82 0.673 977.849 0.32 60.731 24 98 0.651 352.521 0.383 99.982 25 130 0.6607 1290.457 0.508 108.541 27 60 0.692 901.321 0.234 31.857 30 63 0.642 843.274 0.246 47.497 31 79 0.665 1249.349 0.309 51.538 35 94 0.797 1968.251 0.367 52.731 36 83 0.751 1673.708 0.324 55.031 37 108 0.794 2587.521 0.422 58.826 38 67 0.621 615.282 0.262 49.961 39 117 0.814 2494.852 0.457 60.336 45 65 0.614 628.847 0.254 47.842 46 79 0.639 652.216 0.309 48.881 47 82 0.640 1128.102 0.32 48.569 48 55 0.604 280.360 0.215 35.338 49 113 0.838 2917.877 0.441 56.905 50 109 0.473 321.355 0.426 108.324 51 71 0.617 564.410 0.277 52.768 52 67 0.620 554.612 0.262 42.295 53 64 0.5693 443.973 0.25 40.040 Porosity also defined in terms of pore sizes as micro and macropores. Core thin section samples used in the current study were scanned with optical microscopy having a pixel resolution of (10 ๐œ‡m). Substantial porosity may be residing in pore sizes less than (10 ๐œ‡m), i.e. sub-resolution pores. Such sub- resolution pores were visually observe-able on thin- section images but with a mixed response of clay-silt matrix and porosity. In the current study, sub-resolution pores are defined as (Micro Pores), and pore sizes larger than (10 ๐œ‡m) is defined as (Macro Pores). A subjective adjustment factor was used to extract matrix effect from sub-resolution pores as [25]; ๐œ‘๐‘–๐‘š๐‘Ž๐‘”๐‘’ = ๐ด โˆ— ๐œ‘๐‘š๐‘–๐‘๐‘Ÿ๐‘œ + ๐œ‘๐‘š๐‘Ž๐‘๐‘Ÿ๐‘œ = ๐œ‘๐‘ก๐‘œ๐‘ก๐‘Ž๐‘™ (22) Where: ๐œ‘๐‘–๐‘š๐‘Ž๐‘”๐‘’ = porosity derived from image analysis ๐œ‘๐‘š๐‘–๐‘๐‘Ÿ๐‘œ = microporosity derived from image analysis ๐œ‘๐‘š๐‘Ž๐‘๐‘Ÿ๐‘œ = macroporosity derived from image analysis ๐œ‘๐‘ก๐‘œ๐‘ก๐‘Ž๐‘™ = total porosity derived from image analysis Ain = adjustment factor (between 0 and 1) to remove matrix effect from sub-resolution pores. As a pixel representing pore size of less than 10 ๐œ‡m may consist of both a grain and a pore. ะค๐‘š๐‘–๐‘๐‘Ÿ๐‘œ is micro pores porosity, and ะค๐‘š๐‘Ž๐‘๐‘Ÿ๐‘œ is macro pores porosity. Microporosity was adjusted by the factor (๐ด = 0.75) to exclude matrix effect from micropores, and image porosity is calculated using Equation (22). The results of image porosity obtained by OTSU's thresholding techniques are listed in the Table 2 with the comparison with core porosity. Fig. 4 shows a comparison of porosity obtained OTSU's thresholding techniques vs core porosity. The task of digital rock analysis needs to be quantitatively measured in an area of interest, whether it is pores or grains, extracted from a digital rock image. The extracted objects are binary objects, where interest object with object label map is presented. Binary objects characteristically interpreted to get a value of (1), and the residual pixels to reach a value of (0). A binary object can be described by size, shape or distance from other objects. An object's size can be defined by area and perimeter. The area is a suitable measure of the total size. Perimeter is mainly ideal for discriminating between objects with simple shapes and those with complex shapes. Consider the function In (i, j) described for the object of an (MxN) image: ๐ผ๐‘› (๐‘–, ๐‘—) = { 1 ๐‘–๐‘“ ๐ผ(๐‘–, ๐‘—) = ๐‘›๐‘กโ„Ž ๐‘œ๐‘๐‘—๐‘’๐‘๐‘ก ๐‘›๐‘ข๐‘š๐‘๐‘’๐‘Ÿ 0 ๐‘œ๐‘กโ„Ž๐‘’๐‘Ÿ๐‘ค๐‘–๐‘ ๐‘’ } (23) The area in pixels is then specified by; ๐ด๐‘› = โˆ‘ โˆ‘ ๐ผ๐‘› (๐‘–, ๐‘—) ๐‘โˆ’1 ๐‘—=0 ๐‘€โˆ’1 ๐‘–=0 (25) Y. J. Tawfeeq and J.A. Al-Sudani / Iraqi Journal of Chemical and Petroleum Engineering 21,3 (2020) 57 - 66 62 The simple calculation of the perimeter takes from the number of boundary pixels belonging to an object. This achieved thru calculating pixels number with a value of (1) and at least one adjacent pixel of (0). Another problem in the measurement of the perimeter is to separate an object's internal and external perimeter (segmented pores). The boundary pixel exact vertex points usually understood to be at the center of that pixel. The boundary pixels location for perimeter measurement yields internal perimeter; external perimeter yields with the boundary of pixels in the background around the object; as shown in Fig. 5. Fig. 3. Original and Segmented Binary Rock Images Thresholds Using OTSU Method (Samples 25,35,39, and 49 respectively from top to bottom) Table 2. Image porosity analysis results using the OTSU thresholding method Core ID total pores macropores micropores Phi_Macro Phi_Micro PHI_Im PHI_Core 7 304486 41837 9137 0.137 0.030 0.159 0.182 19 304486 29577 30561 0.097 0.100 0.172 0.193 23 304486 33739 8747 0.110 0.028 0.132 0.145 24 304486 22551 8474 0.074 0.027 0.094 0.095 25 304486 36559 9924 0.120 0.032 0.144 0.158 27 304486 14852 13283 0.048 0.043 0.081 0.085 30 304486 48714 14699 0.159 0.048 0.196 0.207 31 304486 40496 16749 0.132 0.055 0.174 0.199 35 304486 33036 6687 0.108 0.021 0.124 0.142 36 304486 35322 9609 0.116 0.031 0.139 0.144 37 304486 42083 3843 0.138 0.012 0.147 0.184 38 304486 47138 15888 0.154 0.052 0.193 0.212 39 304486 50218 12294 0.164 0.040 0.195 0.217 45 304486 48565 13666 0.159 0.044 0.193 0.213 46 304486 21743 13485 0.071 0.044 0.104 0.147 47 304486 17129 27177 0.056 0.089 0.123 0.147 48 304486 18437 15278 0.060 0.050 0.098 0.17 49 304486 38680 3195 0.127 0.010 0.134 0.167 50 304486 13801 9660 0.045 0.031 0.069 0.12 51 304486 25996 13299 0.085 0.043 0.118 0.128 52 304486 28605 12458 0.093 0.040 0.124 0.139 53 304486 21650 14724 0.071 0.048 0.107 0.156 Fig. 4. Comparison of image-based total porosity using OTSU thresholding method with core porosity The circular equivalent diameter defined as the diameter of a circle with the same area as the region. Thus, once the area of the pore measured, the equivalent diameter (Deq) calculated as [26]; ๐ท๐‘’๐‘ž = โˆš 4๐ด ๐œ‹ (25) Specific surface area or surface to volume ratio is approximated by the ratio of pore perimeter to pore area. Perimeter and area of each pore (i) are outputs of binary image analysis as discussed previously. The specific surface area of digital binary rock sample image is written as [26]; Y. J. Tawfeeq and J.A. Al-Sudani / Iraqi Journal of Chemical and Petroleum Engineering 21,3 (2020) 57 - 66 63 ๐‘†๐‘– = ๐‘ƒ๐‘œ๐‘Ÿ๐‘’ ๐‘ƒ๐‘’๐‘Ÿ๐‘–๐‘š๐‘’๐‘ก๐‘’๐‘Ÿ ๐‘ƒ๐‘œ๐‘Ÿ๐‘’ ๐ด๐‘Ÿ๐‘’๐‘Ž (26) The specific area of the analyzed sample is approximated as the average specific area of all pores. ๐‘† = 1 ๐‘ โˆ‘ ๐‘†๐‘– (27) The results of image pore space characteristics obtained by OTSU's thresholding techniques are listed in the table (3). Fig. 5. Perimeter measurement by counting the number of object boundary pixels Table 3. Image pore space characteristics results using OTSU thresholding method Core ID Avg. Area (ยตm 2 ) Avg. Equiv. Diameter (ยตm) Avg. Perimeter (ยตm) Avg. Specific surface area (1/ ยตm) 7 49.45 3.774 18.355 1.088 19 10.911 2.659 10.927 1.2691 23 31.01 3.412 18.228 1.178 24 28.788 3.754 17.583 1.136 25 42.531 4.403 22.288 1.092 27 25.809 3.131 12.076 1.064 30 32.995 3.413 18.412 1.111 31 24.72 3.364 14.549 1.1028 35 51.983 3.489 19.649 1.201 36 44.861 3.172 15.724 1.185 37 92.241 4.973 26.83 1.077 38 29.816 3.443 17.315 1.140 39 102.234 5.658 27.138 1.009 45 35.332 3.436 18.804 1.169 46 21.252 3.390 13.455 1.137 47 7.144 2.165 6.754 1.249 48 15.916 3.061 11.361 1.138 49 116.055 5.648 26.785 1.004 50 23.206 3.619 14.817 1.101 51 18.550 3.139 13.178 1.181 52 26.980 3.474 15.193 1.134 53 19.365 3.190 12.951 1.1583 3.3. Results and Discussion In this study, three statistical parameters are considered for the analysis of image porosity resulted from a digital image. These statistical parameters utilized to assess the accuracy of porosity predicted from the digital rock analysis. Absolute average percent relative error (AARE) used to quantify the average value of the absolute relative deviation of measured porosity value from experimental core porosity data. The standard deviation of the estimated image porosity relative to the experimental values is essential to measure the accuracy of the correlation and used algorithm. The value of standard deviation is usually expressed in percent, and the small value indicates higher accuracy. The purpose of performing the correlation coefficient is to describe the strength of the association between two variables, namely experimental and calculated values. The correlation coefficient (R) expresses the presence or non-presence of a linear interrelationship between the two observed variables. If the linear interrelationship is positive, the correlation coefficient will be a positive number between 0 and 1.0. If, on the other hand, it is negative, the number will be between (0) and (1). The coefficient of determination (R 2 ) is the square of the coefficient of correlation (R) shows percentage variation in the y-axis that described by all x-axis variables collected. It is varied between (0) and (1) with higher values is better. The results of AARE, standard deviation, correlation coefficient, and coefficient of determination are 14.66, 0.029, 0.892 and 0.796, respectively. 4- Discussions and Conclusions Porosity from image analysis was compared against core porosity to validate the goodness of porosity prediction from image analysis. However, uncertainties associated with both measurements shall be considered as well. Porosity from image analysis is limited to pixels resolution of optical microscopy and represents a very small section of the rock sample. Core porosity is determined on the 1-inch cylindrical plug while the dimensions of thin section sample are only 35 ๐œ‡m thick with a diameter of 1-inch. The volume investigated is different. Studied scanned samples with optical microscopy had a pixel resolution of 10 ๐œ‡m. Pore sizes larger than (10 ๐œ‡m) (i.e. macropores) were correctly resolved, but there was a significant quantity of sub-resolution pores (micropores) with mixed response of pore and matrix. A subjective adjustment factor used to take out this matrix effect from micro-pores. This single value of adjustment factor was determined while comparing image porosity against core porosity for all samples. This factor has selected as (0.75) in this study to correct the matrix effect during OTSU segmentation method. The suggested value may have worked for the analyzed samples of the current research and can differ in other environments. Uncertainty analysis can also be analyzed for porosity from thin section image analysis. Y. J. Tawfeeq and J.A. Al-Sudani / Iraqi Journal of Chemical and Petroleum Engineering 21,3 (2020) 57 - 66 64 As follows from error analysis results, due to an appropriate choice and capture of blue color using OTSU algorithm segmentation, the porosity obtained by image analysis is quite close to the core porosity. The errors are about 14.66% with a standard deviation of about 0.029 and high correlation coefficient of (R = 0.892). Furthermore, binary images capture the pore distribution excellently without counting matrix material as pore space. Digital thin section image analysis can be considered as an alternate technique to evaluate porosity and pore space rock properties rather than experimental core analysis. Histogram thresholding established method had the element of subjectivity in it where the threshold on pixel intensity histogram had to be manually adjusted till the analyst is visually satisfied that pore space adequately captured. This visual analysis was challenging, as optically scanned images used in the current study had a pixel resolution of 10๐œ‡m/pixel, and there were a significant number of pores with size less than the pixel resolution. In another study, regression equations used to achieve a good correlation of porosity between image analysis and routine core analysis data. These adjustments and regression lost the predictive power of image analysis. OTSU clustering introduced as an automatic technique to separate the intensity histogram into two parts and segment the pores from the matrix. Thin sections image porosity using the OTSU technique showed a good match with core porosity; with the additional benefit, that workflow now automated. Moreover, the OTSU method can predict threshold values if it desired to make image interpretation with a thresholding technique. In the current study, porosity is the main petro physical property determined from thin section images. For future work, the permeability as a function of porosity and pore space characteristics can be estimated. The predictive power of the OTSU method is encouraging, as it can be applied on vastly available drill cuttings as a secondary means of porosity data. However, for the wells where conventional core data is not available or possible, porosity can be determined from thin section images for its integration with well logs interpretation to reduce uncertainties. Some limitations to thin section image analysis were also observed. For optically scanned images, pore sizes less than ten๐œ‡m had a mixed response of matrix and porosity. A subjective but a single adjustment factor was required to remove the matrix effect from such pores for all analyzed samples. This is equally applicable for both automatic and manual thresholding techniques. Clustering analyzes porosity from pore filling blue epoxy, i.e., a blue cluster; it was observed that clustering over-estimates porosity if blue color is also present as a matrix color. Such a situation will be equally challenging for manual thresholding and hence, can be concluded as a general limitation of thin section image analysis. References [1] R. C. Gonzalez and R. E. Woods, Digital image processing. New Jersey: Parson, 2008. [2] L. G. Shapiro and G. C. Stockman, "Computer Vision. 279-325," ed: New Jersey, Prentice-Hall, ISBN 0-13-030796-3, 2001. [3] L. Barghout and L. Lee, "Perceptual information processing system." U.S. Patent Application 10/618,543, filed March 25, 2004. [4] P. Shanthakumar and P. Ganesh Kumar, "Computer aided brain tumor detection system using watershed segmentation techniques", International Journal of Imaging Systems and Technology, vol. 25, no. 4, pp. 297-301, 2015. [5] E. B. George and M. Karnan, "MR brain image segmentation using bacteria foraging optimization algorithm," International journal of engineering and technology (IJET), vol. 4, pp. 295-301, 2012. [6] J. Delmerico, P. David and J. Corso, "Building facade detection, segmentation, and parameter estimation for mobile robot stereo vision", Image and Vision Computing, vol. 31, no. 11, pp. 1632-1639, 2013. [7] D. L. Pham, et al., "Current methods in medical image segmentation," Annual review of biomedical engineering, vol. 2, pp. 315-337, 2000. [8] M. Forouzanfar, N. Forghani and M. Teshnehlab, "Parameter optimization of improved fuzzy c-means clustering algorithm for brain MR image segmentation", Engineering Applications of Artificial Intelligence, vol. 23, no. 2, pp. 160-168, 2010. [9] S. Kamalakannan, "Double-edge detection of radiographic lumbar vertebrae images using pressurized open DGVF snakes," IEEE Transactions on Biomedical Engineering, vol. 57, pp. 1325-1334, 2010. [10] N. Otsu, "A Threshold Selection Method from Gray- Level Histograms", IEEE Transactions on Systems, Man, and Cybernetics, vol. 9, no. 1, pp. 62-66, 1979. [11] O. Wirjadi, "Survey of 3d image segmentation methods," 2007. [12] H. Mobahi, S. Rao, A. Yang, S. Sastry and Y. Ma, "Segmentation of Natural Images by Texture and Boundary Compression", International Journal of Computer Vision, vol. 95, no. 1, pp. 86-98, 2011. [13] Md. Abu Bakr Siddique, Mohammad Mahmudur Rahman Khan, Rezoana Bente Arif and Zahidun Ashrafi, "Study and observation of the variations of accuracies for handwritten digits recognition with various hidden layers and epochs using neural network algorithm." In 2018 4th International Conference on Electrical Engineering and Information & Communication Technology (iCEEiCT), pp. 118-123. IEEE, 2018. http://sdeuoc.ac.in/sites/default/files/sde_videos/Digital%20Image%20Processing%203rd%20ed.%20-%20R.%20Gonzalez%2C%20R.%20Woods-ilovepdf-compressed.pdf http://sdeuoc.ac.in/sites/default/files/sde_videos/Digital%20Image%20Processing%203rd%20ed.%20-%20R.%20Gonzalez%2C%20R.%20Woods-ilovepdf-compressed.pdf https://patents.google.com/patent/US20040059754A1/en https://patents.google.com/patent/US20040059754A1/en https://patents.google.com/patent/US20040059754A1/en https://onlinelibrary.wiley.com/doi/full/10.1002/ima.22147 https://onlinelibrary.wiley.com/doi/full/10.1002/ima.22147 https://onlinelibrary.wiley.com/doi/full/10.1002/ima.22147 https://onlinelibrary.wiley.com/doi/full/10.1002/ima.22147 https://onlinelibrary.wiley.com/doi/full/10.1002/ima.22147 http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.411.7411&rep=rep1&type=pdf http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.411.7411&rep=rep1&type=pdf http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.411.7411&rep=rep1&type=pdf http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.411.7411&rep=rep1&type=pdf https://www.sciencedirect.com/science/article/abs/pii/S0262885613001327 https://www.sciencedirect.com/science/article/abs/pii/S0262885613001327 https://www.sciencedirect.com/science/article/abs/pii/S0262885613001327 https://www.sciencedirect.com/science/article/abs/pii/S0262885613001327 https://www.sciencedirect.com/science/article/abs/pii/S0262885613001327 https://www.annualreviews.org/doi/abs/10.1146/annurev.bioeng.2.1.315 https://www.annualreviews.org/doi/abs/10.1146/annurev.bioeng.2.1.315 https://www.annualreviews.org/doi/abs/10.1146/annurev.bioeng.2.1.315 https://www.sciencedirect.com/science/article/abs/pii/S095219760900150X https://www.sciencedirect.com/science/article/abs/pii/S095219760900150X https://www.sciencedirect.com/science/article/abs/pii/S095219760900150X https://www.sciencedirect.com/science/article/abs/pii/S095219760900150X https://www.sciencedirect.com/science/article/abs/pii/S095219760900150X https://ieeexplore.ieee.org/abstract/document/5415616 https://ieeexplore.ieee.org/abstract/document/5415616 https://ieeexplore.ieee.org/abstract/document/5415616 https://ieeexplore.ieee.org/abstract/document/5415616 https://ieeexplore.ieee.org/abstract/document/5415616 https://cw.fel.cvut.cz/wiki/_media/courses/a6m33bio/otsu.pdf https://cw.fel.cvut.cz/wiki/_media/courses/a6m33bio/otsu.pdf https://cw.fel.cvut.cz/wiki/_media/courses/a6m33bio/otsu.pdf https://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/1978 https://kluedo.ub.uni-kl.de/frontdoor/index/index/docId/1978 https://link.springer.com/article/10.1007/s11263-011-0444-0 https://link.springer.com/article/10.1007/s11263-011-0444-0 https://link.springer.com/article/10.1007/s11263-011-0444-0 https://link.springer.com/article/10.1007/s11263-011-0444-0 https://ieeexplore.ieee.org/abstract/document/8628144 https://ieeexplore.ieee.org/abstract/document/8628144 https://ieeexplore.ieee.org/abstract/document/8628144 https://ieeexplore.ieee.org/abstract/document/8628144 https://ieeexplore.ieee.org/abstract/document/8628144 https://ieeexplore.ieee.org/abstract/document/8628144 https://ieeexplore.ieee.org/abstract/document/8628144 https://ieeexplore.ieee.org/abstract/document/8628144 https://ieeexplore.ieee.org/abstract/document/8628144 Y. J. Tawfeeq and J.A. Al-Sudani / Iraqi Journal of Chemical and Petroleum Engineering 21,3 (2020) 57 - 66 65 [14] Rezoana Bente Arif, Md. Abu Bakr Siddique, Mohammad Mahmudur Rahman Khan and Mahjabin Rahman Oishe, "Study and Observation of the Variations of Accuracies for Handwritten Digits Recognition with Various Hidden Layers and Epochs using Convolutional Neural Network." In 2018 4th International Conference on Electrical Engineering and Information & Communication Technology (iCEEiCT), pp. 112-117. IEEE, 2018. [15] Mohammad Mahmudur Rahman Khan, Rezoana Bente Arif, Md. Abu Bakr Siddique and Mahjabin Rahman Oishe, "Study and Observation of the Variation of Accuracies of KNN, SVM, LMNN, ENN Algorithms on Eleven Different Datasets from UCI Machine Learning Repository", in 2018 4th International Conference on Electrical Engineering and Information & Communication Technology (IEEE ICT), 2018, pp. 124-129. [16] M. Sezgin and B. Sankur, "Survey over image thresholding techniques and quantitative performance evaluation", Journal of Electronic Imaging, vol. 13, no. 1, pp. 146-166, 2004. [17] Y. Zhang and L. Wu, "Optimal multi-level thresholding based on maximum Tsallis entropy via an artificial bee colony approach," Entropy, vol. 13, pp. 841-859, 2011. [18] A. dos Anjos and H. Shahbazkia, "Bi-Level Image Thresholding-A Fast Method", in BIOSIGNALS (2), 2008, pp. 70-76. [19] P.-S. Liao, โ€œA fast algorithm for multi-level thresholding," J. Inf. Sci. Eng., vol. 17, 2001, pp. 713-727. [20] Ghassan H. Ali, Yahya J. Tawfeeq, and Mohammed Y. Najmuldeen, โ€œComparative estimation of water saturation in a carbonate reservoir: A case study of northern Iraqโ€, Periodicals of Engineering and Natural Sciences, nol. 7, No. 4, pp.1743-1754, 2019. [21] Mohammed Y. Najmuldeen, Ali A. Fadhil, and Yahya J. Tawfeeq, โ€œPetrophysical Characterization of The Tertiary Oil Reservoir, Northern Iraqโ€, Periodicals of Engineering and Natural Sciences, ISSN 2303-4521, Vol. 8, No. 2, 2020. [22] Karrar Hayder Jassim and Jalal A. Al-Sudani. " Re- evaluation of Petro physical Properties in Yammama Formation at Nasiriya Fieldโ€. Iraqi Journal of Chemical and Petroleum Engineering, Vol.20 No.3, 2019, pp. 59 โ€“ 66. [23] Yahya J. Tawfeeq, Mohammed Y. Najmuldeen and Ghassan H. Ali, โ€œOptimal statistical method to predict subsurface formation permeability depending on open hole wireline logging data: A comparative studyโ€, Periodicals of Engineering and Natural Sciences, ISSN 2303-4521, Vol. 8, No. 2, 2020. [24] Sara S. Zughar, Ahmad A. Ramadhan, Ahmed K. Jaber, "Petrophysical Properties of an Iraqi Carbonate Reservoir Using Well Log Evaluation," Iraqi Journal of Chemical and Petroleum Engineering, vol.21, no.1, pp. 53 โ€“ 59,2020. [25] Fens, T.W., โ€œPetrophysical properties from small rock samples using image analysis techniquesโ€, Ph.D, Delft University of Technology, 2000. [26] Zerabruk, B.T., Nermoen, A., Nadeau, P.H., โ€œDigital image analysis for petrophysical characterizationโ€, M.Sc., University of Stavanger, 2017. [27] Sonka, Hlavac and Boyle, โ€œDigital Image Processing and Computer Visionโ€, India Edition, CENGAGE Learning, 2007. [28] K. Fukunaga, Introduction to statistical pattern recognition. Elsevier, 2013. pp. 260-267. [29] J. Gong, L. Li, and W. Chen, โ€œFast recursive algorithms for two-dimensional Thresholding,โ€ Pattern Recognition, vol. 31, no. 3, pp. 295โ€“300, 1998. [30] Zhang, Jun and Hu Jinglu, "Image segmentation based on 2D Otsu method with histogram analysis", Computer Science and Software Engineering, 2008 International Conference on. 6: 105โ€“108. [31] Zhu, Ningbo and Wang, Gang and Yang, Gaobo and Dai, Weiming, "A fast 2d otsu thresholding algorithm based on improved histogram", Pattern Recognition, 2009. CCPR 2009, Chinese Conference on: 1โ€“5. [32] Varfolomeev, I., Yakimchuk, I., Denisenko, A., Khasanov, I., Osinceva, N., Rahmattulina, A., โ€œIntegrated study of thin sections: Optical petrography and electron microscopyโ€, SPE 182071, 2016. [33] Lawrence, M., Jiang, Y., "Porosity, pore size distribution, micro-structure." In Bio-aggregates based building materials, pp. 39-71. Springer, Dordrecht, 2017 https://ieeexplore.ieee.org/abstract/document/8628078 https://ieeexplore.ieee.org/abstract/document/8628078 https://ieeexplore.ieee.org/abstract/document/8628078 https://ieeexplore.ieee.org/abstract/document/8628078 https://ieeexplore.ieee.org/abstract/document/8628078 https://ieeexplore.ieee.org/abstract/document/8628078 https://ieeexplore.ieee.org/abstract/document/8628078 https://ieeexplore.ieee.org/abstract/document/8628078 https://ieeexplore.ieee.org/abstract/document/8628078 https://ieeexplore.ieee.org/abstract/document/8628041 https://ieeexplore.ieee.org/abstract/document/8628041 https://ieeexplore.ieee.org/abstract/document/8628041 https://ieeexplore.ieee.org/abstract/document/8628041 https://ieeexplore.ieee.org/abstract/document/8628041 https://ieeexplore.ieee.org/abstract/document/8628041 https://ieeexplore.ieee.org/abstract/document/8628041 https://ieeexplore.ieee.org/abstract/document/8628041 https://ieeexplore.ieee.org/abstract/document/8628041 https://www.spiedigitallibrary.org/journalIssue/Download?fullDOI=10.1117%2F1.1631315&SSO=1 https://www.spiedigitallibrary.org/journalIssue/Download?fullDOI=10.1117%2F1.1631315&SSO=1 https://www.spiedigitallibrary.org/journalIssue/Download?fullDOI=10.1117%2F1.1631315&SSO=1 https://www.spiedigitallibrary.org/journalIssue/Download?fullDOI=10.1117%2F1.1631315&SSO=1 https://www.mdpi.com/1099-4300/13/4/841 https://www.mdpi.com/1099-4300/13/4/841 https://www.mdpi.com/1099-4300/13/4/841 https://www.mdpi.com/1099-4300/13/4/841 https://www.scitepress.org/Papers/2008/10643/10643.pdf https://www.scitepress.org/Papers/2008/10643/10643.pdf https://www.scitepress.org/Papers/2008/10643/10643.pdf https://www.iis.sinica.edu.tw/zh/error/404.html https://www.iis.sinica.edu.tw/zh/error/404.html https://www.iis.sinica.edu.tw/zh/error/404.html http://pen.ius.edu.ba/index.php/pen/article/view/917 http://pen.ius.edu.ba/index.php/pen/article/view/917 http://pen.ius.edu.ba/index.php/pen/article/view/917 http://pen.ius.edu.ba/index.php/pen/article/view/917 http://pen.ius.edu.ba/index.php/pen/article/view/917 http://pen.ius.edu.ba/index.php/pen/article/view/1303 http://pen.ius.edu.ba/index.php/pen/article/view/1303 http://pen.ius.edu.ba/index.php/pen/article/view/1303 http://pen.ius.edu.ba/index.php/pen/article/view/1303 http://pen.ius.edu.ba/index.php/pen/article/view/1303 https://doi.org/10.31699/IJCPE.2019.3.8 https://doi.org/10.31699/IJCPE.2019.3.8 https://doi.org/10.31699/IJCPE.2019.3.8 https://doi.org/10.31699/IJCPE.2019.3.8 https://doi.org/10.31699/IJCPE.2019.3.8 http://pen.ius.edu.ba/index.php/pen/article/view/1306 http://pen.ius.edu.ba/index.php/pen/article/view/1306 http://pen.ius.edu.ba/index.php/pen/article/view/1306 http://pen.ius.edu.ba/index.php/pen/article/view/1306 http://pen.ius.edu.ba/index.php/pen/article/view/1306 http://pen.ius.edu.ba/index.php/pen/article/view/1306 https://doi.org/10.31699/IJCPE.2020.1.8 https://doi.org/10.31699/IJCPE.2020.1.8 https://doi.org/10.31699/IJCPE.2020.1.8 https://doi.org/10.31699/IJCPE.2020.1.8 https://doi.org/10.31699/IJCPE.2020.1.8 https://www.narcis.nl/publication/RecordID/oai:tudelft.nl:uuid:d5f7947c-15b4-455e-9cef-99badf41c6ed https://www.narcis.nl/publication/RecordID/oai:tudelft.nl:uuid:d5f7947c-15b4-455e-9cef-99badf41c6ed https://www.narcis.nl/publication/RecordID/oai:tudelft.nl:uuid:d5f7947c-15b4-455e-9cef-99badf41c6ed https://books.google.iq/books?hl=en&lr=&id=BIJZTGjTxBgC&oi=fnd&pg=PP1&dq=Introduction+to+statistical+pattern+recognition.+Boston&ots=X8CglSjnlS&sig=LWyfbJEHcWHuo20o21TrZBSeiGs&redir_esc=y#v=onepage&q=Introduction%20to%20statistical%20pattern%20recognition.%20Boston&f=false https://books.google.iq/books?hl=en&lr=&id=BIJZTGjTxBgC&oi=fnd&pg=PP1&dq=Introduction+to+statistical+pattern+recognition.+Boston&ots=X8CglSjnlS&sig=LWyfbJEHcWHuo20o21TrZBSeiGs&redir_esc=y#v=onepage&q=Introduction%20to%20statistical%20pattern%20recognition.%20Boston&f=false https://www.sciencedirect.com/science/article/abs/pii/S0031320397000435 https://www.sciencedirect.com/science/article/abs/pii/S0031320397000435 https://www.sciencedirect.com/science/article/abs/pii/S0031320397000435 https://www.sciencedirect.com/science/article/abs/pii/S0031320397000435 https://ieeexplore.ieee.org/abstract/document/4723207 https://ieeexplore.ieee.org/abstract/document/4723207 https://ieeexplore.ieee.org/abstract/document/4723207 https://ieeexplore.ieee.org/abstract/document/4723207 https://ieeexplore.ieee.org/abstract/document/5344078 https://ieeexplore.ieee.org/abstract/document/5344078 https://ieeexplore.ieee.org/abstract/document/5344078 https://ieeexplore.ieee.org/abstract/document/5344078 https://www.onepetro.org/conference-paper/SPE-182071-RU https://www.onepetro.org/conference-paper/SPE-182071-RU https://www.onepetro.org/conference-paper/SPE-182071-RU https://www.onepetro.org/conference-paper/SPE-182071-RU https://www.onepetro.org/conference-paper/SPE-182071-RU https://link.springer.com/chapter/10.1007/978-94-024-1031-0_2 https://link.springer.com/chapter/10.1007/978-94-024-1031-0_2 https://link.springer.com/chapter/10.1007/978-94-024-1031-0_2 https://link.springer.com/chapter/10.1007/978-94-024-1031-0_2 Y. J. Tawfeeq and J.A. Al-Sudani / Iraqi Journal of Chemical and Petroleum Engineering 21,3 (2020) 57 - 66 66 ุจุงุณุชุฎุฏุงู… OTSUุชุญู„ูŠู„ ู…ุณุงู…ูŠุฉ ุงู„ู†ู…ุงุฐุฌ ุงู„ุตุฎุฑูŠุฉ ุงู„ุฑู‚ู…ูŠุฉ ุจุงุงู„ุณุชู†ุงุฏ ุนู„ู‰ ุชู‚ู†ูŠุฉุนุชุจุฉ ุงู„ู…ุงุชุงู„ุจ ูŠุญูŠู‰ ุฌุฑุฌูŠุณ ุชูˆููŠู‚ ูˆุฌุงู„ู„ ุนุจุฏ ุงู„ูˆุงุญุฏ ุงู„ุณูˆุฏุงู†ูŠ ุณุฉุŒ ุฌุงู…ุนุฉ ุจุบุฏุงุฏุŒ ุจุบุฏุงุฏุŒ ุงู„ุนุฑุงู‚ู‚ุณู… ู‡ู†ุฏุณุฉ ุงู„ู†ูุทุŒ ูƒู„ูŠุฉ ุงู„ู‡ู†ุฏ ุงู„ุฎุงู„ุตุฉ ุชู„ุนุจ ุงู„ู…ุณุงู…ูŠุฉ ุฏูˆู‹ุฑุง ุฃุณุงุณู‹ูŠุง ููŠ ุงู„ุฌูŠูˆู„ูˆุฌูŠุง ูˆู‡ู†ุฏุณุฉ ุงู„ุจุชุฑูˆู„. ูˆู‡ูŠ ุชุชุญูƒู… ููŠ ุชุฎุฒูŠู† ุงู„ุณูˆุงุฆู„ ููŠ ุทุจู‚ุงุช ุงู„ู…ูŠุงู‡ ุงู„ุฌูˆููŠุฉ ูˆุญู‚ูˆู„ ุงู„ู†ูุท ูˆุงู„ุบุงุฒ ูˆุชูˆุตูŠู„ ู‡ูŠูƒู„ ุงู„ู…ุณุงู… ู„ู„ุชุญูƒู… ููŠ ุชุฏูู‚ ุงู„ุณูˆุงุฆู„ ูˆุงู†ุชู‚ุงู„ู‡ุง ู…ู† ุฎุงู„ู„ ุชูƒูˆูŠู†ุงุช ุงู„ุฎุฒุงู†. ุงู„ุตุฎูˆุฑุŒ ูŠุฌุจ ู‚ูŠุงุณ ุจู†ูŠุฉ ุงู„ู…ุณุงู… ูˆูˆุตูู‡ุง ูƒู…ูŠู‹ุง. ู„ุชุญุฏูŠุฏ ุงู„ุนุงู„ู‚ุงุช ุจูŠู† ุงู„ู…ุณุงู…ูŠุฉ ูˆุงู„ุชุฎุฒูŠู† ูˆุงู„ู†ู‚ู„ ูˆุฎุตุงุฆุต ุญูŠุซ ูŠุชู… ููŠู‡ุง ุญุณุงุจ ู…ุณุงู…ูŠุฉ ุงู„ุตูˆุฑ ุงู„ุฑู‚ู…ูŠุฉ ุจุงุณุชุฎุฏุงู… ู…ุนุงู„ุฌุฉ ุงู„ุตูˆุฑ ูŠุนุฏ ุงู…ุฑุง ู…ู‡ู…ุง ู„ุชุญู„ูŠู„ ุตุฎูˆุฑ ุงู„ู…ูƒุงู…ู† ุงู„ูˆุชูŠู†ูŠุฉ ูŠุณุชุฎุฏู… ุนู…ู„ูŠุฉ ุชุญูˆูŠู„ ุงู„ุตูˆุฑุฉ ุงู„ู‰ ุงุจูŠุถ ูˆุงุณูˆุฏ ุญูŠุซ ูˆุตู ุงู„ู…ุณุงู…ูŠุฉ ุซู†ุงุฆูŠุฉ ุงุฃู„ุจุนุงุฏ ู„ู„ุนูŠู†ุฉ. ุงู„ุนู…ู„ูŠุงุช ุชุณุชุฎุฏู… ุญุฏ ู‚ูŠู…ุฉ ุงู„ุจูŠูƒุณู„ ู„ุชุญูˆูŠู„ ุงู„ุตูˆุฑ ุงู„ู…ู„ูˆู†ุฉ ูˆุงู„ุฑู…ุงุฏูŠุฉ ุฅู„ู‰ ุตูˆุฑ ุซู†ุงุฆูŠุฉ. ูˆุงู„ููƒุฑุฉ ู‡ูŠ ุงุณุชูŠุนุงุจ ุงู„ู…ู†ุงุทู‚ ุงู„ุฒุฑู‚ุงุก ุจุงู„ูƒุงู…ู„ ู…ุน ุงู„ู…ุณุงู… ูˆุชุญูˆูŠู„ู‡ุง ุฅู„ู‰ ุงู„ู„ูˆู† ุงุฃู„ุจูŠุถ ููŠ ุตูˆุฑุฉ ุซู†ุงุฆูŠุฉ ู†ุงุชุฌุฉ. ุชู‚ุฏู… ู‡ุฐู‡ ุงู„ุฏุฑุงุณุฉ ุฅู…ูƒุงู†ุงุช ู… ู…ุนุงู„ุฌุฉ ุงู„ุตูˆุฑ ู„ุชุญุฏูŠุฏ ู…ุณุงู…ูŠุฉ ุนูŠู†ุงุช ุงู„ุตุฎูˆุฑ ุงู„ุฑู‚ู…ูŠุฉ ุซู†ุงุฆูŠุฉ ุงุฃู„ุจุนุงุฏ ููŠ ุตุฎูˆุฑ ู…ูƒุงู…ู† ุงู„ูƒุฑุจูˆู†ูŠุฉ. ุชู… ุงุณุชุฎุฏุง ุฅู†ุดุงุก ูƒูˆุฏ ู…ุงุชุงู„ุจุŒ ูˆุงู„ุฐูŠ ูŠู‚ูˆู… ุชู„ู‚ุงุฆู‹ูŠุง ุจุชู‚ุทูŠุน ู…ุณุงู…ูŠุฉ ุงู„ุตุฎูˆุฑ ุงู„ุฑู‚ู…ูŠุฉ ูˆุชุญุฏูŠุฏู‡ุงุŒ ุงุณุชู†ุงู‹ุฏุง ุฅู„ู‰ ุฎูˆุงุฑุฒู…ูŠุฉ ุนุชุจุฉ OTSU ุงุช ู…ู‚ุทุน ุฑู‚ูŠู‚ ุซู†ุงุฆูŠ ุงุฃู„ุจุนุงุฏ ู„ุญู‚ู„ ุนูŠู†ุฉ ู…ู† ุตุฎูˆุฑ ู…ูƒู…ู† ุงู„ุตูˆุฑุฉ ุฐ 22. ููŠ ู‡ุฐุง ุงู„ุนู…ู„ุŒ ุชู…ุช ุฏุฑุงุณุฉ ู†ูุท ุนุฑุงู‚ูŠ. ุชุชู… ู…ุนุงู„ุฌุฉ ุฃู…ุซู„ุฉ ุตูˆุฑ ุงู„ู‚ุณู… ุงู„ุฑู‚ูŠู‚ ูˆุฑู‚ู…ู†ุชู‡ุง ุจุงุณุชุฎุฏุงู… ุจุฑู…ุฌุฉ ู…ุงุชุงู„ุจ. ููŠ ุงู„ุฏุฑุงุณุฉ ุงู„ุญุงู„ูŠุฉุŒ ุฑูƒุฒู†ุง ู„ู„ุตูˆุฑุฉ ุงู„ุฑู‚ู…ูŠุฉ. ุฃูŠุถุงุŒ ุชู… ุญุณุงุจ ุจุนุถ ุฎุตุงุฆุต ุงู„ูุฑุงุบ ุงู„ู…ุณุงู…ูŠุŒ ู…ุณุงู…ุงุช ุงู„ู…ุงูŠูƒุฑูˆ ูˆุงู„ู…ุงูƒุฑูˆ ุนู„ู‰ ุญุณุงุจ ุงู„ู…ุฎุชุจุฑูŠ ุงุฆุฌ ุชุญู„ูŠู„ ุงู„ุตูˆุฑ ุงู„ุฑู‚ู…ูŠุฉ ุซู†ุงุฆูŠุฉ ุงุฃู„ุจุนุงุฏ ุจู†ุชุงุฆุฌ ุงู„ู…ุณุงู…ูŠุฉ ู…ู† ุงู„ูุญุตู…ุซู„ ุงู„ุญุฌู… ูˆุงู„ู…ุญูŠุท. ุชู… ู…ู‚ุงุฑู†ุฉ ู†ุช ู„ุชุญุฏูŠุฏ ู‚ูˆุฉ ุชู‚ู†ูŠุงุช ุชูุณูŠุฑ ุงู„ุตูˆุฑ ุงู„ุฑู‚ู…ูŠุฉ ุงู„ู…ุณุชุฎุฏู…ุฉ. ุฃุธู‡ุฑุช ู…ุณุงู…ูŠุฉ ุงู„ุตูˆุฑุฉ ุงู„ุฏู‚ูŠู‚ุฉ ุงู„ู…ุฌู‡ุฑูŠุฉ ุงู„ุชูŠ ุชู… ุญุณุงุจู‡ุง ุชุทุงุจู‹ู‚ุง ุฌูŠู‹ุฏุง ู…ุน ุงู„ู…ุณุงู…ูŠุฉ ุงู„ู…ุญุณูˆุจุฉ ู…ู† ุงู„ูุญูˆุตุงุช ุงู„ู…ุฎุชุจุฑูŠุฉ. OTSUุจุงุณุชุฎุฏุงู… ุชู‚ู†ูŠุฉ : ููŠุฒูŠุงุก ุงู„ุตุฎูˆุฑ ุงู„ุฑู‚ู…ูŠุฉุŒ ุนุชุจุฉ ุฃูˆุชุณูˆุŒ ุตูˆุฑุฉ ุงู„ู…ู‚ุทุน ุงู„ุฑู‚ูŠู‚ุŒ ุงู„ู…ุณุงู…ูŠุฉุŒ ู…ุณุงู…ุงุช ุงู„ู…ุงูŠูƒุฑูˆุŒ ู…ุณุงู…ุงุช ุงู„ู…ุงูƒุฑูˆ.ุงู„ุฏุงู„ุฉู„ู…ุงุช ุงู„ูƒ