Plane Thermoelastic Waves in Infinite Half-Space Caused FACTA UNIVERSITATIS Series: Electronics and Energetics Vol. 31, N o 2, June 2018, pp. 287 - 301 https://doi.org/10.2298/FUEE1802287B VISION INSPECTION AND MONITORING OF WIND TURBINE FARMS IN EMERGING SMART GRIDS  Mahdi Bahaghighat, Seyed Ahmad Motamedi Electrical Engineering Faculty, Amirkabir University of Technology, Tehran, Iran Abstract. Todays, Smart Grids as the goal of next generation power grid system span wide and new aspects of power generation from distributed and bulk power generators to the end-user utilities. There are many advantages to develop these complex and multilayer system of systems such as increasing agility, reliability, efficiency, privacy, security for both Energy and ICT sections in smart grid architecture. In emerging smart grids, the communication infrastructures play main role in grid development and as a result multimedia applications are more practical for the future power systems. In this work, we introduce our method for monitoring and inspection of Wind Turbine (WT) farms in smart grids. In our proposed system, a thermal vision camera is embedded on a wireless sensor node for each WT to capture appropriate images and send video streams to the coordinator. It gets video frames to perform machine Vision Inspection (VI) and monitoring purposes. In our constructed model, turbine blade velocity estimation is targeted by detecting two important landmarks in the image that are named hub and blade. By tracking the blade in the consecutive frames and based on proposed scoring function, we can estimate the velocity of the turbine blade. Obtained results clearly indicate that accurate hub and blade positions extraction lead to error free estimation of turbine blade velocity. Key words: Vision Inspection, Thermal Vision, Gabor Wavelets, Template Matching, Wind Turbine and Smart Grids. 1. INTRODUCTION Smart Grids (SGs) as a future network of legacy power grids are sophisticated system of systems that support bidirectional power and data flows. SG benefits renewable energies such as wind energy and should be eco-friendly [1]. The self-healing, two-way communication, decentralization, and predictive reliability of SGs make electricity network operation and maintenance more manageable and easy [2]. In Europe, a total of 211 SG’s projects in the R&D phase is worth approximately € 820 million, and 248 projects under D&D have a total budget of around € 2320 million [3]. These investments clearly indicate Received June 10, 2017; received in revised form October 3, 2017 Corresponding author: Seyed Ahmad Motamedi Electrical Engineering Faculty, Amirkabir University of Technology, Tehran, Iran (E-mail: motamedi@aut.ac.ir) 288 M. BAHAGHIGHAT, S.A. MOTAMEDI the importance of smart grids in the world’s future. The European roadmap for smart grids is based on SGAM model that includes five interoperability layers in five Domains and six Zones [4]. Fig. 1 shows SGAM architecture for Smart Grids. In addition to European standards, there are some strong models such as NIST Framework1 and NIST Framework2 [5] that are conducting by related organizations in the USA. These comprehensive models cover all parts of SGs. In all mentioned SG models, telecommunication infrastructures play major role in the development of next generation power grid systems. Fig. 1 SGAM architecture for Smart Grids [4] These infrastructures can be deployed for multimedia applications [2] but there are a number of requirements that must be addressed in order to have fully robust, reliable and secure multimedia streaming in smart grid networks [2]. The most important requirements that should be considered in the SG’s communication backbone are latency, frequency ranges, reliability, data rate, security, and throughput. For example, in [6] the total throughput in ranges 3–10 Mbps is estimated for SG communication systems in many applications as well as multimedia communications. In addition, they offer a frequency range under 2GHz to have a low-cost solution that can overcome line-of-sight issues, e.g., foliage, rain fade, and penetration through walls [6]. In SG architecture, several major network types are defined in literatures [6-8]: Home Area Network (HAN), Building Area Networks (BAN), Neighborhood Area Network (NAN), Field Area Networks (FAN) and Wide Area Network (WAN). This multi-tier network structure type Vision Inspection and Monitoring of Wind Turbine Farms in Emerging Smart Grids 289 is illustrated in Fig. 2. Each network area involves its technical restrictions and they will have mutual impacts on others. Now, there are increasing demands for energy monitoring through SG communication networks. Our work is considered to develop a special WT energy monitoring system based on multimedia applications. We assume that WT farms are located in NAN and/or FAN. In NAN/FAN applications, usually data rates vary between 100Kbps and 10Mbps [6] which is well adapted to monitor and video inspection purposes. Fig. 2 Different network types in SGs In addition to energy mentoring and its essential priority in SG, wind turbines (WTs) as extremely high cost devices should have advance maintenance services [9]. Fig. 3 shows a comparison between traditional and modern maintenance approaches. For a WT that its components are made from particularly carbon fiber reinforced plastic (CFRP), it is essential to have monitoring infrastructure and intelligent Vision Inspection (VI) during in-service operation [10]. In order to increase the lifetime of the WT farm and reduce the maintenance cost, well accurate predictions of system faults and failures are needed. In future grids, these predictions can be available based on advanced nondestructive testing (NDT) approaches such as intelligent VI. The blades monitoring and diagnostics based on intelligent vision inspection are actually a complex challenge, and until now, there is no work in the literature. Therefore, we conduct our research on the subject and introduce our architecture to monitor wind turbine (WT) farms in smart grids. In our proposed system, a thermal vision camera [10] is embedded on a wireless sensor node for each WT to capture appropriate images and stream video to the coordinator. It gets video streams to perform machine vision inspection and monitoring tasks. 290 M. BAHAGHIGHAT, S.A. MOTAMEDI Fig. 3 Comparison between traditional and intelligent maintenance methods [9]. Thermal cameras are passive sensors that capture the infrared radiation emitted by all objects with a temperature above absolute zero. Deploying this type of sensor in vision systems eliminates the illumination problems of normal greyscale and RGB cameras [11]. These type of cameras were originally developed as surveillance and night vision tools for the military systems, but recently their prices have dropped significantly. This means that a broader field of applications can use these cameras. In this work, turbine blade velocity estimation is targeted and it can be used to estimate power generation of WT farms as well. To tackle this challenging issue, we define two landmarks in a receiving image that related to the hub and blade of a WT. Accurate and robust estimation of these two objects in the consecutive frames can lead to the turbine blade velocity calculation that it is directly related to the power generated by a WT farm. Furthermore, our proposed structure based on thermal vision camera and sensor node is well designed for more vision inspections (VI) such as Mechanical Deformations, Surface Defects, Overheated Components in rotor blades, nacelles, slip rings, yaw drives, bearings, gearbox, generators, and transformers [9]. 2. PROPOSED MODEL In this section, we want to elaborate details of our comprehensive model for real-time video streaming from a sensor node to the coordinator node. Our main motivation to construct this architecture is that usually sensor nodes are low end devices with limited hardware resources [12-14] so running high complexity VI algorithms cannot be expected with acceptable performance in a light sensor node. As a result, for practical purposes, we propose to use our structured model for turbine blade velocity estimation at coordinator node as a medium or a high end receiver node. In the following, we demonstrate our proposed blade velocity estimation algorithm. 2.1. Turbine blade velocity estimation system model The coordinator as the receiver node receives video output from the thermal vision camera. In predefined time window, it gets each frame and tries to find special objects in Vision Inspection and Monitoring of Wind Turbine Farms in Emerging Smart Grids 291 the current image as landmarks. As mentioned before, we define two landmarks in a image that related to the hub and blade of a WT (Fig. 4). Fig. 4 Condition monitoring and diagnosis for Wind Turbine Firstly, the fast and robust Correlation Coefficient template matching approach is used for hub detection and localization in the received image, and then the mass center of the hub is calculated to set coordinate system (Cx,Cy). Based on the estimated hub position, the bounding box with height BH and width BW is assigned to the point on the circumference of a circle with predefined radius  and angle  in obtained coordinate system. This primary phase, constructs the adaptive structure of our algorithm to dynamic adaptation on the variation of the hub position in a sequence of images. Now, our Gabor Wavelet Filter Banks that well-tuned according to blade parameters are applied to the masked sub image. Consequently, the present or absent of a blade in this image will be recognized by GW coefficient analysing. Recalling the mentioned procedure in consecutive frames and using of some softening approaches lead to accurate estimation of blade velocity. In Fig. 5, proposed method for blade detection based on Gabor Wavelets is described. In addition, in Fig.6, overall view of our model is presented to clear more details of our algorithm. Fig. 5 Proposed method for blade detection based on Gabor Wavelets 292 M. BAHAGHIGHAT, S.A. MOTAMEDI 2.2. Hub detection based on template matching In the first step of our algorithm, it is necessary to localize hub position as the first landmark object in the whole input image. This is the fundamental step that has a direct impact on the performance of next steps. In this work, we use Template matching (TM). TM plays an important role in many image processing applications. In a TM approach, it is sought the point in which it is presented the best possible resemblance between a sub image known as a template and its coincident region within a source image [15]. There are a lot of methods for pattern and template matching [15-18] but for simplicity, we use correlation coefficient [16] template matching to find a hub in an input image. So we benefits Pearson’s correlation coefficient as below [16]: , 2 2 (x x)(y y) (1) (x x) (y y) x y x y r         (1) The correlation coefficient can be interpreted as a correlation between a template image (with average x ) and an input image (with average y ) after both the template and the image have been z-normalized (it is rescaled so that its mean is zero and the standard deviation is one) [12]. Illumination and contrast differences are thus eliminated before match quality is evaluated making the correlation coefficient an ideal measure of match when we want to ensure robustness for variations of pattern brightness and contrast. Fig. 6 Proposed algorithm for turbine blade velocity estimation Vision Inspection and Monitoring of Wind Turbine Farms in Emerging Smart Grids 293 Our study shows that template matching based on the correlation coefficient can successfully identify potential target regions in a thermal camera video use case for WT. After hub localization, the sub image IH(x,y) is extracted to calculate its mass center (Cx,Cy) by the following equations [19, 20]: 10 01 00 00 , (2) x y M M C C M M   (2) (x, y) (3) i j ij H x y M x y I  (3) Now, based on extracted reference point (Cx,Cy), the bounding box with height BH and width BW is assigned to the point on the circumference of a circle by radius  and angle  This primary phase, constructs the adaptive structure of our algorithm to dynamic adaptation on the variation of the hub position in a sequence of images. 2.3. Proposed Gabor Wavelet feature extraction method for blade detection From the morphological point of view, the turbine blade object is a directional pattern with known inter ridge spacing. There are a lot of approaches in the literature for example Radon Transform [21], Hough Transform and Gabor Wavelet Transform (GWT) [22] to analysis directional pattern. Among all mentioned methods, GWT has special and unique properties. The important property of the GWT is that it minimizes the product of its standard deviations in both time and frequency domain. Put another way, the uncertainty in information carried by this wavelet is minimized. However, they have the downside of being non-orthogonal, so efficient decomposition into the basis is difficult. Since their inception, various applications have appeared, from image processing to analyzing neurons in the human visual system [23, 24]. At this stage, we use our structured Gabor Wavelet Filter Banks that are well adapted to turbine blade parameters. These tuned filter banks are applied to the masked sub image IB(x,y). This image includes neighbour pixels around the central point of a bounding box that located in (,), in our bipolar coordination system. In fact, our Gabor wavelet based feature extraction method is used to determine whether a blade is located in IB(x,y) or not? Before elaborating the details of our blade detection algorithm, the brief view of GWT should be presented. In [22], Bidimensional Gabor Wavelets, gw,(x,y) = L(v)B(u), is used for directional pattern analysis. Where B(u) is the equation of a band pass filter, centered on the w frequency, and L(v) is the equation of a Gaussian low-pass filter. A Bidimensional Gabor Wavelet is composed of a band pass filter in the direction of the wave and low pass filter in orthogonal direction as below: 2 2 2 2 2 21 1 ( ) , ( ) (4)v u v u jwu v u L v e B u e e         (4) With cos( ) sin( )u x y    (5) sin( ) cos( )v x y    (6) 294 M. BAHAGHIGHAT, S.A. MOTAMEDI Where  2 v and  2 u are scale parameters in the direction of the wave and in its orthogonal direction respectively [22, 25]. In Fig. 7, real and imaginary parts of 2D Gabor Wavelet are depicted. (a) Real part (b) Imaginary part Fig. 7 Real and imaginary parts of a 2D Gabor Wavelet An example of Gabor filter bank feature extraction including 12 coefficients, in the case of 3 frequencies and 4 directions, is shown in Fig. 8. According this figure, providing that the input image frequency and orientation are W * and  * respectively, Gabor wavelet coefficient for (W * ,  * ) will be the maximum and vice versa. In our work, in order to have the best adaptation on blade direction and frequency features, we define (2K +1) directions and (2KW +1) frequencies around (w0,0) so we have Gabor Bank, including (2KW +1)(2K +1) filters, then we should compute the local projections of the normalized masked image, I ' B(x,y), on the filter bank. I ' B(x,y) is the normalized version of I ' B(x,y) that obtained from equation below: , 0 0 0 0 ( , ) [ ( , ) ( , )] ( , ) B B V I x y I x y m x y v x y   (7) Where m(x0,y0) and v(x0,y0) are the mean and standard deviation of IB(x,y) respectively, and norm of I ' B can be noted according equation (8): 2 2, , 2 2 2 2 1 || || ( , ) 1 N N B B N N I I x y dxdy V N       (8) Now, we can calculate our Gabor wavelet coefficients bank as is following: 2 2 , , 2 2 , , ( , ) ( , ) , ( , ) N N B w N N w w I x y g x y dxdy a g x y         (9) Vision Inspection and Monitoring of Wind Turbine Farms in Emerging Smart Grids 295 Fig. 8 An example of Gabor filters bank feature extraction in the case of 3 frequencies and 4 directions Then features can be extracted by proposed equation (10): 2 , , * * , 0 0 Re ( ) ( , ) max( w W , ) W { 1, 2,... } { 1, 2,... } w w w s s s w w w s A N l a w Arg A w k w k K k k K                         (10) At the final step of our analysis, decision making should be down to recognize present/absent of a blade in this frame. We define the equation (11) for scoring the blade detection results: * * , 1 , 2 , * * , S { , W , } w BD w W S w s s S w A w A w w W                 (11) Where w1,w2 are weighting coefficients and SBD is the blade detection score. If a blade is available in IB, most of the Aw, coefficients will have a large value near or greater than Sthr while in null case which there is no blade in IB; coefficients may have a relatively low value. We proposed such weighted scoring function as mentioned in the equation (11) to intensify score gap between null and blade situations. 296 M. BAHAGHIGHAT, S.A. MOTAMEDI Now, by comparing SBD to predefined threshold Sthr, final decision can be made. Simply, if SBD  Sthr, we have detected a blade in the masked image. We rewrite blade detection score function for current frame index k as below: ( ) 0 Otherwise BD BD thrk BD S S S S     (12) So S (k) BD  0 clearly means that a blade detected for frame index k. This algorithm should be repeated for all image frames in our predefined time window, TW. 2.4. Blade velocity estimation method In the previous section, the blade detection method is well presented. This procedure should be called for all frames in the predefined time window, TW. Now, all information that needed for velocity estimation is calculated as below: { } ( ){ 1, 2..., }, TW i BD BD W S S S i N N T f    (13) Where fS is the video frame rate in frame per second (fps). In order to estimate blade velocity, it is necessary to fit the gathered scores by appropriate function so we deploy ―Sum of Sine‖ approximation function ( )f x including 3n parameters: 1 ( ) sin(W x ) n i i i i f x A     (14) Then, we select the harmonic Wi * with the highest power Ai * as the frequency of blade rotation: Ai * = Ai* and Wi * = Wi* where: * Argmax{ 1, 2,3,...n} i i A i  (15) Finally, the velocity of Blade can be calculated by our proposed equation: * 1 2 (W ) 60( ) B si V f RPM    . (16) 3. SIMULATION AND RESULTS In order to investigate the performance of our proposed system architecture, we use ―Rotary Blade.avi‖, including 440 image frames with size 512 by 512 and fs = 40 fps in Matlab software. We receive video stream at the coordinator and convert each buffered image to a 8-bit RGB image for further inspection. Fig. 9(a)-(d) evidently show that our algorithm accurately detect both HUB and blade landmarks. In Fig. 10(a)-(d), different blade orientations are presented. In all figures, a hub is detected correctly, but in Fig. 10(a)-(c) the blade orientations are not matched to the GW filter bank so for these frames we have SBD < Sthr that means the blade is not detected. For Fig. 10(d), the orientation of the blade is well adapted to the GW filter bank and SBD  Sthr is satisfied thus we have a blade in this sub image. Vision Inspection and Monitoring of Wind Turbine Farms in Emerging Smart Grids 297 We conduct our experiment for all 440 images. Obtained results for these images indicate that in 94.97% the hub is detected correctly and approximately in 50 frames (11.36%) a blade is recognized by our GWT feature extraction and scoring method. (a) (b) (c) (d) Fig. 9 Hub and blade detection: (a) Original image (b) Hub detection by proposed correlation coefficient template matching (c) Adaptive bounding box at (,) (d) Proposed blade detection by GWT In Fig. 11, the gathered scores of blade detector are presented for several sample frames of this video stream. For example, Fig. 11(a) shows that for frame indexes from 264 to 268, in all 5 consecutive frames, a blade is detected with scores over 820. In fact, all of these five detections just related to unique blade and must count as one blade. This is due to our defined directional drifts in the proposed GW filter bank to increase a blade detection probability. As the same way, Fig. 11(b) presents two blade detections for frame indexes 363 and 365 while Fig. 12 shows the results for the whole video stream. As mentioned before, in order to estimate blade velocity, it is necessary to fit the gathered scores by appropriate function so we deploy ―Sum of Sine‖ approximation function f(x) based on equation (14). In Fig. 12(b), the soft approximation of blade score function is depicted. 298 M. BAHAGHIGHAT, S.A. MOTAMEDI (a) (b) (c) (d) Fig. 10 Hub and blade detection during blade rotation (a)(b)(c) Non-matched blade orientation: No blade is detected (d) Matched bade orientation: A blade is detected (a) (b) Fig. 11 Blade detection scores for sample frames Vision Inspection and Monitoring of Wind Turbine Farms in Emerging Smart Grids 299 0 1000 0 1 5 3 0 4 5 6 0 7 5 9 0 1 0 5 1 2 0 1 3 5 1 5 0 1 6 5 1 8 0 1 9 5 2 1 0 2 2 5 2 4 0 2 5 5 2 7 0 2 8 5 3 0 0 3 1 5 3 3 0 3 4 5 3 6 0 3 7 5 3 9 0 4 0 5 4 2 0 4 3 5B la d e D e te ct io n S co re Frame Index (a) (b) Fig. 12 Blade detection scores vs frame index (a) Without softening function (b) Results of ―Sum of Sine‖ estimation Now, we select the harmonic Wi* with the highest power ( * Argmax{ 1, 2,3,...n} i i A i  ) as the frequency of blade rotation. Finally, the velocity of the blade can be calculated by simply replacing W i * =0.1789 and fS = 40 in equation (16) then VB = 52.68RPM will be estimated for simulated scenario. This value means that our model successfully estimate blade velocity without any error. In the final step of our analysis, we evaluate robustness of our algorithm by adding zero mean Gaussian noise to the video stream. Fig. 13 shows three noisy samples. Table 1 shows the result of  2 n variations and its impact on velocity estimation (noise variance can vary between 0 and 1). The obtained results emphasize that our model is highly robust against noisy conditions. 300 M. BAHAGHIGHAT, S.A. MOTAMEDI (a) (b) (c) Fig. 13 Adding zero mean Gaussian noise to the input image (a)  2 n = 0.01, (b)  2 n = 0.05, (c)  2 n = 0.10 Table 1 Gaussian noise impact on velocity estimation accuracy  2 n 0 0.01 0.02 0.05 0.10 VB(RPM) 52.68 52.62 52.44 52.53 5368 4. CONCLUSION Many smart grid applications face harsh environmental conditions, but have high reliability and maintenance requirements. In this work, an intelligent nondestructive test (NDT) based on vision inspection is modelled. In the proposed model, the velocity of turbine blade is targeted as a main goal to energy monitoring while it is also well designed for more vision inspections (VI) procedures such as Mechanical Deformations, Surface Defects, and Overheated Components in rotor blades, nacelles, slip rings, yaw drives, bearings, gearbox, generators, and transformers. Although, our structured model is the first work in the related literature that fully concentrated on practical VI based NDT maintenance for WTs in SGs, the obtained results show the high accuracy and robustness of our algorithm. It is worth nothing that our blade velocity estimation can be used to estimate WT power generation [26] as well. Vision Inspection and Monitoring of Wind Turbine Farms in Emerging Smart Grids 301 REFERENCES [1] M. E. El-Hawary, "The smart grid—state-of-the-art and future trends," Electric Power Components and Systems, vol. 42, pp. 239-250, 2014. [2] V. C. Gungor, D. Sahin, T. Kocak, S. Ergut, C. Buccella, C. Cecati, et al., "A survey on smart grid potential applications and communication requirements," IEEE Transactions on Industrial Informatics, vol. 9, pp. 28-42, 2013. [3] N. IqtiyaniIlham, M. Hasanuzzaman, and M. Hosenuzzaman, "European smart grid prospects, policies, and challenges," Renewable and Sustainable Energy Reviews, vol. 67, pp. 776-790, 2017. [4] S. G. C. CEN-CENELEC-ETSI, "Group," in Smart grid reference architecture, ed, 2012. [5] N. Framework, "Roadmap for smart grid interoperability standards," National Institute of Standards and Technology, 2010. [6] M. Kuzlu, M. Pipattanasomporn, and S. Rahman, "Communication network requirements for major smart grid applications in HAN, NAN and WAN," Computer Networks, vol. 67, pp. 74-88, 2014. [7] A. Usman and S. H. Shami, "Evolution of communication technologies for smart grid applications," Renewable and Sustainable Energy Reviews, vol. 19, pp. 191-199, 2013. [8] H. Li, L. Lai, and W. Zhang, "Communication requirement for reliable and secure state estimation and control in smart grid," IEEE Transactions on Smart Grid, vol. 2, pp. 476-486, 2011. [9] P. Tchakoua, R. Wamkeue, M. Ouhrouche, F. Slaoui-Hasnaoui, T. A. Tameghe, and G. Ekemb, "Wind turbine condition monitoring: State-of-the-art review, new trends, and future challenges," Energies, vol. 7, pp. 2595-2630, 2014. [10] C.-S. Tsai, C.-T. Hsieh, and S.-J. Huang, "Enhancement of damage-detection of wind turbine blades via CWT-based approaches," IEEE Transactions on energy conversion, vol. 21, pp. 776-781, 2006. [11] R. Gade and T. B. Moeslund, "Thermal cameras and applications: a survey," Machine vision and applications, vol. 25, pp. 245-262, 2014. [12] A. B. Nikolic, N. Neskovic, R. Antic, and A. Anastasijevic, "Industrial wireless sensor networks as a tool for remote on-line management of power transformers'heating and cooling process," Facta Universitatis, Series: Electronics and Energetics, vol. 30, pp. 107-119, 2016. [13] M. Milutinov, N. Đurić, N. Pekarić-Nađ, D. Mišković, and D. Knežević, "Multiband sensors for wireless electromagnetic field monitoring system-SEMONT," Facta Universitatis, Series: Electronics and Energetics, vol. 25, pp. 137-150, 2012. [14] M. R. Kosanović and M. K. Stojčev, "RPLL-Rendezvous protocol for long-living sensor node," Facta Universitatis, Series: Electronics and Energetics, vol. 28, pp. 85-102, 2015. [15] E. Cuevas, V. Osuna, and D. Oliva, "Template Matching," presented at the Evolutionary Computation Techniques: A Comparative Perspective, 2017. [16] G. Jasvilis, C. Weise, and B. Zenger-Landolt, "Finding complex patterns using template matching," 2016. [17] A. Mahmood, A. Mian, and R. Owens, "On Optimizing Auto-correlation for Fast Template Matching Through Transitive Elimination," arXiv preprint arXiv:1407.3535, 2014. [18] F. Zhong, S. He, and B. Li, "Blob analyzation-based template matching algorithm for LED chip localization," The International Journal of Advanced Manufacturing Technology, pp. 1-9, 2015. [19] Y. Zhang, S. Wang, P. Sun, and P. Phillips, "Pathological brain detection based on wavelet entropy and Hu moment invariants," Bio-medical materials and engineering, vol. 26, pp. S1283-S1290, 2015. [20] J. Flusser and T. Suk, "Rotation moment invariants for recognition of symmetric objects," IEEE Transactions on Image Processing, vol. 15, pp. 3784-3790, 2006. [21] M. K. Bahaghighat and J. Mohammadi, "Novel approach for baseline detection and Text line segmentation," International Journal of Computer Applications, vol. 51, 2012. [22] M. K. Bahaghighat and R. Akbari, "Fingerprint image enhancement using GWT and DMF," in Proce. of the 2nd International Conference on Signal Processing Systems (ICSPS), 2010, pp. V1-253-V1-257. [23] S. Tang, "Face recognition method based on gabor wavelet and memetic ecological algorithm," Biomedical Research, pp. 1-1, 2017. [24] T. S. Lee, "Image representation using 2D Gabor wavelets," IEEE Transactions on pattern analysis and machine intelligence, vol. 18, pp. 959-971, 1996. [25] N. Karimimehr and A. A. B. Shirazi, "Fingerprint image enhancement using gabor wavelet transform," in Proceedings of the 18th Iranian Conference on Electrical Engineering (ICEE), 2010, pp. 316-320. [26] S. Li, D. C. Wunsch, E. A. O'Hair, and M. G. Giesselmann, "Using neural networks to estimate wind turbine power generation," IEEE Transactions on energy conversion, vol. 16, pp. 276-282, 2001.