Microsoft Word - B_24_Szoke_R_irod.doc HUNGARIAN JOURNAL OF INDUSTRIAL CHEMISTRY VESZPRÉM Vol. 39(2) pp. 237-242 (2011) PROTOTYPE OF A MULTI-ROBOT SYSTEM FOR AUTONOMOUS GAS MAPPING IN POLLUTED ENVIRONMENTS D. LUPEA , I. SZŐKE, A. MAJDIK, GH. LAZEA Technical University of Cluj-Napoca, Department of Automation, C. Daicoviciu 15 Cluj-Napoca, ROMANIA, E-mail: diana.lupea@aut.utcluj.ro The need of environmental monitoring robots is imposed by nowadays trends related to sustainability – towards protecting the environment and the human factor. This research received a great amount of focus in the past years. The directions of study in this area of research will be overviewed in the next lines. They are varied and range from robots designed to monitor ammonia pollution near animal farms to robots which search for unexploded ordinance, or sources of leaks from pipes and tanks, and even to a collecting and segregating garbage robot which also monitors pollution while performing its other tasks. This paper describes a novel system that aims to explore dangerous sites (old mines, chemical plants, etc), where an autonomous ground robot and a micro unmanned aerial vehicle (UAV) are sent together to map the area in order to detect toxic gases. The new approach proposes a multi-robot architecture which uses a multi-sensor gas detection measurement unit which uses a Simultaneous Localization and Mapping (SLAM) algorithm for 3D autonomous gas mapping. Current research is trying to shift the use of robots for gas detection from indoor environments towards the outdoors. There can be found several research directions with applications based on ecological robots. Keywords: autonomous multi-robot system, mapping, navigation, polluted enviromnets, multi-sensor. Introduction The interest on using chemical sensors in robotic systems began in early nineties focusing mainly on the development of two typologies of robots: the first one is able to follow odour trails marked on the ground like in the case of ants following a pheromone trail [1]; the second one was for tracking odour trails formed in fluid media to find a chemical spill source [2, 3, 4]. Other robots for monitoring gas pollution in areas of interest have been proposed in literature. There are robots designed to monitor ammonia pollution near animal farms [5], robots designed to go into old mines and detect dangers of gas intoxication and so on [6]. In case of the underground search finding unexploded ordinance, land mines, and sources of leaks from pipes and tanks are some examples. DustCart [7] was developed by numerous European companies and universities; the project aims to build a collecting and segregating garbage robot. To top it all off, it can also act as a mobile pollution monitor by recording atmospheric pollution levels. Another proposal is the artificial clouds proposed by profesor Abdul Ghani from the Qatar University. He proposed a system of robots mimicing clouds for the 2022 World Cup spectators, in order for them to be shed from the scourching sun of the area. The robots are supposed to be from lightweight carbon, mimic the shape of clouds and they will contain gas and helium. The robots will be solar powered and controlable remotelly in order to protect spectators and athletes. Another example is the one of E.ON RUHRGAS AG to develop infrared laser light-based remote sensing system called CHARM® (CH4 Airborne Remote Monitoring) which – when installed onboard helicopters used for routine aerial surveys (Fig. 1) – can be utilized to check natural gas pipelines for tightness and detect very small traces of methane [8]. Figure 1: System setup inside the CHARM Helicopter In the field of chemical sensor research there were numerous scientific work dealing with the detection of the chemically toxic gases. Real-time information on the composition of combustion gases is important for improving efficiency and reducing emissions. One approach for monitoring gas composition is to collect a sample gas, which then can be analyzed either by spectroscopic techniques or by room temperature gas sensors. However, the gas composition could change 238 during cooling to room temperature, so analysis of the gas at high temperatures is preferred. Such analysis can be performed using optical techniques based on infrared radiation or laser spectroscopy. Placement of a sensor directly in the high temperature gas would avoid such interferences, so sensors for in situ monitoring of combustion gas components, including oxygen (O2), hydrogen sulfide (H2S), sulfur oxide (SOx), methane (CH4) and hydrocarbons, at the high temperatures typical of combustion processes have been developed. Among the various approaches used, solid electrolyte based sensors are particularly well suited to high temperature aggressive environments. Nowadays this research field of finding places where chemical sources are present or in the air or on the ground surface or underground, has been widely spread. System description Our system proposal aims to develop a multi-sensor gas detection measurement unit specially designed for mobile robotic platforms. The novel system aims to explore dangerous sites (old mine, chemical plant, etc), where autonomous ground robot and a micro unmanned aerial vehicle (UAV) are sent together to map the area in order to detect toxic gases. The developed, novel multi-robot architecture will be equipped, among the novel gas detection measurement unit with sensors that allow a precise localization of the robots and complex perception of the surroundings, such as: stereo camera system, laser unit, Kinect sensor, 3D compass, and GPS module (Fig. 2). The team is creating a topological map and a gas distribution map of the environment, which is explored at exact intervals. In case of pollution detection a warning signal is sent to the server by wireless network. The warning message contains the position of the gas spill and the quantity of pollutant. Using all this information the rescue team can be sent to avoid more tragic events, explosions which can harm the workers. In the following sections the a description of the systems components and of the used methods will be made. Figure 2: Plant scene with the proposed system Theoretical background chemical sensors There are two electrochemical types of gas detection sensor (i) amperometric cell known as Clark electrode and (ii) a solid-state electrochemical cell based on metal oxides or semiconductor electrodes. (i) For the first type the galvanic detection principle is based on the diffusion of the gas molecules through the membrane, following the partial pressure gradient between water and detector room, according to the Law of Henry. Hence, the concentration in the detector space is directly correlated to the concentration in the outside water. The correlation is expressed by the calibration formula [9]. (ii) The simplest and most direct use of a solid electrolyte in a chemical sensor is to measure the concentration of the ion that is mobile in the electrolyte. However, the number of available solid electrolytes is limited, so electrolytes that conduct the desired species are often not available. In such cases, however, an additional phase, referred to as an auxiliary electrode, can be used to provide an equilibrium reaction between the target species and the ion that is mobile in the solid electrolyte, so that the voltage generated by the electrochemical cell is related to the concentration of the target species. While increasing the number of components complicates the design, fabrication and operation of the sensor, such complications are often necessary to achieve satisfactory performance [10]. To test the developed measurement unit the gases proposed for study will be generated. In the generated conditions experiments and tests will be performed to determine the concentrations of toxic gases. The obtained results with the new measurement unit will be analyzed in comparison with measurements obtained by commercial, standard sensors or other type of determination. In the case of toxic gases the security and safety rules will be respected. Sulfur oxides (SOx), mainly produced from the burning of substances containing sulfur, are one of the typical air pollutant gas species causing acid precipitants, and the suppression of SOx gas emissions into the atmosphere is an urgent issue. In order to reduce SOx gas emission, it is essential to develop the SOx gas sensing tool which can detect the SOx gas concentration rapidly and exactly at every emitting site. For this purpose, compact SOx gas sensors with semiconductors, liquid electrolytes and solid electrolytes having rapid and accurate sensing are required. The detection of SO2 using solid electrolytes such as Li2SO4–Ag2SO4, β-Al2O3, ZrO2 and NASICON (sodium super ionic conductor); Al3+ ion conducting (Al0.2Zr0.8)20-19Nb(PO4)3; by solid state electrochemistry techniques has been reported [11, 12]. Methane (CH4), which is the main component of city gas, is highly explosive even at low concentrations. Therefore, methane detection has become an important issue with increasing urbanization. CH4 sensor using, LaGaO3-based electrolyte was successfully fabricated [13]. Oxygen consumption is one of the most important indexes of biological activity during cell culture and 239 microbial development. In the past two decades, owing to the progress in semiconductor and micromachining techniques, various types of miniature Clark-type oxygen sensors have been proposed, like BaFe0.8Ta0.2O3 solid electrolyte [14]. Theoretical background of the vision and laser system Proposed algorithm In Fig. 3 the diagram of the proposed algorithm is shown. In the first step the algorithm needs a prior map of the environment. The offline, prior map is a 3D feature map containing all the 3D image features collected by the robot while it navigates throughout the environment. For the visual perception of the mobile robot we used the SURF image feature detector and descriptor. By detecting identical features on both of the images of the stereo pairs 3D features can be computed. Details about the SURF features and the stereo geometry of the visual system are given later on. At every position the online 3D feature point cloud is searched based on the SURF descriptor in the saved map. For the features found a point clouds is selected from the offline map. To estimate the displacement between these two feature clouds a variant of the RANSAC algorithm is used. The features that are considered to be outliers are excluded from the online point cloud. The algorithm is repeated iteratively. Figure 3: The flow diagram of the proposed algorithm SURF for interest point detection To solve the localization problem the recognition of the landmarks in the environment is a key issue. Our algorithm uses image features detector and descriptor called SURF. SURF is an abbreviation from Speeded Up Robust Features; it is a method, known in the field of computer vision to detect interest points on images. SURF has the advantage of the computational speed compared to other image feature detectors like Scale- Invariant Feature Transform (SIFT) presented by [15], maintaining approximately the same or even better performances regarding: repeatability, distinctiveness and robustness. In [16] is concluded that SURF is one of the most suitable descriptor for visual simultaneous localization and mapping applications. The algorithm and the performances are largely discussed in [17] and [18], whereas hereby only a short overview is presented. The main reason why the algorithm decreases the computational time is because SURF uses integral images [19] as intermediate representation, in this way the sum of intensities over any upright, rectangular region is calculated with only four additions. The SURF algorithm is based on the Fast-Hessian detector, which computes the determinant of the Hessian matrix: ( ) ( ) ( ) ( ) ( ) xx xy xy yy L , L , H , L , L , p p p p p σ σ σ σ σ = ⎡ ⎤ ⎢ ⎥ ⎣ ⎦ , (1) where with Lxx(p,σ) is denoted the convolution of the second order Gaussian derivative ∂2g(σ)/∂x2 of the image at point p(x,y) and scale σ in the x direction. Lxy(p,σ) and Lyy(p,σ) are calculated similarly, these derivatives are known as Laplacian of Gaussian. Another reason why the SURF algorithm performs fast is that it approximates the Laplacian of Gaussian with a box filter representation. The box filter allows a performance increase in time when they are computed on integral images. Simultaneous localization and mapping Simultaneous localization and mapping (SLAM) is a technique used for autonomous vehicles to build up a map of an unknown environment (without a priori knowledge), or to update a map of a known environment (with a priori knowledge from a given map), while at the same time keeping track of the vehicle’s current location. SLAM consists of multiple parts: Landmark extraction, data association, state estimation, state update and landmark update. There are many ways to solve each of the smaller parts. The critical problems for large scale implementations [20] will be described next. First, data association-finding correspondences between map landmarks and robot sensor measurements-becomes difficult in complex, cluttered environments, especially if the robot location is uncertain. Second, the information required to maintain a consistent map using traditional methods imposes a prohibitive computational burden as the map increases in size. And third, the mathematics for SLAM relies on assumptions of small errors and near-linearity, and these become invalid for larger maps. In outdoor environments, the problems of scale are exacerbated by complex structure and rugged terrain. This can impede the detection of stable discrete landmarks, and can degrade the utility of motion estimates derived from wheel-encoder odometry [21]. The main scope of a SLAM system is to obtain an estimate of the probability distribution, at time t, over all possible robot states X and over all maps of the environment Θ, by means of the previous sensor measurements Zt and navigation commands Ut. 240 p(x, Θ|zt,ut), (2) This distribution is called the SLAM posterior. Considering the general state space model with the hidden variables ht = {h0, h1,..., ht} and the observed variables ot = {o0, o1,..., ot}, the goal is to perform inference on the hidden variables. If we consider the known observations, the Bayesian inference relies on the joint posterior distribution p(ht | ot). If the hidden variables have an initial distribution p0(h0) and a transition model p(ht | ht–1), then if the observations are conditionally independent, there exists the following recursive expression for the posterior, which can be updated using only the observations and transition models: ( ) ( ) ( ) ( ) ( ) 1 1 1 1 | | | | | t t t t t t t t t p o h p h h p h o p h o p o o − − − − = , (3) Particle filters are mathematical models that represent the probability distribution as a set of discrete particles which occupy the state space. In the update step a new particle distribution given motion model and controls applied is generated. In our case the particle filters are a sequential Monte Carlo approximation to the recursive Bayesian filter. Particle filters provide a good implementation of Bayesian filtering for systems whose belief state, process noise, and sensor noise are modelled by non-parametric probability density functions [22]. Here, the particle filter is set to keep a discrete approximation for the SLAM posterior by means of using a large set of particles, which are including the state space for the algorithm: s = {xt,Θ}. So we obtain a weighted approximation of the Bayes filter: ( ) ( ) ( ) 1 | , no of samples i i t t t t i p s z u w s = ≈ ∑ , (4) The particles are updated taking samples: ( ) ( )( )1~ |i it t ts q s s − , (5) where the last term in the equation is a term from which samples can easily be extracted. So the weights’ update is: ( ) ( ) ( )( ) ( ) ( )( ) ( ) ( )( ) 1 1 1 | | | , i i i t t t ti i t t i i t t t p z s p s s w w q s s z − − − ∝ , (6) Usually the motion model is used p(xt | xt–1), because is a normal distribution and allows us to simplify the weight update equation to: ( )( )( ) ( )1 |i i it t t tw w p z s−∝ , (7) A problem depicted by the use of st to wrap the state space is that particle filters sample over the entire state space of the robot. As long as this state space doesn’t have big dimensions, as in the case of estimating the position of a vehicle, a few hundred samples will adequately represent the true distribution. But for SLAM, the state space of the robot consists of the robot position and the map, so it becomes pretty large. Each particle is a sample from all possible maps and all possible positions within that map, in order to solve this problem, a solution would be the Rao-Blackwellized Particle Filters [23] which keeps the map portion of the state in a computationally efficient closed form. Ellipse Fitting In this In this paper the used form of the quadratic curve has the following form: ax2 + 2bxy + cy2 + 2dx + 2fy + g = 0, (8) coefficients a, b, c, d, f, g are giving little input to the shape of the ellipse, they are not unique solutions to this equation, and even if they are multiplied by a number δ ≠ 0 the shape of the ellipse will not change. Despite this issue, calculating the coefficients of the algebraic equation it is possible to extract the parameters of the ellipse. The five ellipse parameters are: the coordinates of its center x0 and y0, the lengths of the semi-major axis a and of the semi-minor axis b, and the angle θ between the axis and the major axis. Defining a b d b c f d f g Δ = , (9) , a b J I a c b c = = + , (10) The curve is an ellipse if the following conditions are true 0, 0 0J and I Δ ≠ > < Δ , (11) Assuming that the ellipse is nondegenerate, if it is not a circle a ≠ c and we know that is not a point, since J = ac – b2 ≠ 0, (12) After obtaining the coefficients of the algebraic equation the parameters of the ellipse can be calculated: - the center of the ellipse: 0 02 2 , cd bf af bd x y b ac b ac − − = = − − , (13) - the semi-axis lengths: ( ) ( ) ( ) ( ) 2 2 2 ' 22 2 2 2 4 af cd gb bdf acg a b ac a c b a c + + − − = − − + − +⎡ ⎤ ⎣ ⎦ ,(14) 2 2 2 ' 2 2 2 2( 2 ) ( ) ( ) 4 ( ) af cd gb bdf acg b b ac a c b a c + + − − = − − − + − +⎡ ⎤⎣ ⎦ ,(15) 241 - the counterclockwise angle of rotation: ( )( )( ) ( )1 1 1 | 0 0 1 0 2 1 / 2 cot 0 2 1 0 2 2 cot 2 i i i t t t t w w p z s for b and a c for b and a c a c for b and a c b for b and a c a c b ∅ π π − − − ∝ = = < = > − ≠ < + ≠ > − ⎧ ⎪ ⎪ ⎪ ⎪ ⎛ ⎞ ⎨ ⎜ ⎟ ⎝ ⎠⎪ ⎪ ⎪ ⎛ ⎞⎪ ⎜ ⎟ ⎩ ⎝ ⎠ , (16) The ellipse’s parametric equation is: 0 0 0 2 xx cos sin acost t yy sin cos bsint θ θ π θ θ − = + ≤ ≤ ⎡ ⎤⎡ ⎤ ⎡ ⎤ ⎡ ⎤ ⎢ ⎥⎢ ⎥ ⎢ ⎥ ⎢ ⎥⎣ ⎦ ⎣ ⎦ ⎣ ⎦⎣ ⎦ , (17) The objective is to calculate the ellipse’s parameters using the coefficients from the algebraic equation (8). Experimental Results The experimented algorithms are based on Mixture Gaussian Models, Expectation-Maximization algorithm and the formulas presented in Chapter Ellipse Fitting. The acquired data set is clustered into groups which are representing the same object. As can be seen in Fig. 4 the environment is presenting the office and the detected obstacles which are the desks shown with different colour. To each obstacle an ellipse is fitted and a trajectory is built from the set of the waypoints obtained after the connection of the neighbour ellipses. This path planning method is combined with the localization algorithm based on SURF. The objective is to gain an efficient localization which allows the tracking of the calculated trajectory (Fig. 5). Figure 4: The optimal trajectory obtained after applying the algorithm Figure 5: Result of the proposed algorithm after four steps (top view) Future Work Our future work is aimed at obtaining a multi-robot system capable of performing active large scale area simultaneous localization and mapping. To accomplish this goal in an efficient data management manner the bag of words representations will be used. The laser system will be combined with the Markovian properties. For the active SLAM a submapping technique which uses conditionally independent graphs will be used. ACKNOWLEDGEMENT This paper was supported by the project “Doctoral studies in engineering sciences for developing the knowledge based society-SIDOC” contract no. POSDRU/88/1.5/S/60078, project co-funded from European Social Fund through Sectorial Operational Program Human Resources 2007–2013. REFERENCES 1. H. ISHIDA, T. MORIZUMI: Machine olfaction for mobile robots, S. S. Schifman, H. T. Nagle, J. W. Weinheim T. C. Pearce [eds.]: Handbook of Machine Olfaction: Electronic Nose Technology, 2006 Weinheim: Wiley-VCH Verlag GmbH Co. KGaA 2. G. FERRI, E. CASELLI, V. MATTOLI, A, MONDINI, B. MAZZOLAI: SPIRAL: A novel biologically-inspired algorithm for gas odor source localization in an indoor environment with no strong airflow, Robotics and Autonomous Systems, 57(4), (2009), 1413–1418 3. H. ISHIDA, T. NAKAMOTO, T. MORIIZUMI: Remote sensing of gas/odor source location and concentration distribution using mobile systems, Sensors and Actuators, 49(1-2), (1998), 52–57 4. C. LAUGIER, R. CHATILA: Autonomous Navigation in Dynamic Environments, Springer Tracts in Advanced Robotics, (2007), Vol. 35 242 5. D. MARTINEZ, O. ROCHEL, E. HUGHES: A biomimetic robot for tracking specific odors in turbulent plumes, Autonomous Robots, 20(3), (2006), 185–195 6. I. SZOKE, A. MAJDIK, D. LUPEA, L. TAMAS, G. LAZEA: Autonomous mapping in polluted environments, Hungarian Journal of Industrial Chemistry, 38(2), (2010), 187–191 7. G. FERRI, A. MONDINI, A. MANZI, B. MAZZOLAI, C. LASCHI, V. MATTOLI, M. REGGENTE, T. STOYANOV, A. LILIENTHAL, P. DARIO: DustCart, a mobile robot for urban environments: Experiments of pollution monitoring and mapping during autonomous navigation in urban scenarios, ICRA Workshop, 2010 8. A. SCHERELLO: CHARM® CH4 airborne remote monitoring, Paris, International Gas Union Research Conference, 2008 9. C. C. WU, T. YASUKAWA, H. SHIKU, T. MATSUE: Fabrication of miniature Clark oxygen sensor integrated with microstructure, Sensors and Actuators, 110(2), (2005), 342–349 10. J. W. FERGUS: A review of electrolyte and electrode materials for high temperature electrochemical CO2 and SO2 gas sensors, Sensors and Actuators, 134(2), (2008), 1034–1041 11. X. LIANG, T. ZHONG, B. QUAN, B. WANG, H. GUAN: Solid-state potentiometric SO2 sensor combining NASICON with V2O5-doped TiO2 electrode, Sensors and Actuators B, 134(1), (2008), 25–30 12. T. ZHONG, B. QUAN, X. LIANG, F. LIU, B. WANG: SO2-sensing characteristics of NASICON sensors with ZnSnO3 sensing electrode, Materials Science and Engineering B 151(2), (2008), 127–132 13. Z. BI, H. MATSUMOTO, T. ISHIHARA: Solid-state amperometric CH4 sensor using LaGaO3-based electrolyte, Solid State Ionics, 179(27-32), (2008), 1641–1644 14. R. MOOS, N. IZU, F. RETTIG, S. REIS, W. SHIN, I. MATSUBARA: Resistive oxygen gas sensors for harsh environments, Sensors, 11(4), (2011), 3439–3465 15. D. G. LOWE: Object recognition from local scale invariant features, International Conference on Computer Vision, (1999), 1150–1157 16. A. GIL, O. M. MOZOS, M. BALLESTA, O. REINOSO: A comparative evaluation of interest point detectors and local descriptors for visual SLAM, Machine Vision and Applications, 21(6), (2009), 905–920 17. C. EVANS: Notes on the OpenSURF library, University of Bristol, 2009, CSTR-09-00 18. H. BAY, A. ESS, T. TUYTELAARS, L. VAN GOOL: Speeded-up robust features (SURF), Computer Vision and Image Understanding, 110(3), (2008), 346–359 19. P. VIOLA, M. JONES: Rapid object detection using a boosted cascade of simple features, Computer Vision and Pattern Recognition, 1, (2001), I511–I518 20. J. AULINAS, J. SALVI, X. LLADO, Y. PETILLOT: Local Map Update for Large Scale SLAM, Electronics Letters, 46(8), (2010) 564–566 21. M. C. BOSSE, S. TELLER: ATLAS: A Framework for Large Scale Automated Mapping and Localization, Massachusetes Institute of Technology, 2004 22. S. ARULAMPALAM, S. MASKELL, N. GORDON, T. CLAPP: A tutorial on particle filters for on-line non- linear/non-gaussian bayesian tracking, IEEE Transactions on Signal Processing, 50(2), (2002), 174–188 23. A. DOUCET, N. DE FREITAS, K. MURPHY, S. RUSSELL: Rao-blackwellised particle filtering for dynamic bayesian networks, Proc. of the Sixteenth Conf. on Uncertainty in AI, (2000), 176–183 << /ASCII85EncodePages false /AllowTransparency false /AutoPositionEPSFiles true /AutoRotatePages /None /Binding /Left /CalGrayProfile (Dot Gain 20%) /CalRGBProfile (sRGB IEC61966-2.1) /CalCMYKProfile (U.S. Web Coated \050SWOP\051 v2) /sRGBProfile (sRGB IEC61966-2.1) /CannotEmbedFontPolicy /Error /CompatibilityLevel 1.4 /CompressObjects /Tags /CompressPages true /ConvertImagesToIndexed true /PassThroughJPEGImages true /CreateJobTicket false /DefaultRenderingIntent /Default /DetectBlends true /DetectCurves 0.0000 /ColorConversionStrategy /CMYK /DoThumbnails false /EmbedAllFonts true /EmbedOpenType false /ParseICCProfilesInComments true /EmbedJobOptions true /DSCReportingLevel 0 /EmitDSCWarnings false /EndPage -1 /ImageMemory 1048576 /LockDistillerParams false /MaxSubsetPct 100 /Optimize true /OPM 1 /ParseDSCComments true /ParseDSCCommentsForDocInfo true /PreserveCopyPage true /PreserveDICMYKValues true /PreserveEPSInfo true /PreserveFlatness true /PreserveHalftoneInfo false /PreserveOPIComments true /PreserveOverprintSettings true /StartPage 1 /SubsetFonts true /TransferFunctionInfo /Apply /UCRandBGInfo /Preserve /UsePrologue false /ColorSettingsFile () /AlwaysEmbed [ true ] /NeverEmbed [ true ] /AntiAliasColorImages false /CropColorImages true /ColorImageMinResolution 300 /ColorImageMinResolutionPolicy /OK /DownsampleColorImages true /ColorImageDownsampleType /Bicubic /ColorImageResolution 300 /ColorImageDepth -1 /ColorImageMinDownsampleDepth 1 /ColorImageDownsampleThreshold 1.50000 /EncodeColorImages true /ColorImageFilter /DCTEncode /AutoFilterColorImages true /ColorImageAutoFilterStrategy /JPEG /ColorACSImageDict << /QFactor 0.15 /HSamples [1 1 1 1] /VSamples [1 1 1 1] >> /ColorImageDict << /QFactor 0.15 /HSamples [1 1 1 1] /VSamples [1 1 1 1] >> /JPEG2000ColorACSImageDict << /TileWidth 256 /TileHeight 256 /Quality 30 >> /JPEG2000ColorImageDict << /TileWidth 256 /TileHeight 256 /Quality 30 >> /AntiAliasGrayImages false /CropGrayImages true /GrayImageMinResolution 300 /GrayImageMinResolutionPolicy /OK /DownsampleGrayImages true /GrayImageDownsampleType /Bicubic /GrayImageResolution 300 /GrayImageDepth -1 /GrayImageMinDownsampleDepth 2 /GrayImageDownsampleThreshold 1.50000 /EncodeGrayImages true /GrayImageFilter /DCTEncode /AutoFilterGrayImages true /GrayImageAutoFilterStrategy /JPEG /GrayACSImageDict << /QFactor 0.15 /HSamples [1 1 1 1] /VSamples [1 1 1 1] >> /GrayImageDict << /QFactor 0.15 /HSamples [1 1 1 1] /VSamples [1 1 1 1] >> /JPEG2000GrayACSImageDict << /TileWidth 256 /TileHeight 256 /Quality 30 >> /JPEG2000GrayImageDict << /TileWidth 256 /TileHeight 256 /Quality 30 >> /AntiAliasMonoImages false /CropMonoImages true /MonoImageMinResolution 1200 /MonoImageMinResolutionPolicy /OK /DownsampleMonoImages true /MonoImageDownsampleType /Bicubic /MonoImageResolution 1200 /MonoImageDepth -1 /MonoImageDownsampleThreshold 1.50000 /EncodeMonoImages true /MonoImageFilter /CCITTFaxEncode /MonoImageDict << /K -1 >> /AllowPSXObjects false /CheckCompliance [ /None ] /PDFX1aCheck false /PDFX3Check false /PDFXCompliantPDFOnly false /PDFXNoTrimBoxError true /PDFXTrimBoxToMediaBoxOffset [ 0.00000 0.00000 0.00000 0.00000 ] /PDFXSetBleedBoxToMediaBox true /PDFXBleedBoxToTrimBoxOffset [ 0.00000 0.00000 0.00000 0.00000 ] /PDFXOutputIntentProfile () /PDFXOutputConditionIdentifier () /PDFXOutputCondition () /PDFXRegistryName () /PDFXTrapped /False /CreateJDFFile false /Description << /ARA /BGR /CHS /CHT /CZE /DAN /DEU /ESP /ETI /FRA /GRE /HEB /HRV (Za stvaranje Adobe PDF dokumenata najpogodnijih za visokokvalitetni ispis prije tiskanja koristite ove postavke. Stvoreni PDF dokumenti mogu se otvoriti Acrobat i Adobe Reader 5.0 i kasnijim verzijama.) /HUN /ITA /JPN /KOR /LTH /LVI /NLD (Gebruik deze instellingen om Adobe PDF-documenten te maken die zijn geoptimaliseerd voor prepress-afdrukken van hoge kwaliteit. De gemaakte PDF-documenten kunnen worden geopend met Acrobat en Adobe Reader 5.0 en hoger.) /NOR /POL /PTB /RUM /RUS /SKY /SLV /SUO /SVE /TUR /UKR /ENU (Use these settings to create Adobe PDF documents best suited for high-quality prepress printing. Created PDF documents can be opened with Acrobat and Adobe Reader 5.0 and later.) >> /Namespace [ (Adobe) (Common) (1.0) ] /OtherNamespaces [ << /AsReaderSpreads false /CropImagesToFrames true /ErrorControl /WarnAndContinue /FlattenerIgnoreSpreadOverrides false /IncludeGuidesGrids false /IncludeNonPrinting false /IncludeSlug false /Namespace [ (Adobe) (InDesign) (4.0) ] /OmitPlacedBitmaps false /OmitPlacedEPS false /OmitPlacedPDF false /SimulateOverprint /Legacy >> << /AddBleedMarks false /AddColorBars false /AddCropMarks false /AddPageInfo false /AddRegMarks false /ConvertColors /ConvertToCMYK /DestinationProfileName () /DestinationProfileSelector /DocumentCMYK /Downsample16BitImages true /FlattenerPreset << /PresetSelector /MediumResolution >> /FormElements false /GenerateStructure false /IncludeBookmarks false /IncludeHyperlinks false /IncludeInteractive false /IncludeLayers false /IncludeProfiles false /MultimediaHandling /UseObjectSettings /Namespace [ (Adobe) (CreativeSuite) (2.0) ] /PDFXOutputIntentProfileSelector /DocumentCMYK /PreserveEditing true /UntaggedCMYKHandling /LeaveUntagged /UntaggedRGBHandling /UseDocumentProfile /UseDocumentBleed false >> ] >> setdistillerparams << /HWResolution [2400 2400] /PageSize [612.000 792.000] >> setpagedevice