Performance Testing of Download Services of COSMC Jiří Horák1, Jan Růžička1, Jiří Ardielli2 1Institute of Geoinformatics, Faculty of Mining and Geology, VSB – Technical University of Ostrava, Czech Republic 2Tieto Czech s.r.o, Czech Republic jiri.horak@vsb.cz, jan.ruzicka@vsb.cz, jiri.ardielli@tieto.com Abstract The paper presents results of performance tests of download services of Czech Office of Surveying, Mapping and Cadastre according to INSPIRE requirements. Methodology of testing is explained, including monitoring performance of refer- ence servers. 26 millions of random requests were generated for each monitored operation, layer and coordinate system. The temporal development of performance indicators are analyzed and discussed. Results of performance tests approve the compliance with INSPIRE qualitative requirements for download services. All monitored services satisfy requirements of latency, capacity and availability. The latency and availability requirements are fulfilled with an abundant reserve. No problems in structure and content of responses were detected. Keywords: performance testing, spatial data, download service, INSPIRE 1. Introduction The development of SDI is based on integration of global, European, national and local spatial initiatives. Directive 2007/2/EC of the Council and the European Parliament establishes the legal framework for setting up and operating an Infrastructure for Spatial Information in Europe (INSPIRE). INSPIRE requires to establish and operate following types of network services for the spatial data sets and services [2]: discovery services, view services, download services, transformation services and services allowing spatial data services to be invoked. Services provided within the frame of SDI have to be tested to verify the usability and sat- isfaction of end users. Usually assessment of the content and function capabilities represents the primary (qualitative) part of the web service evaluation. Nevertheless the critical part may be the evaluation of the service availability, its performance and other aspects of the quality. Basic indicators of quality of services (QoS) and limit values are given by regulations imple- menting INSPIRE [6, 7]. Full satisfaction of end users usually require to implement higher standards (than those given by INSPIRE) and provide better performance [10]. Novel ap- proaches emphasize the central role of users and the importance of elaborated testing of the final user satisfaction [13]. The aim of our testing was to verify the fulfilment of obligatory parameters required by above mentioned regulations and to evaluate the capacity to meet also higher requirements. Geoinformatics FCE CTU 10, 2013 5 Horák, J. et al.: Performance Testing of Download Services of COSMC A quantitative evaluation of the service quality should contain both server-side testing and client-side testing. Server-side analyses usually explore web server log files, including i.e. click stream analysis [12]. One of the possible analytical objectives is to explore a dependency between a content of rendered image and time for its rendering [5]. Results of the server-side tests can be used for optimization of the service based on several techniques (i.e. adjusting of service’s settings, geoweb caching, load balancing). The client-side testing enable to better simulate user conditions, however results are influenced by the client and network status. Following types of client-side tests can be specified [3]: Test precondition data, Functional testing, Stress test functionality, Capacity testing and Performance testing. Performance testing is the most well-known form of testing. Tests are based on software emulating common users' behaviour or uses some random pattern to access the server. Quality of services according to the INSPIRE directive follows the three aspects - perfor- mance, capacity and availability of services [6]. The availability usually refers to a percentage of time when a user can access the service. INSPIRE based QoS does not distinguish if re- turned results are correct or not. Even an error message returned by the monitored service is considered as an evidence for the availability of the service [1]. The first testing of QoS for web map services (WMS) according to the INSPIRE directive (including COSMC services) was performed in the Czech Republic in 2008 and 2009 [4]. COSMC (Czech Office of Surveying, Mapping and Cadastre) has been provided web services (WMS) according INSPIRE since April 2008. The implementation has been significantly influenced by establishing the base registers being a part of the Czech e-government. The content of this register practically covers the requirements for the following themes: “Cadas- tral parcels“, „Buildings“, ”Addresses“ and “Administrative units“. Download services for the “Cadastral parcels” theme according to INSPIRE was launched in May 2012. Both pre- defined data set and direct access download service (WFS) are supported [11]. 2. Methodology The download services of COSMC were tested during two weeks in May and June 2012. The subject of testing was following services: 1. http://services.cuzk.cz/gml/inspire/cp/epsg-5514/, 2. http://services.cuzk.cz/gml/inspire/cp/epsg-4258/, 3. http://services.cuzk.cz/wfs/inspire-cp-wfs.asp. First and second services provide pre-prepared files using GML version 3.2.1, while the third service represents WFS [11]. Following operations were separately tested: • Get Download Service Metadata, • Get Spatial Object, Geoinformatics FCE CTU 10, 2013 6 http://services.cuzk.cz/gml/inspire/cp/epsg-5514/ http://services.cuzk.cz/gml/inspire/cp/epsg-4258/ http://services.cuzk.cz/wfs/inspire-cp-wfs.asp Horák, J. et al.: Performance Testing of Download Services of COSMC • Get Spatial Data, • Describe Spatial Data set, • Describe Spatial Data object. The time schedule for testing ranges from Monday morning to Friday evening to cover usual service conditions. The standard load was 10 virtual users. The testing has been done on a client side out of intranet of the service [9]. The client runs out of an intranet of the service and it is not connected to the same router as the service. The list of requirements for operations has been set according to the analysis of regulations and recommendations focused to the services. Jmeter (apache-jmeter 2.6 + JMeter Plugins 0.5.2, GNU/Linux, Java OpenJDK 1.6) software was used for generating of requests and for logging service's responses. Random requests were generated for each monitored operation, layer and coordinate sys- tem. Spatial extents of the downloaded data sets were generated according to the expected behaviour of users. Following requests have been generated: 1. GetCapabilities, 2. GetData – approx. 12 thousands unique requests for each coordinate system, 3. GetFeature to topics of Parcel and Boundary – approx. 200 thousands unique requests, 4. GetFeature to the topic of Zoning – approx. 50 thousands unique requests, 5. DescribeFeatureType and XSD for all themes. The requests were included to one joint Jmeter queue to distribute requests of all operations during the testing equally. Following parameters have been monitored: date and time of response arrival, time spent till the arrival of the first byte of the response (Time To First Byte - TTFB), latency as a time spent till the arrival of the last byte of the response (Time To Last Byte - TTLB), size of the response in bytes (SIZE), the response code of the server (to identify errors and their sources), identification of the group of tests (to recognize the true one from ten used threads for testing where each thread represents one virtual user), the identification of the request (type of operation, coordinate system, etc.), layer name, spatial extent. A response time (RT) was calculated as a difference between TTLB and TTFB. The speed of downloading (data flow, DF) was obtained dividing SIZE by RT. Simultaneously to the testing of evaluated download services, testing of referencing servers were performed to validate results of our service availability testing. The reason was to elim- inate results of testing in time periods influenced by various problems in the network and namely at the client. Referencing servers are expected to be highly available in normal condi- tions. Requests to referencing servers were generated in one second interval per each thread and included to the joint queue of Jmeter. Recorded results were analysed and recognized periods with RT above the limit (indicating an interruption or remarkable traffic slow-down of Geoinformatics FCE CTU 10, 2013 7 Horák, J. et al.: Performance Testing of Download Services of COSMC the network connection) were excluded from the processing of results of the download service testing. The following reference servers were used: 1. http://www.google.cz/, 2. http://www.seznam.cz/, 3. http://maps.google.cz/, 4. http://www.mapy.cz/. Performance testing was performed with a constant load of ten parallel virtual users. 3. Monitored qualitative parameters According to the Commission Regulation No 1088/2010 regarding download services and transformation services the following Quality of Service criteria relating to performance, ca- pacity and availability shall apply [7]: 3.1. Performance The normal situation represents periods out of peak load. It is set at 90 % of the time. For the Get Download Service Metadata operation, the response time for sending the initial response shall be maximum 10 seconds in normal situation. For the Get Spatial Data Set operation and for the Get Spatial Object operation, and for a query consisting exclusively of a bounding box, the response time for sending the initial response shall be maximum 30 seconds in normal situation then, and still in normal situation, the download service shall maintain a sustained response greater than 0,5 Megabytes per second or greater than 500 Spatial Objects per second. For the Describe Spatial Data Set operation and for the Describe Spatial Object Type op- eration, the response time for sending the initial response shall be maximum 10 seconds in normal situation then, and still in normal situation, the download service shall maintain a sustained response greater than 0,5 Megabytes per second or greater than 500 descriptions of Spatial Objects per second. 3.2. Capacity The minimum number of simultaneous requests to a download service to be served in accor- dance with the quality of service performance criteria shall be 10 requests per second. The number of requests processed in parallel may be limited to 50. 3.3. Availability The probability of a network service to be available shall be 99 % of the time. Geoinformatics FCE CTU 10, 2013 8 http://www.google.cz/ http://www.seznam.cz/ http://maps.google.cz/ http://www.mapy.cz/ Horák, J. et al.: Performance Testing of Download Services of COSMC 4. Results 4.1. Get Download Service Metadata - TTFB Although the tests were undertaken on a client-side, the results show an abundant fulfilling of the criteria (unambiguous fulfilling with abundant reserves). A quotient of requests with a time to the first byte (TTBF) for GetCapabilities less than 10s is 0.004%, thus the limit of 10% is far away from the result. During the testing period we did not recognise (in any day or any hour) such results that are close to the required limit. The monitoring by each hour shows that there are two periods during a day with the increased average response time: between 9 am and 2 pm CEST (Central European Summer Time) and between 8 and 9 pm CEST (figure 1). The average response time is by 38% higher in these peak periods than in the quietest period. The occurrence of the high RT partly corresponds with periods of the highest network traffic. Figure 1: Minimal, average and maximal latency (TTFB) according to hours for Get Down- load Service Metadata. 4.2. Get Download Service Metadata - availability The availability of the service has been evaluated from the occurrence of errors and their source. All responses without a HyperText Transfer Protocol (HTTP) response code 200 (means OK) has been classified as errors. An average count of errors has been 0.004% that is much smaller than the required limit of the service availability (max. 1% of time with unavailable service). The limit has not been crossed or approached in any of the monitored days. Geoinformatics FCE CTU 10, 2013 9 Horák, J. et al.: Performance Testing of Download Services of COSMC 4.3. Get Download Service Metadata – data structure compatibility The tests did not show any error, so we can confirm that all metadata items are in conformance with a declared structure. 4.4. Get Download Service Metadata – expert evaluation of the content Two problems were detected in the content of the obtained metadata. The response does not contain information about the used language and listed feature types did not contain a reference to metadata. Similar errors have been detected in testing of view services by Kliment a Cibulka [8], and we can expect such errors when testing other services. According to the test results and subsequent recommendations the provider of the service improved the metadata. 4.5. Get Spatial Data Set – TTFB The results show unambiguous fulfilment of the required limit of TTFB (with an abundant reserve). Responses with a longer latency (TTFB) are occurred in a few cases. This conclusion is valid for all monitored layers, days and hours. 4.6. Get Spatial Data Set – Data flow under 0.5 MB/s According to the evaluation of the data download speed after the first packet of data is received we can approve a fulfilment of the requirement. The 99.7% of the requests are under the required limit (figure 2). There are no results for any layer, day or hour that would imply to fail in the required criterion. Figure 2: Occurrence of data flows bellow and above 0.5 MB/s in monitored days for Get Spatial Data Set. Geoinformatics FCE CTU 10, 2013 10 Horák, J. et al.: Performance Testing of Download Services of COSMC 4.7. Get Spatial Data Set – availability The results are fully satisfactory and values are usually 100 times better than the required limit. This conclusion is valid to all monitored day or hour. Average values of the error occurrence or service unavailability are 0.004% for the most of operations. The highest value 0.021% was recorded at May 31st that is still much smaller than the limit 1%. 4.8. Get Spatial Data Set – data structure compatibility Tests for data structures were focused on a conformance of obtained data with referenced XML schemes. Tests did not show any error for all tested files. 4.9. Get Spatial Object – TTFB The results show satisfactory fulfilment of the required limit of TTFB. Responses with a longer latency are occurred in few isolated cases. This statement is true for all monitored layers, days and hours. The graph depicts the average time of TTFB ranges between 500 and 727 ms (figure 3). In several days we did not record any response with TTFB above the limit of 30s. In other days such unsatisfactory responses occur rarely (from 1 to 6, maximum is 9 cases at 31st of May). Figure 3: Minimal, average and maximal latency (TTFB) according to days for Get Spatial Object. Note: values for Saturday and Sunday (2nd-3rd of June) are calculated from small amount of requests, because the monitoring was not performed for whole day. Geoinformatics FCE CTU 10, 2013 11 Horák, J. et al.: Performance Testing of Download Services of COSMC 4.10. Get Spatial Object – data flow below 0.5 MB/s According to the evaluation of the data download speed after the first packet of data is received we can approve a satisfaction of the requirement with an abundant reserve. The 99.7% of the requests are under the limit. The situation is similar for all layers – results are 99.6% for CadastralBoundary, 99.6% for CadastralParcel and 100% for CadastralZoning. The rule has not been broken or even reached in any of the monitored days or hours. 4.11. Get Spatial Object – availability The results are fully satisfactory and values are usually 100 times better than the required limit. This conclusion is valid to all monitored day or hour. Average values of the error occurrence or service unavailability are 0.004% for the most of operations. The highest value 0.022% was recorded at May 31st that is still much smaller than the limit 1%. 4.12. Get Spatial Object – data structure compatibility Tests for data structures were focused on a conformance of obtained data with referenced XML schemes. Tests did not show any error for all tested files. 4.13. Describe Spatial Data Set – TTFB The results approve the fulfilment of the required limit of TTFB because 98.7% of requests satisfied the limit. The fulfilment of the criteria (at least 90% of data flows more than 0.5 MB/s) can be theoretically endangered in case of two schemas (BasicFeature.xsd and UtilityAndGovernmentalServices.xsd). Most of requests have been responded within a first return packet, thus the size of the response is small enough. Unfortunately, measuring of the speed for such small amount of data may not be reliable. In the case of BasicFeature.xsd about 89% of responses are transferred with the required speed. Similarly the estimation of data flow for UtilityAndGovernmentalServices.xsd is about 82% of responses with the satisfactory speed. If we take into account the amount of data transferred within the first packet (the response time is measured between the time of delivering of the first and the last part of response which may differ from time of delivering the first byte) the criteria should be probably fulfilled. 4.14. Describe Spatial Data Set – Availability The results are fully satisfactory and values are extremely lower than the required limit. This statement is valid to all monitored day or hour. Average values of the error occurrence or service unavailability are 0.003% for the most of operations. The highest value 0.02% was recorded at May 31st that is still much smaller than the limit 1%. 4.15. Describe Spatial Data Set – data structure compatibility All XML schemes are identical to the schemes available at the INSPIRE portal. Geoinformatics FCE CTU 10, 2013 12 Horák, J. et al.: Performance Testing of Download Services of COSMC 4.16. Describe Spatial Object Type – TTFB The results show satisfactory fulfilment of the required limit of TTFB. Responses with a longer response time are occurred in few isolated cases. This statement is true for all monitored layers, days and hours. 4.17. Describe Spatial Object Type – data flow under 0.5 MB/s According to the evaluation of the data download speed after the first packet of data is received we can approve a satisfaction of the requirement with an abundant reserve. 99.8% of the requests fulfil the limit. There are no results for any layer, day or hour that would imply to fail in the required criterion. 4.18. Describe Spatial Object Type – availability The results are fully satisfactory and values are much lower than the required limit. This statement is valid to all monitored days or hours. Average values of the error occurrence or service unavailability are 0.004% that is significantly below the limit 1%. 4.19. Describe Spatial Data Type – data structure compatibility All responses are in conformance with XML schemes verified in the operation Describe Spatial Data Set. 5. Conclusion The aim of the analysis was to evaluate the implementation of requirements of the INSPIRE directive for the download service of COSMC. Operations Get Download Service Metadata, Get Spatial Object, Get Spatial Data, Describe Spatial Data Set and Describe Spatial Object Type were tested. The testing included two performance tests, each running for one week, performed from 28th of May to 10th of June 2012. The testing was done on a client-side out of an intranet of the service. Jmeter software was used with a randomly generated set of requests for monitored operations, layers and coordinate systems. Results were stored into logs that were used for the analysis. There were generated about 26 millions of requests from testing clients from VSB-Technical University of Ostrava for load tests altogether. Responses logged in a time of recognized problems in the intranet of clients (identified by parallel testing of an availability of four reference servers) were excluded from the evaluation. Results approve the fulfilment of all criteria for download services required by the INSPIRE directive and corresponding regulations. The latency and availability of tested services are excellent providing abundant reserves to satisfy higher requirements than INSPIRE limits. Geoinformatics FCE CTU 10, 2013 13 Horák, J. et al.: Performance Testing of Download Services of COSMC Acknowledgment We are appreciative employees of Czech Office for Surveying, Mapping and Cadastre for their cooperation and support of the analysis. References [1] Ardielli, J., Horak, J. & Ruzicka, J. (2012): View service quality testing according to INSPIRE implementing rules. Elektronika ir Elektrotechnika, Issue 3, pp. 69-74. [2] Directive (2007): Directive 2007/2/EC Of The European Parliament And Of The Council Of 14 March 2007 establishing an Infrastructure for Spatial Information in the European Community (INSPIRE), p. 14. [3] Hicks, G., South, J. & Oshisanwo, A. (1997): O. Automated testing as an aid to systems integration. BT Technology Journal, No. 15, pp. 26–36. [4] Horak, J., Ardielli, J. & Horakova, B. (2009): Testing of Web Map Services. International Journal of Spatial Data Infrastructures Research, pp. 1–19. [5] Horak, J., Ruzicka, J., Novak, J., Ardielli, J. & Szturcova, D. (2012): Influence of the number and pattern of geometrical entities in the image upon PNG format image size. Lecture Notes in Computer Science, vol. 7197 LNAI, Issue 2, pp. 448-457. [6] INSPIRE (2009): Commission Regulation (EC) No 976/2009 of 19 October 2009 imple- menting Directive 2007/2/EC of the European Parliament and of the Council as regards the Network Services, p. 10. [7] INSPIRE (2010): Commission Regulation (EU) No 1088/2010 of 23 November 2010 amending Regulation (EC) No 976/2009 as regards download services and transformation services, L323/1, p. 10. [8] Kliment, T. & Cibulka, D. (2011): Testovanie vyhladavacich a zobrazovacich sluzieb podla INSPIRE poziadavek. Proceedings of GIS Ostrava 2011, pp. 1–9. [9] Kliment, T., Tuchyňa, M. & Kliment, M. (2012): Methodology for conformance testing of spatial data infrastructure components including an example of its implementation in Slovakia. Slovak Journal of Civil Engineering, vol. XX, No. 1, pp. 10-20. [10] Mildorf, T. & Cada, V. (2012): Reference Data as a Basis for National Spatial Data Infrastructure. Geoinformatics FCE CTU 9, pp. 51-61. [11] Polacek, J. & Soucek, P. (2012): Implementing INSPIRE for the Czech Cadastre of Real Estates. Geoinformatics FCE CTU 8, pp. 9-16. [12] Sun, J. & Xie, H. (2012): Mining Sequential Patterns from Web Click–Streams Based on Position Linked List. Proceedings of Asian–Pacific Youth Conference On Communication Technology, pp. 466–470. [13] Voldan, P. (2011): Developing web map application based on user centered design. Geoin- formatics FCE CTU 7, pp. 131-141. Geoinformatics FCE CTU 10, 2013 14