___________________________________________________________________________________________________________ Geoinformatics CTU FCE 233 IMPROVING COMPLETENESS OF GEOMETRIC MODELS FROM TERRESTRIAL LASER SCANNING DATA Clemens NOTHEGGER Institute of Photogrammetry and Remote Sensing, Vienna University of Technology Gußhausstraße 27-29, Vienna, Austria cn@ipf.tuwien.ac.at Keywords: laser scanning, modeling, cultural heritage, automation, documentation, triangulation Abstract: The application of terrestrial laser scanning for the documentation of cultural heritage assets is becoming increasingly common. While the point cloud by itself is sufficient for satisfying many documentation needs, it is often desirable to use this data for applications other than documentation. For these purposes a triangulated model is usually required. The generation of topologically correct triangulated models from terrestrial laser scans, however, still requires much interactive editing. This is especially true when reconstructing models from medium range panoramic scanners and many scan positions. Because of residual errors in the instrument calibration and the limited spatial resolution due to the laser footprint, the point clouds from different scan positions never match perfectly. Under these circumstances many of the software packages commonly used for generating triangulated models produce models which have topological errors such as surface intersecting triangles, holes or triangles which violate the manifold property. We present an algorithm which significantly reduces the number of topological errors in the models from such data. The algorithm is a modification of the Poisson surface reconstruction algorithm. Poisson surfaces are resilient to noise in the data and the algorithm always produces a closed manifold surface. Our modified algorithm partitions the data into tiles and can thus be easily parallelized. Furthermore, it avoids introducing topological errors in occluded areas, albeit at the cost of producing models which are no longer guaranteed to be closed. The algorithm is applied to scan data of sculptures of the UNESCO World Heritage Site Schönbrunn Palace and data of a petrified oyster reef in Stetten, Austria. The results of the method’s application are discussed and compared with those of alternative methods. 1. INTRODUCTION There are many different instruments on the market for performing Terrestrial Laser Scannning (TLS). These instruments vary considerably with respect to their measuring principle, accuracy, speed, range and purpose. Therefore also the strategy for processing the data needs to be adapted to the advantages and disadvantages of the respective instrument. The choice of instrument is mainly driven by the scale of the objects to be scanned. Small objects measuring from a few centimeters up to a few meters can be scanned using close range scanners. These scanners have a high accuracy, but also a limited range (several meters) and a limited field of view. It is therefore necessary to build the final point cloud by merging scans from many different scan positions. This can become very expensive if it is not automated. Larger objects measuring from a few meters up to hundreds of meters can only be scanned efficiently using medium to long range scanners which have a wide field of view, typically panoramic, and a theoretical maximum range of up to hundred meters, sometimes even more. The accuracy, on the other hand, of these scanners is lower when compared to close range scanners, despite some improvements in recent years. The challenge when working with data from these instruments, compared to close range scanners, thus shifts from registering a multitude of individual scans to dealing with measurement errors, both random noise and systematic errors. In this paper we will exclusively focus on this latter class of instruments. While the point cloud by itself is sufficient for satisfying many documentation needs, it is often desirable to use this data for applications other than documentation. For these purpose a triangulated model is usually required. Surface reconstruction from point clouds is a well studied problem and a great number of different algorithms solving it have been developed. It is also available in commercially available software packages such as Rapidform, Geomagic Studio, or Polyworks. Most of them, however, work best when applied to data acquired using close range scanners, or when the data acquired using medium range scanners is of a relatively low resolution. However, when exploiting the full potential of medium range terrestrial laser scanners, i.e. using them at high resolutions, generating topologically correct triangulated models is still a challenging task, which typically involves much interactive editing. The main reason is that because of residual errors in the instrument calibration and the limited spatial resolution due to the laser footprint, the point clouds from different scan positions never match perfectly. Under ___________________________________________________________________________________________________________ Geoinformatics CTU FCE 234 these circumstances many of the commonly used algorithms for generating triangulated models produce models which have topological errors such as surface intersecting triangles, holes or triangles which violate the manifold property [1]. In this paper we present an algorithm which significantly reduces the number of topological errors in the models from such data, when compared with the results of commercial software packages. The algorithm is a modification of the Poisson surface reconstruction algorithm. Poisson surfaces are resilient to noise in the data and the algorithm always produces a closed manifold surface. The surface will be closed even when the sampling of the surface is incomplete, e.g. because of occlusions. While this is a desirable property in many applications, for documentation purposes this arbitrary closing of surfaces can be problematic, especially since this closing might not even be topologically correct. Our modified algorithm restricts the surface to sampled areas and thus these errors can be avoided, albeit at the cost of producing models which are no longer guaranteed to be closed. Our modified algorithm is tested on a dataset consisting of a number of small point clouds, along with the original Poisson surface algorithm and the algorithm available in the commercial software package Geomagic Studio 11. Using these datasets we demonstrate, that our algorithm does indeed perform as designed, both in simple as well as in challenging situations. To demonstrate the applicability to large real world datasets it is applied to scan data of sculptures of the UNESCO World Heritage Site Schönbrunn Palace as well as data of a petrified oyster reef. 2. RELATED WORK The data from terrestrial laser scanners consists of a set of point samples of all the surfaces within the line of sight of the scanner. Because of occlusions the data from many scan position need to be merged into one point cloud. The resulting point cloud is unstructured, i.e. it does not contain any information about the topologically correct relations between the points. Furthermore the point cloud usually contains both noise and systematic errors. Therefore the accurate and correct reconstruction of the originally sampled surface from TLS data is a challenging problem. There are two basic approaches to surface reconstruction. One approach is to utilize geometric properties of the point cloud. Examples of this approach include algorithms such as alpha shapes [2], power crust [3], or VRIP [4]. Algorithms which try to construct highly generalized models by fitting geometric primitives also fall into this category. The common property is that an explicit description of the surface itself is constructed. The alternative is so-called implicit surfaces. Here the goal is to find a function defined on the entire volume around the object. The surface itself is then defined as an iso - contour of this scalar field. Examples of this approach include the implicit surface framework described in [5], and Poisson surfaces [6],[7], which are the basis of this work. 2.1 Poisson Surfaces The idea behind Poisson surfaces is to look at an indicator function, which is zero outside the object and one inside. Since this function is not continuous, it is smoothed by convoluting it with a Gaussian function, or an approximation thereof. It can then be shown, that the true surface normal vectors are samples of the gradient field of the convolute d indicator function. Since in reality the true surface normal vectors are unknown, the surface normals are estimated from the scanned point samples. The point samples contain measurement errors, thus the function that is sought is the one whose gradient field is closest to the gradient field estimated from the point cloud. The solution of this variational optimization problem is a Poisson equation, thus the name Poisson surfaces. The solution of the Poisson equation is the convoluted indicator function, which is a scalar field defined in the entire volume enclosing the object. To compute the solution this volume is discretizised into voxels. The final surface can be located with sub-voxel precision; nevertheless a fine discretization is needed for good results. Because of the enormous amount of memory that would otherwise be necessary this needs to be done using an octree, a data structure enabling that the full resolution is only used close to the surface. This does not impact the accuracy of the solution, since away from the surface the indicator function is constant. The achievable spatial resolution is determined by the maximal depth of the octree, and thus depends on the point density and the smoothing function used. 2.2 Surface Normal Vectors To estimate the surface normal vectors it is necessary to estimate the tangential plane for each point of the surface. The most common and straightforward approach is to perform a least squares fitting of a plane to a compact neighborhood of sample points. For homogeneously sampled surfaces, as it is typically the case with close range data, this works well. For panoramic TLS instruments, however, the point density is usually very inhomogeneous due to the wide range of measured distances. To get consistent results w.r.t estimation error and smoothing properties it is necessary to locally adapt the size of the neighborhood [8]. In practice the data from TLS contains a quite significant number of outliers. For instruments utilizing the phase-shift measurement principles the main source of outliers is the fact that the laser beam can simultaneously illuminate more than one surface. In this case the measured distance is an average of the distances to the illuminated surfaces and thus does not correspond with any existing surface. This always happens at the silhouette of objects. Other sources of outliers are glossy surfaces, which under certain angles can either saturate the detector due ___________________________________________________________________________________________________________ Geoinformatics CTU FCE 235 to their high reflectivity, or lead to multi-path effects by reflecting the laser beam like a mirror. Even in the absence of such effects, it is also not guaranteed that all the points within the neighborhood used for estimating the tangential plane lie indeed on the same surface. They might be beyond a sharp edge or on a parallel surface if the material is thin. These points can thus appear like outliers, when they are used to estimate the tangential plane for a surface they don't belong to. To deal with these situations robust estimation techniques can be used. Robust estimators take much more time to compute and are also not optimal with respect to their statistical efficiency. However, they are unaffected by outliers and thus allow to identify outliers more reliably [9], thus making them a worthwhile addition to the processing toolbox. Robust estimators that have successfully been used in this context include the fast minimum covariance determinante estimator [10],[11] and an estimator based on robustly adapted kernel density estimation [12]. 3. METHOD AND IMPLEMENTATION The algorithm we present in this paper uses the Poisson Surface technique. However, we implemented it differently than described in [7]. The key difference between our implementation and the original implementation is that we do not solve the system globally, but rather split the system into small cubes and independently solve the systems locally and in a second step correct the errors that occur along the boundary of the cubes. The main advantage of our approach is that it is more easily parallelizable and can be implemented using data streaming. Thus it is more suitable for very large datasets and scales better when run on clusters of machines. We achieve this localization by restricting the domain Ω on which the indicator function χ is defined to areas close to the surface rather than the entire bounding box around the object. This is illustrated in Figur 1. Figure 1: Illustration of the Poisson Surface algorithm: a) sample points, surface normal vectors, domain Ω and domain boundary ∂Ω, b) smoothed indicator function χ and c) iso-contour for the original implementation and d) sample points, surface normal vectors, domain Ω and domain boundary ∂Ω, e) smoothed indicator function χ and f) iso-contour for the modified implementation. Restricting the domain in such a way has the disadvantage that the surface is no longer guaranteed to be closed and it raises numerical problems. There are two main reasons. The first is that the boundary can become quite complicated and secondly the Dirichlet boundary condition ∂Ω = 0 is only true on the outside and thus only applicable to the outer boundary. On the inner boundary the Neumann boundary condition grad(χ) ∙ n = 0, where n denotes the normal vector of χ, must be used. Unfortunately, a system of partial differential equations containing a Neumann boundary condition along a complicated boundary is numerically challenging to solve, at least when using finite differences. To mitigate these difficulties we used the approach described in [13] and additionally used the dilate morphological operator on the voxel grid to smooth the boundary and move it away from the surface where these errors resulting from the discretization of the problem domain matter less. The dilate operator also enables the algorithm to close small holes which may exist in the sampling. The resulting system of partial differential equations is much too large to be solved directly. Therefore iterative approximation solvers must be used. In [7] the octree used for the organization of the data is used to construct a multigrid solver. This approach profits from the fact that the solution is constant almost everywhere except near the surface. In our approach, however, the solution is undefined almost everywhere, except near the surface. Therefore the multigrid approach cannot be used. Instead we use a domain decomposition solver based on the additive Schwarz method [14]. In his method the entire domain is divided into overlapping subdomains, and the results from ___________________________________________________________________________________________________________ Geoinformatics CTU FCE 236 solving the problem on one subdomain are added the other overlapping subdomains to compensate the errors. This procedure is iterated until the errors are sufficiently small. Here the domain restriction is advantageous. Since solution components with a low frequency are not possible the solution converges quickly even without using a coarse grid. 4. RESULTS We tested our implementation of the Poisson surface algorithm on three datasets. The first is synthetic and consists of sample points on a plane and on a sphere with various levels of noise. The advantage of the synthetic data is that the ground truth is known and the performance of the algorithm can be assessed accurately. The other two datasets are from actual scan campaigns. 4.1 Synthetic Datasets This dataset consists of five small point clouds. They were constructed to evaluate the characteristics of the algorithm in situations which are commonly encountered in laser scanning data. The point cloud consists of 3600 points on a plane, i.e. the data is error free. In reality data is never error free, but data from close range scanners and data from a single low-resolution panoramic scan is usually quite close to this ideal. The second and third point cloud is the same points, but a gaussian range error was added, having a standard deviation of one and three millimeters, respectively. The spacing of the points is one millimeter in each direction. This might seem like an unrealistically large error, especially with respect to the point spacing. However, while this is true for a single scan, it is not unrealistic when multiple scans are combined. With modern panoramic scanners a point spacing of one to five millimeters and a noise level of 0.3 to 0.5 mm can be achieved even when keeping scan times down to a few minutes, assuming that a maximum scanning distance of 10 meters is not exceeded. The scan data from multiple scan positions will usually not match perfectly, since residual errors in the instrument calibration cannot be corrected using the rigid body transformation used in registering the scans. These are systematic errors which can be up to several millimeters in magnitude. When combining data from two or three scan positions a combined standard deviation of one millimeter and a point spacing of one millimeter is not uncommon. When using older instruments, e.g. when dealing with data acquired a few years ago, or when using current instruments under very unfavorable conditions, e.g. when using an unstable platform or very dark surfaces, the errors are even higher. The point cloud with three millimeter standard deviation is designed to mimic this situation. Finally the fourth and fifth point cloud is 7200 points on a small sphere with a radius of 6 centimeters. Again, one of the point clouds is error free; the other has an added Gaussian error with a standard deviation of one millimeter in the direction of the surface normal. This dataset is designed to show the behavior of the algorithm on surfaces with a fairly high curvature. Table 1 shows the results of comparing the vertices of the final surface with the known ground truth for the three surface generation methods. There is a small, but nonetheless significant, deviation from the expected mean distance for both Poisson surface implementations, i.e. the surface estimator is biased. This is the case even for the error free dataset. The surface constructed with Geomagic Studio 11 does not show this bias. Since for Poisson surfaces the final surface is the iso-contour of a scalar field, it needs to be examined this bias can be reduced by calibrating the value of the iso-contour. Poisson Surfaces Localized Poisson Surfaces Geomagic Studio 11 Mean Distance Standard Deviation Mean Distance Standard Deviation Mean Distance Standard Deviation Perfect Plane 0.016 0.019 0.014 0.019 0.000 0.000 Noisy Plane (σ=1mm) 0.037 0.082 0.082 0.090 0.005 0.746 Noisy Plane (σ=3mm) 0.246 0.136 0.147 0.265 0.004 2.245 Perfect Sphere 0.014 0.053 0.018 0.097 0.000 0.000 Noisy Sphere (σ=1mm) 0.028 0.190 0.126 0.343 0.023 0.799 Table 1: Comparison of absolute mean and standard deviation of the distances between surface vertices and the known ground truth for the three tested algorithms ___________________________________________________________________________________________________________ Geoinformatics CTU FCE 237 On the other hand, the noise reduction properties of the Poisson surfaces are excellent. The residual noise in the surface is reduced by more than one order of magnitude when compared to the noise level of the original point cloud and there are no topological errors. Our localized version of the Poisson surfaces does not perform quite as well as the original implementation in this respect. This effect can also be seen in Table 2, where images of the resulting surfaces are shown. Table 2 also shows that for the localized version the errors are spread equally over the entire surface, whereas with the original implementation the errors are located mostly towards the edges of the surface, which bends away from the plane, whereas in the central part the errors are less. The reason is that for this point cloud the assumption that the points lie on a closed surface is not valid. It still needs to be examined why the localized version of the Poisson surfaces does not perform as well as the original implementation w.r.t. the noise suppression. The most likely reason, however, is that we used a smaller, simpler but also less accurate discretization of the problem space. Poisson Surface Localized Poisson Surface Geomagic Studio 11 Noisy Plane (σ=1mm) Noisy Plane (σ=3mm) Noisy Sphere Table 2: Visualization of the estimated surfaces for the synthetic datasets. It can also be seen in Table 2 that when the noise level is relatively high, the surface reconstruction built into Geomagic Studio 11 completely fails to produce a useful surface, despite efforts on our part to utilize the built in noise reduction facilities. We want to stress that this is not a problem of Geomagic Studio in particular. The results would not look much different with other commercial software packages. Dealing with this kind of data is very difficult and the only thing that helps is to reduce the point density, thus reducing the relative noise level. That, however, inevitably causes a loss of detail. This can also be seen with the oyster reef dataset. 4.1 Schönbrunn Attic Sculptures Datasets The second dataset consists of 400 individual scans of 50 attic sculptures of the UNESCO World Heritage Site Schönbrunn Palace. The sculptures were removed for restoration and scanned after the restoration was complete. For each sculpture a total of 8 scans were acquired from two height levels. A Faro Photon 120 scanner was used at medium resolution (¼ scans), but from a distance of no more than 5 meters, resulting in an average point spacing of 2 mm per scan. Before the surface reconstruction the data was preprocessed and registered. The preprocessing stages consisted of outlier removal, random noise reduction, and surface normal estimation [1]. The outlier removal was performed primarily to eliminate the erroneous points that occur along the silhouettes of the arms and legs of the sculptures. After registration of the scans the surface reconstruction was performed using both the Poisson surface algorithm and ___________________________________________________________________________________________________________ Geoinformatics CTU FCE 238 Geomagic Studio 11. Figure 2 shows the result for one of the more elaborate sculptures. The model generated using the Poisson surface algorithm is clearly visually more appealing and would not need barely any interactive editing if it were to be used for visualization purposes. In the surface reconstructed using Geomagic Studio 11 the problems in the data are still clearly visible. The roughness of the surface is the result of residual registration errors, which is especially pronounced in the face and on the outside of the arm which are seen from multiple scans. Areas which are occluded in all but one scan, such as the part of the chest behind the arm, do not exhibit this roughness. The rims which are visible along the chest and neck are the result of outliers due to the silhouette effect, which were close enough to the surface points to pass the outlier filter. The Poisson surface algorithm is capable of concealing these deficiencies in the data. Figure 2: Schönbrunn Attic Sculpture. a) entire sculpture b) reconstructed surface using Poisson surface algorithm c) differences between Poisson surface and original points d) reconstructed surface using Geomagic Studio 11 and e) differences of Geomagic Studio surface and original point cloud. When looking at the accuracy of the models, the situation is quite different, however. The difference model comparing the reconstructed surface with the points used to reconstruct the surface shows predominantly shade of blue for the Poisson surface. This means that there is a systematic bias, a trait of the algorithm that was also present in the synthetic dataset and discussed there. If the geometric model is to be used for documentation purposes other than visualization, e.g. monitoring or change detection, such a bias is not desirable. The surface reconstructed with Geomagic Studio on the other hand almost exclusively shows cyan and yellow colors. This means that the surface never deviates more than 0.5 mm from the original points. 4.2 Petrified Oyster Reef Dataset The third dataset we used for our experiments consisted of 8 scans of the world's largest accessible petrified oyster reef located in Stetten, Austria. The exposed area is approximately 20 by 15 meters large, the scan positions are located along the boundary of the area. The challenge with this dataset is that the scans were acquired using the low resolution setting of the instrument (Leica HDS 6000) and from a very shallow angle, despite the size of the area to be scanned. The result is that there are large areas which have a point density which is significantly lower than one point per centimeter. Due to the very shallow angles there are numerous outliers due to the silhouette effect, which are very hard to detect due to the low point density and which are an impediment for an accurate registration. Furthermore, there is very little overlap between the individual scan positions due to their large distance and the shallow angles. The goal was to reconstruct a surface that accurately documents shapes of the petrified structures. The results are shown in Figure 3. ___________________________________________________________________________________________________________ Geoinformatics CTU FCE 239 Two models were constructed, one interactively by a skilled operator using Geomagic Studio 11 (3c), the other using the Poisson surface algorithm (3b). For reference a model was constructed using only a single scan (3a). Comparing the models it can be seen that the Poisson surface model preserves more detail than the interactively edited model. However, when compared to the model determined from the single scan, it can be seen that the sharpness of the edges of the original is completely lost in both models. Thus neither model fulfills the goal of documenting the petrified shapes. A high point density is a necessary condition if sharp edges are to be represented accurately. Figure 3: The oyster reef data. a) triangulation of a single scan, b) Poisson surface reconstruction of all 8 scans and c) interactively created surface using Geomagic Studio 11. 5. CONCLUSIONS Panoramic terrestrial laser scanners are much more cost efficient as close range scanners when scanning objects which are larger than a few meters. However, when deriving surface models from this data for visualization purposes, currently much manual effort is needed to clean the surface of artifacts. In this paper we showed that by using the Poisson Surface algorithm it is indeed possible to significantly reduce this effort, provided that the point density of the combined point cloud is high enough. Low point density leads to strong smoothing over sharp edges. The benefits are that residual errors in the registration and a few outliers can be tolerated. The main advantage of the presented methodology is that it produces a geometric model which can typically be directly used for visualization purposes without requiring further interactive editing. Manual closing of holes is only necessary if the scan data has large uncovered areas. The field of smoothed surface normal vectors which is produced as part of the algorithm is ideally suited for the derivation of a high-resolution normal map for texture rendering. This is especially beneficial for real- time rendering, since for this application the polygon count of the model needs to be dramatically reduced. The determined surface, however, always exhibited a systematic deviation from the mean position of the points. Future research is required to determine the cause of this behavior. As long as the reason is not known and this bias cannot be corrected, the use of this method for purposes other than visualization cannot be recommended. However, since for documentation purposes, the point cloud itself is usually sufficient, this does not prevent the method from being used for cultural heritage documentation, when the original point cloud is kept for reference. We used a modified implementation of the Poisson surface algorithm which solves many local systems instead of a large global system. The advantage is that the localized version scales very well using parallel computation and data streaming can be used to keep the memory footprint low. Another characteristic of the modified implementation is that the resolution can be chosen consistently and independently of the point spacing. The errors in the solution for the indicator function are slightly worse for the modified version, because of a less accurate discretization. This can be offset, however, by using a smaller discretization interval. The reduced memory footprint and improved parallelization of the modified implementation makes it possible to use the method for very large dataset consisting of many billions of points. 6. ACKNOWLEDGEMENTS This work is funded by the Österreichische Forschungsförderungsgesellschaft FFG under project number 21275Ř5. The data of the attic sculpture was provided by Schloß Schönbrunn Kultur- und BetriebsgesmbH. The data of the oyster reef was acquired by Leica Geosystems Austria GmbH and provided by Naturhistorisches Museum Wien. ___________________________________________________________________________________________________________ Geoinformatics CTU FCE 240 7. REFERENCES [1] P. Dorninger and C. Nothegger, “Automated Processing of Terrestrial Mid-Range Laser Scanner Data for Restoration Documentation at Millimeter Scale,” Proceedings of the 14th International Congress “Cultural Heritage and New Technologies,” Vienna, Austria: 2009. [2] H. Edelsbrunner and E.P. Mücke, “Three-dimensional Alpha Shapes,” 1řř4. [3] N. Amenta, S. Choi, and R.K. Kolluri, “The Power Crust,” Proceedings of the sixth ACM symposium on solid modeling and applications, Ann Arbor, Michigan, United States: ACM, 2001, pp. 249-266. [4] B. Curless and M. Levoy, “A Volumetric Method for Building Complex Models from Range Images,” Proceedings of the 23rd annual conference on Computer graphics and interactive techniques, ACM, 1996, pp. 303-312. [5] H. Hoppe, “Surface Reconstruction from Unorganized Points,” PhD Thesis, University of Washington, 1řř4. [6] M. Bolitho, M.M. Kazhdan, R.C. Burns, and H. Hoppe, “Parallel Poisson Surface Reconstruction.,” ISVC (1), G. Bebis, R.D. Boyle, B. Parvin, D. Koracin, Y. Kuno, J. Wang, R. Pajarola, P. Lindstrom, A. Hinkenjann, M.L. Encarnação, C.T. Silva, and D.S. Coming, Eds., Springer, 200ř, pp. 67Ř-689. [7] M. Kazhdan, M. Bolitho, and H. Hoppe, “Poisson surface reconstruction,” Proceedings of the fourth Eurographics symposium on Geometry processing, Eurographics Association Aire-la-Ville, Switzerland, Switzerland, 2006, pp. 61- 70. [8] C. Nothegger and P. Dorninger, “3D filtering of high-resolution terrestrial laser scanner point clouds for cultural heritage documentation,” Photogrammetrie, Fernerkundung, Geoinformation (PFG), vol. 1, 2009. [9] D.C. Hoaglin, F. Mosteller, and J.W. Tukey, Understanding Robust and Exploratory Data Analysis, Wiley- Interscience, 2000. [10] C. Nothegger and P. Dorninger, “Automated Modeling of Surface Detail from Point Clouds of Historical Objects,” 21st CIPA Symposium, Anticipating the Future of the Cultural Past, The Internationa l Archives of Photogrammetry, Remote Sensing and Spatial Information Sciences, Athens: 2007, pp. 538 - 543. [11] P.J. Rousseeuw and K. van Driessen, “A Fast Algorithm for the Minimum Covariance Determinant Estimator,” Technometrics, vol. 41, 1999, pp. 212-223. [12] B. Li, R. Schnabel, R. Klein, Z. Cheng, G. Dang, and J. Shiyao, “Robust normal estimation for point clouds with sharp features,” Computers & Graphics, vol. 34, Apr. 2010, pp. 94-106. [13] B.J. Noye and R.J. Arnold, “Accurate finite difference approximations for the Neumann condition on a curved boundary,” Applied Mathematical Modelling, vol. 14, Jan. 1990, pp. 2-13. [14] A. Toselli and O. Widlund, Domain Decomposition Methods, Springer, 2004.