Acta Polytechnica CTU Proceedings doi:10.14311/APP.2016.3.0001 Acta Polytechnica CTU Proceedings 3:1–6, 2016 © Czech Technical University in Prague, 2016 available online at http://ojs.cvut.cz/ojs/index.php/app STATISTICAL EVALUATION OF FATIGUE DATA OF COMPONENTS Chi Nghia Chung∗, Zoltan Major Johannes Kepler University Linz, Institute of Polymer Product Engineering, Altenbergerstraße 69, 4040 Linz, Austria ∗ corresponding author: chi_nghia.chung@jku.at Abstract. A variety of steels, cast iron grades and other metals have long been used for the production of machine components. In recent years, however, new materials such as sintered materials and plastics become increasingly important. Because of the large number of different fibers, matrices, stacking sequences, processing conditions and processes and the variety of resulting material configurations it is not possible to rely on proven fatigue models for conventional materials. Moreover, the development of models, which are valid for all composites are generally extremely difficult. In this work, a possible application of high-performance composites as materials for machine elements are investigated. This study attempts to predict the fatigue behavior and the consequent durability, based on laboratory measurements. Using the statistics program JMP, the aquired data was subjected to a reliability analysis in order to ensure the plausibility, validity and accuracy of the measured values. Keywords: fatigue models, statistical evaluation, reliability, techniques of parameter fitting, JMP. 1. Introduction There was a tremendous advance in the field of plastics took in the past few decades. Plastics have become an integral part of our daily life. Polymers are flexible materials which can cover a great range of applica- tions. They replace more and more metal, glass, wood and other materials. The development of novel artifi- cial materials often opens a door to new technologies. Electrical and automotive industry are hard to imag- ine without plastics, the use of polymer materials has revolutionized medicine. Cyclically stressed components have a limited dura- bility, therefore it is important to perform fatigue tests or simulations on critical components to predict their lifetime. Figure 1 shows representative loading pat- terns with constant amplitudes for the whole loading level [1]. Figure 1. Constant amplitude loading patterns. Fatigue tests are performed to study the relationship between the fatigue resistance of a material, compo- nent or structural element and cyclic loading [2]. It is a slowly progressing damage process. The strengths lie far below the static strength and yield strength. The objective of this work was to create high cy- cle fatigue curves from experimental datasets, using suitable material models, similar to the Wöhler curve. The curve should as closely as possible reflect the experimental values. Since specimens and test con- ditions are never 100 % identical, in two discrete measurements there is always scattering in the re- sults, which can span over a decade in fiber reinforced polymers [1]. Therefore to handle and correctly inter- pret experimental results, statistical methods must be used. 2. Fatigue Models In general, Fatigue models are quantifications of phys- ical material properties. They are independent of the shape of the component and are usually based on experimentally acquired data. The aim of all mod- els is to predict how a component behaves in certain conditions. The relationships are reproduced mathe- matically afterwards, therefore they are mathematical models. In this work, the focus will be on presenting fatigue models [3]. In order to design a component correctly in terms of fatigue, a complete set of experimentally acquired data is usually required. Since this is not possible for reasons of time and cost, engineers have to rely on predictive models. This models predict the durability N (in cycles) under a given cyclic loading. 1 http://dx.doi.org/10.14311/APP.2016.3.0001 http://ojs.cvut.cz/ojs/index.php/app Chi Nghia Chung, Zoltan Major Acta Polytechnica CTU Proceedings 2.1. The Basquin Model The Basquin fatigue model is a linear regression model. In a logarithmic scale, the durability (logN) is plotted versus the stress amplitude (log∆σ). log N = A−B · log∆σ, ∆σ ≥ ∆σ0, B ≥ 0 (1) A and B are material parameters that need to be approximated with appropriate fitting methods. In the model, ∆σ is limited by ∆σ0, the endurance limit. In this work, the endurance limit is at 406 cycles. ∆σ0 itself has no direct influence on the model, since it is not considered in the formula. 2.2. The Strohmeyer Model In contrast to the Basquin model the models of Strohmeyer and Weibull are nonlinear. They are shaped by smoothing a piecewise linear function. Again A and B are the material parameters. log N = A−B · log(∆σ − ∆σ0), B ≥ 0 (2) 2.3. The Weibull Model The Weibull model is more complex, since there are more parameters to fit. log(N+D) = A−B·log ( ∆σ − ∆σ0 ∆σst − ∆σ0 ) , B ≥ 0 (3) A, B and D describe the material parameters and ∆σst denotes the ultimate strength. 2.4. The Bastenaire Model logN −B ∆σ − ∆σ0 = A ·exp[−C · (∆σ − ∆σ0)] (4) A, B and C express the material parameters. This model will be henceforth denoted as Bastenaire1, be- cause one can find another model of Bastenaire in the literature, which is called Bastenaire2 here. It is quite similar to equation 4 and it is also discussed for the purpose of comparison. N = A ·exp[−C · ∆σ − ∆σ0 B ]/(∆σ − ∆σ0) (5) 3. Techniques of Parameter Fitting In this section, two exemplar fitting methods are dis- cussed in greater detail [4–6]. Both methods are cur- rent and proven estimation methods in statistics. 3.1. The Method of Least Squares (L2-Norm) The method of Least Squares is a mathematical stan- dard procedure for compensation calculation. Here a function is determined, which fits a point cloud as close as possible. A point cloud is a scatter plot, a graphical representation of statistical measurements in a coordinate system. To illustrate the method, the Basquin model is used as an example. Consider a dependent variable y (in this case, logN), which depends on one or several vari- ables (in this case A, B and log∆σ). The relationship between the dependent variable and the arguments are described in a model function f [7]. The model function f can be linear, as in this case, but it can also have any other shape (parabolic, exponential, ...). The general form is: y(x) = f(x; a1.........am) (6) In this case: log N = A−B · log∆σ, ∆σ ≥ ∆σ0, B ≥ 0 (7) The parameters A and B should be adapted such that bad data points (outliers) have only little effects on the fitting. If no unique solution that perfectly fits the point cloud can be found, then a compromise solution with the smallest overall deviation from the point cloud is the valid criterion. For this purpose, the sum of the squares of the differences between the model function f and the measured values yi is formed. n∑ i=1 (f(xi; a1.........am) −yi)2 (8) The parameters A and B are adapted until the sum of squares becomes minimal. In this work the fit- ting procedure carried out with the help of a solver implemented in Excel. 3.2. The Method of Least Absolute Deviations (L1-Norm) The L1 Norm is a more robust fitting method. Outliers are not so strongly weighted. In principle, this method works similarly to the previously explained Method of Least Squares. Instead of the sum of the squares, the absolute sum of the differences between the model function f and the measured values yi is calculated. Abs ‖ n∑ i=1 (f(xi; a1.........am) −yi) ‖ (9) Subsequently, the parameters A and B are also be adjusted until the absolute sum is minimized. 4. Reliability The reliability is a measure of the accuracy of the mea- surement, as well as for the reliability of data. Mea- surement series with very high reliability are therefore almost free of random errors, which means that they are repeatable at any time under the same measure- ment conditions and thereby provide approximately the same results [8, 9]. Therefore, one gets a high reli- ability, when performing controlled and standardized measurements. To examine the reliability, different techniques can be used [10]. Some known techniques are: 2 vol. 3/2016 Statistical Evaluation of Fatigue Data of Components • Test - Retest Reliability • Parallel - Forms Reliability • Split - Half Reliability • Internal - Consistency Reliability. Reliability analysis can be easily carried out with various computer programs. Known programs are SPSS or, as used in this work, JMP. 5. Evaluation of Fatigue Data Two different materials were tested. Material#1 is a glass fiber reinforced, semi-crystalline thermoplas- tic. Material#2 is a carbon fiber reinforced, semi- crystalline thermoplastic, where the bearing was simu- lated as a compliant bearing. Material#3 is the same material as Material#2, however it has been simulated as a rigid bearing. 5.1. Fatigue Models Figure 2 exhibits the applied fatigue models for ma- terial#1. Since for Material#2 the ultimate strength and the load at the endurance limit could not be in- vestigated, only the Basquin model could be applied 3. In the case of Material#3 the fitting with the Strohmeyer and the Weibull model resulted in the same curve 4. Figure 2. Applied fatigue models for Material#1. Figure 3. Applied fatigue models for Material#2. Figure 4. Applied fatigue models for Material#3. The software minimized the deviation between the models and the measured values. By comparing the obtained model parameters the optimal model can be easily determined, not only qualitatively but also quantitatively. Figure 5 and 6 show the optimal fatigue models for the materials. Since for Material#2 only the Basquin model was applied, it was omitted below. Figure 5. Optimal fatigue model for Material#1. Figure 6. Optimal fatigue model for Material#3. 5.2. Reliability Analysis in JMP In JMP custom tables can be created, or files of dif- ferent formats (Excel, SAS, text files, ...) can be processed. For a correct data input, the appropriate settings in the import wizard must be applied. In order to determine the best distribution for the measured values, a life distribution is performed [11]. The program returns a table with the appropriate distributions for the respective materials. JMP sorts them in descending order, the best model being on the top. The distribution of the measured data is weighted by 3 criteria.The Akaike Information Criterion (AIC), the Bayesian Information Criterion (BIC) and Log- likelihood (maximum probability) used as estimation methods for the selection of models in statistics. 3 Chi Nghia Chung, Zoltan Major Acta Polytechnica CTU Proceedings Tables 1, 2 and 3 show the "Model Comparisons" for each materials. According to the distribution anal- ysis in JMP, the measured values from Material#1 follows a natural logarithmic distribution, from Mate- rial#2 a Frechet distribution and from Material#3 a Weibull distribution. Distribution AICc Loglike BIC Lognormal 503.65435 498.90435 504.79323 Weibull 503.66928 498.91928 504.80816 Frechet 505.47261 500.72261 506.61149 Loglogistic 505.74393 500.99393 506.88280 Exponential 518.31862 516.08332 519.02776 LEV 538.93104 534.18104 540.06992 Logistic 546.20378 541.45378 547.34266 Normal 550.30607 545.55607 551.44495 SEV 563.21318 558.46318 564.35205 Table 1. Comparisons of the distribution of the measured data for Material#1. Distribution AICc Loglike BIC Frechet 297.91571 292.41571 297.21150 Lognormal 299.16181 293.66181 298.45760 Loglogistic 300.06766 294.56766 299.36345 Weibull 301.43592 295.93592 300.73171 Exponential 305.28286 302.83841 305.23631 LEV 320.31613 314.81613 319.61192 Logistic 326.13477 320.63477 325.43056 Normal 329.93763 324.43763 329.23342 SEV 337.33938 331.83938 336.63517 Table 2. Comparisons of the distribution of the measured data for Material#2. Distribution AICc Loglike BIC Weibull 407.25467 402.16376 407.44188 Loglogistic 409.33460 404.24369 409.52181 Lognormal 409.74094 404.65003 409.92814 Frechet 415.33191 410.24100 415.51912 Exponential 416.03305 413.69971 416.33877 LEV 432.98688 427.89597 433.17409 Logistic 439.48273 434.39182 439.66994 Normal 439.98404 434.89313 440.17125 SEV 445.68650 440.59560 445.87371 Table 3. Comparisons of the distribution of the measured data for Material#3. Based on this findings, the durability evaluation was performed. Tables 4, 5 and 6 display the mean estimates for each material. σ is the standard deviation from the measured data, β0 and β1 denote positional and shape parameters. StdError stands for standard error and describes the standard deviation, but from the estimate function. Additionally the tables show for each estimation the 95 % confidence interval. µ is the estimated average value of cycles, which is dependent on the loading. Strictly speaking, this designation should be logµ, because the relations are from logarithmic nature. Par. Estimate StdError Low95 % Up95 % β0 79.25810 6.98804 64.83945 93.67675 β1 -14.17640 1.44835 -17.16484 -11.18796 σ 0.92287 0.14971 0.69226 1.31688 Table 4. Mean estimations for Material#1. µ = 79.2581 − 14.1764 · log(loading) (10) Par. Estimate StdError Low95 % Up95 % β0 126.9802 20.30212 82.06733 168.4553 β1 -22.7520 3.98476 -30.90587 -13.9531 σ 0.6862 0.15914 0.45366 1.1457 Table 5. Mean estimations for Material#2. µ = 126.9802 − 22.75204 · log(loading) (11) Par. Estimate StdError Low95 % Up95 % β0 86.41222 14.55656 53.28540 112.4174 β1 -14.89503 2.94585 -20.13681 -8.1682 σ 1.18943 0.25506 0.80843 1.8933 Table 6. Mean estimations for Material#3. µ = 86.41222 − 14.89503 · log(loading) (12) Figures 7, 8 and 9 illustrate the quantile analysis. The quantile is a measure of location in statistics. It represents a threshold. A certain amount of the value is below, the residual amount above this threshold. Using the example of Material#1 the threshold is 21675.91 cycles. That means here that for a loading of 132.5 there is a 50 % probability of failure. The blue dotted lines represent the 95 % confidence interval. JMP can also perform a custom estimate. For example, the failure probability at the endurance limit is calculated for each material. The results are given in figures 10, 11 and 12. In the case of Material#3 the material can sustain 4.7*106 cycles at a loading of 93 and a failure probability of 5 %. At a loading of 93 and cycles of 406 the failure probability becomes 4.4 %. 6. Conclusion Due to the very large scatter of the measurement results, the determination of the most suitable model 4 vol. 3/2016 Statistical Evaluation of Fatigue Data of Components Figure 7. Quantile analysis for Material#1. Figure 8. Quantile analysis for Material#2. Figure 9. Quantile analysis for Material#3. 5 Chi Nghia Chung, Zoltan Major Acta Polytechnica CTU Proceedings Figure 10. Custom estimate for Material#1. Figure 11. Custom estimate for Material#2. Figure 12. Custom estimate for Material#3. to fit the experimental data was a major challenge. In some cases, one gets very unsatisfactory graphical representations of the models. By the application of several fatigue models, at least one suitable model for each material could be found. By the implementation of statistical analysis also the large outliers could be included in the parameter estimation. JMP is a powerful tool for statistical analysis. Es- pecially, the analysis of the failure probability is a very important feature. The estimate in JMP within the experimental data range leads to slightly different prediction than that of the fatigue models. Extrapo- lation with JMP out of range is highly dependent on the quality of the measurement data and therefore, does not always leads to plausible results. List of symbols A,B,C,D material parameter n number of cycles ∆σ loading amplitude ∆σ0 loading amplitude at endurance limit ∆σst loading amplitude at ultimate strength σ standard deviation β0 positional parameter in JMP β1 shape parameter in JMP µ estimation for the mean value References [1] T. K. Anastasios P. Vassilopoulos. Fatigue of Fiber-reinforced Composites. Springer Verlag London Limited, London, 2011. [2] M. V. Dieter Radaj. Ermüdung - Grundlagen für Ingenieure. Springer Verlag Berlin, Heidelberg, 1995, 2003, 2007. [3] A. F.-C. E. Castillo. A Unified Statistical Methodology for Modeling Fatigue Damage. Springer Science + Business Media B. V., 2009. [4] U. Heidelberg. http://www.physi.uni-heidelberg.de/Einrichtungen/ AP/Elearning/index.php/animationen/37-anpassung- von-funktionen-an-messdaten/52-die-methode-der- kleinsten-fehlerquadrate 2013. Universität Heidelberg, 2013. [5] S. J. M. C. R. A. Schneider. Best Practice Guide on Statistical Analysis of Fatigue Data. TWI Cambridge, UK, 2003. [6] A. S. for Testing, Materials. Statistical Analysis of Fatigue Data. ASTM, 1981. [7] S. J. Miller. The Method of Least Squares. Brown University, Providence, RI 02912, US. [8] T. E. J. S. Udo Kuckartz, Stefan Rädiker. Statistik - Eine verständliche Einfürung. Springer Fachmedien Wiesbaden GmbH, 2010. [9] A. F. Hans Diefenbacher. Statistik - Eine verständliche Einfürung. Andreas Frank & Ventus Publishing ApS, 2006. [10] P. Schmolck. Methoden der Reliabilitätschätzung. Universität der Bundeswehr München, 2007. [11] S. I. Inc. JMP Version 11 Documentation. SAS Institute Inc., 2013. 6 Acta Polytechnica CTU Proceedings 3:1–6, 2016 1 Introduction 2 Fatigue Models 2.1 The Basquin Model 2.2 The Strohmeyer Model 2.3 The Weibull Model 2.4 The Bastenaire Model 3 Techniques of Parameter Fitting 3.1 The Method of Least Squares (L2-Norm) 3.2 The Method of Least Absolute Deviations (L1-Norm) 4 Reliability 5 Evaluation of Fatigue Data 5.1 Fatigue Models 5.2 Reliability Analysis in JMP 6 Conclusion List of symbols References