Field Test Data Processing for the Implementation of Accelerated Rig Test of Sprayer Booms Mykola Stashkiv Ternopil Ivan Puluj National Technical University, 56 Ruska street, Ternopil 46001, Ukraine Abstract A method and results of the digital processing of the sprayer’s booms field test data have been described in the paper under discussion. The purpose of the field test data digital processing is to create equivalent damage test specifications for a rig test using specialized lab equipment (such as a multiaxial servohydraulic shaker) that is controlled in the time domain, and needs time histories file to reproduce load in the laboratory. The edited channels of field test data that have retain all the damage but considerably shorten the length of the drive files is received. This gives significant benefits in accelerating rig test of the sprayer’s booms. Analysis of the obtained results was implemented by comparing the power spectrum densities of the measured and processed signals. Keywords 1 Field test data, processing, software, glyph, data correction, acceleration, rig test. 1. Introduction Durability assessment is an important phase in the development of a new machine or the operation of a machine that has already been created before. It is possible to estimate the durability of the machine directly or to estimate the durability of its most critical assembly or element, the durability of which will be determine the durability of the machine in general. Durability (or fatigue life) can be assessed by theoretical and experimental methods. Theoretical methods of fatigue life analysis do not require significant material costs, but require a good knowledge of mathematical theory. In addition, the obtained results significantly depend on the applied calculation algorithms and accepted assumptions [1]. It is also significant that the basis for the theoretical assessment of durability is operational loads, the nature and magnitude of which are most often obtained experimentally [2]. The main source of input data for theoretical fatigue life analysis is the previous results of structural analysis or the results of the field test, rig test or lab tests. The quality of a fatigue life analysis is thus directly dependent on the quality of the results (stress or strain) obtained from a structural analysis or experiment. In particular, in the paper [3] is proposed an analytical model for the numerical evaluation of the service life of the load-bearing frames of sections of boom field sprayers. The service life of boom elements is represented as the sum of periods of initiation and subcritical growth of fatigue cracks that determined according to the Wöhler diagram and kinetic diagram of fatigue crack growth. The service life of a boom for the maximum amplitudes of cyclic bending of its weakest elements by using this model is computed. To clarify the previous model in the paper [4] is propose a method for the determination of the residual service life of wide-coverage spraying booms of field sprinklers with regard for the Proceedings ITTAP’2023: 3rd International Workshop on Information Technologies: Theoretical and Applied Problems, November 22–24, 2023, Ternopil, Ukraine, Opole, Poland EMAIL: stashkiv@tntu.edu.ua (M. Stashkiv) ORCID: 0000-0002-7325-8016 (M. Stashkiv) ©️ 2020 Copyright for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0). CEUR Workshop Proceedings (CEUR-WS.org) CEUR ceur-ws.org Workshop ISSN 1613-0073 Proceedings maneuvering mode of their loading and the action of corrosive media. This improved model made it possible to increase the accuracy of calculating the service life of the field sprayer’s booms. Field tests give the most accurate results, but require significant material and organizational costs to the preparation of such an experiment. In addition, a full-scale experiment requires considerable time for its preparation and implementation. The financial and time commitment for field test can be reduced substantially using accelerated tests able to reproduce on the investigated structural, the same damage produced on the machine during real life, in a reduced time. To effectively study and predict the structural durability, it is necessary to understand the fundamentals of theory and to know modern approaches to planning accelerated testing. In particular, in the paper [5] is emphasized researchers working on AT programs need to be aware of general principles of AT modeling and current best practices. The purpose of this review paper was to outline some of the basic ideas behind accelerated testing and especially to review currently used AT modeling practice and to describe the most commonly used AT models. In concluding remarks are maked explicit suggestions about the potential contributions that scientists should be making to the development of AT models and methods. The used of the different models is illustrated a series of examples from the literature and own experiences. In the paper [6] principles and approaches to creating accelerated test (AT) models and data analysis are described. ATs can be characterized by the nature of the response variable in the test (i. e., what can be measured or observed, relative to reliability): • Accelerated Life Tests - ALT (the response in an ALT is related to the lifetime of the product); • Accelerated Repeated Measures Degradation Tests - ARMDT (in an ARMDT, one measures degradation on a sample of units at different points in time); • Accelerated Destructive Degradation Tests - ADDT (is similar to an ARMDT, except that the measurements are destructive, so one can obtain only one observation per test unit). These different kinds of ATs can be closely related because they can involve the same underlying mechanisms for failure and models for acceleration. They are different, however, in that different kinds of statistical models and analyses are performed because of the differences in the kind of response. There are different methods of accelerating tests to induce product failures more quickly. These methods vary depending on the nature of the product or material being tested: • Accelerate the Product Use Rate; • Accelerate Product Aging; • Accelerate by Increasing Product Stress. In all types of acceleration, care should be taken to make sure that the underlying mechanisms and the resulting failure modes in an AT are the same as those that will affect the product in actual use. Accelerated tests can be carried out both for the machine in general and separately for its specific assembly or element. So, for example, in the paper [7] the process of speeding up the degradation of a rolling element bearing by over- loading it to uncover faults in a short amount of time are described are described. By running the bearing with a heavy load at a high speed, the life-time is accelerated to uncover faults relatively fast. In the paper [8] the purpose is to design accelerated life testing which involved determination of normal test time, acceleration factor, acceleration test time and test setup for the tractor front axle based on ALT. In the paper [9] is show the development of a methodology to perform an accelerated structural test on a medium power tractor using a 4 post test rig. In particular, several proving ground testing conditions have been performed to measure the loads on the tractor. The loads obtained were then edited to remove the not damaging portion of signals, and finally the loads obtained were reproduced in a 4 post test rig. The proposed methodology could be a valid alternative to the use of a proving ground to reproduce accelerated structural tests on tractors. In the development and validation of new products the use of software to structural analysis and virtual test rigs applying random loading and different frequencies is becoming more relevant. In the paper [10] is developed a virtual test rig for vehicle suspended components validation and definition of experimental test rigs. The study was based on a standard component, using LMS Virtual.Lab Siemens software for the dynamic analysis and durability. From the virtual modeling and experimental data, the proper hydraulic actuators signals were defined to characterize the component behavior according to the field application. In the paper [11] the accelerated life test based on the spectrum of fatigue loads was used both in numerical simulation and bench tests of the push rod. The symmetrical cycling of the push rod was presented as the positive pulsating load spectrum corresponding to the stress spectrum of the critical node in the finite element model. Bench test results show that the fatigue life of the push rod is in good agreement with FEM data. In the paper [12] the full-scale physical test and virtual test of car body are carried out. The data processing method of small deletion and the inverse problem load acquisition method are proposed. Taking the obtained load as the input of the physical and virtual bench, a new fatigue test method for simulating the running attitude of the car body line is completed. The inverse problem analysis results of virtual and physical tests are basically consistent, and the study of this method provides a basis for improving the fatigue reliability of freight car bodies. With the development of controlling techniques, and data acquisition and processing methods, electro-hydraulic servo test rig has been widely use in the accelerated tests to optimize critical performance attributes such as durability, but also NVH (noise, vibration and harshness) and comfort in the different industries. An important task in the preparation of such accelerated tests is the creation of a driver file that will ensure the execution of the tests in accordance to the specified conditions and in the shortest possible time. In the paper [13] an offline iterative learn control loop (ILC) is built to reduce the error between acceleration signals measured on specimen and target signals. In order to eliminate cumulative error of time integration of acceleration, displacements – the “drives” – to be applied to the test rig are obtained by integration of the corrected acceleration in frequency domain. Experiment shows that acceleration spectrum can be appropriately reproduced after several iterative learnings. The most effective way to create driver files to control a test rig is the combined method, which allows creating a set of data for accelerated rig test using specialized software based on experimental data from a field test. Among the whole amount of software to processing experimental data, the nCode software of HBM Prenscia [14, 15] is the most appropriate. It designed for working with large amounts of test data, for signal processing, and performing various studies. nCode GlyphWorks software has a convenient object-oriented graphical interface and it is optimized for the complex work with large volumes of multi-channel data. The software functional includes a wide range of tools for temporary, frequency and statistical analysis of signals. Moreover, some tools are available to assess both the resource and fatigue durability of the products and synchronized reproduction of GPS signals, video and other data obtained during the tests, and also a convenient mechanism for automated creation of reports has been implemented. nCode GlyphWorks software has a module system which enables to create the required feature set either by means of large embedded libraries or by the use of language Python. Working templates in nCode GlyphWorks can be prepared in advance and can be found in the specified library to provide reliability and high speed of calculation [16]. 2. Field Test Data editing for Accelerated Rig Tests In this paper the field test data processing are implemented use nCode GlyphWorks software’s tools for the implementation accelerated rig test of the sprayer booms. The goal of this processing is to use the strain gauges measured test data to preparing drive files for a test rig. The field test data were derived from four channel of universal measuring system each of these represents a uniaxial strain gauge placed in some potentially critical locations on the test object. This test data were obtained and processing using the methods, means and software presented in [16 - 18]. The task is to edit down these channels to retain all the damage but shorten the length of the files as much as possible. This enables significant benefits in accelerating rig test. In an article [18] the fatigue lives of four critical areas of the sprayer boom based on measured strain gauge data is calculated. Another use of fatigue analysis is to create equivalent damage test specifications for a rig test. The fatigue life editing to create equivalent damage test specifications can be realized splitting the signals into temporal windows in which the pseudo-damage (PD) is calculated to the equation [9]: PD = i ni S i4 , ∑ (1) where Si is load amplitude derived from rainflow matrix and ni is cycle number counted in a generic time history. The task now is to edit down these channels to retain all the damage but shorten the length of the files as much as possible. This can have significant benefits in accelerating simulations, for example, when preparing drive files for a test rig. This editing of fatigue analysis results to equivalent damage test specifications for a rig test can be implemented by the nCode GlyphWorks - data processing system that contains a comprehensive set of standard and specialized tools for performing durability analysis and other insightful tasks such as digital signal processing. 2.1. Project development and setting Under test data digital processing conditions by specialized software tools nCode GlyphWorks procedure to create equivalent damage test specifications for a lab test has been implemented. GlyphWorks is a multi-channel, multi-file, multi-format environment for processing large amounts of data. GlyphWorks provides a graphical, process–oriented environment that contains leading analysis capabilities for research of various processes. GlyphWorks represents data analysis processes graphically and lets drag and drop graphical representations of interactive data analysis processes that allow create and save sophisticated working projects for later re-use [16]. The basic analysis building blocks used in GlyphWorks are termed glyphs. Glyphs are connected by pipes, which contain the data that passes between glyphs and attach at the glyph’s pads (different types of I / O pads are marked with different colors). In fact, Glyph is a calculation module (template) with specified algorithms of certain functions performance and with possible setting of different parameters of its properties. A set of glyphs with functional connections is the detailed design of the research [16]. In GlyphWorks, a process is defined as a combination of glyphs that define a data flow. A process typically starts with an input glyph to define the data to be processed. Additional glyphs define subsequent steps in the process for calculation, display, or writing output [16]. The editing of fatigue analysis results to equivalent damage test specifications and creating the drive file for a test rig is implemented according to the developed detailed design (Fig. 1) that contains the following glyphs: 1 – Time Series Input glyph, 2 – Strain Life glyph, 3, 5, 9, 13 – XY Display glyph, 4, 6 – Damage Editing glyph, 7 – Graphical Editor glyph, 8 – Data Value Display glyph, 10 – Time Series Output glyph, 11, 12 – Frequency Spectrum glyph. Functional purpose of these glyphs, structural relations between glyphs and their parameters settings are described below. Figure 1: Damage editing process (completed) The obtained in [16] the time series data output file was uploaded to the glyph TSInput1 (Fig.1, glyph 1). These time series graphs are displayed in the central window of the glyph. We have connected the Time Series Input glyph (TSInput1) to glyph StrainLife1 (Fig.1, glyph 2) from the Fatigue palette. Before running the process, the properties of the strain life analysis need to be set; in this case, we configure this glyph as in [18]. The glyph StrainLife1 output is connected with the XYDisplay1 glyph input (Fig.1, glyph 3). The XY Display glyph has been used to show (Fig. 2) the output data from Strain Life glyph (only the output data for the channel 1 is shown here). Figure 2: Plotting strain gauge data and damage in the time domain for channel 1 2.2. Fatigue Editing Assessment For the fatigue editing assessment is used Damage Editing glyph from the Fatigue palette and Data Values Display glyph and XY Display glyph from the Display palette. The DamageEditing1 glyph (Fig.1, glyph 4) is connected to the time series (blue) output of the StrainLife1 glyph. The XYDisplay2 glyph (Fig.1, glyph 5) is connected to the multicolumn (brown) output of the DamageEditing1 glyph. The DataValuesDisplay1 glyph (Fig.1, glyph 8) is connected to the multicolumn (brown) output of the DamageEditing1 glyph. The next step in the fatigue editing process is to determine appropriate parameters for use in the Damage Editing glyph. For this it is necessary to set the following properties of the DamageEditing1 glyph: Mode: Assessment; WindowMinimum: 0.125; WindowMaximum: 1; WindowSteps: 7; PercentageDamageMinimum: 75; PercentageDamageMaximum: 100; PercentageDamageSteps: 5. The glyph DamageEditing1 properties are shown on the Figure 3. Figure 3: Damage editing assessment properties The assessment mode of the Damage Editing glyph enables to assess the effect of two parameters: a. Window Length – This is the “time slice” of data that is considered in adjacent non-overlapped windows. With a window length of 1 second, for example, the total damage within each one-second window is calculated for each channel. In assessment mode, a range of values is defined; b. Percentage Damage Retained – Given a certain window length, there will be a relationship between the amount of damage retained and the time retained. In assessment mode, a range of values is defined for the damage retained. Varying these two parameters produces a family of curves: Percentage Damage Retained against Percentage Time retained for each Window Length (Fig. 4). Figure 4: Percentage Damage Retained against Percentage Time retained The resulting curves can be used to select an appropriate trade-off between required damage retained against how much time will be saved. Having reviewed the results are shown on the Figure 4 and the Figure 5, in this case we will use a 0.25 second window length and retain 100% damage to achieve an approximately 80% reduction in time. Figure 5: Damage assessment results from Data Value Display glyph 2.3. Damage Editing To edit the time histories with these parameters according to the equation (1), the process was extended so that the new DamageEditing2 glyph (Fig.1, glyph 6) creates a Feature List of sections to be deleted. This Feature List is then used by the GraphicalEditor1 glyph (Fig.1, glyph 7) to perform the required edits, thus shortening the time histories. The Graphical Editor glyph can be added to the workspace from the BasicDSP palette. The output pad from TSInput1 must be connecting to the time series input pad (blue) of the glyph GraphicalEditor1. The multicolumn Feature List output (brown/orange) from the DamageEditing2 glyph was connected to the multicolumn input pad of the GraphicalEditor1 glyph. The XYDisplay3 glyph was connected to the time series output pad (blue) of the GraphicalEditor1 glyph. For the DamageEditing2 glyph was set the following properties: Mode: SliceSelection; KeepFirstAndLast: True; WindowSetting: ByTime; WindowLength: 0.25; DamageSetting: Percent; PercentDamageRetained: 100. The glyph DamageEditing2 properties are shown on the Figure 6. For the GraphicalEditor1 glyph must be set the following properties: EditMethod: Delete; JoinType: HalfSine; JoinTime: 0.05. Figure 6: Damage editing properties for the DamageEditing2 glyph After run the process the “SliceSelection” mode in the DamageEditing2 glyph has identified which sections can be deleted and passed this information to the GraphicalEditor1 glyph in the form of a Feature List (using a multicolumn table of start and end times). The GraphicalEditor1 glyph shows (Fig. 7) that the sections that will be deleted (marked sections) are generally of smaller amplitude than the retained sections. Figure 7: The marked sections that will be deleted The method used in this study ensures that if there is significant damage on any channel, then that time slice will be retained. A join function between retained sections of a 0.05 second half sine is used to avoid sudden discontinuities in the resulting time histories. If connected the original strain gauge channels from TSInput1 into the XY Display’s second blue input pad and run the flow then XY Display glyph will shows both the edited and original strain channels in the time domain (Fig. 8). Figure 8: Comparing edited and original in the time domain In the Figure 8 is easy to see the effect of removing non-damaging sections in the time domain. The max and min values are the same, and the cyclic content is the same, but the time duration has been drastically reduced. This will shorten the rig test time while retaining the important fatigue inputs because these edited time histories are about 85 seconds long, compared with the original 440 seconds. The edited time histories are written to disk using TSOutput1 glyph (Fig.1, glyph 10). This Time Series Output glyph is connected to the time series output pad (blue) of the GraphicalEditor1 glyph. 2.4. Results Assessment To assessment the obtained results, it is advisable to compare them in the frequency domain. The frequency domain is another way of representing a time series. In a frequency domain representation, we are able to see behaviors that would be impossible to identify in the time domain. Time series may be represented in the frequency domain in many different formats. Among these, the Power Spectral Density (PSD) format is the most popular. The PSD is useful for measuring the frequency content of signals and is therefore widely used for analyzing vibrating components [19]. The transformation between time and frequency domain is accomplished using the Fast Fourier Transform (FFT). The FFT gives the amplitude and phase of the signal at different frequencies. The power spectrum shows power as the mean squared amplitude at each frequency line, but includes no phase information [20]. GlyphWorks Frequency Spectrum glyph can easily calculate the PSD from time domain data. This glyph performs frequency spectrum analysis (auto-spectral) on time series data using FFT algorithm. Two the Frequency Spectrum glyph was add to the workspace from the BasicDSP palette (glyph 11 and glyph 12 in the Figure 1) to perform the transformation between the time and frequency domains is performed using the Fast Fourier Transform. The FrequencySpectrum1 glyph was connected to the TSInput1 glyph and the FrequencySpectrum2 glyph was connected to the time series output pad (blue) of the GraphicalEditor1 glyph. The input pads (red) of the XY Display4 glyph (Fig.1, glyph 13) was connected to the output pads of the FrequencySpectrum1 and FrequencySpectrum2 glyphs to showed power spectrum densities signals from they glyphs (Fig. 9). Figure 9: Power spectrum densities comparison Power spectrum densities of the measured and the edited signals that are compared in the Figure 9 have a good correlation. These two trends are very similar because editing doesn’t introduce any anomalous peaks. 3. Conclusion The field tests of the sprayer’s booms can be reduced using accelerated tests able to reproduce on the structural part of this machine the same damage produced on the sprayer’s booms during real life, in a reduced time. The presented procedure of the field test data digital processing by nCode GlyphWorks software’s tools made it possible to obtain of set of edited time histories is ready for use in driving the test rig. Reduced the test duration by more than five times (about 85 seconds versus the original 440 seconds) at the same damage level of structural will have a huge impact on the time required to run the durability test in the lab. Furthermore the test on the lab rig can be performed without interruption due to unfavorable weather conditions or the need for rest of the driver unlike the field tests. The transformation of the obtained results between time and frequency domain using the Fast Fourier Transform are showed that power spectrum densities of the measured and the edited signals are similar enough and correlate well. 4. Acknowledgements I am very grateful to the company HBM Prenscia and the team nCode for the possibility to use their software and for the information support. My special thanks to Lukasz Pieniak – Account Manager Prenscia. 5. References [1] M. M. Pedersen, Introduction to Metal Fatigue, 2018, Department of Engineering, Aarhus University, Denmark, 91 pp.: Technical report ME-TR-11. [2] X. Ma, F. Yang, J. Li, Y. Xue and Z. Guan, Fatigue life assessment method of in-service mechanical structure, Advances in Mechanical Engineering, 2021, Vol. 13(2), 1–9. [3] Т.І. Rybak, A.V. Babii, I.M. Bortnyk, G.B. Tsion, and S.I. Konovalenko, Evaluation of the Service Life of the Frames of Sections of Boom Field Sprayers, Materials Science, 2019, 55, 374–380. doi 10.1007/s11003-019-00312-0. [4] О.E. Andreikiv, А.V. Babii, & І.Yа. Dolinska, Influence of the Working Media and Maneuvering Loading Mode on the Service Life of Spraying Booms of Field Sprinklers, Materials Science, Vol. 56, December, 2020, 166–173. doi 10.1007/s11003-020-00411-3. [5] L.A. Escobar and W.Q. Meeker, A Review of Accelerated Test Models, Statistical Science 2006, Vol. 21, No. 4, 552-577. doi: 10.1214/088342306000000321. [6] F. Pascual, W. Meeker, & L. Escobar, Accelerated Life Test Models and Data Analysis. Springer Handbook of Engineering Statistics, 2006, 397–426. doi: 10.1007/978-1-84628-288-1_22. [7] A. Klausen, R.W. Folger, K.G. Robbersmyr, H.R. Karimi, Accelerated Bearing Life-time Test Rig Development for Low Speed Data Acquisition, Modeling, Identification and Control, Vol. 38, No. 3, 2017, 143 – 156. doi: 10.4173/mic.2017.3.4. [8] Azianti Ismai, Won Jung and Qiang Liu, Accelerated Life Test Design for Tractor Powertrain Front Axle, MATEC Web of Conferences 74, 00020 (2016) ICMER 2015. doi: 10.1051/matecconf/20167400020. [9] M. Mattetti, Giovanni Molari, A. Vertua, A. Guarnieri, Tractor accelerated test on test rig, Journal of Agricultural Engineering 2013, volume XLIV(s2): e76, 381 - 383. doi:10.4081/jae.2013.s2.e76. [10] J.N. de Araújo, L. Hoss, A. Viecelli, & M. Molon, Virtual Test Rig Development for Accelerated Durability Analysis, SAE Technical Paper Series, 2016. doi:10.4271/2016-36-0061. [11] L. Xu, & G. Z. Dai, Accelerated Life Test and FEM Simulation-Based Fatigue Analysis of an Aluminum Alloy Push Rod, Strength of Materials, 2019. doi:10.1007/s11223-019-00047-y. [12] S. Zhao, X. Li, D. Wang, and W. Li, Key Technologies of Physical and Virtual Test Rig for Railway Freight Car body, Materials 2022, 15, 5439. https://doi.org/10.3390/ ma15155439. [13] L. Ma, & H. Zhoua, Study acceleration spectrum replication on an electrohydraulic servo test rig with displacement control, Procedia Engineering, 2011, 16, 204–210. doi: 10.1016/j.proeng.2011.08.1073. [14] hbkworld.com, nCode (Signal Processing and Durability Analysis). URL: https://www.hbkworld.com/en/products/software/analysis-simulation/durability [15] Altair.de, nCode DesignLife by HBM Prenscia, 2022. URL: https://www.altair.de/ncode- designlife. [16] M. Stashkiv, O. Matsiuk, nCode GlyphWorks software use for test data processing, The 1st International Workshop on Information Technologies: Theoretical and Applied Problems 2021 (ITTAP 2021), Vol. 3039, 192-205. [17] R. Hevko, M. Stashkiv, O. Lyashuk, Y. Vovk, V. Oleksyuk, O. Tson, I. Bortnyk, Investigation of internal efforts in the components of the crop sprayer boom section, Journal of Achievements in Materials and Manufacturing Engineering, Volume 105, Issue 1 (2021), 33 – 41. doi: 10.5604/01.3001.0014.8743. [18] M. Stashkiv, I. Lytvynenko, V. Stashkiv, Test Data Processing Use for Structural Fatigue Life Assessment, The 2st International Workshop on Information Technologies: Theoretical and Applied Problems 2022 (ITTAP 2022), Vol. 3309, 241-258. [19] Sophocles J. Orfanidis, Introduction to Signal Processing, Rutgers University, 2010, 795. [20] Isabela F. Apolinário and Paulo S.R. Diniz, Introduction to Signal Processing Theory, Academic Press Library in Signal Processing, Vol. 1: Signal Processing Theory and Machine Learning, Chennai: Academic Press, 2014, 3-28. doi:10.1016/B978-0-12-396502-8.00001-2. [21] M.R. Petryk, A. Khimich, M.M. Petryk, J. Fraissard, Experimental and computer simulation studies of dehydration on microporous adsorbent of natural gas used as motor fuel, 2019. Fuel239, 1324-1330. doi: 10.1016/j.fuel.2018.10.134. [22] S. Lupenko, I. Lytvynenko, A. Sverstiuk, A. Horkunenko, B. Shelestovskyi, Software for statistical processing and modeling of a set of synchronously registered cardio signals of different physical nature. CEUR Workshop Proceedings, 2021, 2864, 194–205.