Health & Safety

An Update on Instrumentation for Vibration Measurement - and Its Calibration

Author: John Shelton on behalf of Svantek

Free to read

This article has been unlocked and is ready to read.

Download

Much has been written about the accuracy of sound level meters and calibrators, along with procedures for calibration and measurement. However, it is fair to say that more and more practitioners are involved in the measurement of vibration, whether it’s for health & safety or ground vibration for nuisance and building damage. There seems to be a stark contrast between standards for sound and vibration, as well as a lack of understanding of procedures for vibration measurement.

For example, BS EN IEC 61672:2013 bolts down the performance criteria for sound level meters, and in parts 2 and 3, their pattern evaluation and laboratory verification procedures. Similarly, sound level calibrators are defined in BS EN IEC 60942:2018, and filter characteristics in BS EN IEC 61260:2014.
Similarly, the procedural standards for noise measurement, such as BS4142:2014, are quite proscriptive about how measurements should be made, and which calibration procedures followed, e.g. ‘thou shalt calibrate your sound level meter at the beginning and end of each series of measurements’ and ‘thou shalt send your sound level meter for laboratory calibration/verification every two years’, and so on.
So, what are the equivalent procedures for vibration measurement? After all, one could argue that a sound level meter is simply a fancy voltmeter with a microphone attached, and a vibration meter is really no different, except it will have a vibration transducer plugged in. Previously a sound level meter could be converted into a vibration meter by plugging in an ‘integrator’ and accelerometer and wielding a cunning protractor to convert  decibels into ms-2!
Of course, there are many standards covering vibration measurement instruments and transducers too, but perhaps we should familiarise ourselves with the most relevant and look at how we can improve the practice of measuring vibration to an acceptable uncertainty.
Let’s start by looking at the applications which interest us most:-
1) Human vibration
        a. Health & Safety
            i. Whole body vibration
            ii. Hand-arm vibration
        b. Nuisance & disturbance
2) Building Vibration
        a. Building Damage
        b. Noise re-radiation
        c. Sensitive equipment

Human Vibration
The Health & Safety aspect of human vibration is well defined by the Physical Agents Directive Vibration 2002/44/EC which is incorporated into UK law as the Control of Vibration at Work Regulations 2005. Performance of suitable vibration meters and procedures for their use is covered by various standards, such as BS EN ISO8041:2017, BS EN ISO 2631:1997, BS EN ISO 5349:2001, and so on. Whilst there is still some controversy about mounting of transducers, which will in the future be covered by standards development for personal vibration exposure meters (PVEM), in general, measurements for health & safety is a mature art.
For the nuisance and disturbance aspect, once again ISO 8041 comes to the fore, again using weighted acceleration measurements, but unusually in BS6472-1:2008, using a root mean quad (RMQ) detector. A vibration dose value (VDV) is used for the final assessment.

Building Vibration
The target in this case is the building itself, where over the mists of time, peak particle velocity (PPV) measurements have been used as a damage criterion in e.g. BS 7385-2:1993, BS ISO 4866:2010, BS 5228-2:2009, etc.
In addition, 1/3 octave RMS velocity spectra can be used for noise prediction, as well as comparison to rating curves for sensitive environments, e.g. VC, NIST, etc.
Both applications share much in common, with the use of a vibration transducer connected to a suitably calibrated instrument, but often using two different types of transducer; an accelerometer or a velocity transducer (geophone). This has been covered in more detail in previous Instrumentation Corner articles, July/August 2012 being an example.

BS EN ISO 8041-1:2017
Despite the many and varied standards applying to vibration measurements, the nearest we currently have to the sound level meter standard is BS EN ISO 8041-1:2017 which has recently been heavily revised.
Although all the existing standards listed above will refer to the previous version (2005 now withdrawn), all professionals should look to the current version for advice, and future instrumentation will be developed accordingly.
The current version of the standard introduced three sections (13-15) which brings it more into line with BS EN IEC 61672, principally one-off validation of instruments, periodic verification of instruments, and field calibration, this latter requiring the use of a field calibration device suggested in Annex A.
Section 13 concerns only the instrument developers/manufacturers, but the end-user needs to be aware of the remaining sections. Although this standard does not specifically apply to velocity-based instrumentation, it would be sensible to follow similar procedures if possible.

Vibration Calibration
As with sound, calibration of a vibration meter at its simplest is applying a known vibration level to the system and comparing the measured result to that expected. The known vibration level will be checked using another measurement chain, which at its highest level would be a laser interferometer, or more commonly, a reference accelerometer which has a sensitivity traceable to the higher standard.
The concepts and procedures are described in the BS ISO 16063 series of standards, which currently numbers 45 parts, each one describing the different accuracies and methods for a given type of transducer. Most of these will only concern your chosen calibration laboratory.
So, to calibrate our measuring system, all we have to do is mount the transducer on to a vibrating surface and take the measurement. The vibrating table will typically be a moving coil vibration exciter (shaker), which can generate enough acceleration at the desired frequency, with minimum distortion, low cross-axis vibration and adequate linearity.
Being moving coil, like a loudspeaker, they will have a non-flat frequency response, and the amount of force it can deliver will depend on the product of the flux density of the magnet, number of turns on the coil and the current supplied. Why are we worried about force? Newton’s second law tells us that the force will depend on the product of the mass of the transducer and the acceleration, so in simple terms, the larger/heavier the transducer, the larger the required force to achieve the same acceleration.
As well as producing a known acceleration, we have to choose a reference frequency, as we do with sound level meter calibration (1kHz). This will depend on application, so in the standard, the reference frequencies (preferred values) are 79.58Hz for hand-arm vibration, and 15.915Hz for whole-body vibration – the rather odd-looking frequencies coming from 500rads-1 and 100rads-1 angular frequency, chosen because it’s easy to calculate velocity and displacement from acceleration. 10ms-2 acceleration at 1000rads-1 is 10mms-1 velocity for example.
The higher frequency makes sense, as it is in the pass-band of the hand-arm weighting curve Wh, and the lower frequency lies within the pass-band of the whole-body family of weightings, such a Wd and Wb. Therefore, we can use such signals to validate the correct function of the weighting networks too.
Generating sufficient force to accelerate the mass of a small general purpose accelerometer at 500 or even 1000rads-1 to say 10ms-2 doesn’t really take a large shaker, which is why there is a range of suitable hand-held vibration calibrators on the market already, with mass limits between 70 and 300 grams, some with selectable frequency. Most have a frequency of 159.6 Hz, which is permissible although not preferred by the standard.
However, with larger (therefore heavier) high-sensitivity transducers (both accelerometers and geophones) used for ground vibration work, to paraphrase Roy Scheider in Jaws, “you’re gonna need a bigger shaker”, which rather obviates the ‘hand-held’ moniker.
Such devices are coming onto the market which fulfil the ‘portable’ description, allowing field calibration at frequencies down to 15.915Hz. BS ISO 16063-44:2018 covers the requirements for such devices, such as stability, level/frequency accuracy, distortion, cross-axis, etc and as this standard was released only last year, it is not explicitly mentioned in BS EN ISO 8041.
For moving coil geophones, there are two other considerations. The first is that geophones come in two types – vertical and horizontal – and therefore have to be calibrated in their axis of operation. Clearly, vertical geophones can use a standard calibrator as above, but using such a device horizontally may introduce unacceptable distortion and uncertainty. The only solution to this is to use a (rather expensive) horizontal slip table, more likely to be found in a primary laboratory.
Secondly, many geophone-based systems have their transducers built-in and not easily accessible in the field, meaning the whole instrument must be included in the mass – so unless the transducer can be removed, not a job for any field calibrator.

Vibration Calibration for the end-user
Two types of calibration concern the end-user – periodic laboratory calibration, and field verification/calibration.
Section 14 of BS EN ISO8041-1:2017 covers the periodic verification of your vibration meter and sets out the tests required to be performed by the laboratory. In practice, this will often mean that the instrument is tested electrically for e.g. linearity, weighting networks, detectors, etc and it makes sense to specify only those tests which are relevant to the type of work required. For example, the standard specifies a long list of weighting networks, whereas most practitioners will only ever use Wh, Wd and Wb so why test the rest at increased cost? Similarly, the transducer will be tested individually at a range of frequencies according to the relevant parts of BS ISO 16063, and only brought together with the instrument when calibrated using a Part 44 calibrator.
This section only suggests ‘regularly’ regarding the periodicity of calibration, but I would suggest good practice dictates alignment with the approach taken with sound level meters, i.e. every two years. If damage is suspected to the transducer, then this could be re-calibrated more often.
Section 15 covers in-situ checks. As part of the normative rubric of the standard, the use of a field calibrator is specified, at the preferred frequency, according to the documentation of the measuring instrument (Section 10 and Annex G). I darkly expect that most manufacturers’ documentation will not currently make reference to the use of such a field calibrator, as required, but one would hope this will be corrected in due course.
A specification for a field calibrator is given in Annex A, which it is to be expected will be updated to reflect the newer BS ISO 16063-44:2018.
These in-situ tests also include mechanical inspection of cables and connections, a common cause of poor measurement data, so a sensible approach.
Is in-situ calibration necessary?
One apparent loophole in the standard appears as a note in Section 15.3 of the standard, as follows:-
If, according to gained experience, it can be assumed that the sensitivity of transducer and instrument do not alter, a quantitative determination of the overall sensitivity of the vibration meter can be omitted. In this case, however, a mechanical overall tapping test is mandatory to demonstrate that the signal path is uninterrupted.
In other words, if you are sure your measurement system including transducer is stable and working correctly, based on your experience, then you don’t need to check it. Simply tap it a few times.
This seems at odds with the sound level meter instrument standards and noise measurement procedural standards, which specify regular field checks ad infinitum. This begs the question as to why we should take a different approach to vibration measurement? Perhaps it comes from the perceived fragility and instability of measurement microphones, or the apparent robust nature of accelerometers and geophones. However, even with the use of incredibly stable MEMS microphones for Class 1 noise measurements, it’s likely that calibration will still be required.
Vibration transducers often appear less fragile, but high sensitivity accelerometers will easily be damaged when dropped on concrete floors, resulting in a cracked crystal, which will ruin its frequency response, even though it will still yield a similar sensitivity. Similarly, conditioning electronics can develop faults and instabilities, including active damping used with geophones.

Conclusion
No doubt, further discussion will ensue on best practice when it comes to the use and calibration of vibration measurement instrumentation, but it is to be hoped that the standards and procedures for sound and vibration instrumentation will slowly evolve to become more homogeneous. Inconsistency in calibration procedures are just one example.
This process is already under way in Technical Committee TC29 of IEC with regard to harmonising standards for measuring microphones, sound level meters, calibrators. Would it be too much to ask to see the bigger sound & vibration picture?

 

Free to read

This article has been unlocked and is ready to read.

Download


Digital Edition

AET 28.4 Oct/Nov 2024

November 2024

Gas Detection - Go from lagging to leading: why investment in gas detection makes sense Air Monitoring - Swirl and vortex meters will aid green hydrogen production - Beyond the Stack: Emi...

View all digital editions

Events

Safety & Health Expo 2024

Dec 02 2024 London, UK

Valve World Expo

Dec 03 2024 Dusseldorf, Germany

Aquatech China 2024

Dec 11 2024 Shanghai, China

Abu Dhabi Sustainability Week

Jan 12 2025 Abu Dhabi, UAE

World Future Energy Summit

Jan 14 2025 Abu Dhabi, UAE

View all events