Home | Forum | DAQ Fundamentals | DAQ Hardware | DAQ Software
Input Devices | Data Loggers + Recorders | Books | Links + Resources
Sensors are most commonly used to make quantifiable measurements, as opposed to qualitative detection or presence sensing. Therefore, it should be obvious that the requirements of the measurement will determine the selection and application of the sensor. How then can we quantify the requirements of the measurement? First, we must consider what it’s we want to measure. Sensors are available to mea sure almost anything you can think of, and many things you would never think of (but someone has!). Pressure, temperature and flow are probably the most common measurements as they are involved in monitoring and controlling many industrial processes and material transfers. A brief tour of a Sensors Expo exhibition or a quick look at the internet will yield hundreds, if not thousands, of quantities, characteristics or phenomena that can be measured with sensors.
Second, we must consider the environment of the sensor. Environmental effects are perhaps the biggest contributor to measurement errors in most measurement systems.
Sensors, and indeed whole measurement systems, respond to their total environment, not just to the measurand. In extreme cases, the response to the combination of environments may be greater than the response to the desired measurand. One of the sensor designer's greatest challenges is to minimize the response to the environment and maximize the response to the desired measurand. Assessing the environment and estimating its effect on the measurement system is an extremely important part of the selection and application process.
The environment includes not only such parameters as temperature, pressure and vibration, but also the mounting or attachment of the sensor, electromagnetic and electrostatic effects, and the rates of change of the various environments. For ex ample, a sensor may be little affected by extreme temperatures, but may produce huge errors in a rapidly changing temperature ("thermal transient sensitivity").
Third, we must consider the requirements for accuracy (uncertainty) of the measurement. Often, we would like to achieve the lowest possible uncertainty, but that may not be economically feasible, or even necessary. How will the information derived from the measurement be used? Will it really make a difference, in the long run, whether the uncertainty is 1% or 1½%? Will highly accurate sensor data be obscured by inaccuracies in the signal conditioning or recording processes? On the other hand, many modern data acquisition systems are capable of much greater accuracy than the sensors making the measurement. A user must not be misled by thinking that high resolution in a data acquisition system will produce high accuracy data from a low accuracy sensor.
Last, but not least, the user must assure that the whole system is calibrated and trace able to a national standards organization (such as National Institute of Standards and Technology [NIST] in the United States). Without documented traceability, the uncertainty of any measurement is unknown. Either each part of the measurement system must be calibrated and an overall uncertainty calculated, or the total system must be calibrated as it will be used ("system calibration" or "end-to-end calibration").
Since most sensors don’t have any adjustment capability for conventional "calibration", a characterization or evaluation of sensor parameters is most often required. For the lowest uncertainty in the measurement, the characterization should be done with mounting and environment as similar as possible to the actual measurement conditions.
While this guide concentrates on sensor technology, a properly selected, calibrated, and applied sensor is necessary but not sufficient to assure accurate measurements.
The sensor must be carefully matched with, and integrated into, the total measurement system and its environment.
Updated: Wednesday, September 11, 2019 17:28 PST