Diagnostic and monitoring systems have been available for many years; however, they have been hampered by several fundamental limitations. Generally, diagnostic systems have been proprietary and application specific, with little benefit outside a narrowly defined piece of equipment or process. In addition, these have been too expensive to include as part of the overall process automation and monitoring system. Traditional diagnostics systems often require specialised hardware for monitoring a specific function. In addition, most monitoring systems lack real data from the process system, or the devices they are designed to monitor.
Finally, monitoring systems have very limited connectivity to operations, engineering, or business functions within the plant. User interfaces, databases, annunciation, and other functions are frequently proprietary and not interoperable. Solving these shortcomings requires a new approach to diagnostics and monitoring, and a new architecture capable of handling the volume and diversity of information needed to detect problems, isolate or localize the problem to a specific source, and determine the root cause so that the problem can be effectively addressed.
A field-based architecture combining advanced diagnostics and control in the field effectively addresses many of these limitations, and provides an unprecedented opportunity to improve financial performance.
Overview to field-based architecture
A field-based architecture uses the power of field intelligence to improve plant performance. Over the past years, combination of increased computing power in field devices, higher performance in device sensors, and a fully capable and standard communications protocol, foundation fieldbus, has enabled field devices to deliver a step increase in functionality and value. These factors may change the fundamental definition of a process automation architecture, and process control.
Field devices today can perform functions including but not limited to closed loop basic and advanced regulatory control and discrete control. These devices can also perform statistical process monitoring. In addition, these can detect and calculate both actual process variability, and theoretical minimum variability. Improved sensors can detect process conditions over a broader frequency range and with greater accuracy and repeatability than in the past. This information reveals fundamental process signatures such as drift, bias, noise, stuck, and spike. These signatures combined with process control information can then be used to detect a wide variety of equipment and process conditions ranging from remaining sensor life, or plugged impulse lines, to control abnormalities with flow, temperature and level loops, to operational or performance problems with process units. This value can only be extracted, only if the automation architecture can access and use the information in a coordinated way.
Traditional automation architectures are not capable of accessing, using, or delivering this functionality or value to the user. These architectures are designed to access a single value, or a value and basic status from a field device, and deliver that information essentially exclusively to the process control system for closed loop control, or the operator for viewing and manipulation. These are not designed to deliver control information like mode, alarms and alerts, setpoint changes, or other control information back to field devices for use or analysis.
The field-based architecture is fundamentally different in its ability to access, disseminate, and use this information. First, data analysis is partially or completely done within the devices themselves. This reduces the communications bandwidth requirements by orders of magnitude. Additionally, information available to the raw sensor, but not available to the system can be used to perform analysis that are not possible in any other way. Finally, by combining this information with process control actions such as mode, setpoint, load, and other changes, fundamental insights to the health and performance of the equipment and the process become available for the first time.
Signal validation techniques rely on accurate process information. Conventional centralised approaches can lack actual process information, and instead use mathematical models to infer process conditions. If the mathematical models do not reflect actual process conditions, then erroneous results will be unavoidable. In addition, time delays in delivery of information can mask actual process conditions. Smart field devices have the advantage of providing accurately time stamped information directly to the control system as anomalies develop. Using pattern recognition and statistical analysis methods field devices can now detect various process anomalies such as drift, bias, noise, spike, and stuck behaviors of each process. This enables operators to take corrective steps faster, thus avoiding conditions that could cause a process upset or shutdown.
Increasing economic efficiency is the most important task of various engineering groups and managers of industrial plants. Plant engineers traditionally approach efficiency through implementing optimum process control systems. However, an increase in economic efficiency cannot be reached simply by providing better control schemes to an operating plant. Improving plant availability and increasing efficiency come from early detection of anomalies, providing condition-based real-time maintenance that will result in improved plant availability. It is easy to determine the inefficiency of applying validatory schemes at that high level when we consider the signal linearisation, damping, and communication delays that masks the true readings of sensors. Even though readings are accurate by the time they reach these high-level sensor validation systems, the critical information hidden in the raw data has been lost.
Both these approaches depend on recognition of system condition and therefore directly depend on the quality of observed signals. However, most of the previous works on signal validation concentrates on cases where there is either hardware or analytical redundancy. For instance, if there are three or more sensors measuring the same process variable, one can implement a consistency checking or majority-voting algorithm for signal validation and anomaly detection.
In addition, model based techniques are heavily used when different types of measurements are available for the same process or the same system. Most of these techniques are based on modeling the normal behaviour of a sensor by auto regression time series and then monitoring its behaviour. These techniques for signal validation have been integrated with high-level control and data acquisition systems to provide assistance to plant operators.
Technique for assembling information
Advancements in communication and software technologies are now enabling the integration of islands of automation to provide an asset management system to provide a plant- wide view for the operator.
The advent of the DCS and smart instruments in the 70s and 80s has provided the foundation for automated process control that is pervasive in todays industry. The usage of process control has been attributed as having reduced the operating cost of process industry plants by 5-9%.
The advances in communications and software have now given birth to a new automation architecture most often referred to as the field-based architecture. The field-centered architecture is founded on the ability to network field devices, i.e. sensors, valves, smart motors, etc, with controls and asset management systems to provide for an integrated information system that can be used for control and automated maintenance. Since field devices are closest to the process and provide measurements on these processes, performing device or process related anomaly detection could be another task for intelligent field devices.
Technique for disseminating information
Traditionally, fault detection has been part of the control system, i.e. implemented as signal validation modules. Field devices could not handle the tasks that fault detection methodologies require due to the limited firmware capability of the older technologies. However, todays smart transmitters, using advanced silicon technology, are capable of providing more information richness and data analysis.
Process anomalies exhibit five statistical signatures common in all measurement and process types: pressure, temperature, flow, level and others. Using pattern recognition and statistical analysis methods field devices can now detect drift, bias, noise, spike, and stuck behaviors of each process; where:
·Drift: Sensor/process output changes gradually
· Bias: Sensor/process output shows a level change
· Noise: Dynamic variation in the sensor/process output is changed
·Spike: Sensor/process output is momentarily very high or low
· Stuck: Dynamic variation in the sensor/process output is decreased
The approach and the key features of the developed local fault detection technology make it applicable to broad ranges of industrial processes are:
· No additional hardware in the detector system is assumed
· No mathematical model of the process is necessary
· No mathematical model of the sensor is required
Advanced diagnostics approaches address: fault detection, fault isolation and root cause analysis.