The pc industry leads many advances in data acquisition applications. However, data acquisition devices are also improving independently. It is also vital to note that customers are demanding designs that decrease settling times, improve calibration accuracy, and provide more i/o per device while enabling simultaneous operations without data loss. This article aims to take a closer look at the raised bar for data acquisition systems, and the trends driving technology innovations.
Engineers buying data acquisition systems now have higher expectations than ever before. The criteria for choosing the vendor and products to meet the measurement and control needs have also undergone significant changes. These changes can be broadly grouped under a few categories that are part of industry- wide trends. Thus, companies should manufacture products that consistently meet or exceed customers expectations. However, in most cases this means a redesign in architecture. While companies are successively providing improvements in their products, the need of the hour is a giant leap in performance and capabilities, while driving down the price.
Seems like a tall order, but if these trends are anything to go by, it is almost mandatory to deliver all of these benefits.
Trend 1: Requisite for more I/O at lower price
This trend has emerged due to a combination of various factors. Prominent among these is the realisation that there is a significant price saving to be made by buying a single product with greater channel count, as opposed to multiple low channel count products. Though to a lesser extent, another contributing factor is the convenience of implementing as much of the application as possible on a single device.
As a result, for the past few years, manufacturers of data acquisition (DAQ) products have progressively built products with higher channel counts. Earlier multifunction DAQ products came standard with 4,8,16, 32 or 64 analogue input channels, at the most 2 analogue output channels, and up to 32 digital lines. However, identifying the importance of these trends, National Instruments new M-series multifunction DAQ devices come with 16, 32 or even 80 analogue input channels, up to 4 analogue output channels, and up to 48 digital lines.
In addition to the increased channels, these devices lower the cost per channel of data acquisition to below $ 15. This was possible primarily because of a redesign from ground up, and led to a commitment that other manufacturers need to make to meet increased channel, and lower price constraints.
Trend 2: Enlarged data throughput
As the number of channels and functions on a device increases, the potential for bottlenecking or errors due to data handling and data transfer increases. Many plug-in DAQ devices are limited by the rate at which they transfer data to PC memory. Modern DAQ devices include onboard direct memory access (DMA) channels, which send data at high speeds directly from the device to PC memory without processor involvement. DAQ devices that execute more operations than available DMA channels require interrupt request lines (IRQs). As data transfer rates increase and more operations are simultaneously performed, these IRQs begin to monopolize PC processor time, causing system slowdowns and buffer overflow errors. The new M Series DA devices can control 6 DMA channels for the execution of six operations simultaneously, mitigating the risks of losing data or encountering data transfer errors.
Trend 3: Enhanced calibration
Electronic components, such as analogue-to-digital converters (ADCs), are characterised by non-linearities and drift due to time and temperature. To compensate and ensure accurate measurements, devices must be calibrated.
Legacy devices use an onboard 5 V precision voltage reference to perform a two-point self-calibration at 5 V and 0 V ground to generate a straight-line approximation of the ADC Linearity. When subsequently measuring voltage levels away from the 0 V and 5 V calibration points, the linear approximation from calibration may produce significant inaccuracies.
Next-generation devices, such as the National Instruments M-series, can self-calibrate the overall input ranges. This results in an accurate polynomial correction curve rather than a linear approximation. This calibration methodology improves measurement accuracy by as much as 500 per cent.
Trend 4: Reduced settling time
Have you ever stared at a television too long then glanced at an adjacent wall only to see a resemblance of the TV? This happens when your retinas have not quickly settled and cannot return an accurate image of the adjacent wall to your brain.
DAQ devices have a similar problem. When the ADC multiplexes between two channels with large voltage differences and the signals are digitized too quickly, the measurement is inaccurate. With the exception of simultaneous sampling plug-in DA devices that have a dedicated ADC for each channel, an ADC must wait and settle down before it can take the next measurement accurately. Design advances providing faster settling times increase the DAQ scan rates and maintain accuracy.
National Instruments realised that most ADCs used on DAQ devices were not customised for instrumentation. The only solution to this problem was to create a custom programmable gain instrumentation amplifier (NI-PGIA 2) that was optimized for low settling times. This provided the advantage of achieving true accuracy even at high sampling speeds.
Trend 5: More than just a driver
Most device drivers successfully expose hardware functionality. But not all device drivers are created equal. According to a survey performed by National Instruments, engineers typically spend more than 50 per cent of their total system costs on setup and development.
Measurement services, such as data logging programmes and code generation tools, can reduce these development costs. Additionally, every driver should include software that makes it easy to control, manage, test, and calibrate devices.
Trend 6: Approval of new bus standards
In the early 1990s, Intels PCI computer bus earned wide industry acceptance and quickly became the standard. The maximum transfer limit of 132 Mbps promised higher throughput, and DAQ devices began capitalizing on this benefit. This was a much higher rate than traditional stand- alone instruments could transfer. The incentive to use PC technology for instrumentation just got more attractive. By riding the trend of rapid PC advancements, PC-based instrumentation can take advantages of these advancements.
Examples of this are the use of USB, PXI, PCI Express, etc. USB is excellent for portable applications and easy device setup; the market is flooded with USB DAQ products. The list continues with PXI (PCI extensions for Instrumentation) for advanced timing, triggering, and synchronization; and Ethernet is used for networking distributed devices. New standards, including PCI Express, are showing promise.
The M-series DAQ devices pave the way for the next-generation of DAQ by incorporating all of these improvements and leveraging these industry trends. Additionally, timesaving software bundled with drivers and online resources will continue to increase productivity. New generations of plug-in DA devices coupled with desktop PC advances will promise to improve the speed of discovery for engineers, scientists and technicians worldwide.
|Posted : 8/29/2005|