|
|
Prior chapters have covered the mechanics of acquiring data from various sources and translating this data into bits and bytes. This chapter will focus on tools available to translate data into worthwhile information. These tools are primarily programming languages and off-the shelf software packages running under Microsoft Windows and Macintosh operating systems on personal computers.
The steps required to select a data acquisition software program are:
1)determine the features and performance necessary to accomplish the task at hand;
2)determine the available resources of time, money and expertise; and
3)select the tools to perform the task.
Selection of tools requires investigation of available software programs and languages, and of operating systems under which these programs and languages will run.
Various versions of Microsoft Windows have come to dominate the operating system market for PC-based data acquisition systems. Many data acquisition software packages are designed to run on more than one operating system, but the Windows operating system is almost always one of the available choices.
At one time the technical performance of Windows was significantly inferior to other more "real-time" systems. More recent versions of Windows, however, have largely ameliorated the real-time problem, either through the operating system itself or through the use of various operating system extensions. These extensions improve the real-time performance of Windows and allow for deterministic scan times.
As in the office, the OS/2 and Macintosh platforms seem destined to recede in market share, investment, and new software applications. The relative technical merits of PC operating systems were an interesting area for discussion several years ago, but these considerations have receded into the background as much of industry has standardized on the Windows platform.
One big difference remains between office and industrial operating systems. There remains a strong market for low-end, embedded applications in the industrial arena. There is no analogy for this in office computer systems. Original equipment manufacturers (OEMs) with product runs in the tens of thousands need a low cost solution that addresses the task at hand with no wasted overhead. This market continues to be addressed by DOS/386 machines, often in compact form factors such as PC-104 or STD. OEMs with embedded hardware/software solutions often continue to use just enough horsepower to solve the task at hand.
|
Off-the-shelf software packages provide the data acquisition systems developer with a range of graphical user interface and signal processing tools. |
Development Considerations
Once the operating system is selected, the next decision to be made is whether to write custom software or to buy an off-the-shelf package. Writing software is much like cooking dinner, and buying an off-the-shelf package is like eating at a restaurant. If you are good in the kitchen, it may be better to cook. The dinner may be less expensive-although a bit more work-and customization will be easier.
The risks of cooking dinner (writing software) are the very real possibilities of a bad cook (programmer), or bad ingredients (wrong programming platform), or both. Furthermore, it is very difficult to determine if the dinner is bad until the cooking is finished. Only the most experienced master chefs (programmers) can determine the probability of a bad meal (program) early on in the task. After the cooking (programming) is done, it is too late and too expensive to start over. Even the best chef may not be able to rescue a poorly prepared dinner.
The restaurant (off-the-shelf) is a more dependable and initially more expensive approach. Much depends on the reputation and reliability of the restaurant (software vendor). The best restaurants consistently deliver high quality meals, and stand behind their offerings if things go wrong.
Custom software requires the involvement of at least two people at all stages of software development. The application expert must understand the problem and transfer that knowledge to the programmer. The programmer must then write the software and review the completed program with the application expert to close the loop. Training an application expert to be a programmer is rarely feasible because the learning curve is steep and is measured in months or even years.
Off-the-shelf software usually can be configured by an application expert. Training is required, but typically is measured in weeks. This greatly reduces development time because communication problems and delays between the experts and the programmers are eliminated. Expenses also are lower because fewer people are required to complete the project. Figure 6-1 shows some of the relative merits of writing code versus the off-the-shelf approach.
The majority of applications are well suited to off-the-shelf programs. The chief drawback to these programs has been the high cost of software licenses. A full-featured data acquisition program used to cost in the neighborhood of USD$2,000, but prices today are closer to USD$1,000 and continue to fall. These low prices mean that custom software's price advantage is only apparent in rare situations where the software cost is high relative to the total system cost and where product runs are in the thousands.
Off-the-shelf programs score high in customer acceptance. Clients are much more apt to purchase a data acquisition system with name-brand software. In fact, to most clients the software is the system because software is the front end seen day in and day out.
The drawback (from a systems integrator's point of view) to customer acceptance of off-the-shelf software is the necessity of complying with client demands for their specific brands of software. This can make development and support costs prohibitively high, especially if market share is split among many software vendors.
Off-the-shelf programs are at their best in the related areas of support cost, modification cost, and risk of obsolescence. Assuming the right software vendor is chosen, a user can expect regular upgrades. These upgrades should accomplish the following while protecting the investments made in existing application software:
Figure 6-1: Cost of Writing vs. Buying Software |
ATTRIBUTE |
WRITE IT |
BUY IT |
Initial cost |
Low |
High |
Cost of subsequent copies |
None |
Relatively high (but falling) |
Customization |
Infinite |
Restricted (but improving) |
Cost to modify |
High |
Low if desired changes are part of upgrades, very high
or impossible otherwise |
Cost to support |
High |
Low if upgrades address bugs in a timely manner |
Customer acceptance |
Low |
High |
Risk of unusable program |
High |
Low if right vendor is selected |
Risk of obsolescence |
High |
Low if right vendor is selected |
Speed |
High |
Low |
Use of system resources |
Low |
High |
Fix bugs: All software has bugs. Good vendors listen to their clients to find these problems and promptly issue bug fixes, or at least workarounds.
Incorporate new features: Each new revision should add important product features.
Provide customer support: Upgrades for most data acquisition software can now be purchased as part of an annual support agreement. These agreements generally require an annual investment of 10-20% of the initial cost of the software. This investment yields regular upgrades and comprehensive customer support including phone hotlines, Internet, e-mail, and newsletters.
Maintain backward compatibility.
One area where off-the-shelf software has traditionally been weak is flexibility for customization. This is changing rapidly as most packages now offer developer toolkits for creation of custom features. These toolkits allow a user to program special functions in a popular programming language such as C++. These functions can then be seamlessly integrated into the application program. Special software functions must be developed to comply with toolkit specifications. This reduces the cost of support because the toolkit specifications standardize software customization.
When is the write-your-own approach effective? An OEM with a planned product run of thousands of units would probably choose to write a data acquisition program, especially if the cost of software made up a large percentage of total product cost. The high cost of subsequent copies of off-the-shelf software would tend to outweigh all other factors. Another situation necessitating home-grown software would be an unusual application not addressed by commercially available programs.
Many of these types of applications involve high-speed data acquisition. In these applications, system resources must be concentrated on speed and not on the overhead required to support off-the-shelf programs. Another compelling reason to write your own software is to minimize the use of system resources such as processing power and memory. Many OEMs find that custom-written programs can perform quite adequately on DOS/386 machines with minimal memory. This can significantly reduce the total cost of the data acquisition system.
On the negative side, obsolescence is a major problem. This is because home-grown software must incorporate the latest advances in the field in order to remain competitive. Large software firms with thousands of customers can afford to write such features into the next release of their programs, but firms not in the software business tend to put their development dollars into their primary products instead of ancillary software programs.
Another problem with custom software is high cost for modification and customer support. Custom programs usually are authored by a small group of programmers, or even by just one person. A firm can become highly dependent on a very small group of employees, and if these employees are no longer available, support can be extremely expensive and difficult. Even when the original author or development team is available, prices may rise dramatically as they grasp their negotiating leverage.
Component software claims to address many of the problems associated with support and modification of custom programs. The next section will address the problems and promise of this new methodology.
|