Copyright © Romilly Bowden 1997, 1998.
Now only of historical interest! - November 2004
It's now quite widely understood that many computer systems and software will be upset by the arrival of the 21st century in just over a year. The problem arises from the common habit of using only two digits to represent the year, assuming "19" for the hundreds digits. Obviously, this assumption will be wrong from 1 January 2000; indeed it may well be wrong before that, for applications which look ahead into the future. The two-digit representation may be used internally to store dates in memory or disk files, or it may simply arise in the format used to display and enter dates. In either case, a human user might well be able to resolve the ambiguity, but a computer program probably won't, at least not without some new rules to guide it.
There are two straightforward ways to fix these problems. You can re-write affected parts of a program to use 4-digit years, or you can add a new rule to interpret two-digit years using a "pivot year" (for example: any number xx above 50 is "19xx", while 50 or lower is "20xx"). The second method only postpones the problem, but that may be acceptable, depending on the expected life of the product. Using a variable pivot year, depending on the actual date, may also help.
Of course, the hard part is to find and understand the operation of the program code, especially in older or poorly-documented software.
For a fuller discussion of the millennium problem (otherwise known as "Year 2000" or "Y2k") look at the DTI information page at http://web.archive.org/web/20000511090455/http://www.bsi.org.uk/disc/year2000.html (WayBack Machine Archive), or the "Year 2000 Information Center" website at http://web.archive.org/web/19991113040046/http://www.year2000.com/ (WayBack Machine Archive).
On the whole, industrial process measurement and control systems are less likely to be seriously affected than business systems, which may depend heavily on the use of dates. However, we cannot afford to ignore the problem.
So, what about the HART protocol? Is HART itself "millennium-proof"? The short answer is YES, HART IS OK. But read on ...
HART does define a format for dates, which is used in Universal HART commands #13 and #18. Every HART device must store a date, in order to implement these commmands. The protocol specification doesn't say what this date should be used for, only that it must exist. Typical uses are to record the date of the last instrument calibration, or the date of the next one; this is entirely at the whim of the host application programmer.
The HART date format consists of a set of three 8-bit numbers (that is, numbers which could be anything from 0 up to 255) representing
Happily, because the year byte represents "year minus 1900", this scheme will work unambiguously until 31 December 2155. Dates represented in this way are not affected by the "millennium problem". (Personally, I'm not too worried about what happens in 2156. Maybe HART will not still be in use by then. At least, I don't expect to be around to worry about it!)
However, just because HART is ok as a communication protocol, doesn't necessarily mean that products using HART are ok. It is true that most of today's field instruments have no concept of date and time, so will not be affected. But a few instruments, and a good proportion of host application software, could be affected - particularly those which include datalogging or configuration management functions.
Instruments may include dates stored in proprietary formats. Use of the HART date format is not mandatory for internal functions or other HART commands. As a pure "data carrier", the HART protocol can convey any data in device-specific commands. And a field device may be adversely affected if its embedded software uses 2-digit date representations, either in internal calculations or in local display or data entry for an operator.
In practice, however, it is much more likely that host computer applications will have problems, depending on how they interpret and display dates, and how they accept operator input. Use of only two digits in any of these areas could lead to ambiguous or wrong interpretation (for instance, telling you that an instrument doesn't need calibrating for another 99 years!).
As a device designer, you should:
If your existing devices or applications do not meet these criteria, make sure your customers understand the implications.
As a user, you should:
* All but the most recent PCs may have built-in millennium problems:
Application software may make use of any of these functions.
Your software application supplier cannot be expected to overcome these basic PC hardware and firmware problems. Unless they also supplied the PC (and possibly even then!), that is likely to be your responsibility. Your PC supplier may be able to help you. If you want to know more, visit http://www.rightime.com for details, and for test programs.
The Rosemount HART Communicators do not contain a real-time clock, and do not use the HART date themselves. But they can be used to enter or display dates from a field device, and in the past, both used a 2-digit year field.
The 268 does not correctly deal with dates beyond 31 December 1999, and will not be upgradable to overcome this.
The 275 can be upgraded with a new revision of its operating software, allowing 4-digit date display and entry, for complete Year 2000 compliance. To find out whether your 275 needs this upgrade, go to http://web.archive.org/web/20001101154742/http://www.rosemount.com/y2k/upgrade275.html (WayBack Machine Archive).
If you don't care about date display and entry, you should be able to continue using your old communicators indefinitely.