In the early days of computer software development, programmers used a two-digit representation for the year in a date to save memory (i.e. 61 instead of 1961). This will create, and in some cases is already creating, serious problems as the computer sees the year 2000 as 1900. In this paper, we will further clarify the problem and survey some existing methods of finding, fixing and testing for accuracy of possible solutions. We will evaluate these techniques and some existing software products that deal with this problem. We will give a real-life example of how the problem has been solved at an institution of higher learning.
|Number of pages||13|
|Journal||Kuwait Journal of Science and Engineering|
|Publication status||Published - Dec 1 1999|
- Code understanding
- Year 2000
ASJC Scopus subject areas