Talk:2697: Y2K and 2038
Y2K issues solved back in 1996. Even wrote a letter to the Board of Trustees. 2038 Problems are not-my-concern. Retired 9/30/2022.172.70.110.236
- Many of the people who helped solve the Y2K problem were pulled out of retirement. Lots of the issues were in old COBOL software, and there weren't enough active programmers who were competent in COBOL. So keep your resume ready. Barmar (talk) 20:07, 11 November 2022 (UTC)
this is so weird I just finished a research assignment on the Y2038 problem 172.71.166.223 18:27, 11 November 2022 (UTC)
Somewhere there is an essay about the unexpected synergy between the Y2K bug and the burgeoning open source movement, which may or may not be useful for the explanation. 172.70.214.243 20:18, 11 November 2022 (UTC)
- https://www.livehistoryindia.com/story/eras/india-software-revolution-rooted-in-y2k is a fascinating essay too. 172.70.214.151 21:03, 11 November 2022 (UTC)
- I wouldn't be surprised if there's such an essay, but I suspect it's more of a coincidence. The late 90's was also when the Internet was really taking off, and that may be more of a contributor. Barmar (talk) 23:04, 11 November 2022 (UTC)
- All involved what epidemiologists call coordinated or mutually reinforcing causes, IMHO. 172.71.158.231 01:41, 12 November 2022 (UTC)
Speaking of which, what comes after Generation Z? Generation AA? ZA? Z.1? Help! 172.70.214.243 07:24, 12 November 2022 (UTC)
- Generation Alpha 172.69.34.53 07:27, 12 November 2022 (UTC)
- Zuckerbergs Army. --Lupo (talk) 15:18, 12 November 2022 (UTC)
I've been unable to confirm this so I'm moving it here: A major problem had struck IBM mainframes on and after August 16, 1972 (9999 days before January 1, 2000) that caused magnetic tapes that were supposed to be marked "keep forever" instead be marked "may be recycled now."[actual citation needed] 172.71.158.231 07:37, 12 November 2022 (UTC)
Does the arrow move over time? ... should it? (I think so!) It could be done server side and only regulars would [see, sic] that it changes over time. Then... perhaps we could see different versions of the strip cached on the Internet. --172.71.166.158 08:30, 12 November 2022 (UTC) It isn't, of course, but if it was a .GIF with ultralong replace-cycles then only those who kept the image active would see the arrow move in real-time. (It would reset to now's "now" upon each (re)loading, so it would have an even more exclusive audience, aside from those that cheat with image(-layer) editing. ;) ) 172.70.162.57 13:32, 12 November 2022 (UTC)
Should we mention anything about that it is that specific year in a specific calendar? As far as I know there was also fear of a similiar bug in Japan recently. However Wikipedia seems not to be up to date about it. --Lupo (talk) 15:18, 12 November 2022 (UTC)
Does anyone know of an actual program or OS that stored the year as two characters instead of a single byte? I have (and had back then) serious doubts that any problems existed. Even the reported government computers had people born prior to 1900 entered, so they already had to have better precision than "just tack on 1900." Even using a single signed byte would still have been good for another 5 years from now. SDSpivey (talk) 17:22, 12 November 2022 (UTC)
- In my experience (I lived and worked through the Y2K preparations) it wasn't so much "an actual program", or necessarily a fundemental limitation of an entire OS (though the roots of the problem effectively date back to key decisions surrounding the developmet of the IBM System/360 in the 1960s), but a matter of how data was held in human-readable but space-saving format. Someone in the '70s (or even up into into the '90s) may have decided their system could store some date as the six characters representing DDMMYY (or ay of the other orders) secure in the knowledge that the century digits were superfluus - and would have perhaps sent the footprint of a standard record over some handy packable length for the system, say 128 bytes. Which was a lot in those days.
- (If the year value had been recorded in 16bit binary, or even 2x7bit or doubled 6-bit, it could have been as good for the computer, but oh the fuss to convert to and from a human-orientated perspective. And it worked neatly enough, right?)
- And a useful implementaion might be used, in some form or other for a long time... Sometimes the storage system is upgraded (kilobytes? ha, we have megabytes of space now!) and the software to handle it might be ported and even rewritten, but at each stage the extra data has to match the old program, and the new program has to read and write the current data, however kludged it actually is. And it works, at least under the care of those who dabble in the dark arts of its operation. And not many others are bothered or even have any idea of what ;ies beneath the surface.
- Until somebody starts to audit the issue and asks everyone to poke around and check things... Thenthings get sorted in-situ or a much needed (YMV!) change of process is swapped in, in the place of old and (possibly) incorrect hacks. 172.69.79.133 20:00, 12 November 2022 (UTC)