It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

Scoffers of the 2012 phenomenon take note

page: 5
16
<< 2  3  4   >>

log in

join
share:

posted on Jul, 30 2012 @ 09:59 AM
link   
reply to post by r2d246
 

reply to post by infiniteobserver
 

reply to post by litterbaux
 

The Y2K bug... What was it? Was it the end of the world or a proposed end to mankind? No, but it was perceived as such. All that the Y2K bug encompassed was a programming error. Before computers were mainstream devices; before every house had at least one computer (or computer type device, i.e. - Smartphone, tablet, etc. etc.); before Microsoft held the market share; computers were expensive, large and slow. The amount of overall memory allotted to the running of a program really mattered.

As I have explained many, many times before, at the level of programming that we are discussing, saving space in memory was the ultimate aim because you saved space and affected overall system performance (programmers still try to save space in memory but not to the extent that they did back then; which is a loss because this leads to bloated code and inefficient methods, which are not the aim of this discussion).

So what was this programming error? Quite simply it was an error caused by the storage of a variable type, instead of storing 4 characters as either an int16 or uint16, I can save space and store the 2 characters needed for the year as an int8 or uint8. This will save space in memory and storage because an int8 or uint8 will only use 8bits or 1 byte to store the year. Whereas an int16 or uint16 will use 2 bytes.

Why does this matter? Well to you and I, differentiating between 76 and 1976 simply falls into the realm of contextual representation. The context is based upon the understanding about the century being discussed or represented as the year within the current century. For example, if I were to say, in 76 the Cubs will win the pennant, you will think I contextually mean 2076 vs. 1976 (by my usage of *will*, or future tense). A computer, however, will not know whether I mean 1976, 2076, 2176 or any number that ends in 76. By that same right, in 2000, will the computer think that the 2 year representation means 1900, 2000, 500, 100? That was the question that caused the Y2K bug or glitch. The glitch was very evident on things that did not bother to update (for example _javascript represented the year 2000 as 19100).

So what happened? Many nights were spent by many programmers (myself included) rewriting and fixing code to ensure that nothing happened. Testing, rewriting and more testing. Up to and beyond Y2K. This is why I really got a chuckle when Y2K-itis went rampant in advertising. For example, "You can buy this couch, which is Y2K compatible, for only 699.99". I mean it was a storm of this and that, which was not even affected by the Y2K glitch.

So in conclusion, Y2K was a computer programming bug/glitch but perceived as the end of the world. Y2K was never proposed as the end of the world. Would it have made life more difficult? Potentially yes, but thanks to the efforts of millions of individuals working countless long nights and days, it did not.

12/21/2012 on the other hand. Is only a perceived danger in some charlatans mind.

-saige-



new topics
 
16
<< 2  3  4   >>

log in

join