The technical issue seems to lie with the way that Microsoft was naming updates for its malware-scanning engine, putting the year, month, and day (220101) at the front of another four-digit number (0001) ...The problem appears to be that the field this number was stored in had a limit of being 31 bits
This is what happens when a "tech reporter" doesn't understand tech.
yyyymmddnnnn requires at least 37 bits to store in the human-readable format described in the article. That means the format described by the reporter is wrong, or the number of bits reported is wrong. To fit into 31 bits it has to be 2 orders of magnitude smaller - yyyymmddnn.
This bug is even more egregious than the article suggests. The format of their update naming scheme is yyyymmddnn. In other words, there are 100 possible names for any given date. 31 bits gives us 2,147,483,648 possible update names. That's enough room for 58,794 years' worth of update names. The real bug isn't that the field was limited to 31 bits, but that they stored the date in human-readable format which is about the most inefficient way possible to use that 31 bits. The consequences were easily foreseeable by any moderately competent programmer.
but that they stored the date in human-readable format which is about the most inefficient way possible to use that 31 bits. The consequences were easily foreseeable by any moderately competent programmer.
The "senior programmer" above me pulled this kind of bullshit on me when I was new. He ain't ignorant, knows better, he just didn't seem to give a shit. Probably just didn't care because everything he gets to work on is blazing fast ARM SOC-type shit, whereas 90% percent of the work I got at the time was on 8-bitters generally running at somewhere around 16 MHz or worse.
So when I had to interface with his supercomputer shit using the protocol and driver he came up with, I got to crunch on decoding ascii and then grind through floats. Things that are not exactly synonymous with 8-bit microcontroller and fast. He gave me some shit code to use utilizing slow C std library functions, and expected me to make my shit respond within 62 mS. Asshole questioned the efficiency of my protocol handler and peripheral driver code on my side, instead of suggesting maybe it's his stupid faggot protocol and driver.
Had to bend over backwards for a month completely re-writing the protocol driver he gave me... NOT my app or hardware code. Couldn't change what was already in the stupid existing protocol I had to dance around, but at least they let my version of the driver replace his when they saw how damned fast it was, and in totally portable C. Blew that spec time completely out of the water. It could now also handle binary data, which let us expand the protocol in a way that was desperately needed. Yeah, I was utilizing the micro's hardware behind the scenes to it's fullest, but one should be doing this anyway at all times.
Without changing the app/hardware code on my side, response time-his protocol driver = ~90 mS. My driver = ~20 mS. No exaggerations, no embelishments.
I guess what I'm getting at here is that many coders just don't give a flying fuck that their shit rolls downhill, because it's easy and convenient for them to do so. And I wanted to take the opportunity to insert some bragfagging.
Most programmers don't give two shits. I used to work at a company that made enterprise employee time management hardware and software, e.g. timeclocks and payroll software. One feature of their system was that you could have authenticated users clock in and out from their desktop. The hardware and desktop software recorded the in/out times as a SQL command to a central server, but the username and password were stored in plaintext on each client machine. Worse still was no details about the time entry were stored in the database: no originating IP address, no client MAC, no nothing. It just literally directly wrote the date and time to a field.
When I reported what a tragically horrible bug this was, nobody was interested. They said, "nobody will go to the trouble." So guess what I did? I made a little Perl script that would punch me in and out at work even if I wasn't there. They had no way to detect what was happening. To avoid it being to obvious I just set it to enter a random time that was plus or minus 7 minutes from my start and end time. It worked beautifully. I never manually clocked in again as long as I worked there.
That's cool. I'd probably forget to turn it off on a sick day and get caught.
(post is archived)