That reminded me of something. When I was young, if I remember correctly, Windows 95 (if not 98) had this weird behavior that when installing programs, wiggling the mouse cursor make the progress faster. What caused this? I googled for it, I couldn’t find anything related.
I had no idea this was a thing, and the explanation for it… Makes sense, strangely enough.
It is too vague, but never-the-less I distinctly remember windows 9x using user interactions to adjust the priority of 16bit software. The 16bit virtualization in windows didn’t know when DOS applications needed CPU time. DOS apps frequently poll for events under the assumption that they’re running continuously in real time. So windows devs implemented heuristics that could work with 16bit polling code most of the time. After user input events the 16bit code would be given CPU time to process those events, but then it would revert to spurts of intermittent CPU. This would allow most DOS applications to respond to input quickly while keeping them from monopolizing the CPU all the time.
More specifically when using impulse tracker (a DOS based mod music tracker) under windows, the interface would refresh as the music is playing in realtime as long as you held down a key (it could even be a “shift” key). But if you didn’t interact with it the DOS screen would stall & lag under windows. Windows may have had a setting to boost the CPU for these apps. Interestingly the audio was not affected since the DOS player used interrupts, which don’t require polling and worked under 16bit virtualization.
The interesting thing here is that every technical answer that I am reading about this is just an educated guess about timers, events, queues and backwards compatibility. Yet every single user that I know that is dealing with a slow/hanging system immediately vigorously shakes his mouse hoping that that will bring back the blood flow of the system
I swear this still works sometimes in OS X.
I can vouch for the event-loop hypothesis. In the late 90’s I worked at a software firm that had a top-tier MSDN access. One day I asked our R&D department manager (hi Nate!) why moving the mouse sped up one of our published programs, thinking I might have found a bug in the UI. His response was exactly what the article describes.
And then in a “ha ha I’m better than you” display, he showed me how the behavior was mitigated on his Windows NT Workstation desktop. So I asked him why Microsoft couldn’t adjust the Windows 9X event loop manager to show the same behavior. His response, in so many words, was, “They will, when they make the NT kernel the standard on their desktop OS’s.”
I kind of liked these mid-80s-to-late-90s OS quirks. It made you feel closer to the machine.
kurkosdr,
I concur. IMHO there was never a better time to get into computing to learn bare metal programming. These days it’s much harder since there’s no DOS equivalent and there are more restrictions and barriers to entry. Technically one might still access bare metal by writing bootloader code today, but it’s a larger gap than writing simple .com files which were perfect for learning the basics. A beginner might try their hand at microcontrollers, which haven’t changed much since the 90s at least. But I personally didn’t learn on microcontrollers, ,that only came after I learned on a real PC. On mobile devices where you have to flash your bare metal code onto a chip that’s soldered into the mainboard with inadequate standards, that adds a new risk of bricking hardware that just wasn’t a problem back in the 90s.
So I consider myself fortunate for having been around when computers weren’t as restrictive and more conducive to learning the bare metal fundamentals as a young programmer without fundamental training.