Some people forget that the late 90s were a crazy time for game development. Games were suddenly getting exponentially more...
Some people forget that the late 90s were a crazy time for game development. Games were suddenly getting exponentially more complex before the hardware had evolved to fit it. That’s why things like the fast inverse square root and Mario parallel universes exist (Mario moves in floating point space for smoothness, but collisions are calculated in integer space for speed). Nowadays it’s hard to imagine computers failing to understand floating point math, but until that era many PCs didn’t have an FPU to begin with, much less one fast enough to provide enough inverse square roots for 3D lighting before the heat death of the universe. “Bad” programming and mind boggling decisions (from a modern perspective, anyway) aren’t just hacks. They made games what they are.
I’m quite fond of fast inverse square root as it’s simultaneously both an excellent argument for C and an excellent argument against C
When I got to college in the early 2000s I was all excited to take CS courses because I (like probably a lot of people that the intro level was deliberately there to weed out) was like
Programming: The Discipline That Makes Games Like Final Fantasy 6!
except the class was all
Computer Science: A Quest For The Most Elegant Recursive Math!
in fairness the introductory programming course did introduce me to the fact that that was kinda true, I came to realize a programming standpoint the interesting part of FF6 was
* producing a smooth composite image out of multiple animated layers and sprites, which can change position independent of each other due to scripting or player input
and
* compressing the thing to fit into cartridge memory
and that’s not really something that much interested me. The people who followed that trail all the way through ended up writing absurdly complex algorithms they couldn’t fully grasp the function of but on requirements assumed must be for NSA phone tapping; though I hear if you got far enough you could sell out and write absurdly complex algorithms you couldn’t fully grasp the function of but were used for investment bank derivatives trading.
In retrospect those are two hilariously 2005-ass things right there, I wonder what it’s like now.
But then the hardware did evolve, and you had big-league PC gaming chasing video cards down a rabbit hole, but I think what’s equally interesting is how today is the reverse of OP’s setting - just an overabundance of processing power. And how, ironically, that’s allowed the return of small-team/lone genius production that typified ‘90s gaming but has fallen aside in today’s huge multi-studio AAA titles.
Like, Minecraft and Dwarf Fortress, two of the most famous self-published games, and two of the only to make names - Notch and Toady - on the level of the Carmack/Romero/Newell/Molyneux/Wright/Garriott age. They have absurdly behind-the-curve graphics but are still pretty taxing because they ran on naive un- or weakly-abstracted engines that work by checking and calculating an absurd amount of variables for the contents of every cubic foot in the world on every tick while ticking often enough to maintain some playability.
And I mean, despite that they are quite playable. Underlying CPU architecture improves to the point that lone self-taught Toady can spend his time expanding the flavor and scope of the engine - making the calculations more numerous and byzantine - while hardware keeps pace. (The biggest performance optimization I can recall being the ability to neuter cats.)