Actually what just occurred to me was something more mundane of a proper definition for this. A game's quality can be controlled to help with performance, usually graphics extra options (be they fog, blur, AA, meshes&textures, lighting, shadows, etc). Games that were meant for big gaming rigs but could be played on budget PC's would be perfect for this, since you'd play on low settings. THEN over the years when you got a better PC you'd crank up the graphics quality. Literally the game got better because technology was better (and cheaper), not because the game was changed or patched.
One example i think of is Diablo 2, which looked fairly good as i recall, but i almost ALWAYS played it using direct draw and 256 colors. However I've also seen it run on 24bit color with 3D perspective and extra options on. To this effect games that had multiple graphics card settings also would be a good qualification in old DOS days, where CGA was common (Lemmings, Commander Keen, etc), then people had VGA and the 16-256 colors was a huge leap forward vs the 4 colors they were getting before.
Now putting all that, patches that make the game actually work and fun to play are probably all good and well, but it's hard to say if it makes a game like Daikatana worth playing regardless. I remember the intro and it was so horrible, looked like a really badly done PS1 game.
Post edited February 02, 2016 by rtcvb32