It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
Back in the days of MS-DOS, if you would try to run a game on a computer that's significantly faster than what the game was designed for, it would run really fast. (For example, try running Ultima 2 (without the fan mod that's out there) at a high cycle count under DOSBOX; if the game doesn't crash with a division by zero error when you load it, it will likely run way too fast.) However, that doesn't seem to happen with modern games; if you take a modern game with low system requirements (like a typical indie game) and run it on a fast machine that far exceeds the minimum requirements, it will still run at a reasonable speed.

My question is, why doesn't the modern game run too fast?

Furthermore, if writing a game, what would be the best way (on a modern computer) to control the speed of the game so that it doesn't run too fast? How is this done in modern games?
I'm guessing because they don't tie framerate to game speed anymore? Well except for crappy Japanese devs
high rated
In the old days, people kept time by counting the number of clock cycles. Measure how long it takes to go through one step of a counting loop and divide the clock speed of the computer by that number.

That worked, because the computer you wronte the software for had a limited range of processors (often 1), rather like consoles.

In the consumer market, the popularisation of binary software that survived CPU upgrades and wider ranges of processing capabilities caused the end of clock counting.

Modern systems have timers. that measure in microseconds. When the time elapses the code executes and then idles until the next period. Counters for time are strongly discouraged as being grossly energy inefficient and wildly inaccurate. It will do nasty things to battery life for notebooks and clock frequency boost mode will mess with your results..

We haven't completely left that era. There are some games on GOG that suffer from this sort of problem.


Prince of Perisa Warrior Within.
A console port that assumes a fixed framerate. If you exceed 30FPS, the voice dialog falls behind the subtitles/lib sync.

Risen
Another game. With a high enough frame rate, you can glitch the game by trying to climb up onto a ledge. V-Sync works, but needs to be forced by the driver, the game can't do it.
First search result seems to answer it reasonably well.
avatar
dtgreene: Back in the days of MS-DOS, if you would try to run a game on a computer that's significantly faster than what the game was designed for, it would run really fast. (For example, try running Ultima 2 (without the fan mod that's out there) at a high cycle count under DOSBOX; if the game doesn't crash with a division by zero error when you load it, it will likely run way too fast.) However, that doesn't seem to happen with modern games; if you take a modern game with low system requirements (like a typical indie game) and run it on a fast machine that far exceeds the minimum requirements, it will still run at a reasonable speed.

My question is, why doesn't the modern game run too fast?

Furthermore, if writing a game, what would be the best way (on a modern computer) to control the speed of the game so that it doesn't run too fast? How is this done in modern games?
Depending on your definition, yes, some "modern" games may run too fast if framerate isn't limited. A common symptom is wonky physics and motion when the time delta between each frame becomes too small and significant calculations end up becoming eaten by rounding errors.

The mechanism that prevent a game from running too fast are generally 1) slow code 2) vsync or sleep to limit framerate.

Problems caused by too fast a framerate can be prevented by decoupling physics rate from framerate, but this means additional steps need to be taken to interpolate between (or extrapolate from) physics frames. Otherwise there'd be no point drawing the visuals at anything more than the physics rate since nothing would move between these frames.

Another way to alleviate the problem is to simply use higher precision for the computations. If you have enough bits and the calculations are arranged with numerical stability in mind, it's quite feasible to make a game that won't run too fast for it to become a problem in a very long time.

EDIT: And yeah, as some people pointed out, games use a proper clock instead of cycle counter (or fixed step per frame) to keep time.
Post edited September 16, 2019 by clarry
GOOD QUESTION!

I'll put my best men on it!
As stated above, it simply depends on whether game logic, frame rate and/or physics are decoupled or not. It's also not exclusively an old games problem. You can run Doom on a 2 GHz processor and it will run fine because the game has a 35 fps lock and game logic "speed" is capped as well.
Most Japanese developers still use the old ways of tying everything together, because that's the way that makes sense on consoles. Almost every fighting game is locked at 60 fps, and most will go into slow-motion if the frame rate drops below because frame skip is not implemented. There are also many cases of games going into "turbo mode" on 120 and 144 Hz monitors.
avatar
Paradoks: Most Japanese developers still use the old ways of tying everything together, because that's the way that makes sense on consoles.
Why does it make sense on consoles? I never understood that. It doesn't make sense to me, at least not on any modern game on a modern console that is going to do lots of multitasking and dynamic resource allocation & loading & streaming etc.

It kinda would make sense on ancient consoles without operating system; game running directly on the hardware with no other tasks to interfere.
Post edited September 16, 2019 by clarry
Because sometimes people learn from their mistakes?
avatar
Paradoks: Most Japanese developers still use the old ways of tying everything together, because that's the way that makes sense on consoles.
avatar
clarry: Why does it make sense on consoles?
Mostly because of expected fixed frame rate, but I'll admit that is changing with consoles offering more options today (with quality or performance settings) and the problem of actually keeping that targeted frame rate.
But you can see this is still done this way today - in Resident Evil 2 Remake for example the knife deals more damage the higher the frame rate is. The frame rate of objects in the distance is also lower by design (fighting games also do this, like recent Mortal Kombat 11 - which really is annoying because it only hurts PC players).
avatar
clarry: The mechanism that prevent a game from running too fast are generally 1) slow code 2) vsync or sleep to limit framerate.
How does v-sync work from the game programmer's perspective, anyway?
avatar
Paradoks: Most Japanese developers still use the old ways of tying everything together, because that's the way that makes sense on consoles.
avatar
clarry: Why does it make sense on consoles? I never understood that. It doesn't make sense to me, at least not on any modern game on a modern console that is going to do lots of multitasking and dynamic resource allocation & loading & streaming etc.

It kinda would make sense on ancient consoles without operating system; game running directly on the hardware with no other tasks to interfere.
Fun fact: I have found out that Super Mario Brothers (on the NES) actually uses the vblank interrupt to do its timing; when booted, the game does some initialization, sets up the interrupt service routine (ISR), and then gets into an infinite loop; all of the game logic is inside the ISR.

I have been wondering whether the analogous method would work for a modern game. Would this work? (It seems that many libraries, including SDL, have a way to get a callback to run at regular intervals, for example.)
Post edited September 16, 2019 by dtgreene
avatar
clarry: The mechanism that prevent a game from running too fast are generally 1) slow code 2) vsync or sleep to limit framerate.
avatar
dtgreene: How does v-sync work from the game programmer's perspective, anyway?
The call that presents the framebuffer on the screen blocks until the GPU has passed it on.

I have been wondering whether the analogous method would work for a modern game. Would this work? (It seems that many libraries, including SDL, have a way to get a callback to run at regular intervals, for example.)
If you wanted to use vblank interrupt for timing, you'd rely on vsync, that gets rid of tearing for free. Software callback timers are generally not as accurate as vsync; there's only so much sleep resolution a general purpose non-realtime OS can offer. The other problem (SDL's) callback timers introduce is threads. If you just want to avoid burning through cycles (without relying on vsync), you'd generally use some form of sleep (such as SDL_Delay) but again resolution may not be great.

It's a bit of a dilemma, I hate the input delay introduced by vsync, and timers may not be accurate enough for careful framerate control. Unless you burn CPU in a busy loop, but that can couse other issues with the scheduler when other tasks need to run and no cores are available.

Then there's triple buffering which can reduce the latency issue but you're still burning CPU/GPU rendering frames that aren't needed.
Post edited September 16, 2019 by clarry
avatar
dtgreene: Back in the days of MS-DOS, if you would try to run a game on a computer that's significantly faster than what the game was designed for, it would run really fast. (For example, try running Ultima 2 (without the fan mod that's out there) at a high cycle count under DOSBOX; if the game doesn't crash with a division by zero error when you load it, it will likely run way too fast.) However, that doesn't seem to happen with modern games; if you take a modern game with low system requirements (like a typical indie game) and run it on a fast machine that far exceeds the minimum requirements, it will still run at a reasonable speed.

My question is, why doesn't the modern game run too fast?

Furthermore, if writing a game, what would be the best way (on a modern computer) to control the speed of the game so that it doesn't run too fast? How is this done in modern games?
Because framerates are tied to other factors than computer cycles since there is a much wider variety of CPUs with different cycle rates out there than there were back in the day.
avatar
dtgreene: My question is, why doesn't the modern game run too fast?

Furthermore, if writing a game, what would be the best way (on a modern computer) to control the speed of the game so that it doesn't run too fast? How is this done in modern games?
There you go, this is your answer to these questions on how developers, both new and old, manage to run their games at a stable frame rate.

Delta-time is of course a more modern term, but the same principles are applied in older video game code that fixes frame refresh rates without using v-sync. The oldest perceptable example I know is Frontier, but I think that there are much older games that includes a form of Delta-timing so the game doesn't run too fast.
Post edited September 16, 2019 by Dray2k
avatar
Dray2k: There you go, this is your answer to these questions on how developers, both new and old, manage to run their games at a stable frame rate.

Delta-time is of course a more modern term, but the same principles are applied in older video game code that fixes frame refresh rates without using v-sync. The oldest perceptable example I know is Frontier, but I think that there are much older games that includes a form of Delta-timing so the game doesn't run too fast.
I don't know if you read the wikipedia article you linked, but what it describes is not a way to stabilize the framerate but to make things move around at a (relatively) constant rate in the face of unstable framerate.

And this mechanism is precisely what leads to rounding errors when framerate is too high (i.e. delta becomes very small and significant digits vanish from the calculations).