It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
clarry: dtgreene is a technical person and so am I. The devil is in the details, and I believe we both find it interesting to discuss these details. I'm sorry that you don't.
avatar
ColJohnMatrix: lol Excuse me, what? I already did and added actual code that provides a working example. Your response stepped beyond that into the realm of criticism, obvious condescension, and strawman, and frankly I've seen nothing from you giving you the credentials to do so. You're some unbelievable piece of work...

But, by all means, show your code sample that's a vast improvement... I'm waiting.

avatar
clarry: And yeah, I know plenty of game developers ship subpar code. The results speak for themselves. Of course, it helps that the mass market seems to be oblivious to stutter, latency, and other performance issues that can be traced to poor design. And people who complain are shut up by telling them to buy a better computer. "Sorry if that doesn't work out for you."
avatar
ColJohnMatrix: You're conflating multiple issues, somehow attributing them to this approach (et al), and then by proxy deflecting to me for demonstrating... What you seem to fail to grasp is these things aren't the issues that you think they are. Not unless you're working on a much larger game design or something for modern consoles where it becomes critical. A vast amount of games use the same technique listed above with minor tweaks, generally adding delta physics, and the things you seem to think are issues just aren't. In short, your thoughts on this are vain nonsense that sounds great on paper but doesn't hold up to obdurate practical reality. You think you're "being technical", but you're just being pathetically pedantic without cause.
#include <iostream>
using namespace std ;

int main()
{

cout << " Calm Down! " << endl;

return 0;
}
Don't have time to read the whole thread so maybe this was mentioned, but as an owner of a 144hz monitor I can definitely tell you some games mess up a lot above 60fps. Fallout 3, 4 and New Vegas for example, their physics go all nutty above 60fps.
avatar
clarry: dtgreene is a technical person and so am I. The devil is in the details, and I believe we both find it interesting to discuss these details. I'm sorry that you don't.
avatar
ColJohnMatrix: lol Excuse me, what? I already did and added actual code that provides a working example. Your response stepped beyond that into the realm of criticism, obvious condescension, and strawman, and frankly I've seen nothing from you giving you the credentials to do so. You're some unbelievable piece of work...

But, by all means, show your code sample that's a vast improvement... I'm waiting.
I don't understand why you take it so personally. I'm just pointing out technical deficiencies in the approach you present, nothing more. The purpose is education (the very point of this thread), not condescension. There's no need to get upset.

I'm not interested in boasting with credentials but I do software for a living, anything from kernel & drivers to userspace. Dealing with these sort of details is something I do and have done for >two decades. And yes I've fixed games that run bad because of bad design. And yes I've debugged issues that come down to interaction with the scheduler or poor timing mechanisms. And yes I've written game loops.

Of course you wouldn't take my word for it, so instead of trying to convince you, I'll just point out that I'm not the only professional developer who has come to the conclusion that it's a good idea to yield the CPU: https://github.com/TTimo/doom3.gpl/blob/master/neo/framework/Session.cpp#L2588

Sadly, Doom 3 has other sources of jank (that can be worked around)...

You're conflating multiple issues, somehow attributing them to this approach (et al), and then by proxy deflecting to me for demonstrating... What you seem to fail to grasp is these things aren't the issues that you think they are. Not unless you're working on a much larger game design or something for modern consoles where it becomes critical.
It's more like you live in the same fantasy world where (unfortunately) many hobbyist game developers live, they don't understand operating systems much at all and just see an idealized view of the abstractions it presents from the application's perspective. In such a world, it's fine to pretend the scheduler does not exist and won't pre-empt your process that never yields. It's fine as long as you don't mind the performance implications or don't care that it'll needlessly stutter on some systems and drain the battery and run hot.

Size of game doesn't matter, in fact simple 2D games that *should* run perfectly but somehow have jank *and* make the CPU fan on a low-end laptop scream for mercy are what really made me want to dive into this shit. Big, complex 3D games often have so many other sources of jank going on that it doesn't matter much (and again they can just blame peoples' hardware whenever there's some issue, without even trying to understand it). Of course, at some point big and complex 3D games age and become old and simple 3D games and players realize the code could be better: https://github.com/ioquake/ioq3/commit/e5dbce839a478038297a2d7e13fad234b7ac1286

A vast amount of games use the same technique listed above with minor tweaks, generally adding delta physics, and the things you seem to think are issues just aren't. In short, your thoughts on this are vain nonsense that sounds great on paper but doesn't hold up to obdurate practical reality. You think you're "being technical", but you're just being pathetically pedantic without cause.
This "pedantry" does not come from paper or ideals, it comes from experience with real software (including games) and OS kernels.

I'm becoming convinced you guys have Aspergers.
Assburgers is a curious way to spell experience. But I like it.
Post edited September 19, 2019 by clarry
avatar
StingingVelvet: Don't have time to read the whole thread so maybe this was mentioned, but as an owner of a 144hz monitor I can definitely tell you some games mess up a lot above 60fps. Fallout 3, 4 and New Vegas for example, their physics go all nutty above 60fps.
That brings up an interesting question: Is there any way to emulate a 144Hz monitor for testing purposes? (It doesn't actually need to be able to refresh the display at that rate; it just needs to appear that way to the OS and game, so that the game think it's running that fast and experiences those issues.)

Also, do other Bethesda games (Morrowind, Oblivion, and Skyrim) have this issue? (Of course, I should point out hat these games have physics glitches even at more typical 60Hz refresh rates.)

In any case, your comment does bring up an interesting point; it is a good idea to get a diverse selection of hardware to test games on, as sometimes bugs can show up on some hardware that doesn't on others.
avatar
dtgreene: By the way, could you explain how I would choose between select(2) and poll(2)?
Boils down to a matter of preference.

select used to be more portable (Windows...) but I don't think it matters anymore.

I don't like dealing with select's bitmasks, and it has the other issue of having a fixed fd set size. Doesn't matter for simple applications that deal with a handful of FDs, but I find poll more convenient. If you have lots of fds (hundreds, thousands), you want something more performant anyway, like kqueue.
avatar
dtgreene: In any case, your comment does bring up an interesting point; it is a good idea to get a diverse selection of hardware to test games on, as sometimes bugs can show up on some hardware that doesn't on others.
To be fair you can make a 144hz monitor run in 60hz (or just limit to 60fps using something like RivaTuner) so the hardware itself isn't the issue, just the game.
What sucks most is games that offer vsync or nothing...

F.E.A.R. is one of these games and it's a shame, I like shooters and it kinda seems like something I'd enjoy.

It just doesn't run well. Vsync makes for too much latency, and without vsync I get massive tearing & stutter unless I allow it to run at a very high fps (over twice the refresh rate). Unfortunately it can't maintain a constant very high framerate and it appears that mouse sensitivity is subject to framerate fluctuations so aiming becomes a pain. Sigh. Probably rounding errors?

Only problem is I can't seem to find anyone else on planet earth who can replicate my issue. I must be very special to be tormented with this :(

Maybe one day I'll hack in a delay after vsync to reduce the latency. Or get a 144Hz freesync monitor :P

Inspired by that recent thread about monitors, I took a peek at what's available but nothing really stands out. I'd like higher dpi + size + refresh rate than my ~10 yo monitor can do but turns out that's a tough combo.
Post edited September 19, 2019 by clarry
avatar
StingingVelvet: *snip*
avatar
dtgreene: *snap*
avatar
StingingVelvet: *snup*
I can provide the answer to this.

The TES game engine is an engine called "Gamebryo" which they recoded and relabeled as "Creation Engine", quite akin to how the GoldSrc engine was taken from the Quake 2 engine as a baseline.

So when they made the Creation Engine they used the proprietary Havok engine to handle all physics within the game. What Bethesda has done is that they took the Havok engine and used it as a seperate api without changing too much of it, as you have to hamonize the seperate APIs with the core engine. So once you use a simple .ini tweak for any TES/FO game so it runs more than 60 FPS, the results are hillarious. Because the seperate APIs aren't really programmed to "understand" (for the lack of a better term) to synchronize themselves whenever a higher fps can get achieved. However they can do so with less fps, because that was already programmed inside the base of the Havok engine.

Bethesda basically took the Havok engine and made no changes to implement a dynamic way so the engine knows in what FPS the game is running to adept. This was probably because they never thought that people want to make their games run faster than 60 FPS, simply because they thought of console compatibilities first.

There is an easy fix for all Bethesda Gamebryo/Creation Engine games (including NV). Go to your skyrim.ini and type in fMaxTime=x.xxxx (x = value) under the [HAVOK] category (if it doesn't exist, make it). depending on the max fps you can archive you simply have to divide the value of 0.0166 (which is Havoks engines baseline of its 60 frames per second, or 16.6 ms x 60 = around 1 second) with the max fps you want to have a bugfree experience while then multiplying this with the max fps you want to have from TES(Skyrim, Oblivion)/FO(FO3, FO4, NV). So if you want to have 90 FPS within your Game without the bug you simply type in fMaxTime=0.0111 (which is 0.0166/90*60, with 60 being the prior baseline) below the [HAVOK] category and then it works. This fix should work with all bethgame .inis :)!

The downside to the tweak is that the Havok Engine now makes more "physic synchronization calls" per second, which will logically decrease some of your performance due to more consistency checks. If your fps is above the treshhold again things will also break again, so make sure your apply the fMaxTime= exactly or slighty above the your max fps that you can get.

Skyrim, a game which I recently play, has a fix for these kind of dynamic Havok-calls so you don't have to calculate anything and the game works it out for you. However a similar fix should be available for all the other games available on the Engine as long as they offer a script extending API.

EDIT: Quoted the wrong people by accident, fixed. Also changed a few things so readers at least have a better time reading through it.
Post edited September 19, 2019 by Dray2k
avatar
dtgreene: By the way, could you explain how I would choose between select(2) and poll(2)?
avatar
clarry: Boils down to a matter of preference.

select used to be more portable (Windows...) but I don't think it matters anymore.

I don't like dealing with select's bitmasks, and it has the other issue of having a fixed fd set size. Doesn't matter for simple applications that deal with a handful of FDs, but I find poll more convenient. If you have lots of fds (hundreds, thousands), you want something more performant anyway, like kqueue.
Incidentally, in Python (using the standard library), select() looks like it's a lot easier to use; all you need to do is pass it some arrays of file descriptor and an optional timeout; with poll() you get an object that you then need to individually register the file decriptors you want to watch. (With that said, on GNU/Linux at least, select() is unsuitable for thousands of file descriptors, as it can only handle file descriptors up to 1023.)

Also, from what I can tell, kqueue is a BSD thing and therefore is only suitable if you are using such a platform; this isn't an issue if you can choose the OS (and are in a field that BSD is well suited to, like running servers), but is not suitable for a commercially released game.
avatar
dtgreene: Also, from what I can tell, kqueue is a BSD thing and therefore is only suitable if you are using such a platform; this isn't an issue if you can choose the OS (and are in a field that BSD is well suited to, like running servers), but is not suitable for a commercially released game.
Yep, there are alternatives on other systems, e.g. epoll. And then there are libraries like libevent or libev that wrap a bunch of these alternatives. I tend to prefer to use the thing I want directly though.

Anyway, games shouldn't need to be watching 1000s of fds anyway.