Adaptive VSync
Moderator: Moderators
Adaptive VSync
This recent setting in nvidia drivers is super useful. It caps frame rate at refresh frequency, but doesnt start halving if you drop under refresh frequency. Reduces gpu power usage, eliminates tearing and doesnt cause drop to 30 fps if you go a smidge under 60. Works well with spring :D
Re: Adaptive VSync
it will be used in 89.0 with spring's own vsync cmd
PS: I changed the used vsync mechanism, so it won't be hardlocking anymore. Before the mainthread was locked while waiting for vsync, and so the time was lost. In 89.0 it will use a async system. Locking will only happen when the OGL buffer is full. The spring thread can then compute other stuff (e.g. simframes, next drawframe, ...) while the driver waits for vsync.
This + adaptive vsync should stabilize the FPS and make vsync much more usable in spring.
PS: I changed the used vsync mechanism, so it won't be hardlocking anymore. Before the mainthread was locked while waiting for vsync, and so the time was lost. In 89.0 it will use a async system. Locking will only happen when the OGL buffer is full. The spring thread can then compute other stuff (e.g. simframes, next drawframe, ...) while the driver waits for vsync.
This + adaptive vsync should stabilize the FPS and make vsync much more usable in spring.
Re: Adaptive VSync
In RTS - works sometimesBeherith wrote:This recent setting in nvidia drivers is super useful. It caps frame rate at refresh frequency, but doesnt start halving if you drop under refresh frequency. Reduces gpu power usage, eliminates tearing and doesnt cause drop to 30 fps if you go a smidge under 60. Works well with spring :D
In FPS - nope
In plane/car simulators - hell no
Drop in power usage definitely causes a drop in frames AND adds stability issues. Try that with PALIT cards for example. We changed 3-way SLI pack after 1 week BSOD kung-fu-ing ....
And btw why are you so happy when this is possible not only with nvidia cards?
Re: Adaptive VSync
Because I am the happy owner of a new nvidia card? This was not possible with my old card. Or is that too much fanboyism? Sorry, didnt mean to offend.
Since power and frame rate are connected, its logical that if you drop the power budget you will get less max frames. In this case, I'm dropping frames that dont even get fully pushed to my monitor. I have not experienced stability issues, but I dont own either a Palit card, or use SLI. Spring runs smooth as butter.
What do you mean it doesnt work in FPS or sims? Isnt that just dependent on the game's engine and not on the genre?
Since power and frame rate are connected, its logical that if you drop the power budget you will get less max frames. In this case, I'm dropping frames that dont even get fully pushed to my monitor. I have not experienced stability issues, but I dont own either a Palit card, or use SLI. Spring runs smooth as butter.
What do you mean it doesnt work in FPS or sims? Isnt that just dependent on the game's engine and not on the genre?
Re: Adaptive VSync
+1Beherith wrote:Thanks jK!
- SwiftSpear
- Classic Community Lead
- Posts: 7287
- Joined: 12 Aug 2005, 09:29
Re: Adaptive VSync
That solution actually makes a lot of sense. It was SO frustrating to have your frame rate plummet to 30 just because your machine, during a high action moment, can only handle 55 FPS. A bit of screen tearing at that point is far preferable to "OMG I CAN'T CLICK THINGS I WANT ANY MORE". But it's not like any of those frames above 60 are actually helping you if that's what your display rate is.Beherith wrote:This recent setting in nvidia drivers is super useful. It caps frame rate at refresh frequency, but doesnt start halving if you drop under refresh frequency. Reduces gpu power usage, eliminates tearing and doesnt cause drop to 30 fps if you go a smidge under 60. Works well with spring :D
For that reason disabling Vsync was always the first thing I'd do in any game. If they start using this more sensible option I'll move away from that though.
Re: Adaptive VSync
Adaptive Vsync can be implemented in the games, or with thirdparty tool.Beherith wrote:Because I am the happy owner of a new nvidia card? This was not possible with my old card. Or is that too much fanboyism? Sorry, didnt mean to offend.
Just for the protocol for my hobby (and subjectively called 2nd work) use and GTX690 SLI, also put mainly nvidias into my friend's computers.
My opinion is based at least on my and other's user experience.Beherith wrote:Since power and frame rate are connected, its logical that if you drop the power budget you will get less max frames. In this case, I'm dropping frames that dont even get fully pushed to my monitor. I have not experienced stability issues, but I dont own either a Palit card, or use SLI. Spring runs smooth as butter.
Palit was just an manuf example, what do you say about that same problems are in Gigabyte, Asus, Evga, Sapphire and XFX products, no matter what is the GPU?
Anyway Spring continues to have the problem with popping ground at some water maps. Imagine a block 1/4 of the map that contrasts its borders for a fraction of the second in random intervals ...
It's good to play FPS/sims with VSync=off to fix synchronization problems between graphics and network. Good example are HL1, HL2 engines, famous Quake 3 difference in jump height at different frames per second, Crysis needs Vsync=off to handle at sharp fps drop too random gun recoil, some arcade sims like NFS and Dirt bring camera effects that distract you more at frame drop ...Beherith wrote:What do you mean it doesnt work in FPS or sims? Isnt that just dependent on the game's engine and not on the genre?