@yosh Uh, uhm.
-
@aeva wait I'm not sure what this means 😅
Does it still cause input lag and stuff? Is it basically just vsync but with tearing but then without tearing cause I cap it externally?
@hazelnot so, this games commonly follow a cadence more or less along these lines:
1. read input events from controller
2. update game state
3. render a new frame image
4. present the new frame to the displaysteps 1 and 2 happen on the CPU. step three starts with a bunch of up front work on the CPU to schedule work on the GPU, and step 4 is usually initiated on the CPU before 3 finishes on the GPU and then some complicated stuff happens
-
@hazelnot so, this games commonly follow a cadence more or less along these lines:
1. read input events from controller
2. update game state
3. render a new frame image
4. present the new frame to the displaysteps 1 and 2 happen on the CPU. step three starts with a bunch of up front work on the CPU to schedule work on the GPU, and step 4 is usually initiated on the CPU before 3 finishes on the GPU and then some complicated stuff happens
@hazelnot these steps can be staggered in exciting ways, but let's ignore that. as far as the computer is concerned, the distance in time between frame step 1 and the end of step 4 is your latency. (your monitor will also add latency, but that's out of the game's control)
-
@hazelnot these steps can be staggered in exciting ways, but let's ignore that. as far as the computer is concerned, the distance in time between frame step 1 and the end of step 4 is your latency. (your monitor will also add latency, but that's out of the game's control)
@aeva ...oh, so I'm basically not doing anything and I have to choose between cooking my PC and having bad aim? 💀
-
@hazelnot these steps can be staggered in exciting ways, but let's ignore that. as far as the computer is concerned, the distance in time between frame step 1 and the end of step 4 is your latency. (your monitor will also add latency, but that's out of the game's control)
@hazelnot modern computers still use vertical blank interrupts to indicate when it is safe to start scanning out a new frame. these come at a cadence like 60 hz, aka 16.6 milliseconds, and the game's cadence has to match that for everything to look and feel ok when stuff is in motion
-
@hazelnot modern computers still use vertical blank interrupts to indicate when it is safe to start scanning out a new frame. these come at a cadence like 60 hz, aka 16.6 milliseconds, and the game's cadence has to match that for everything to look and feel ok when stuff is in motion
@hazelnot when vsync is on, and a frame is presented early, the gpu driver blocks the thread that requested a present so that it doesn't run faster than the GPU (this is an oversimplification). when the all of the rendering work is finished on the GPU and it is late, the present stalls until the next vblank before flipping the back buffer. so if you're targeting 60 fps on a 60 fps screen, the normal cadence is 16.6 ms, a hitch is + 16.6 * N where N is how many intervals late you are rounded up
-
@hazelnot modern computers still use vertical blank interrupts to indicate when it is safe to start scanning out a new frame. these come at a cadence like 60 hz, aka 16.6 milliseconds, and the game's cadence has to match that for everything to look and feel ok when stuff is in motion
@aeva wow that's wild, I thought vblank stuff ended when CRTs stopped being the mainstream display technology cause LCDs afaik display the whole picture at once(?)
-
@hazelnot when vsync is on, and a frame is presented early, the gpu driver blocks the thread that requested a present so that it doesn't run faster than the GPU (this is an oversimplification). when the all of the rendering work is finished on the GPU and it is late, the present stalls until the next vblank before flipping the back buffer. so if you're targeting 60 fps on a 60 fps screen, the normal cadence is 16.6 ms, a hitch is + 16.6 * N where N is how many intervals late you are rounded up
@hazelnot since there's some amount of staggering that happens naturally between the CPU and GPU, the end to end latency is usually going to be something in the ballpark of the frame interval * 2. so the CPU submits work within its 16.6 ms, and the gpu completes its work within its own 16.6, and the two will drift within some overlap to a normal total of up to 33.3 milliseconds of total latency when everything is on time
-
@hazelnot since there's some amount of staggering that happens naturally between the CPU and GPU, the end to end latency is usually going to be something in the ballpark of the frame interval * 2. so the CPU submits work within its 16.6 ms, and the gpu completes its work within its own 16.6, and the two will drift within some overlap to a normal total of up to 33.3 milliseconds of total latency when everything is on time
@hazelnot now if you're not using vsync what happens? the game still needs to throttle the CPU periodically so the rendering work doesn't get backed up, but generally your options are 1) to ignore vsync entirely and if you get a tear you tear, and 2) wait for the vblanks when you're on time but if you're late just tear anyway. option 1 improves latency when it's early and when it's late, but looks terrible. option 2 only improves latency when the frame is late
-
@aeva wow that's wild, I thought vblank stuff ended when CRTs stopped being the mainstream display technology cause LCDs afaik display the whole picture at once(?)
@hazelnot I have it on good authority that the HDMI protocol involves an emulated virtual CRT, and audio and stuff all gets transmitted in the hblank. Displayport doesn't do that, but vsync is still a thing so I assume the concept of a vblank still exists mostly in the form of when it's safe to transmit a new image and/or the monitor signaling the speed which it can recieve or replace frames
-
@hazelnot I have it on good authority that the HDMI protocol involves an emulated virtual CRT, and audio and stuff all gets transmitted in the hblank. Displayport doesn't do that, but vsync is still a thing so I assume the concept of a vblank still exists mostly in the form of when it's safe to transmit a new image and/or the monitor signaling the speed which it can recieve or replace frames
@hazelnot keep in mind that 4k images are huge and don't transmit instantaneously, so there's still a cadence even if it's not a (real or virtual) electron beam
-
@hazelnot keep in mind that 4k images are huge and don't transmit instantaneously, so there's still a cadence even if it's not a (real or virtual) electron beam
@aeva I've never had a 4K anything tbh 😅
The biggest resolution I think I'll ever need is 2560x1600
-
@hazelnot I have it on good authority that the HDMI protocol involves an emulated virtual CRT, and audio and stuff all gets transmitted in the hblank. Displayport doesn't do that, but vsync is still a thing so I assume the concept of a vblank still exists mostly in the form of when it's safe to transmit a new image and/or the monitor signaling the speed which it can recieve or replace frames
@aeva god that's so cursed, another reason to dislike HDMI ðŸ˜
-
@aeva I've never had a 4K anything tbh 😅
The biggest resolution I think I'll ever need is 2560x1600
@hazelnot so, consider: whichever cable you are using to connect your GPU to your screen most likely does not have 2560x1600=4096000 wires in it
-
@hazelnot so, consider: whichever cable you are using to connect your GPU to your screen most likely does not have 2560x1600=4096000 wires in it
@aeva nonsense, I connect the monitor to my PC via one of those undersea transcontinental fiber conduits :P
-
@aeva ...oh, so I'm basically not doing anything and I have to choose between cooking my PC and having bad aim? 💀
@hazelnot yes, however strictly speaking you also have the option of "turn the graphical settings down", and if your computer is a desktop PC then you theoretically also have the option of "simply improve your computer's cooling and/or hardware specs by some means"
-
@hazelnot yes, however strictly speaking you also have the option of "turn the graphical settings down", and if your computer is a desktop PC then you theoretically also have the option of "simply improve your computer's cooling and/or hardware specs by some means"
@aeva if I turn the settings down it just generates even more frames instead, and improving cooling is expensive cause I'd need a new case and/or liquid cooling (or a GPU without a goddamn blower fan which is even more expensive) 😔
-
@aeva if I turn the settings down it just generates even more frames instead, and improving cooling is expensive cause I'd need a new case and/or liquid cooling (or a GPU without a goddamn blower fan which is even more expensive) 😔
@hazelnot turn the settings down and leave vsync on, I mean. that will cause the latency between the CPU and GPU to drop if the CPU work is finishing fast enough to do so.
-
@hazelnot turn the settings down and leave vsync on, I mean. that will cause the latency between the CPU and GPU to drop if the CPU work is finishing fast enough to do so.
@hazelnot I'm not convinced that frame tearing allows you to experience the game faster than your monitor's refresh rate, it just lets you randomly experience multiple views of the game from different points in time at your monitor's refresh rate. I haven't seen any research on this, so this is speculation on my part.
-
@aeva nonsense, I connect the monitor to my PC via one of those undersea transcontinental fiber conduits :P
@hazelnot I can't find it because all of the search terms I can think of are overloaded with common products, but I saw an article about a display that was an array of fiber optic wires, one per pixel. it was an art installation
-
@aeva god that's so cursed, another reason to dislike HDMI ðŸ˜
@hazelnot it's not an entirely unreasonable protocol, but it's less versatile than display port, which is packet based