@yosh Uh, uhm.
-
@msfjarvis @yosh https://web.archive.org/web/20210425095523/https://www.cs.mun.ca/~wlodek/
Is the archived website of our professor back then.
At a quick glance, I think this paper seems to discuss most of it. https://scispace.com/pdf/modeling-and-performance-analysis-of-priority-queuing-3g6mmrdnxh.pdf
Like, this seriously informed my intuition for network applications, which was my career before Rust :).
-
undefined aeva@mastodon.gamedev.place shared this topic
-
@msfjarvis @yosh https://web.archive.org/web/20210425095523/https://www.cs.mun.ca/~wlodek/
Is the archived website of our professor back then.
At a quick glance, I think this paper seems to discuss most of it. https://scispace.com/pdf/modeling-and-performance-analysis-of-priority-queuing-3g6mmrdnxh.pdf
Like, this seriously informed my intuition for network applications, which was my career before Rust :).
@skade @msfjarvis @yosh as it happens dynamic resolution scaling and similar systems in video game rendering also work like that. 100% GPU load causes everything to go to hell (frame drops and worse) so we try to find setpoints that get us within some threshold below that. I don't remember what the magic % ended up being for the games I've worked on, but we try to run as close to the wire as we can and still have a stable cadence.
-
@skade @msfjarvis @yosh as it happens dynamic resolution scaling and similar systems in video game rendering also work like that. 100% GPU load causes everything to go to hell (frame drops and worse) so we try to find setpoints that get us within some threshold below that. I don't remember what the magic % ended up being for the games I've worked on, but we try to run as close to the wire as we can and still have a stable cadence.
@aeva @skade @msfjarvis @yosh meanwhile gamers: "ackhshually running the GPU at 100% all the time is good cause it means you're getting all your money's worth of performance" like those boomers going on about how they paid for the whole screen and stretched a 4:3 picture to 16:9 back when wide-screen TVs were becoming mainstream but a big chunk of media was still not
-
@aeva @skade @msfjarvis @yosh meanwhile gamers: "ackhshually running the GPU at 100% all the time is good cause it means you're getting all your money's worth of performance" like those boomers going on about how they paid for the whole screen and stretched a 4:3 picture to 16:9 back when wide-screen TVs were becoming mainstream but a big chunk of media was still not
@hazelnot @skade @msfjarvis @yosh well, you can do that, you just have to turn off vsync, and it's not an entirely unreasonable thing to do since it can greatly improve input latency at the cost of tearing
-
@hazelnot @skade @msfjarvis @yosh well, you can do that, you just have to turn off vsync, and it's not an entirely unreasonable thing to do since it can greatly improve input latency at the cost of tearing
@aeva @skade @msfjarvis @yosh lol fair, however, my GPU running at 100% heats up to over 110°C which ok it is rated for but it doesn't feel safe especially on a 6 year old card that could just die at any time anyway
-
@aeva @skade @msfjarvis @yosh lol fair, however, my GPU running at 100% heats up to over 110°C which ok it is rated for but it doesn't feel safe especially on a 6 year old card that could just die at any time anyway
@hazelnot @skade @msfjarvis @yosh yeah i don't recommend it personally
-
@hazelnot @skade @msfjarvis @yosh yeah i don't recommend it personally
@aeva actually I'm wondering, does disabling vsync but capping the framerate do anything? Cause that's that I've been doing for shooters to try to improve input lag without cooking my PC but I have no idea if it helps or if the input lag is caused by the capped framerate itself
-
@aeva actually I'm wondering, does disabling vsync but capping the framerate do anything? Cause that's that I've been doing for shooters to try to improve input lag without cooking my PC but I have no idea if it helps or if the input lag is caused by the capped framerate itself
@hazelnot all it does is trade hitching for tearing when too many frames are late
-
@hazelnot all it does is trade hitching for tearing when too many frames are late
@aeva wait I'm not sure what this means 😅
Does it still cause input lag and stuff? Is it basically just vsync but with tearing but then without tearing cause I cap it externally?
-
@aeva wait I'm not sure what this means 😅
Does it still cause input lag and stuff? Is it basically just vsync but with tearing but then without tearing cause I cap it externally?
@hazelnot so, this games commonly follow a cadence more or less along these lines:
1. read input events from controller
2. update game state
3. render a new frame image
4. present the new frame to the displaysteps 1 and 2 happen on the CPU. step three starts with a bunch of up front work on the CPU to schedule work on the GPU, and step 4 is usually initiated on the CPU before 3 finishes on the GPU and then some complicated stuff happens
-
@hazelnot so, this games commonly follow a cadence more or less along these lines:
1. read input events from controller
2. update game state
3. render a new frame image
4. present the new frame to the displaysteps 1 and 2 happen on the CPU. step three starts with a bunch of up front work on the CPU to schedule work on the GPU, and step 4 is usually initiated on the CPU before 3 finishes on the GPU and then some complicated stuff happens
@hazelnot these steps can be staggered in exciting ways, but let's ignore that. as far as the computer is concerned, the distance in time between frame step 1 and the end of step 4 is your latency. (your monitor will also add latency, but that's out of the game's control)
-
@hazelnot these steps can be staggered in exciting ways, but let's ignore that. as far as the computer is concerned, the distance in time between frame step 1 and the end of step 4 is your latency. (your monitor will also add latency, but that's out of the game's control)
@hazelnot modern computers still use vertical blank interrupts to indicate when it is safe to start scanning out a new frame. these come at a cadence like 60 hz, aka 16.6 milliseconds, and the game's cadence has to match that for everything to look and feel ok when stuff is in motion
-
@hazelnot modern computers still use vertical blank interrupts to indicate when it is safe to start scanning out a new frame. these come at a cadence like 60 hz, aka 16.6 milliseconds, and the game's cadence has to match that for everything to look and feel ok when stuff is in motion
@hazelnot when vsync is on, and a frame is presented early, the gpu driver blocks the thread that requested a present so that it doesn't run faster than the GPU (this is an oversimplification). when the all of the rendering work is finished on the GPU and it is late, the present stalls until the next vblank before flipping the back buffer. so if you're targeting 60 fps on a 60 fps screen, the normal cadence is 16.6 ms, a hitch is + 16.6 * N where N is how many intervals late you are rounded up
-
@hazelnot modern computers still use vertical blank interrupts to indicate when it is safe to start scanning out a new frame. these come at a cadence like 60 hz, aka 16.6 milliseconds, and the game's cadence has to match that for everything to look and feel ok when stuff is in motion
@aeva wow that's wild, I thought vblank stuff ended when CRTs stopped being the mainstream display technology cause LCDs afaik display the whole picture at once(?)
-
@hazelnot when vsync is on, and a frame is presented early, the gpu driver blocks the thread that requested a present so that it doesn't run faster than the GPU (this is an oversimplification). when the all of the rendering work is finished on the GPU and it is late, the present stalls until the next vblank before flipping the back buffer. so if you're targeting 60 fps on a 60 fps screen, the normal cadence is 16.6 ms, a hitch is + 16.6 * N where N is how many intervals late you are rounded up
@hazelnot since there's some amount of staggering that happens naturally between the CPU and GPU, the end to end latency is usually going to be something in the ballpark of the frame interval * 2. so the CPU submits work within its 16.6 ms, and the gpu completes its work within its own 16.6, and the two will drift within some overlap to a normal total of up to 33.3 milliseconds of total latency when everything is on time
-
@hazelnot since there's some amount of staggering that happens naturally between the CPU and GPU, the end to end latency is usually going to be something in the ballpark of the frame interval * 2. so the CPU submits work within its 16.6 ms, and the gpu completes its work within its own 16.6, and the two will drift within some overlap to a normal total of up to 33.3 milliseconds of total latency when everything is on time
@hazelnot now if you're not using vsync what happens? the game still needs to throttle the CPU periodically so the rendering work doesn't get backed up, but generally your options are 1) to ignore vsync entirely and if you get a tear you tear, and 2) wait for the vblanks when you're on time but if you're late just tear anyway. option 1 improves latency when it's early and when it's late, but looks terrible. option 2 only improves latency when the frame is late
-
@aeva wow that's wild, I thought vblank stuff ended when CRTs stopped being the mainstream display technology cause LCDs afaik display the whole picture at once(?)
@hazelnot I have it on good authority that the HDMI protocol involves an emulated virtual CRT, and audio and stuff all gets transmitted in the hblank. Displayport doesn't do that, but vsync is still a thing so I assume the concept of a vblank still exists mostly in the form of when it's safe to transmit a new image and/or the monitor signaling the speed which it can recieve or replace frames