

Nice! A big improvement indeed.
I wished you had showed them with similar sharpness settings though. The FSR 3 image is very oversharpened, while the FSR 4 one has the opposite problem so you can’t really compare any details.


Nice! A big improvement indeed.
I wished you had showed them with similar sharpness settings though. The FSR 3 image is very oversharpened, while the FSR 4 one has the opposite problem so you can’t really compare any details.


Yeah I just wanted to illustrate that with some numbers :)
It’s a bit counter-intuitive that frame generation is worse the lower your base frame rate is. And Nvidia in particular has no interest in making it clear that this tool is only really good for making a well-running game run even better, and is not going to give your 5070 “4090 performance” in any meaningful way.


I was trying to explain why the game loop would be held back by the rendering speed, even though they run on different hardware.
If you are bottlenecked by the GPU that means the game loop spends some of its time waiting for the GPU. If you then turn on frame generation, you devote parts of the GPU to doing that, which makes regular rendering slower, making the game loop spend even more time waiting. This will increase input latency.
Frame generation also needs to delay output of any real frame while it creates and inserts a generated frame. This will add some output latency as well.
In the opposite scenario, where you are bottlenecked by the CPU, enabling frame generation should in theory not impact the game loop at all. In that case it’s the GPU that’s waiting for the CPU, and it can use some of those extra resources it has to do frame generation with no impact on input latency.


Most games aren’t bottlenecked by your CPU at all. It spends a lot of time waiting for the GPU to be done drawing you a picture.
“Why isn’t the game doing other stuff meanwhile?” you might ask, and part of the answer is surely, “Why do stuff faster than the player can see?”, while another part is likely a need to syncronize the simulation and the rendering so it doesn’t show you some half-finished state, and a third part might be that it would be very confusing for the player to decouple the game state from what they see on screen, like you see yourself aiming at the monster, but actually it moved in between frames so your shot will miss even if the crosshair is dead on.


Framegen is worse the lower your base frame rate is.
The penalty to the speed at which the game runs is much more significant, if you normally run at 40 fps and framegen gives you 60 (30 real) then you have introduced 8 ms of latency just from that. While the same 25% performance cost going from 180 fps to 270 (135 real) adds just 2 ms.
The lower your real frame rate is the harder it will be to interpolate between frames because the changes between frames are much larger, so it will look worse. Also the lower your frame rate the longer any mishaps will remain on screen, making them more apparent.


📎 Looks like you’re trying to hold B! Would you like help with that?


If you want newer stuff the non-stable branches of Debian are perfectly usable.
Testing (the upcoming release) should be your first stop. But even Unstable works just as well as most other distros. There might be the occasional issue, but anything serious is generally fixed quickly.
Debian stable is intended for use cases where an update must never change anything that could cause any problem. For the average desktop it’s perfectly fine to have things change or to be mildly inconvenienced every now and then.


4K is an outrageously high resolution.
If I was conspiratorial I would say that 4K was normalized as the next step above 1440p in order to create a demand for many generations of new graphics cards. Because it was introduced long before there was hardware able to use it without serious compromises. (I don’t actually think it’s a conspiracy though.)
For comparison, 1440p has 78% more pixels than 1080p. That’s quite a jump in pixel density and required performance.
4K has 125% more pixels than 1440p (300% more than 1080p). The step up is massive, and the additional performance required is as well.
Now there is a resolution that we are missing in between them. 3200x1800 is the natural next step above 1440p*. At 56% more pixels it would be a nice improvement, without an outrageous jump in performance. But it doesn’t exist outside of a few laptops for some reason.
*All these resolutions are multiples of 640x360. 720p is 2x, 1080p is 3x, 1440p is 4x, and 4K is 6x. 1800p is the missing 5x.


Definitely, but it’s impossible to do for everyone using an adapter.


Overcurrent protection on each pin should definitely be mandated by the standard.
But it’s important to keep in mind that Nvidia has 90% market share and can do whatever they want. If PCI standardized something Nvidia didn’t agree with, then there simply would not be any implementations of the standard, and Nvidia cards would use a non-standard connector. It’s that simple.


It was Nvidia that designed the original connector and forced it upon the world. PCI has been trying to make it less bad, but it was standardized after it had already been created, not the other way around.


deleted by creator


Yeah, I realize now that I’d have to do that anyway, as PTM pads are not available in those thicknesses (if the material even works well at those thicknesses).
What I should do is get some thermal putty to replace the pads, so I don’t have to bother with getting and cutting the right size of pads.
I also found a PTM pad from Cooler Master on the market called Cryonamics. But it seems like a very new product. I can find no one even as much as mentioning it online. It’s half the price of the Thermal Grizzly so I’m tempted to try it.


Honestly must have been a manufacturing error. Which is no excuse, QC should have caught it.
You’d think that high prices would mean the ability to have higher quality manufacturing without affecting the margin much. But I think much of that money is going to TSMC, Nvidia and AMD, with third-party manufacturers getting squeezed as well. But idk.


Really interested in trying PTM on my graphics card, but it’s still too expensive. You need several sheets to cool all the components and Thermal Grizzly is the only brand I can get a hold of.
It’s cool (hehe) that it’s even available at regular computer retailers though.
I have no idea if this is true, but it certainly fits the very strange vibe of the game.
It’s like how I would imagine the most violent cops see the world.
All people are awful. Every criminal is a heavily armed, highly trained, fearless lunatic, who does not care if they live or die, as long as there’s a tiny chance they can hurt more people. Civilians are uncooperative, ungrateful, and suicidal.
Every deployment, no matter how routine, will likely lead you into an ambush by dudes with assault rifles.
Avoiding bloodshed is almost impossible and even trying is likely to get you killed.
The game has some of the strangest bugs.
The last time I tried to play, I had no UI at all, but only in multiplayer. It worked fine in single player, but if I joined or created a multiplayer game, the whole HUD was just gone and nothing could make it appear.
That’s really the only time I’ve tried to play it since 1.0, and I’m not going to blame them for bugs in early access. But loosing the mission after the last civilian (Daniella Voll) managed to trap two officers in a bugged closed and slap us to death was as infuriating as it was hilarious.
They have been circling the drain for a long time now. With this cancelation I think they’re done. They can’t live off Payday 2 forever.