Desktop res 1080p = 99% GPU uti. Desktop res 4k = 50-60% GPU util
Started by moccor




0 posts in this topic
moccor
Member


0
17 posts 8 threads Joined: Jun 2018
11-12-2019, 07:30 PM -
#1
So I am not sure why this is happening, I have no problems at all with 4K gaming performance. But I noticed this problem when I was rendering RPCS3 @4K on my 1080p monitor, then moving to my 4K TV I noticed much lower RPCS3 FPS ingame. So I set my TV to 1080p while keeping RPCS3 to 4K rendering resolution and I got all the FPS back. Basically in 1080p Desktop resolution, my GPU is being fully utilized by RPCS3 but in 4K Desktop resolution, it is using about 50% of my RTX 2080. I don't experience this in any non-RPCS3 games though to rule hat out.

Specs - 
Alienware m15 w/8750H + 2070 Max-Q
16GB RAM
Alienware AGA + 2080

Edit: It seems to be caused by ReShade. And since ReShade + Vulkan is in early stages, I wouldn't be surprised if it was entirely due to ReShade. But At least now if anyone else has this problem they'll know why. I just wonder if there is some information I can share with the ReShade devs or RPCS3 to potentially fix this?

Edit 2: Something else weird I noticed and this time it is 100% without ReShade. RPCS3 performs much worse when using a custom resolution VS using a native resolution, why can that be? If I run it @ native1080p it performs X fps. If I run it @ native 4K it performs X fps. If I create a custom resolution of 3840x2160 for my 1080p monitor it performs at Y fps. like 25% less FPS. Reverting from the custom resolution to the native 1080p you can immediately see the FPS increase.
This post was last modified: 11-12-2019, 11:20 PM by moccor.


Messages In This Thread
Desktop res 1080p = 99% GPU uti. Desktop res 4k = 50-60% GPU util - by moccor - 11-12-2019, 07:30 PM

Forum Jump:


Users browsing this thread: 1 Guest(s)