Hey all, I know that switchable graphics is a thing in laptops where there is usually a single port. But how would you go about it on desktop? Do you put your monitor in the onboard HDMI or on the dGPU port? There are other issues associated with doing it of course, but I thought it might save on power and noise if I used the iGPU as much as possible.
Only have a nvidia GPU at the moment, but hoping to get an AMD 9070 at some point
I’m not sure if you’ll really save yourself much electricity with what you’re planning, if any.
GPUs shouldn’t really be pulling much power when they aren’t being used, but as another poster mentioned you can test this with a kill-o-watt or other similar tool. If you care about electricity usage and don’t have one it’s a great lil thing to have around.
You can test by doing the following:
-run PC at idle without GPU inserted into mobo and then test again with GPU plugged in.
If you want to check if switching back and forth between output ports (igpu vs card) you can try and switch your monitor cable while the PC is running, but there’s a good chance that the GPU will be active even if you don’t have a display plugged in. You can test this by:
-boot PC with GPU inserted and monitor plugged into GPU, then swap the cable to igpu and see if that makes a difference. I highly doubt it will make a noticeable difference, but if it seems worth it to you I think the easiest way to switch is a KVM switch or other device made to swap a single display amongst different devices.
But at that point you’re saving pennies at best and it’ll all go towards your new kvm switch unless you want to be plugging and unplugging a cable all the time… And I’m no expert in how various components draw power in a PC when they’re not being used (probably firmware/OS dependent) but I still think your GPUs will draw power even if not actually connected to a display
Just keep your stuff plugged into your GPU my guy. If you want to drop power use and noise then tweak the power/fan curves, underclock it, make sure you’ve got good airflow, etc
Thanks, I don’t think i’ll bother with the power side of it but I’d be interested to know still.
Unfortunately, I don’t think this would work.
The answer to where you should plug in is directly into your GPU, as streaming the data from your external GPU to your iGPU will cause data throughput issues as it has to constantly stream data back and forth through the PCIE bus. Even in simple games at low resolutions where that wouldn’t be an issue, you’d still be introducing more input lag. That’s why connecting your display to your motherboard is usually considered a rookie mistake.
But obviously, if you’re outputting from your external GPU, that silicon is still being used while rendering on the iGPU, which I believe would erase any potential power savings.
I think the better solution if you really want to maximize power savings, would be to use a conservative power setting on your main GPU, and do things like limiting your framerate/selecting lower resolutions to reduce your power draw in applications where you don’t need the extra grunt. Modern GPUs should be pretty good at minimizing idle power draw.
Get a wall plug power meter and check for yourself. It’s a cheap investment and always fun to be able to check the power draw of different appliances once you’re done testing your computer. :)
From my experience there’s little power savings in your scenario.The idle power consumption for a GPU tends to be pretty low these days. The only major issue is running multiple monitors with different vertical timings. That will lock the VRAM frequency to maximum and use a lot more power.
Oh I gotta check this, I’ve always been running a slower screen alongside the main 144Hz one but this is the first time I hear about it affecting power draw this way.
Yea that’s what I kinda figured to be honest. Probably the only scenario would be wanting to do pass through to a VM or something where it might be useful
It’s best with dual AMD GPUs, and maybe Intel. YMMV once you throw Nvidia into the mix.
Monitor needs to be plugged into the GPU to utilize it.
In what setup? Using switchable graphics?
With your desktop
I’m pretty confused what you’re getting at to be honest.
If the monitor is plugged into the motherboard, it will exclusively use the integrated GPU. If it is plugged into the video card, it will exclusively use the discrete GPU. A standard desktop motherboard lacks the extra wiring to allow routing discrete graphics through the onboard video output or vice versa.
OK, got it. Thanks
The question is in regards to a desktop, not a laptop. :)
Yet, PRIME also works on Desktop, and Bumblebee in some scenarios as well.
Not sure if that’s a good idea. Switchable GPUs suck bad enough on laptops, where that’s a standard use case. With desktop where that’s more exotic, I’d be very surprised if all that works without issues.
I’ve generally had pretty good success with laptops, but they were always Intel + Nvidia with only one HDMI port available.
I’ve had some pretty bad issues with Intel + Nvidia laptops.
In general, the less main-stream your configuration gets, the more complicated and error-prone it becomes.