Message boards : GPUs : Multiple AMD GPUs = crash?
Message board moderation
Author | Message |
---|---|
Send message Joined: 17 Nov 16 Posts: 893 |
Ian has lots of experience with using those USB based risers. Always stated you have to use quality shielded USB cables for anything to work or the cards fall off the bus. |
Send message Joined: 24 Dec 19 Posts: 229 |
Not always. But they do have a high failure rate for PCIe 3.0 speeds, sometimes they work though. I had like 2 work OK out of over 10 cables. Peter has old boards not even capable of PCIe 3.0 though, and I think some of the slots are even PCIe 1.0 if I recall correctly. But it’s probably a motherboard limitation, can’t handle multiple GPUs. I don’t have any experience running multiple AMD GPUs for BOINC, but I had a machine running 8x RX 570s mining a few years ago. That system used a Pentium dual core CPU on a Z270 motherboard and running Windows 10. There’s nothing special about an AMD system vs an nvidia system. You just need the right platform to begin with. I think if you want to seriously run a multi GPU setup you just need to invest in newer hardware. Z170 or newer, something with lots of PCIe slots. Intel motherboards seem to play nicer with this kind of thing than AMD motherboards. |
Send message Joined: 8 Nov 19 Posts: 718 |
I've ran multiple RTX 2080Tis on PCIE 3.0 with cheap x16 risers for many years, and for the most part they run without problems. But when I do run them, I make sure I get the 'double ribbon' x16 risers. The ones where 2 ribbons are in parallel with one another. They're good enough for up to 20cm and never really failed me. I've had 1 or 2 bad ones out of a good 10, one of them was because I had bent the cable and the solder joints came off, one of them had worn contact pins from swapping GPUs too much (over 20 swaps) and only ran at x4 or x2 speeds. Probably bad contact points. And possibly 1 more with some errors that could have a variety of reasons non-PCIE related, but I just threw it away anyway, and installed a new one.. Once you get more than 20cm ~8in, you need shielded risers. The more expensive black ones. They have much better cable mounting, and can withstand much better torquing and flexing. For basic mounting a GPU and never touching it, the regular grey ones (with parallel ribbons) are good enough. If I were you, I'd try to install the GPUs without risers on the motherboard, just to see if the motherboard accepts it like that (to rule out any bad risers). You might also need to go into the Bios to see if the right speeds (PCIE 2.0 or 3.0) are selected, and if you need to enable a second GPU slot or not. On some boards you first need to set the "Above 4g decoding" option, if your GPU has more than 4GB of VRAM. If your BIOS supports it, you can see if the GPUs are recognized, what slots they populate and what the slot speed is (x4/x8/x16), what PCIE speeds they run on (PCIE 2.0/3.0/..). If you can't even boot into BIOS with 2 GPUs, try swapping out one of the GPUs with a spare older one you may or may not have lying around. A GT710 is a rather poor GPU, but a GT730 (with DDR3) is a good GPU for regular day to day activities, that only costs like $25-50 on the second hand market. |
Send message Joined: 8 Nov 19 Posts: 718 |
Seems like your motherboard supports multi GPUs, so that's out of the equation. You could try installing a USB riser if you can, or replace the ribbon riser with a powered ribbon riser. Chances are the motherboard is either not capable of powering 2 GPUs through the PCIE slots, or barely. Using a powered ribbon or USB riser might alleviate the situation. Other causes might be: a driver issue, bad hardware (broken GPU), bad overclocking settings, ... The 4G BIOS option, is only if you have GPUs with higher than 4GB VRAM. For 4GB and below (even if you run 10), you don't need to enable the option. Older motherboard Bioses don't show much info on PCIE devices, like the blue background Amibios kind of Bios doesn't. Though sometimes they will show PCIE slot 00:02.0 or 00:04.0 whatever is populated... |
Send message Joined: 25 May 09 Posts: 1302 |
I do have one GPU that's annoying me more often, I'm currently swapping things around to see if I can isolate what the problem is. I don't overclock. The motherboard memory addressing by GPUs is not a simple xGb on GPU = xGb on motherboard - there are a whole pile of other things that get in the way, including, but not limited to: how the PCIe bus is mapped onto the physical memory at both ends of the link, driver mapping, data compression, what else is going on, GPU hardware & BIOS etc. The whole subject is a real pain and is well outside the scope of BOINC to even consider managing - indeed as you suggest it is an issue that resides somewhere in the infernal triangle of: BIOSes, operating system and GPU drivers :-( There is no simple solution. First make sure all the components work well "on their own" - for GPUs that means only one plugged in directly, checking each one in turn in a given slot without an extender in place; then check each one on its own on several extenders (this should weed out a few of the problematic extenders); now try each GPU in turn singly on a splitter, and so on. As you rightly imply, it takes a lot of time to go through all the combinations. Eventually you may end up with a working system with multiple GPUs connected to the motherboard via splitters and extenders. Don't forget to have an adequate supply of coffee to hand....) |
Copyright © 2024 University of California.
Permission is granted to copy, distribute and/or modify this document
under the terms of the GNU Free Documentation License,
Version 1.2 or any later version published by the Free Software Foundation.