Why will my second graphics card not display anything when plugged into a monitor?
So I have a two graphics card setup. They both show up on device manager. But any monitor that is plugged into the second card is undetectable. I have narrowed the problem down to the card itself. And usually I would presume that it means the card went bad but it shows up in device manager just fine, fully updated. What is the meaning of this? Why won't it activate any monitors plugged into it?
- Anonymous10 months ago
When you have two video cards, the second card is often just used for computation purposes, and it sends all of its graphics data out through the the first card. So the second card is just a slave to the first one. In AMD this technology is called Crossfire, and in Nvidia it's called SLI.