Jump to content

Welcome to Geeks to Go - Register now for FREE

Need help with your computer or device? Want to learn new tech skills? You're in the right place!
Geeks to Go is a friendly community of tech experts who can solve any problem you have. Just create a free account and post your question. Our volunteers will reply quickly and guide you through the steps. Don't let tech troubles stop you. Join Geeks to Go now and get the support you need!

How it Works Create Account
Photo

2nd Video Card not Recognized after reset


  • Please log in to reply

#1
VidenTheColdOne

VidenTheColdOne

    Member

  • Member
  • PipPip
  • 53 posts
UPDATE 12/20/08: I called EVGA, and they told me to put the faster card on the bottom slot, and the slower one in the first slot. I did so (which was a pain) It booted up, recognized the card. I reset, and BOOM the card is gone AGAIN. Could someone PLEASE help me!!!??



I just got a 2nd EVGA 7950 GT to run in SLI. I installed it, and it ran fine. I set up SLI with no problems. I later decided i wanted to watch a movie, so i connected my TV to my video card using the S-Video and RCA cables. I reset my computer, and it stops recognizing the second card. It doesn't show up on the hardware list, and i can't get to the SLI options in the Nvidia control panel. I updated the drivers and everything, but it wont recognize it. I tried just taking the cable out and restarting, but it doesn't seem to help. I have to completly take out the card, put it back in, and then reboot. Only then will the computer recognize it again. I made sure the card was seated correctly, and the power was hooked up securely. I can't understand why it does this. Any ideas? If you need any more information, just ask. I will include my system specs. DXDiag is included.

UPDATE: I put the card back in, turned my computer on, and it showed up again. I had to reset again, but when i did, it again disappeared. So i don't think the S-Video has anything to do with it.


GPU - 1 EVGA 7950 GT KO 512mb, 1 EVGA 7950 GT 512mb (I was told the overclocking on the one does not matter, only the chipset, and the memory)
CPU - Intel Pentium D 3.4ghz Dual Core
RAM - 3GB DDR2
PSU - 650 W Vortec
MOBO - ECS NFORCE 570 SLIT-A V5.1
HDD - 2x SATA Maxtor 250gb 1x SATA Samsung 500gb
OS - Windows XP SP3


Please help!

Attached Files


Edited by VidenTheColdOne, 20 December 2008 - 07:49 AM.

  • 0

Advertisements


#2
VidenTheColdOne

VidenTheColdOne

    Member

  • Topic Starter
  • Member
  • PipPip
  • 53 posts
For the love of god, could somebody please help me? I need suggestions. Could someone TRY to work me through this. I can't try anything if nobody will respond. It's been over a week and i have gotten no reply. PLEASE HELP!!!
  • 0

#3
dji

dji

    Member

  • Member
  • PipPip
  • 27 posts
You didn't provide enough information, so:
Which video output of graphic card you are using for monitor? RGB or DVI?
Install newest Nvidia drivers for your card

1. Test system with just one card stick in primary slot and without S-cable

if that work

repeat test but with only other card

if that work

2. Put both card without connecting SLI cable

- In NVidia driver settings select option OPTIMIZED FOR MULTI DISPLAYS
- Connect your RCA cable in the graphic card which is in primary PCI(e) slot
- In viewing mode do not use CLONE until you know exactly what are you doing.
Using RCA (or S-Video) cable in most better case you can reach resolution up to 1024x768 on that output (supposing your TV can support it)
For larger resolution you must use DVI, RGB or HDMI outputs. If your TV has support for HDMI but your graphics card has no such output you can buy DVI-HDMI adapter and reach FULL HD resolution (1920x1080) on TV if your TV supports such.
So for now set viewing mode to Multiple Displays meaning each output will have it's own resolution
- For primary display select you Monitor and for secondary select TV
- Set resolution on 800x600 for TV (that is for now you can latter increase it up tu 1024x768 if your TV supports for it)
- Do identify your displays in driver section (You should see large numbers 1 and 2 on both monitor and TV)
If your TV is not recognized, turn on option FORCE TV DETECTION, and manually setup parameters (PAL or NTSC)
Restart computer if asked to do that.
- if yor TV is not recognized try to connect RCA cable in second card and repeat process until you get numbers 1 nad 2 over screens on both Monitor and TV
- Now set up TV resolution to desired and check is it supported by TV (if you lose picture on TV you reach maximal supported resolution by your TV)


Try several restarts with and without TV turned ON.

If everything is fine then shutdown comp, connect SLI cable
and try it again
  • 0

#4
VidenTheColdOne

VidenTheColdOne

    Member

  • Topic Starter
  • Member
  • PipPip
  • 53 posts
The video card has a DVI connection, but my monitor is VGA, so i have to use an adapter. I found it seems to have nothing to do with the S-video. I unplugged the S-video completly, and tried to get it working. I restarted, and both cards were showing, and in SLI. It will work fine, but then as soon as i reset, the second card dissapears, and cannot be found again untill i take it out, and put it back in. I called EVGA, and they told me to put the slower card on top, and the overclocked card on the bottom. I did this, and the same thing happened. All my drivers are updated as of like yesterday. Right now, i could care less about getting my TV to work with my computer. I just want to be able to run the two cards in SLI.
  • 0

#5
dji

dji

    Member

  • Member
  • PipPip
  • 27 posts
After second card "disappear" what you exactly see in Device Manager tree in section Display Adapters? Only one or both cards?
I don't know how are you familiar with SLI but point of SLI is to allow system to see more GPUs as one powerful single GPU. If you need use each particular GPU separately then you must disconnect SLI cable and set option OPTIMIZED FOR MULTI GPUs in NVidia driver settings.
  • 0

#6
VidenTheColdOne

VidenTheColdOne

    Member

  • Topic Starter
  • Member
  • PipPip
  • 53 posts
Is that what is supposed to happen when you turn on SLI? It just shows as one card in the display adapters section. Also, in the Nvidia control panel, the SLI options dissapear.

Edited by VidenTheColdOne, 24 December 2008 - 07:51 AM.

  • 0






Similar Topics

0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

As Featured On:

Microsoft Yahoo BBC MSN PC Magazine Washington Post HP