Jump to content

Welcome to Geeks to Go - Register now for FREE

Need help with your computer or device? Want to learn new tech skills? You're in the right place!
Geeks to Go is a friendly community of tech experts who can solve any problem you have. Just create a free account and post your question. Our volunteers will reply quickly and guide you through the steps. Don't let tech troubles stop you. Join Geeks to Go now and get the support you need!

How it Works Create Account
Photo

SLi


  • Please log in to reply

#1
Seven!

Seven!

    Member

  • Member
  • PipPipPip
  • 161 posts
What is the advantage of an SLi system over a single-GPU system? If I don't do extremely hardcore gaming, should I not even bother? Should I get a single SLi-compatible motherboard right now, as well as a video card that supports it, so in the future it's easy to upgrade? Thanks.
  • 0

Advertisements


#2
jrm20

jrm20

    System building expert

  • Retired Staff
  • 2,394 posts
You only really notice it with the games playing at high resolutions as it wont bog down as much as it would with a single video card.
  • 0

#3
warriorscot

warriorscot

    Member 5k

  • Retired Staff
  • 8,889 posts
SLIs primary use is for allowing the use of high settings in games and other 3d apps on very high reolution monitors(above 1600x1200) not really as an upgrade tool, with a second card you notice a difference when you play at the lower more normal resolutions but its not anything like you would expect they dont start to work properly until you get to the high resolutions.
  • 0

#4
p-zero

p-zero

    Member

  • Member
  • PipPipPip
  • 276 posts
To me it was worth every penny. Another advantage of running SLI is that you can buy 2 "mediocre" cards rather than 1 awesome one for less money. Example a nice 7800 GTX is roughly $400, where as I got 2 6800's for $99 each after rebate. And I run everythihng at max settings with NO slow down. On the more graphically intensive games, my fps drop to roughly 80 fps. Where as with a single better GPU it would drop below that. And if you do go the SLI route, make sure that the board you get see's the "full" 32x (16 per card) rather than only 16x (8 per card). Theres only a couple of true 32 sli boards out there, most are only 16x.
-Pete.
  • 0

#5
warriorscot

warriorscot

    Member 5k

  • Retired Staff
  • 8,889 posts
Performance doenst always work out that way, you can sometimes get a cheap deal on cards but you shouldnt count on it. And remember its two cards to replace on an upgrade, lots of people pass cards down from a new pc to an older or spend a little to upgrade an old mobo to get the new card, the SLI mobos are expensive, gotta balance up all the practicalities. You also need a more powerful PSU as well as two 6800s for example use much more power than a single GTX, and SLI is unsuitable in a media center pc that you need to keep quiet. Even with two specialist coolers two of them can be quite loud.
  • 0

#6
p-zero

p-zero

    Member

  • Member
  • PipPipPip
  • 276 posts
Very true Scot. The SLI motherboards are definitly a lot more expensive. You can get a good motherboard for about $100. Where as SLI mobo's start at about $130 (depending on where you look). AS the one that I got costs about $230. And as for the GPU's I believe nVidia is trying to phase out the 6 series, Ive noticed the price on them has dropped drastically. [bleep] 3 months ago the 6800's I bought were $200, now theyre 129. And granted if you want to use your PC for media recording, then yeah itll be a little bit loud. But for gaming, it cant be beat. As I run HL2/counterstrike source at well over the 120 fps mark, even hitting over 200fps on some maps. Even on the "laggy" servers i only drop to 80-90. But in the end it all comes down to how much can you really spend on a system or upgrade.
-Pete
  • 0

#7
warriorscot

warriorscot

    Member 5k

  • Retired Staff
  • 8,889 posts
Yeah, but since your eye cant detect past the 25Fps mark and if its a stable average above 35 you dont notice anything at all so you have alot of excess Fps that are utterly useless its just a number to brag about, i get a good 60Fps on CSS usually and it never goes below 45Fps and ive had my system for over 6 months now and i only paid 180 for an x800 then. Really unless you have a +20 inch monitor displaying over 1900x by whatever then its not very cost effective. Alot of the new cards are going away from fps to having more stable fps across the range, they will have stable fps over a large range and you can see them go from what would be low considering to having the same fps at hgiher resolution. And many games developers are putting fps caps on the games now the one most people know about are doom and quake both locked at 60FPS no matter what you do you cant go above that, and the new quake based ET will have it and so will UT2007 probably.

Edited by warriorscot, 21 March 2006 - 01:16 PM.

  • 0

#8
Seven!

Seven!

    Member

  • Topic Starter
  • Member
  • PipPipPip
  • 161 posts
I didn't intend to spend more than 400 bucks on my upgrade (motherboard, processor, video card), and I'm at 470 including shipping, so I'll go with a single video card.

Whoo, thanks =]
  • 0

#9
p-zero

p-zero

    Member

  • Member
  • PipPipPip
  • 276 posts
I beg to differ Scot, about the 25fps mark. Because usually when you hit 25 fps its not a steady 25. But definitly you cant tell a difference at 40 fps. I mainly got it so I can run FEAR and COD2, at max video settings with no slow down, without spending 400 on a card.
And wouldnt dpi determine what a vid card can display or rather if you can tell the difference between vid cards? I remember when I went from a regualr CRT to my dell flatscreen, which is also CRT but it had way more dpi, Morrowind never looked so good. AND, Id be willing to say that it has better faster resolution than my bros widescreen MAC display. But it also takes up WAY more room than a flat panel. But there are no trails either.
-Pete.
  • 0

#10
warriorscot

warriorscot

    Member 5k

  • Retired Staff
  • 8,889 posts
Lol, DPI us for measuring printers, monitors are measured in there reolution, for example 1280x1024, is 1280 "dots" or pixels by 1024 pixels so on a 19" you would have over 8400 pixels per square inch. It depends on your averages, i usually have a solid fps with under 2-4% below the given fps thats a trademark of ATI cards they generally produce lower but more stable fps so i can get 25 and have it perfectly stable and playable but you obviously want a margin because there are some parts of any game that will take more to render than others and using certain settings like the HDR and volumetric lighting effects cause it to become more unstable but thats just what happens.

So at high resolutions SLI is what you use as a single card really reaches an optimum limit at 1600x1200 SLI and Crossfire are for resolutions beyond that.

Not surprised a mac monitor gets ghosting(thats what those trails are called) its common in monitors with high response time, newer sub 8ms monitors dont get that though, resoltion wise CRTs are usually higher in price comparison( a large CRT is just as expensive as an LCD sometimes) high resolution lcds cost more generally than the standard 1280x1024 monitors.
  • 0

Advertisements


#11
p-zero

p-zero

    Member

  • Member
  • PipPipPip
  • 276 posts
Scot, I realize that dpi is used for printers. When I bought this monitor thats how DELL advertised it. They later changed it later to pixels per inch. This particular monitor has more pixels per inch than other CRT monitors. Hence the name Ultra Scan. I might be a little off on this but I think its pretty close, normal CRT monitors have/had something like 18 ppi. This particular monitor has 29 I believe. Not positive on the numbers, but that was my main reason for getting it, you can actually tell the difference between graphics cards. For example, the old system with the 6200oc, it had good frame rate etc. but the smoke, water, various textures didnt look as "clean" so to speak as they do with my 6800's, even when I ran only 1 6800 I could immediatly tell teh difference. But, then again I pay REALLY close attention to details. My best friend designs game maps, does 3-D art, etc and asks me to test em out and tell em what I think. And with this monitor you can tell the difference on how things are shaded, rendered, textured, etc. so I really appreciate a GOOD looking game, that has lots goin on, with lots of hi-res textures, they just look so dang sweet. Just a clarification.
-Pete.
  • 0

#12
Hammm

Hammm

    Member

  • Member
  • PipPipPip
  • 203 posts
i think SLI is pretty nice for upgrade because you'll have two options instead of one. but what other than the ASUS AN832 is 32x?
  • 0

#13
p-zero

p-zero

    Member

  • Member
  • PipPipPip
  • 276 posts
I think Asus is the only manufacturer right now with true 32x SLI. They have more variations of the A8N. One of em is for the pentium processor and the other is setup for ATi. I think the one for the pentium D is a P5N32, and the ATi crossfire board is A8r32-mvp.
-Pete.
  • 0

#14
warriorscot

warriorscot

    Member 5k

  • Retired Staff
  • 8,889 posts
Abit make them as well(Gigabyte might as well i think but cant be sure without reading the spec sheet from there site) , and i would probably go with them over Asus if i was going for an AMD system. There are a few more of the 32 Crossfire systems though.

Our discussion seems to have delved into a discussion on monitors, a gfx card is only as good as the monitor it outputs to, for most thats a monitor with a 1280x1024 native for lcd or sometimes if they still ise CRT its 1600x1200 the borderline sli res. The sli was primarily meant for the gaming nuts who have the big LCDs with the 1920+x1400+ resolutions, and of course to sell double the number of graphics cards of course.

SLI isnt a good upgrade option there are a few issues with later upgrades including compatibility which is better than it was at the start but it can still crop up especially after a long gap between purchase, and you get left behind with the technology with a second card added in sli, performance increase isnt anywhere close to double its nearer 60% for most cards, the new gens of cards are often double the performance of there last generation and graphics cards have a price limit where they wont drop below for a long time due to there high production cost and its difficult to recoup those costs on graphics cards as they dont have any sort of subscripions or subsidies like other electronics like phones and mp3 players do(some phones you can buy for 200 have a cost to produce nearer a 1000).

Well rambling again.
  • 0

#15
p-zero

p-zero

    Member

  • Member
  • PipPipPip
  • 276 posts
Nvidia's 6 and 7 series are both SLi compatible. As I would assume the new dx10 will be. And the only real issues with SLi is the cards the have a double GPU system. As it will only utilize one GPU in SLi mode, on each card. When running only one there are no issues ( at least that I have read about). Another thing with the dual GPU cards is space, when you have 2 Dual GPU's it covers one of the pci slots so you lose one. As the case with my board if I use those Id be left with only 1 truly usable slot.
-Pete.
  • 0






Similar Topics

0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

As Featured On:

Microsoft Yahoo BBC MSN PC Magazine Washington Post HP