Jump to content

Welcome to Geeks to Go - Register now for FREE

Geeks To Go is a helpful hub, where thousands of volunteer geeks quickly serve friendly answers and support. Check out the forums and get free advice from the experts. Register now to gain access to all of our features, it's FREE and only takes one minute. Once registered and logged in, you will be able to create topics, post replies to existing threads, give reputation to your fellow members, get your own private messenger, post status updates, manage your profile and so much more.

Create Account How it Works
Photo

Video card to run Oblivion when it comes out...


  • Please log in to reply

#91
Casheti

Casheti

    Banned

  • Banned
  • PipPipPip
  • 169 posts
But your max overclock isn't always your fastest though, is it? And also, I have heard of people's graphics card just getting fried, and exploding etc. If i go to the point where it crashes, isn't that dangerous for my beloved (but blatantly rubbish) FX 5200? I have it on 250/400 at the moment because I used the detect optimum settings button. I don't know if it makes a difference though. I also purchased yesterday 1GB of ram but it doesn't seem to help it 1 tiny bit so I thought that overclocking must be the answer. Oh and by the way, mine is PCI. Just thought that I would let you know just in case that is an important overclocking factor.

Edited by Casheti, 12 April 2006 - 01:33 AM.

  • 0

Advertisements


#92
warriorscot

warriorscot

    Member 5k

  • Retired Staff
  • 8,889 posts
Well the fact that its PCI means any other upgrades will probably be pointless, in a game the PCI slot is so slow that for even a slow card like a 5200 is will be bottlenecked limiting the performance of the whole system and because games rely totally on the gfx card for there performance youre pretty screwed.
  • 0

#93
BlackPandemic

BlackPandemic

    Member

  • Member
  • PipPipPip
  • 678 posts
Just throwin' this out there:

I haven't been on the forums for a while because, yes, I got Oblivion maybe a week or two ago. My new computer (described in my signature) should come today but I've had to play it on a 1.4GHz Intel P4 with 512 RAM and a RADEON 9600 SE. I'm hooked on the game even though it runs at 14 FPS.

So if you're eager to see what the play is like, buy it now because your computer will probably run it, if you want it for looks, well, I'm forced to run it all on low everything so it doesn't look special, so wait a while.

Just letting you know that it is playable under the required specs (barely)

-BlackPandemic

P.S. - Anyone have a computer parts review site (for things like sound cards, video cards, processors, etc.) that isn't focused on how well they overclock? I was on NewEgg and every review for the X800 GTO talked about overclocking! Help!
  • 0

#94
JourneyMan

JourneyMan

    Member

  • Member
  • PipPip
  • 86 posts
Accually I think this game is far more CPU dependant than you guys realize. Remember when HL2 came out? Same thing, except instead of a super advanced physics system that the CPU alone has to process, Oblivion has an advanced physics system + world physics system + advanced AI that works outside of sight range. That last bit is a biggie, and it really seperates this game from not only other RPGs, but just about any other FP game out there. Gameplay not being restriced to the current cell is a major breakthrough, and it puts alot more stress on the CPU to manage the AI for these situations.

OCing my CPU from 2.2ghz does far more for frame rates than OCing my Vid card from 450/425.

Rightnow I'm running at max setting with 8x anio filter and 4x anti ailising with smooth smooth frame rates

Edited by JourneyMan, 12 April 2006 - 08:13 AM.

  • 0

#95
warriorscot

warriorscot

    Member 5k

  • Retired Staff
  • 8,889 posts
See i find that hard to believe because just yesterday i read three reviews that used oblivion and they were rating the top cards and no card could manage that at 1600x1200 and only the X1900XTX and 7900GTX could do it smoothly at 1280 so i really dont see how a x800pro could manage that with 4x anti aliasing, i have an x800XL which is a faster card and i cant have AA on more than 2x, and even with a CPU overclock and a tweaked ini file i cant get that.
  • 0

#96
Casheti

Casheti

    Banned

  • Banned
  • PipPipPip
  • 169 posts
Is this quote from another forum actually true? PLZ HELP!!!

"I'm not sure if most people consider this fact, but you can
get a higher memory bandwidth if you synchronize the core
and memory clocks according to their capabilities.
For example: I own a Geforce 2mx 200with an FX on the way.
This Geforce 2's gpu is a 256-bit architecture which means
it can process 32bytes of information per clock cycle. The geforce 2's
memory bus is 64-bit sdr which means it can only send up to 8 bytes
of information to the gpu at a time to process.

The default clock speeds of the card are 200mhz gpu and 200mhz memory.
If you look at this carefully, you can see that power is being wasted and the
gpu is unessesarily warm, because of unused clock cycles. In this situation, you can shave of 75 percent of the
gpu speed and still get the same frame rate because
(200mhz, 64bit) is the equivalant to (50mhz, 256 bit). Without a heatsink you can actually double the cards bandwidth from 1.6 gigabytes/sec to 3.2 gigabytes/sec.

When the card's gpu clock is set at 100mhz, it has the maximum potential to process 3.2/gigabytes/sec with the memory set a 400mhz. I tried it. It really works.

400mhz 64-bit sdr memory=3.2gigabytes/sec. 64 bits= 8*8 bytes. 8 bytes * 400,000,000hz= 3,200,000,000hz bytes which equals 3.2gigabytes/sec.

50mhz 256-bit gpu=3.2gigabytes/sec. 256 bits= 8*32 bytes. 8 bytes *50,000,000hz=3,200,000,000hz whcih equals 3.2gigabytes/sec.

After all, it's the bandwidth we want, not the heat.
My guide line for the 5200oc is the following if you wish to possibily have even memory bandwidth
than you do already:
Geforce 5200 has 64-bit ddrsdram which is effectively 128bit. and a 256bit gpu. Since you have a good heat sync, this will work even better: Start at the default clock speed. Downclock your gpu to be a to be exactly one half of the memory speed so that you don't have excess heat and unused clock cycles. For every 5mhz you overclock you gpu, overclock you memory by 10mhz(real memory speed not the effective clock speed). Some programs show memory speed in effective clock. In this case, 10mhz, which is the real clock speed, equals 20mhz. Using this method, I was able to more than double the effective bandwidth of my geforce 4000 from 3.2gigabytes/sec to 7.2gigabyes/sec. Yowch!
Mr. n oob, with your effective memory clock set to 530mhz I have calculated that you have your clock speed set too high. 265mhz core is all you need to have it at to get the same efective bandwidth that you currently have, which is only 8.480gigabytes/sec. Decrease your core clock to this number. From then on, for every 5mhz increment, increase your effective clock by 20mhz. The decreased heat on your board should allow you to do this effectively. 20mhz effective memory clock= and increase of 320megabytes/sec, for your reference! Good luck, chump."


Well?
  • 0

#97
Casheti

Casheti

    Banned

  • Banned
  • PipPipPip
  • 169 posts
Also how do I overclock my Pentium 4 2.8GHz Processor? PLZ HELP!!!
  • 0

#98
dllp117

dllp117

    Member

  • Member
  • PipPipPip
  • 137 posts
hey, im wondering the same thing... i know that my system will run oblivion, but i don't know how well. here are the specs:
amd athlon 2700+ 2.1 processor
1279 mb of ram
radeon 9550 graphics card
i know that it's a modest system, but i dont have the biggest budget... :whistling:
does anyone know on what level of quality this will run oblivion? im going to get it no matter what, but i want to be prepared for how bad it will run!!!
  • 0

#99
warriorscot

warriorscot

    Member 5k

  • Retired Staff
  • 8,889 posts
low settings, medium with some tweaking of the ini file.
  • 0

#100
JourneyMan

JourneyMan

    Member

  • Member
  • PipPip
  • 86 posts
Well, I have tweaked the [bleep] out of my machine, and fine tuned my Vid Cards RAM.

You GFX is not really as good as mine. XLs are bottom end, Pros are just next to top end, plus mine is a VIVO. Those use the exact same board and chip as the X800XTPEs, minus 4 pipes.

I'm doing it at 1280, my monitor won't support support higher resolution.

Check that, just looked, I'm running at 6x anti from within the game, in Cat Center I have both anti and anio set to application. Who knows what the game set anio to....

Sight distances are at a medium, until I get my 2gigs of low lat RAM in. Shadow levels are set a little low, though the quality is set to high. All filering and quality levels are set tio highest.

Grass will give lag. I drop a good amount of FPS while outside in daylight. Oblivion levels are smooth though.

Clocks for CPU and GFX card are in sig. Uping the FSB from 210 to 225 really did alot for frame rate which is why I think this game is so CPU dependant. The higher OC + disabling Cool and Quiet and tuning my RAM a bit let me go from 2x anti to 6x anti with minor minor outdoor lag. I've been in Oblivion for a while though, and thats super smooth

Edited by JourneyMan, 12 April 2006 - 05:03 PM.

  • 0

Advertisements


#101
warriorscot

warriorscot

    Member 5k

  • Retired Staff
  • 8,889 posts
The x800 pro is the one just below the x800XL the x800pro came out before the XL and was in the original line up of x800 cards, unless you have flashed the bios to x800XT its still a pro and still got lower performance than the x800xl which has performance very close to the x800XT only limited by its core clock being lower and with a decent overclock the XLs got to 450 to 480 which in bencmarks goes well beyond the pros and goes closer to the XTs.

Still im looking at performance tables and not even the x850s are managing that on medium settings with that level of AA. They are fine without it but the AA is crippling in almost every game the difference between x800 pro and x800xl and x800XT at high settings at 1280 are all within 5FPS the difference between them is tiny.

Edited by warriorscot, 12 April 2006 - 05:26 PM.

  • 0

#102
JourneyMan

JourneyMan

    Member

  • Member
  • PipPip
  • 86 posts
Well I guess its the OC then.

Wish I could take a video of gameplay to prove it, but I don't have any programs loaded that could, and I imagine any such program would eat up CPU and drop FPS.

I've done considerable work to the shell of this OS in order to reduce CPU and mem load to an absolute minimum. A bit beyond your standard menu selections and registry tweaks. I suppose that leaves a lot of room for the CPU to handle my games, which would further support my theory that this game is highly CPU dependant.

I build kick butt computers :whistling:
  • 0

#103
BlackPandemic

BlackPandemic

    Member

  • Member
  • PipPipPip
  • 678 posts
To all you who ask about your CPU speed: Oblivion (unlike it predecessors) is highly VIDEO CARD dependent and not CPU, so if you want to overclock something, make it your video card.

My new computer (which came this morning, mentioned in the sig) can run it on high settings at 1024 resolution with almost no problems, did some tweaking though.

So yes, a good CPU helps, but is not the most important piece. :whistling:
  • 0

#104
dllp117

dllp117

    Member

  • Member
  • PipPipPip
  • 137 posts
YAY! :whistling: :blink: :help: I just found out I may be getting a radeon x800 graphics card! then I can play oblivion with no fears! (sorry, I just had to say that)
  • 0

#105
JourneyMan

JourneyMan

    Member

  • Member
  • PipPip
  • 86 posts
My vid card is OCd by 22% and has a custom BIOS. In 3dMark05 it tests out a bit better than stock x850xts. Not much though.

I don't know, I'm just going off how much the gameplay smoothed out when I OC'd my CPU. I could be wrong, but it certainly worked for me.
  • 0






Similar Topics

0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

As Featured On:

Microsoft Yahoo BBC MSN PC Magazine Washington Post HP