Edited by Casheti, 12 April 2006 - 01:33 AM.
Video card to run Oblivion when it comes out...
Posted 12 April 2006 - 01:31 AM
Posted 12 April 2006 - 05:43 AM
Posted 12 April 2006 - 07:49 AM
I haven't been on the forums for a while because, yes, I got Oblivion maybe a week or two ago. My new computer (described in my signature) should come today but I've had to play it on a 1.4GHz Intel P4 with 512 RAM and a RADEON 9600 SE. I'm hooked on the game even though it runs at 14 FPS.
So if you're eager to see what the play is like, buy it now because your computer will probably run it, if you want it for looks, well, I'm forced to run it all on low everything so it doesn't look special, so wait a while.
Just letting you know that it is playable under the required specs (barely)
P.S. - Anyone have a computer parts review site (for things like sound cards, video cards, processors, etc.) that isn't focused on how well they overclock? I was on NewEgg and every review for the X800 GTO talked about overclocking! Help!
Posted 12 April 2006 - 08:07 AM
OCing my CPU from 2.2ghz does far more for frame rates than OCing my Vid card from 450/425.
Rightnow I'm running at max setting with 8x anio filter and 4x anti ailising with smooth smooth frame rates
Edited by JourneyMan, 12 April 2006 - 08:13 AM.
Posted 12 April 2006 - 08:30 AM
Posted 12 April 2006 - 10:24 AM
"I'm not sure if most people consider this fact, but you can
get a higher memory bandwidth if you synchronize the core
and memory clocks according to their capabilities.
For example: I own a Geforce 2mx 200with an FX on the way.
This Geforce 2's gpu is a 256-bit architecture which means
it can process 32bytes of information per clock cycle. The geforce 2's
memory bus is 64-bit sdr which means it can only send up to 8 bytes
of information to the gpu at a time to process.
The default clock speeds of the card are 200mhz gpu and 200mhz memory.
If you look at this carefully, you can see that power is being wasted and the
gpu is unessesarily warm, because of unused clock cycles. In this situation, you can shave of 75 percent of the
gpu speed and still get the same frame rate because
(200mhz, 64bit) is the equivalant to (50mhz, 256 bit). Without a heatsink you can actually double the cards bandwidth from 1.6 gigabytes/sec to 3.2 gigabytes/sec.
When the card's gpu clock is set at 100mhz, it has the maximum potential to process 3.2/gigabytes/sec with the memory set a 400mhz. I tried it. It really works.
400mhz 64-bit sdr memory=3.2gigabytes/sec. 64 bits= 8*8 bytes. 8 bytes * 400,000,000hz= 3,200,000,000hz bytes which equals 3.2gigabytes/sec.
50mhz 256-bit gpu=3.2gigabytes/sec. 256 bits= 8*32 bytes. 8 bytes *50,000,000hz=3,200,000,000hz whcih equals 3.2gigabytes/sec.
After all, it's the bandwidth we want, not the heat.
My guide line for the 5200oc is the following if you wish to possibily have even memory bandwidth
than you do already:
Geforce 5200 has 64-bit ddrsdram which is effectively 128bit. and a 256bit gpu. Since you have a good heat sync, this will work even better: Start at the default clock speed. Downclock your gpu to be a to be exactly one half of the memory speed so that you don't have excess heat and unused clock cycles. For every 5mhz you overclock you gpu, overclock you memory by 10mhz(real memory speed not the effective clock speed). Some programs show memory speed in effective clock. In this case, 10mhz, which is the real clock speed, equals 20mhz. Using this method, I was able to more than double the effective bandwidth of my geforce 4000 from 3.2gigabytes/sec to 7.2gigabyes/sec. Yowch!
Mr. n oob, with your effective memory clock set to 530mhz I have calculated that you have your clock speed set too high. 265mhz core is all you need to have it at to get the same efective bandwidth that you currently have, which is only 8.480gigabytes/sec. Decrease your core clock to this number. From then on, for every 5mhz increment, increase your effective clock by 20mhz. The decreased heat on your board should allow you to do this effectively. 20mhz effective memory clock= and increase of 320megabytes/sec, for your reference! Good luck, chump."
Posted 12 April 2006 - 10:28 AM
Posted 12 April 2006 - 03:57 PM
amd athlon 2700+ 2.1 processor
1279 mb of ram
radeon 9550 graphics card
i know that it's a modest system, but i dont have the biggest budget...
does anyone know on what level of quality this will run oblivion? im going to get it no matter what, but i want to be prepared for how bad it will run!!!
Posted 12 April 2006 - 05:02 PM
You GFX is not really as good as mine. XLs are bottom end, Pros are just next to top end, plus mine is a VIVO. Those use the exact same board and chip as the X800XTPEs, minus 4 pipes.
I'm doing it at 1280, my monitor won't support support higher resolution.
Check that, just looked, I'm running at 6x anti from within the game, in Cat Center I have both anti and anio set to application. Who knows what the game set anio to....
Sight distances are at a medium, until I get my 2gigs of low lat RAM in. Shadow levels are set a little low, though the quality is set to high. All filering and quality levels are set tio highest.
Grass will give lag. I drop a good amount of FPS while outside in daylight. Oblivion levels are smooth though.
Clocks for CPU and GFX card are in sig. Uping the FSB from 210 to 225 really did alot for frame rate which is why I think this game is so CPU dependant. The higher OC + disabling Cool and Quiet and tuning my RAM a bit let me go from 2x anti to 6x anti with minor minor outdoor lag. I've been in Oblivion for a while though, and thats super smooth
Edited by JourneyMan, 12 April 2006 - 05:03 PM.
Posted 12 April 2006 - 05:24 PM
Still im looking at performance tables and not even the x850s are managing that on medium settings with that level of AA. They are fine without it but the AA is crippling in almost every game the difference between x800 pro and x800xl and x800XT at high settings at 1280 are all within 5FPS the difference between them is tiny.
Edited by warriorscot, 12 April 2006 - 05:26 PM.
Posted 12 April 2006 - 06:02 PM
Wish I could take a video of gameplay to prove it, but I don't have any programs loaded that could, and I imagine any such program would eat up CPU and drop FPS.
I've done considerable work to the shell of this OS in order to reduce CPU and mem load to an absolute minimum. A bit beyond your standard menu selections and registry tweaks. I suppose that leaves a lot of room for the CPU to handle my games, which would further support my theory that this game is highly CPU dependant.
I build kick butt computers
Posted 12 April 2006 - 07:29 PM
My new computer (which came this morning, mentioned in the sig) can run it on high settings at 1024 resolution with almost no problems, did some tweaking though.
So yes, a good CPU helps, but is not the most important piece.
Posted 12 April 2006 - 08:39 PM
Posted 13 April 2006 - 06:52 AM
I don't know, I'm just going off how much the gameplay smoothed out when I OC'd my CPU. I could be wrong, but it certainly worked for me.
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users