Is there a difference?
Posted 23 June 2005 - 08:14 PM
Posted 23 June 2005 - 10:21 PM
they are not seperate outputs or inputs on the video card or monitor.
any modern monitor will be able to support multiple resolutions, so you shouldnt have any problems, unless you are trying to use a pretty old monitor. to find out, go to display settings, turn the resolution up to what you want, and hit OK. if the screen comes back, your good. if it goes blank, windows will revert to the old resolution after 15 seconds, unless you tell it to keep the new settings.
hope that helps!
Posted 27 June 2005 - 10:19 AM
Previous to VGA, you had RGB. That was quickly replaced as programers favored the GUI (Graphical User Interface) The RGB port had 9 pins (similar to a serial port) Then VGA has 15 pins (3 rows of five).
Standard VGA (Video Graphics Array) is 640 by 480. This means that the VIDEO card is outputting this resolution. Most modern monitor are capable of much more.
This is where the Super VGA comes in. The monitor itself has an aperture grid inside the front screen. This grid has many holes that the electron guns in the back of the monitor shoot the red, green and blue electrons through. If all three guns are firing at exactly the same amplitude, you get white on the screen.
The screen refresh rate is inportant in the VGA/Super VGA debacle. The refresh rate is a referrence to the number of times a second that the screen updates it's information. The Video signal starts at the top left side of the monitor and goes across the top of the screen, then has a small blanking pulse (no data emitted) then it goes to the next line.
For a resolution of 640 by 480, the electron guns hit all assigned 640 pixels (small tri-colored dots on the front of the screen) on the first line, then continues to line 480 before starting back at the top. It does this 60 times in one second.
Each one of the pixels must have a color value, one for the strength of the red, one for the strength of the green, and one for the strength of the blue.
It stands to reason that the higher the resolution of your card/screen setting, the more memory on your card you are going to need to store the information to display.
If you increase your refresh rate, it does all that more times a second. (less would cause a flicker)
When you go up from the standard 640 by 480, you have numerous options, 800 by 600 or 1024 by 768 and many more. These resolutions will place more demand on your video system. Of course with the higher resolution and the number of colors selected (256 color or 16/32 bit) places more demand on your video card's processing ability.
Most of today's on board video systems and addon video cards are in a constant race to keep up with the requirement programmers are writing software for. A good example is DOOM3. Even the newest card in the latest system is not powerful enough to handle this game when all of the game's features are turned on.
Well I have seemed to run on a bit, I hope this added to the previous response and answered your question.
Posted 27 June 2005 - 07:04 PM
we consider that a good thing around here!
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users