Jump to content

Welcome to Geeks to Go - Register now for FREE

Geeks To Go is a helpful hub, where thousands of volunteer geeks quickly serve friendly answers and support. Check out the forums and get free advice from the experts. Register now to gain access to all of our features, it's FREE and only takes one minute. Once registered and logged in, you will be able to create topics, post replies to existing threads, give reputation to your fellow members, get your own private messenger, post status updates, manage your profile and so much more.

Create Account How it Works
Photo

Questions about 1080i resolution...


  • Please log in to reply

#1
UV_Power

UV_Power

    Member

  • Member
  • PipPipPip
  • 391 posts
Hey everyone,

I am not sure if this is the right G2G forum to be posting a question like this, but it seemed like the most appropriate to me.

Now then, I use a 32" HDTV as my computer monitor. The native resolution is 1366x768. I have an HDCP enabled video card (eVGA GTS250) and it is connected to the TV with an HDMI cable. The TV is capable of displaying a 1080i resolution. When I go to my display settings, the video card is automatically set to 1360x768. But, the max setting is 1920x1080.

When I select 1920x1080 in the display settings, the resolution will change (and the TV recognizes that it is now in a 1080i format), but the picture looks so much worse. Very fuzzy and also the edges of the desktop run past the edges of the TV so some of the desktop icons are cut-off.

I understand that 1080i is just two sets of 540 lines showing at separate intervals in which the human eye cannot detect, but why is the picture quality so poor and cut-off at the edges? Shouldn't the higher resolution look nicer, even if the native resolution is lower?

Also, on a questionably related topic, are there any HDTVs that are 720p which are not 1080i? I've been told that when you see "720p" written on the box that it is implied that the TV is also capable of running 1080i, which I don't think is true. Is it?
  • 0

Advertisements


#2
phillipcorcoran

phillipcorcoran

    Member 1K

  • Member
  • PipPipPipPip
  • 1,293 posts
For the sharpest & clearest picture, LCD screens have to be fed with a video output resolution that matches the screen's quoted "native" resolution.
Set it higher or lower and picture quality will suffer very noticeably.
  • 0

#3
devper94

devper94

    Member

  • Member
  • PipPipPip
  • 817 posts
Try the "automatic adjustment" function. Most modern TV/LCD screens have that option.
Since your TV appears to support 1080i, it should work fine.
  • 0

#4
UV_Power

UV_Power

    Member

  • Topic Starter
  • Member
  • PipPipPip
  • 391 posts

Set it higher or lower and picture quality will suffer very noticeably.

Hmm... This is pretty confusing. :D I know in the past this was only true if you set the resolution lower, but the picture usually looked a lot better when setting the resolution higher. So, does this mean any 1080i HDTV will only have a clear picture at 1360x768 when hooked up to a computer? I figured if it was 1080i then it was capable of higher resolutions. When I set the resolution of my video card to output 1920x1080, a window from the TV's menu says "1920x1080i @ 60Hz". Maybe it just thinks it is in 1080i??? How come when you hook up other devices (like Blu-ray players, Xbox 360, PS3, etc.) to a 1080i HDTV, they can show an interlaced 1920x1080 resolution? What's the difference between the video output of those devices and the video output of an HDCP capable video card in a PC?



Try the "automatic adjustment" function.

That option is grayed out (unavailable). The user manual says this will only work with VGA input. Not DVI/HDMI.

Edited by UV_Power, 05 February 2011 - 12:33 PM.

  • 0

#5
devper94

devper94

    Member

  • Member
  • PipPipPip
  • 817 posts
there is no difference between the output of devices.
is it really 1080i, or is it 1080p? i suggest you check again.
  • 0

#6
UV_Power

UV_Power

    Member

  • Topic Starter
  • Member
  • PipPipPip
  • 391 posts
I double checked. This is an older TV (purchased back in 2007). It's 1080i.
  • 0

#7
devper94

devper94

    Member

  • Member
  • PipPipPip
  • 817 posts
No, I mean the output from your graphics card. Is it 1080i or 1080p ?
Have you checked the graphics card settings ?
  • 0

#8
UV_Power

UV_Power

    Member

  • Topic Starter
  • Member
  • PipPipPip
  • 391 posts
I have yet to find an HDCP capable video card that can only output 1080i and not 1080p.

Taken from Nvidia's website:
"Maximum Digital Resolution: 2560x1600
Maximum VGA Resolution: 2048x1536
HDCP: Yes"

We're getting a bit sidetracked here. I just wanted to know why my HDCP-enabled video card (that can obviously output 1920x1080), which is hooked to a TV (that is capable of running 1080i), displays a fuzzy picture that runs past the edges of the screen when I set the resolution of my display to 1920x1080. You say the display will only look good running in the native resolution (1360x768). Alright, so does this mean this picture will always be "fuzzy" when running blu-ray movies or HD video games in 1920x1080 with anything other than a 1080p HDTV/monitor?

Based on these articles, I didn't think that is how it worked:

Wikipedia article on 1080i:
"The term 1080i assumes a widescreen aspect ratio of 16:9, implying a frame size of 1920×1080 pixels."

CNET article comparing 1080i to 1080p:
"1080i ... actually boasts an identical 1920 x 1080 resolution, but conveys the images in an interlaced format."

I mean, I understand that if I watched a Blu-ray movie in 1080i, the picture will look a bit fuzzier than if I watched it in 1080p because the display of moving objects are interlaced (540 lines at a times). But still pictures, such as a Desktop background, should look good, right?

Also, I just updated my video driver to the latest 266.58 (released about two weeks ago). No change.
  • 0

#9
UV_Power

UV_Power

    Member

  • Topic Starter
  • Member
  • PipPipPip
  • 391 posts
OK. I just plugged in a 1080p LCD TV and the same thing happens. This TV's native resolution is 1920x1080 (unlike my old 1080i TV, which was 1366x768) and the same thing happens. Very fuzzy picture and the edges of the screen are cut-off.

Any ideas?
  • 0

#10
devper94

devper94

    Member

  • Member
  • PipPipPip
  • 817 posts
i can only think of this being a config problem. try tweaking the video card configuration.
  • 0

#11
UV_Power

UV_Power

    Member

  • Topic Starter
  • Member
  • PipPipPip
  • 391 posts
Ahhh... FINALLY! Success. It was an overscanning issue. The default Aspect Ratio of this TV (an LG 32ld450, btw) is 16:9. When I go into the Aspect Ratio in the video settings of the TV, the options listed are 16:9, 4:3, Set by Program, Zoom, Cinema Zoom, and Just Scan. Tried them all, but it was the "Just Scan" that did the trick. Apparently this does "1:1 pixel matching", according to their website. When using the "Just Scan" setting, my picture looks sharp and my whole desktop fits on the screen.

Try the "automatic adjustment" function. Most modern TV/LCD screens have that option.

You were right, devper. I guess my last TV did not have this option (at least not one that pertains to scanning). Thanks for the help.
  • 0






Similar Topics

0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

As Featured On:

Microsoft Yahoo BBC MSN PC Magazine Washington Post HP