PDA

View Full Version : DVI->HDMI quality issues


de><ta
04-27-09, 01:23 PM
At present I am using a 1080p HDTV as a computer monitor and I recently tried using a DVI to HDMI adapter to connect to it from my PC. It looked like crap, it was blurry and pixelated as hell. :thumbdwn:

After a lot of tweaking it looks decent, but the text still does not look nearly as good as the VGA output.

Am I doing something wrong? Or is there a way I can get a sharp and clear image onto my HDTV using HDMI.

Revs
04-27-09, 01:27 PM
Similar setup here (DVI PC-HDMI TV). I have problems if I set up the screen in the windows thing, but not if I use the NVidia CP.

Hope this helps :).

EDIT: Where on the cable is your adapter; On the PC end or the TV end?

de><ta
04-27-09, 01:45 PM
Not sure if I have the Nvidia Control Panel installed, however I will download it and try it out tonight.

I have the adapter on the PC end from DVI to HDMI. At first I thought it was a problem with my monitor, since it works great with a VGA input.

Thanks anyway, will let you know how it goes. :)

Revs
04-27-09, 02:13 PM
Not sure if I have the Nvidia Control Panel installed, however I will download it and try it out tonight.

I have the adapter on the PC end from DVI to HDMI. At first I thought it was a problem with my monitor, since it works great with a VGA input.

Thanks anyway, will let you know how it goes. :)

No need to install anything, just right-click on the desktop and click on 'NVIDIA Control Panel' ;). Also, it sounds like you've got it wired the same as me.

ASUSEN7900GTX
04-27-09, 02:47 PM
i know if i use any other resolution than 1360X768 my screen looks **** but at the native rez it is as good as a PC-monitor would bw wierd that 1080p tvīs does that

nekrosoft13
04-27-09, 02:57 PM
on my tv i have to set the "sharpness" to the right settings other wise text is not readable.

zarlaan
04-28-09, 07:47 PM
Double check your TV manual for the inputs. Some TV's require that you connect a PC to a certain input.

For example I have the Samsung LN450A LCD. I'm running a DVI-HDMI cable but I have to use the HDMI input 2 as it's designed for computer input. Using this input also enables me to run 1920x1080 resolution even though the tv is only a 720p.

hokeyplyr48
04-29-09, 09:28 AM
Using this input also enables me to run 1920x1080 resolution even though the tv is only a 720p.

uhhhh using hdmi 2 makes more pixels???

ViN86
04-29-09, 10:14 AM
Double check your TV manual for the inputs. Some TV's require that you connect a PC to a certain input.

For example I have the Samsung LN450A LCD. I'm running a DVI-HDMI cable but I have to use the HDMI input 2 as it's designed for computer input. Using this input also enables me to run 1920x1080 resolution even though the tv is only a 720p.

Sure it's at that resolution, but it's probably at 30Hz instead of 60Hz. This is also known as 1080i.

uhhhh using hdmi 2 makes more pixels???

No, the resolution is not 1080p, it's 1080i. TV takes 1920x1080 at 30Hz, splits it into two 540 line images, hence increasing the refresh rate to 60 Hz. The two images appear to be 1920x1080 to the eye, but are actually two 540 horizontal line images in sequence. That's how interlacing works.

Revs
04-29-09, 12:11 PM
Don't think that's what he meant, ViN :). It sounds like using this socket will allow a higher than native resolution to be used, but the TV will down-scale it to the screens res.

In the shops they seem to refer to this as HD-ready, as opposed to True HD. At least they do in the UK.

de><ta
04-29-09, 12:17 PM
Update: the screen looks good when watching movies and playing games. But reading text is a big PITA, since the white background looks off-white (more yellowish).

Revs
04-29-09, 12:32 PM
Update: the screen looks good when watching movies and playing games. But reading text is a big PITA, since the white background looks off-white (more yellowish).

Did you have a go at setting it up in the NVidiaCP? Also are you sure you're res. is set to the native res. of the screen?

de><ta
04-29-09, 01:53 PM
Did you have a go at setting it up in the NVidiaCP? Also are you sure you're res. is set to the native res. of the screen?

Yeah I played around with the NV CP and made sure my resolution was correct and my frequency was correct too.

I am guessing the off-white color has something to do with the way the actual HDTV processes the information. Going to mess around with that tonight.

Rakeesh
04-29-09, 02:55 PM
This is going to depend on your tv. Not all TV's were made with PC's in mind and thus won't necessarily allow for 1:1 pixel scaling. Worse, some force overscanning which means you won't see the outer edge of your screen. I'd look at your TV manual, or maybe google search to find if it allows for 1:1 pixel scaling.

ViN86
04-29-09, 03:30 PM
Don't think that's what he meant, ViN :). It sounds like using this socket will allow a higher than native resolution to be used, but the TV will down-scale it to the screens res.

In the shops they seem to refer to this as HD-ready, as opposed to True HD. At least they do in the UK.

Why not? If the refresh rate is at 60Hz then it is scaling it down, but many times if you choose the higher resolution it will automatically down it to 30Hz.

Revs
04-29-09, 06:33 PM
Why not? If the refresh rate is at 60Hz then it is scaling it down, but many times if you choose the higher resolution it will automatically down it to 30Hz.

Ah, I kinda see what you're saying. I'm no too clued up on how that side of it works TBH.

Revs
04-29-09, 06:35 PM
Yeah I played around with the NV CP and made sure my resolution was correct and my frequency was correct too.

I am guessing the off-white color has something to do with the way the actual HDTV processes the information. Going to mess around with that tonight.

Yeah, didn't think of that. Your colour temp is probably set to 'warm' in the TV's settings.

zarlaan
04-30-09, 05:56 AM
Why not? If the refresh rate is at 60Hz then it is scaling it down, but many times if you choose the higher resolution it will automatically down it to 30Hz.


I'm not sure how it works and I've been trying to find more information regarding it. I just know that when I go into the control panel and select 1920x1080 the resolution on the TV obviously changes to it and the refresh rate is set at 59hz. In games there is a noticeable difference when going from 1280x720 to 1920x1080 as well. Even just going into the control panel and selecting the HD signal format to 1080p it works.

Text becomes pretty damn tiny and hard to read at this resolution sadly. Otherwise the Samsung LN37A450 is a damn good TV. I can only imagine the newer models are better.

ViN86
04-30-09, 02:18 PM
I'm not sure how it works and I've been trying to find more information regarding it. I just know that when I go into the control panel and select 1920x1080 the resolution on the TV obviously changes to it and the refresh rate is set at 59hz. In games there is a noticeable difference when going from 1280x720 to 1920x1080 as well. Even just going into the control panel and selecting the HD signal format to 1080p it works.

Text becomes pretty damn tiny and hard to read at this resolution sadly. Otherwise the Samsung LN37A450 is a damn good TV. I can only imagine the newer models are better.

Your TV is probably scaling it down.

I doubt they sold you a 1080p TV although it's labeled as a 720p lol.

zarlaan
04-30-09, 04:21 PM
Your TV is probably scaling it down.

I doubt they sold you a 1080p TV although it's labeled as a 720p lol.


I know it's not a 1080p TV. I guess what just kept throwing me off and what was my point is that in the Nvidia control panel it states the signal as 1080p even though what the TV is actually displaying is 1080i. The TV is just tricking the graphics card so that it sends out the signal but converts it without the source knowing.

ViN86
04-30-09, 06:45 PM
I know it's not a 1080p TV. I guess what just kept throwing me off and what was my point is that in the Nvidia control panel it states the signal as 1080p even though what the TV is actually displaying is 1080i. The TV is just tricking the graphics card so that it sends out the signal but converts it without the source knowing.

Those TV's are tricky like that heh.

ry94080
05-18-09, 02:39 PM
Hello all,

I've successfully hooked up my PC(DVI) to my Sony TV(HDMI) and I'm pretty happy with it.

I've got a 256MB NVIDIA GeForce 8600 GTS graphics card.

While watching movies on my tv, and there is alot of motion, there is a tiny bit cracking in the picture.


Any ideas what the cause of this is and how to fix it?


Thanks

Toss3
05-18-09, 03:18 PM
Hello all,

I've successfully hooked up my PC(DVI) to my Sony TV(HDMI) and I'm pretty happy with it.

I've got a 256MB NVIDIA GeForce 8600 GTS graphics card.

While watching movies on my tv, and there is alot of motion, there is a tiny bit cracking in the picture.


Any ideas what the cause of this is and how to fix it?


Thanks
What kind of "cracking"? I'd guess you've got sharpening turned on on your tv - so just turn off any settings meant to improve the image quality of your TV from the TV's menu.

ry94080
05-18-09, 03:20 PM
it's almost like a lag, when you see a person running fast on the tv screen, you will see a very slight lag in his legs. it's like its trying to catch up?

zarlaan
05-22-09, 12:15 PM
Thats a common problem with LCD TV's. Each pixel has to turn on and then off. So, when you get fast moving objects on an LCD TV and the object is going across the screen and the pixel trys to turn off light is still emitted causing the ghosting or lag in the picture. Plasma TV's do not suffer from this.

Most good name brand LCD tv's today are in the 6ms response time or less so shouldn't be an issue with most current models, but still have to double check the specs to make sure. You could try a different HDMI cable, but it's more then likely the TV itself.