: All About HDTVs as a PC or Laptop Monitor
2007-01-16, 09:11 AM
I have been trying to configure a Toshiba CRT HDTV for use with my HTPC for almost 2 days with no luck. before I begin my basic specs are:
Mombo: DFI AM75-TC
RAM: 580 megs of mixed sd ram, running at 66 ( I know it sucks)
PROC: Athlon XP 2000+
VID: MSI FX5200 128meg
Sound: SB Audigy 2
TV Tuner: Hauppauge 150 sd tv card, connected via s-vid to a starchoice
505, and via coax to a DVD/VCR combo
OS: Win-XP Pro sp2
Screen: Toshiba 30HF86C Connected via DVI-HDMI
Now the problems:
1. I cannot find much info on this tv. If anyone knows what the native rez is or how to find it please let me know.
2. If I have the computer connected via dvi to the tv, and via vga to a monitor The nvidia control panel gives me lots of options to control the hd signal, and manage the over scan. If I disconnect the monitor though, all these options disappear. Anyone know why this would happen? I still have overscan problems even if I don't have a monitor connected.
3. after hours of tweaking (with the monitor connected so I have access to all the controls) I managed to get the tv to display 720p. the problem is the font, icons, and interface in windows is blurry. Movies I play are fine, but I can barely make out options in the movie players menu. Does anyone know why it would be blurry? Any ideas on how to fix the problem?
If I left anything out that you might need to know, just ask. And I thank you for your responses in advance :)
2007-01-16, 01:32 PM
With regards to laptops requiring you to switch off the laptop monitor to watch video, it largely depends on if the laptop has 1 or 2 video cards built into it. Some laptops use 1 video card for both displays and require that both displays have the same resolution and refresh rate. This can obviously be problematic so most newer laptops have 2 video cards. Windows can be configured to display the same thing on both video cards, but it has to write the updates to each card separately. When doing video this obviously requires a lot of overhead and most laptops are not up to the task. By turning off one of the video cards it cuts the amount of work that needs to be done in half and makes the job easier.
2007-01-16, 01:37 PM
@morikaweb - I have my HTPC hooked up to a 50" Toshiba CRT RPTV. My TV is 5 years old and only seems to support 1080i - and I only have component inputs. In normal windows mode it does look awful - I also get flickering due to the i rather than p.
But I think the issue is that you need a 10 foot UI rather than a 2 foot UI. You need to install media software - like XP MCE, or Beyond TV, Sage, Meedio or something else. And/or you need to change fonts to very large fonts and change the DPI settings as well.
One other thing that I had to do is that I had problems configuring my TV when I had my PC hooked up to both a monitor and TV. The advice I was given, which worked, was to unistall the Nvidia drivers, unplug all monitors other than the TV, reboot and reinstall the drivers.
2007-01-18, 06:54 AM
I see you answered most of your questions for yourself. HDMI - DVI should work for your tv, look to the nvidia as the cause of the flicker.
Recheck your connections before giving up hope. Are they tight?
Keep in mind that your tv will display 480i , 480p , and 1080i only. 720p is not available with crt direct view tv.
2007-01-18, 12:19 PM
So basicly, my only option or usable resolution is 720x480 in progressive i guess.. too bad that overscan is a big issue with that resolution.. and for some reason, nvidia is not giving me the option to fix it with underscan like in HD. Maybe powerstrip? but im sure that it would only be custom resolution fix
would getting a standalone upconverter help anything ? like for example having a 720p signal converted to 1080i but without getting the flickering.. or just having a 1080i signal that works...What I dont understand is why my Set top box at 1080i doesnt create flickering..neither does my xbox 360.. but the computer does..is it the video cards? or Windows?. i do understand the concept of 30hz and 60hz and interlaced vs progressive but how come other components work fine in interlaced then..
2007-01-18, 01:26 PM
I have a laptop I want to hook up to my Sharp LC42D62U. The lappie only has a VGA output on it. How do I do this? what are my cable options for making it happen?
2007-01-18, 06:49 PM
If your Sharp LCD HDTV does not have a VGA input, then what you can do is buy a VGA to RGB Component breakout cable and connect your laptop to your HDTV in that way. That is the least expensive solution to your problem.
If your SHarp LCD HDTV does have a VGA input, then you can connect your laptop to your HDTV as you would any PC monitor.
2007-01-23, 08:09 PM
1.) I have a samsung LN-S3292D. It has 2 HDMI inputs but unfortunately it only takes D-sub and pc audio cable for connections coming from PC. From what I gather this is VGA and not capable of high resolution. I was hoping to watch some HD content without buying an HD-DVD player. Am I hooped or is there any converter/adapter type things i can purchase that simulate a HDMI or DVI connection?
2.) the native resolution is 1365x768. So, the best way to connect pc and tv is to buy a video card that supports 1365x768 and set pc to that?
3.) sorta unrelated, since native resolution is 1365x768, should i be running games etc in 720p or 1080i? I've tried the two and really havn't noticed that much of a difference, from what i gather they are both doing some type of scaling?
uh one other thing, the manual says the HDMI/DVI jacks do not support pc connection but I've heard that they just say this so people don't call them and bother them about issues. Has anyone actually tried to do the hdmi-dvi from pc thing into the hdmi slot?
what's the worst that can happen?
2007-01-24, 04:57 AM
unfortunately it only takes D-sub
That is not unfortunate. It is probably the best way to connect the TV to a PC since it is supported. A DVI to HDMI cable should also work but it is not certain since it is not supported.
Choose a resolution as close to the TV native resolution as possible. 1280x768 might be a good starting point. You may need to underscan slightly to get the full PC screen on the display (i.e. 1280x720). Some newer video card drivers support HDTVs and HD resolutions directly. If finer control is required, try a utility called PowerStrip.
I would not use an interlaced resolution for a HDTV. Anything significantly different from the native, 768p resolution, will likely look poor.
2007-01-24, 11:54 AM
1. I have my Samsung DLP connected via VGA and the quality is pretty good. Granted, it's not digital so there may be a bit of loss there but it works well. Plus, when using VGA the tv allows you to use all sorts of fine-tuning to tweak the layout of the picture. I can't get that to work with the 720p or 1080i resolutions, so there's a bonus there.
2 .Yes, try to get the native resolution working. As mentioned by another poster, try updating the video drivers for your video card directly from the manufacturer's website. Most support TV resolutions now.
3. What type of games? Is this PC or an XBox/PlayStation type game? Assuming it's PC, try to use the closest to the native resolution that you can.
2007-01-24, 03:07 PM
sorry, i was talking about xbox 360 games, shoulda clarified
I'm going to get a vga cable now actually because I've heard nothing but good things. I don't think it's worth risking screwing the tv up with hdmi because it says it's not supported.
But just to confirm, if i play a hd format divx tv show or movie from my laptop/pc with VGA, It won't be in true HD on the tv right because it's not digital? I've heard VGA is basically the same as component and i run things in 720p with the component isn't that HD?
2007-01-24, 04:02 PM
It will be in HD, it's more a question of the quality of the signal and where the digital to analog conversion happens. In the case of component and VGA cables, the signal is converted to analog in the PC and then sent analog high definition to your TV. Theoretically, there could be some analog signal degradation between the computer and the television but realistically not much. We've been using VGA for much higher resolutions than this for years on our computer monitors.
The DVI/HDMI cable carries digital information to the television and then converts it to analog (or in some cases just draws it directly to the screen pixel for pixel in pure digital). In this case the chance of analog degradation is next to zero.
So DVI/HDMI is better, but you're still getting the same high-definition picture sent to the TV.
On to the XBox. I personally recommend you pick up a VGA cable for the XBox 360 if you have the spots available on the TV. Once you hook it up with VGA, suddenly the resoltion of 1360x768 becomes available and you can run your games in the native resolution. Another benefit of the VGA cable is DVD upscaling. For purely political reasons the 360 outputs DVD movies at 480p, but when you use the VGA cable it unlocks the upscaling capability of the 360 and converts it all the way up to 1360x768. In my testing, it looks much better this way.
Hope this helps!
2007-01-24, 05:52 PM
thanks very much, will definetely look into that advice.
oh, one last question i swear
the actual input on my tv is called pc not vga. Should be ok right? It's basically the same slot as vga and the xbox vga cable.
2007-01-24, 06:35 PM
A PC input is a VGA input. Samsung was just trying to dumb things down for their users. Unfortunately not using standard terminology can cause more confusion and not less.
As for using HDMI, the problem is with the digital rights management (copy protection). Most video cards don't output the information the TV needs to ensure that it isn't an illegal copy so it probably won't work. It won't cause any damage though if you want to give it a try.
2007-01-24, 07:44 PM
oh really? that's good news. If i damaged this TV I would freak. I love it. Maybe in the future though I'll try to output to hdmi. My vid card kinda sucks as of now, no dvi outputs even. I'll keep in mind though
2007-01-30, 03:06 AM
I was able to get a 1920 x 1080 pixel display from my PC connected to a Sharp Aquos LC-45GD5U over a DVI connector. This is a 45" flat-panel LCD TV with a native resolution of 1920 x 1080.
The TV has a DVI-I connector, which is digital DVI and analog VGA in one. The DVI connector is configured via an option in the TV's menu for "Digital PC" (uses the DVI pins) or "Analog PC" (uses the RGB pins, usually over a $10 VGA to DVI-I adapter).
Configuring the DVI input as a PC would not allow me to go past 1280 x 1024. The TV's user manual warns about this and my experience confirmed it. Obviously this is nowhere near the TV's native resolution or aspect ratio.
But when I set the TV to "Digital AV" mode, it allowed me the full 1920 x 1080 pixel resolution at a refresh rate of 30 Hz, once I set this on the PC using the Display control panel. There was no overscan or underscan, just a pixel-for-pixel representation of the Windows desktop, which is exactly what I wanted.
For some reason, configuring the TV's DVI input as "Digital PC" mode did not work out. It had to be set to "Digital AV" mode.
This requires a digital DVI output from the PC's video card. It would not work with an analog VGA output. (It also would not work with an analog VGA output converted to an analog signal on a DVI-I connector.)
I wanted this for a slide show of JPEG images at the highest resolution I could display, and it worked great, very crisp and clean images.
Hopefully this experience will help others.
2007-01-31, 12:42 PM
Very interesting... that "digital PC" mode on the TV sounds like something to look for when purchasing a new TV these days.
I'm curious - with the TV requiring that you need to run the video card at a refresh rate of 30Hz, was there any picture flicker or stain on your eyes? That's a pretty low refresh rate for a PC video card to be outputting.
2007-01-31, 01:13 PM
Has anyone been using a 32" or 37" LCD TV primarily as a computer monitor? The new 1080 ones would have enough resolution for me to work with, but I'm concerned about the refresh (30, 60, 120Hz?) and what is sufficient for descent viewing.
Am I the only one who thinks it would be pretty cool to be working on a 37" monitor while watching a TV show w. PIP?
2007-01-31, 03:06 PM
At the distances you usually view a computer monitor from, 1080 lines probably wouldn't be good enough for a 32 or 37" monitor.
As for the slow refresh rate, LCDs don't flicker like CRTs do so that shouldn't be a problem, but it might make fast action games look a tiny bit jerky (I haven't tried it myself to confirm this).
2007-01-31, 04:32 PM
What I'm trying to figure out with the 32" or 37" set is how far back to place it in order to loose the 'screen door' vibe. Yes - my current monitor sits on my desk, but I'd be putting the 32"/37" on the wall behind the desk, so I'd be adding another 2ft to the viewing distance.
I just did a little math with the help of my good friend Pythagoras, and found that the 37" set (assuming a 4 ft viewing distance) is pretty much the same visual size as my existing 21" (2 ft) setup. The dot pitch might be different, but I don't think I'll notice the difference at 4ft, especially with a 1080 native resolution.
It's just curious to see Apple/Dell/Samsung selling 30" 'monitors' for $1500+, while you can get a Viewsonic/Westinghouse 37" LCD TV for less than $1K. What's the real difference?