1080p TV Dilemma

New Member
✔️ HL Verified
💻 Oldtimer
Joined
Apr 4, 2005
Messages
1,714
Best answers
0
Location
Santa Cruz Mountains, California
I have the following problem:
I want to get either a 40+" plasma or LCD TV that comes 1080p. I also need it to be able to connect to the computer so I can watch all my TV shows and movies in High-def quality. I would also like to be able to play games on my computer but that really isn't the main concern.

What do you think would be best for my situation, an LCD or a plasma TV?
 
New Member
Joined
Oct 22, 2002
Messages
676
Best answers
0
My solution, save yourself the cash and get a nice 24''-30'' widescreen monitor than can display better than 1080p.

I mean, all 1080p is is the 1920x1080 resolution. Think your computer can handle that rez all the time, weather it be for viewing movies/games?

STOP!

Enough of this nonsense. It’s time to set the record straight, to clear up the air about what 1080p is and isn’t.

First off, there is no 1080p HDTV transmission format. There is a 1080p/24 production format in wide use for prime time TV shows and some feature films. But these programs must be converted to 1080i/30 (that’s interlaced, not progressive scan) before airing on any terrestrial, satellite, or cable TV network.

What’s that, you say? Those 1080p/24 could be broadcast as a digital signal? That’s true, except that none of the consumer HDTV sets out there would support the non-standard horizontal scan rate required. And you sure wouldn’t want to watch 24Hz video for any length of time; the flicker would drive you crazy after a few seconds.

No, you’d need to have your TV refresh images at either a 2x (48Hz) or 3x (72Hz) frame rate, neither of which is supported by most HDTVs. If the HDTV has a computer (PC) input, that might work. But if you are receiving the signals off-air or using a DVI HDCP or HDMI connection, you’ll be outta luck.

What about live HDTV? That is captured, edited, and broadcast as 1080i/30. No exceptions. At present, there are no off-the-shelf broadcast cameras that can handle 1080p/60, a true progressive format with fast picture refresh rates. It’s just too much digital data to handle and requires way too much bandwidth or severe MPEG compression. (Consider that uncompressed 1920x1080i requires about 1.3 gigabits per second to move around. 1080p/60 would double that data rate.)

How about Blu-ray and HD-DVD? If either format is used to store and play back live HD content, it will have to be 1920x1080i (interlaced again) to be compatible with the bulk of consumer TVs. And any progressive-scan content will also have to be interlaced for viewing on the majority of HDTV sets.

Here’s why. To cut manufacturing costs, most HDTV sets run their horizontal scan at a constant 33.8 kHz, which is what’s needed for 1080i (or 540p). 1080p scans pictures twice as fast at 67.6 kHz. But most of today’s HDTVs don’t even support external 720p signal sources, which requires a 44.9 kHz higher scan rate.

In the consumer TV business today, it’s all about cutting prices and moving as many sets as possible through big distribution channels. So, I ask you: Why would HDTV manufacturers want to add to the price of their sets by supporting 1080p/60, a format that no HDTV network uses?

Here’s something else to think about. The leading manufacturer of LCD TVs does not support the playback of 1080p content on its own 1920x1080 products, whether the signal is in the YPbPr component or RGB format. Only the industrial monitor version of this same LCD HDTV can accept a 1920x1080p RGB signal.

also, read this
 
New Member
✔️ HL Verified
💻 Oldtimer
Joined
Apr 4, 2005
Messages
1,714
Best answers
0
Location
Santa Cruz Mountains, California
Well, I'm getting a directx 10 system and money is no problem for me. As for the size, 40" is the smallest size I will go with, just because where the couch and chairs and positioned.

Also, all my TV shows and movies on my computer are HD.

Edit: This is also to replace my current HDTV which is a 37" Viore LCD monitor.
 
New Member
✔️ HL Verified
💻 Oldtimer
Joined
Nov 14, 2003
Messages
3,974
Best answers
0
Well, did you seeing my TV-PC setup influence this any?


haha, if you're going for 1080p, it won't matter if you choose plasma or lcd. plasma had an advantage previously with contrast ranges, but in 1080p tv's they are all pretty much the same. LCD is harder to screenburn too (although i wouldn't advise you leaving your pc on all the time with an LCD either as it'll burn just the same).

watching HD movies though tv, it'll be an interlaced image, however through bluray or hddvd players it'll be full 1080p. and you won't see a huge difference between an image that's scaled down to 1080i from p anyway.

i'd go for one of the sony bravia screens, they use samsung panels at the moment, but the bravia chipsets have by far the best refresh and colour rates.
 
New Member
Joined
Nov 24, 2001
Messages
692
Best answers
0
What’s that, you say? Those 1080p/24 could be broadcast as a digital signal? That’s true, except that none of the consumer HDTV sets out there would support the non-standard horizontal scan rate required. And you sure wouldn’t want to watch 24Hz video for any length of time; the flicker would drive you crazy after a few seconds.
LCD screens don't flicker... He confuses refresh rate of CRTs with fps. Us europeans watch our DVDs at 24fps (http://en.wikipedia.org/wiki/DVD-Video).
 
New Member
✔️ HL Verified
💻 Oldtimer
Joined
Nov 14, 2003
Messages
3,974
Best answers
0
agree'd i hadn't noticed that part of his statement before...

FPS and refresh rate are entirely different. in fact i believe just about all movies are shot at, and displayed, at 24fps. as the more frames exposed per second, the slower the image will appear (example, large portions of "300" were shot at 150fps).

almost all LCD tv's will display an image at 60hz (refresh rate).
 

Users who are viewing this thread

Top Bottom