Canadian TV, Computing and Home Theatre Forums banner

1 - 4 of 4 Posts

·
Registered
Joined
·
2,056 Posts
Discussion Starter #1
How important is the p, in 1080p?

A couple years ago I hooked up some friends with decent home theater setup. A 720p panel on the wall, 5.1 surround sound, cable box and blu-ray player fed through a receiver into the panel.

Now they've gone and got a 1080p display so I'm over there doing the change.

Everything's fine, except the blu-ray player wants to limit itself to 1080i, I'm fairly sure the limiting factor here is the receiver. Nobody wants to change the receiver, it sounds great, fits the location, and best of all the family all knows how to use it.

As I said there's no desire to change the receiver. I thought of separating the video and audio so the HDMI would no longer need to pass through the receiver, but that's has undesirable cabling costs and it's more of a work-around than a solution.

Now I know 1080i is sending most of the picture just less often and the panel circuitry is managing things to make the final output progressive. But what real world drawback will I be subjecting them to if I say just use it as is? Less smooth motion maybe? Somewhat less quality on 24p source material?
 

·
Registered
Joined
·
5,463 Posts
For film source, i.e. 24p material, a 1080i interface which is sending 60 FIELDS per second actually has to replicate some of the original fields/frames in a 3:2 cadence so none of the original content is being lost. It's actually sending MORE information.

Some people are sensitive to the 3:2 judder, especially on panning shots. Personally, I don't notice it because it's no worse than the 24 fps judder that exists in film.

Assuming the display doesn't botch the de-interlacing, which was a problem for some earlier displays, then they will be just fine.
 

·
Registered
Joined
·
2,056 Posts
Discussion Starter #3
For film source, i.e. 24p material, a 1080i interface which is sending 60 FIELDS per second actually has to replicate some of the original fields/frames in a 3:2 cadence so none of the original content is being lost. It's actually sending MORE information.

Some people are sensitive to the 3:2 judder, especially on panning shots. Personally, I don't notice it because it's no worse than the 24 fps judder that exists in film.

Assuming the display doesn't botch the de-interlacing, which was a problem for some earlier displays, then they will be just fine.
Hmm... I'm with you there.

It does seem counter to intuition that 1080p has to be better, even if the difference is slight...
 

·
Registered
Joined
·
3,368 Posts
Some people are sensitive to the 3:2 judder, especially on panning shots. Personally, I don't notice it because it's no worse than the 24 fps judder that exists in film.
Over 10 years ago, early in the 3D-card wars, 3dfx and NVIDIA were battling it out with the Voodoo 2 and TNT chips. The Voodoo 2 chip supported higher frame rates, but the TNT supported 32-bit color. There were two competing demos that were created by each company, one from NVIDIA comparing 16 and 32-bit color to show the color banding on 16-bit color. Another from 3dfx (or was it 3Dfx at the time?), a 30/60 demo showing a split screen of a bouncing ball at 30 and 60 fps.

The both served their goal in showing that their respective features are important (32-bit color and 60fps gaming), but I remember very well at the time learning about the different frame rates of various video sources. I remember being amazed that although people implicitly think that movies are "high quality" that that when you go to the theater you're only being shown 24fps, which is worse than the 30fps that 3dfx was criticizing in their 30/60 demo. After that whenever I watched a movie and saw the camera pan, it was painful! It's interesting how I didn't seem to notice the flaws until I knew about them technically.

Of course, what you want is the highest resolution you can get, non-interlaced video, and the highest frame-rate available.

So, in the ideal world if you had a display that was capable of 1080p @ 240Hz, you'd ideally want your source material to be the same. I personally prefer 720p over 1080i because deinterlacing is complex if you want to try and do it as artifact-free as possible. It would be much better to never introduce the artifacts, and since almost all modern displays are natively "progressive" displays, interlacing just seems to be a silly anachronism. It's video compression from the 1930s/1940s.

Assuming the display doesn't botch the de-interlacing, which was a problem for some earlier displays
Of course, this depends on the definition of botch. If you define it as "visible de-interlacing artifacts persist after the video has been de-interlaced" then nobody has figured out how _not_ to botch it.
 
1 - 4 of 4 Posts
Top