ATSC Card: a question on video quality - Canadian TV, Computing and Home Theatre Forums
Reply
 
LinkBack Thread Tools Search this Thread Display Modes

post #1 of 19 (permalink) Old 2011-08-01, 04:26 PM Thread Starter
Rookie
 
Join Date: Mar 2011
Posts: 19
ATSC Card: a question on video quality

After I bought myself a ATSC tuner card to put in my PC, I have been pleasantly surprised at how crispy clear these images are from local broadcast. Gone are all the typical interferences and artifacts.

However, as I glimpse closer at those video images I notice something that I have not seen before with analog version. Don't get me wrong - the images are really clear and sharp but if there is fast motion of an object against the background I will see some sort of narrow jagged edges all around the object, reminding me of those interlacing effects. I wonder at this for some time and I experiment with different softwares to see if it improves. But it does not seem to change anything at all. If the subjects in the picture moves slowly I won't see it. It only starts to appear when the object moves faster. I wonder if this has anything to do with the compression algorithm or maybe something to do with my hardware here.

Could someone shed some light on this?
vientito is offline  
Sponsored Links
Advertisement
 
post #2 of 19 (permalink) Old 2011-08-01, 04:45 PM
Moderator
 
Join Date: Jul 2003
Location: Calgary
Posts: 3,728
It's called macroblocking and is endemic to all digital TV broadcasts. Generally less so with OTA than on digital cable/satellite.
From 57's Home Theatre FAQ:
Quote:
macroblocking - The "squares" that you sometimes see on digital programming (HD or SD) when there is too much compression of the signal, or a lot of movement on screen or scene changes. (Created by MPEG limitations (compression) in HD and compression on SD digital). (sometimes described as a "game of Tetris" on your TV, when you're not playing the game... )These do not go away with increased signal strength. See sample image of macroblocking as attachment at the bottom of this post. It's normal whenever there is a lot of change/movement on screen - new scene - bright lights, nature scenes, confetti, etc. This is but one type (the most common) compression artifact.
downbeat is offline  
post #3 of 19 (permalink) Old 2011-08-01, 06:02 PM
Veteran
 
Join Date: Dec 2004
Posts: 2,119
ATSC uses the now old MPEG-2 standard for video compression.... If they could switch to MPEG-4 with the same amount of bandwidth (19 Mbps) it would look wonderful.

Now that's a good question... what if they did use MPEG-4 ? Can it be done? All the receivers would have to be capable of decoding MPEG-4 which I doubt many can at this time.
HDTV101 is offline  
post #4 of 19 (permalink) Old 2011-08-01, 06:48 PM
Veteran
 
Join Date: Jan 2009
Location: 127.0.0.1
Posts: 3,074
Of course they could use MPEG4, but most ATSC tuners don't support it. That would be just as disruptive as the NTSC --> ATSC conversion.
audacity is online now  
post #5 of 19 (permalink) Old 2011-08-02, 10:46 AM
Veteran
 
Join Date: Mar 2002
Location: Scarboro
Posts: 6,315
There is no way you could switch ATSC to MPEG-4 at this time.

I wouldn't expect a move to MPEG-4 until we go to HDTV v2. Look how long the SD-HD conversion has taken as Canada will not complete the transition for another 29 days. I have had an HDTV and access to an HD cable signal for about 9.5 years and HD was around for several years before I bought mine.

I am not even aware of discussions on what HDTV v2 will look like - it will likely take years to decide on a standard and then years to start to implement.

And to get back to the original question - it could be macroblocking or it could be the way that your software player does the deinterlacing. Try playing the file in several different players like VLC, Media Player Classic, Windows Media Player, Divx Player, etc. In many of those players you can adjust the deinterlacing - in my experience VLC has deinterlacing turned off by defult. Turning it on can make a big difference.
Wayne is offline  
post #6 of 19 (permalink) Old 2011-08-02, 11:59 AM
 
Join Date: Jan 2007
Location: New Westminster
Posts: 1,251
A good step would be to try a few video decoders it you have a list to try then give it a go as some do better than others also it may be the encoding the station or show used are you looking at SD or HD when you see this event as you are not giving much info on this.
danbcman is offline  
post #7 of 19 (permalink) Old 2011-08-02, 12:12 PM
 
Join Date: Nov 2005
Posts: 820
It sounds like a de-interlacing problem. We might be able to help if you can tell us which OS, video card/drivers and playback software you are using. As well as any codec packs you might have installed.

Check the settings for your video card. ATI/Nvidia can do a lot of image post-processing. Most importantly, de-interlacing, up-conversion from 1080i to 1080p and various edge enhancements. Next, make sure your TV software player is using hardware acceleration. I prefer a clean system and would avoid codec packs unless you really need the additional functionality.

Samsung LN40A550, Asus A8N SLI, Athlon 64 X2 3800, XFX HD One 5450, 2 Hauppauge HVR-2250, LG BH10LS30 Blu-ray, Windows 7
Michael DeAbreu is offline  
post #8 of 19 (permalink) Old 2011-08-02, 12:28 PM
Veteran
 
Join Date: Feb 2009
Location: The Dandelion City
Posts: 7,131
I agree with Michael. 1080i will have interlacing artifacts when viewed in enough detail. The other explanations are also possibilities but don't sound like it in this case. Video post processing (de-interlacing to 1080p + de-interlace processing) can reduce their visibility. Blending is one de-interlace algorithm that works adequately without a lot of overhead. That can sometimes be done in the software player.

At 20 I had a good mind. At 40 I had money. At 60 I've lost my mind and my money. Oh, to be 20 again. --Scary
ScaryBob is offline  
post #9 of 19 (permalink) Old 2011-08-02, 03:46 PM Thread Starter
Rookie
 
Join Date: Mar 2011
Posts: 19
thanks a whole lot for your help. I turn on de-interfacing option under VLC. Bingo that takes care of the issue right away!

Again thanks very much
vientito is offline  
post #10 of 19 (permalink) Old 2011-08-02, 04:09 PM Thread Starter
Rookie
 
Join Date: Mar 2011
Posts: 19
just curious to know - in terms of atsc broadcasting, do local channels only carry interfaced format versus progressive one? I am wondering about this since I am not sure if there is anything to do with the process (ie hardware and software) once the digital signal is received. Apparently it is displayed here as interfaced format hence the need of de-interlacing filter.
vientito is offline  
post #11 of 19 (permalink) Old 2011-08-02, 04:27 PM
Veteran
 
Join Date: Apr 2007
Location: Whitby
Posts: 2,815
The general use in N.A. today is 1080i and 720p
Though ATSC allows for progressive or interlaced of all the formats, that just tends to be the ones they choose.

Note that 1080i60 = 1080p30, and it would use the same bandwidth (perhaps less with the way compression algorithms work), so its just a matter of choice on the broadcaster's part.
I think there's a topic on that somewhere.

And one day we'll probably switch to MPEG4.
It will be nothing like NTSC -> ATSC, though.

Now that we're on ATSC, they can easily migrate to any other format, maintaining legacy support during the transition.
E.g. a channel now has a 13mbps MPEG2 stream and isn't using the remaining 6.4 mbps
That 6.4mbps could be used to duplicate the main stream in MPEG4, most likely at the same or better quality than the MPEG2 stream.
This can be maintained for years until they decide it's safe enough to drop the MPEG2 stream and expand the MPEG4 bandwidth or add a new subchannel, etc.

It just means that old tuners would only be able to tune the MPEG2 piece, while newer tuners could read either stream.
Whereas now, a single channel can only hold NTSC or ATSC.
An MPEG4 transition is a piece of cake in comparison.
recneps77 is offline  
post #12 of 19 (permalink) Old 2011-08-02, 05:15 PM
Veteran
 
Join Date: Jan 2009
Location: 127.0.0.1
Posts: 3,074
recneps77,

Quote:
It just means that old tuners would only be able to tune the MPEG2 piece, while newer tuners could read either stream.
You argue that since the MPEG2 stream would stick around while people to transition to a MPEG4 stream makes that transition easier than NTSC --> ATSC. I find that argument is pretty weak. In your transition model, when the MPEG2 signals get pulled people with ATSC+MPEG2 tuners would be just as screwed as people who still only have NTSC tuners next month.

From a practical standpoint new TVs have supported both NTSC and ATSC for many years now. Both NTSC and ATSC signals were (and are) being broadcast during a transition period, and the transition is easy. Except for people who don't have new TVs.

In both cases the solution is to get a new TV (or new tuning hardware), and that in practice is just as disruptive to the consumer. And lets be honest: 99% of the economic cost of transitioning formats/protocols is the public replacing their tuners/TVs. Sure, going from MPEG2 to MPEG4 less new hardware and less paperwork for the broadcaster, but from a total economic cost perspective that's a rounding error compared to the money required to move the consumers to new TVs.
audacity is online now  
post #13 of 19 (permalink) Old 2011-08-02, 07:51 PM
Veteran
 
Join Date: Apr 2007
Location: Whitby
Posts: 2,815
Its night and day, though.

Right now it's only analog, or only digital.
Come September, there's no choice. You must be on digital (rural exceptions aside).

Once we're digital, they can maintain legacy MPEG2 for 20 years if they so wish. Can't do that now with NTSC, at least not without using double the spectrum.
It's not a required flash cut, by any means.

I'm not suggesting it would happen overnight, or even in 5 years. But the fact remains that switch would be relatively painless, compared to the one to come.
Especially considering the people will come out from under their rocks and learn about ATSC once their analogs go dark.
Those same people would pay attention the next time they say 'transition in 5 years'
Considering how the cost of technology is dropping, by the time they switch to MPEG4, I doubt anyone would notice any significant cost.
Especially since most people these days have a display capable of 1080p decoding/display. That costs more than the actual ATSC chipset.

And let's be realistic here: What's the last electronic device you've bought that has lasted even 10 years?
I think consumers are going to need replacing or repair of their current gen ATSC equipment long before we switch to next gen permanently.
recneps77 is offline  
post #14 of 19 (permalink) Old 2011-08-03, 12:42 AM
Veteran
 
Join Date: Mar 2002
Location: Scarboro
Posts: 6,315
I doubt that we go to MPEG-4 until we have HDTV v2 and hopefully that is something like 4K resolution - forget about 1080p! Youtube has had 4K videos for over a year now.

And while it isn't quite 10 years old I have a 9.4 year old Rear Projection HDTV - a Toshiba 50H81 - I doubt it will die before its 10th birthday next March. And I still have a couple of 26" CRT TVs around the house that still work fine. And then there is my Yamaha stereo receiver from the mid 1990s...
Wayne is offline  
post #15 of 19 (permalink) Old 2011-08-03, 01:47 AM
Veteran
 
Join Date: Jan 2009
Location: 127.0.0.1
Posts: 3,074
recneps77,

Quote:
Its night and day, though.
And that is where you are wrong.

From the consumer perspective it's basically the same thing: equipment replacement. It doesn't matter if we're going analog to digital, digital to analog or to something totally unheard of protocol. It means we need to change our tuner gear, whether it means a new TV or some other tuning device.

Quote:
Once we're digital, they can maintain legacy MPEG2 for 20 years if they so wish.
Just like "they" could (if "they" wanted) maintain analog NTSC signals after this DTV conversion; there are no laws of physics that say all OTA signals must be converted after a given date. There is really no difference here from a consumer transition perspective. A ATSC+MPEG4 signal would seem just as alien to a ATSC+MPEG2 TV as a ATSC+MPEG2 signal would seem to a NTSC TV.

The only way you would be right is if existing ATSC+MPEG2 gear could start decoding MPEG4/H.264 via a software upgrade or something, but as far as I know the current HDTVs cannot be upgraded.

Personally, I don't think MPEG4 OTA TV broadcasting will ever happen. Why? Because it would be at least a decade out before anyone wants to stomach another conversion like the one we're going through now, and by that time streaming video will have taken over, so nobody will care about OTA TV.
audacity is online now  
Reply

Quick Reply
Message:
Options

Register Now



In order to be able to post messages on the Canadian TV, Computing and Home Theatre Forums forums, you must first register.
Please enter your desired user name, your email address and other required details in the form below.

User Name:
Password
Please enter a password for your user account. Note that passwords are case-sensitive.

Password:


Confirm Password:
Email Address
Please enter a valid email address for yourself.

Email Address:
OR

Log-in









Human Verification

In order to verify that you are a human and not a spam bot, please enter the answer into the following box below based on the instructions contained in the graphic.



Thread Tools Search this Thread
Show Printable Version Show Printable Version
Email this Page Email this Page
Search this Thread:

Advanced Search
Display Modes
Linear Mode Linear Mode



Posting Rules  
You may post new threads
You may post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On

 
For the best viewing experience please update your browser to Google Chrome