: ATSC Card: a question on video quality

2011-08-01, 04:26 PM
After I bought myself a ATSC tuner card to put in my PC, I have been pleasantly surprised at how crispy clear these images are from local broadcast. Gone are all the typical interferences and artifacts.

However, as I glimpse closer at those video images I notice something that I have not seen before with analog version. Don't get me wrong - the images are really clear and sharp but if there is fast motion of an object against the background I will see some sort of narrow jagged edges all around the object, reminding me of those interlacing effects. I wonder at this for some time and I experiment with different softwares to see if it improves. But it does not seem to change anything at all. If the subjects in the picture moves slowly I won't see it. It only starts to appear when the object moves faster. I wonder if this has anything to do with the compression algorithm or maybe something to do with my hardware here.

Could someone shed some light on this?

2011-08-01, 04:45 PM
It's called macroblocking and is endemic to all digital TV broadcasts. Generally less so with OTA than on digital cable/satellite.
From 57's Home Theatre FAQ (http://www.digitalhome.ca/forum/showthread.php?t=17715):
macroblocking - The "squares" that you sometimes see on digital programming (HD or SD) when there is too much compression of the signal, or a lot of movement on screen or scene changes. (Created by MPEG limitations (compression) in HD and compression on SD digital). (sometimes described as a "game of Tetris" on your TV, when you're not playing the game... )These do not go away with increased signal strength. See sample image of macroblocking as attachment at the bottom of this post. It's normal whenever there is a lot of change/movement on screen - new scene - bright lights, nature scenes, confetti, etc. This is but one type (the most common) compression artifact.

2011-08-01, 06:02 PM
ATSC uses the now old MPEG-2 standard for video compression.... If they could switch to MPEG-4 with the same amount of bandwidth (19 Mbps) it would look wonderful. :p

Now that's a good question... what if they did use MPEG-4 ? Can it be done? All the receivers would have to be capable of decoding MPEG-4 which I doubt many can at this time.

2011-08-01, 06:48 PM
Of course they could use MPEG4, but most ATSC tuners don't support it. That would be just as disruptive as the NTSC --> ATSC conversion.

2011-08-02, 10:46 AM
There is no way you could switch ATSC to MPEG-4 at this time.

I wouldn't expect a move to MPEG-4 until we go to HDTV v2. Look how long the SD-HD conversion has taken as Canada will not complete the transition for another 29 days. I have had an HDTV and access to an HD cable signal for about 9.5 years and HD was around for several years before I bought mine.

I am not even aware of discussions on what HDTV v2 will look like - it will likely take years to decide on a standard and then years to start to implement.

And to get back to the original question - it could be macroblocking or it could be the way that your software player does the deinterlacing. Try playing the file in several different players like VLC, Media Player Classic, Windows Media Player, Divx Player, etc. In many of those players you can adjust the deinterlacing - in my experience VLC has deinterlacing turned off by defult. Turning it on can make a big difference.

2011-08-02, 11:59 AM
A good step would be to try a few video decoders it you have a list to try then give it a go as some do better than others also it may be the encoding the station or show used are you looking at SD or HD when you see this event as you are not giving much info on this.

Michael DeAbreu
2011-08-02, 12:12 PM
It sounds like a de-interlacing problem. We might be able to help if you can tell us which OS, video card/drivers and playback software you are using. As well as any codec packs you might have installed.

Check the settings for your video card. ATI/Nvidia can do a lot of image post-processing. Most importantly, de-interlacing, up-conversion from 1080i to 1080p and various edge enhancements. Next, make sure your TV software player is using hardware acceleration. I prefer a clean system and would avoid codec packs unless you really need the additional functionality.

2011-08-02, 12:28 PM
I agree with Michael. 1080i will have interlacing artifacts when viewed in enough detail. The other explanations are also possibilities but don't sound like it in this case. Video post processing (de-interlacing to 1080p + de-interlace processing) can reduce their visibility. Blending is one de-interlace algorithm that works adequately without a lot of overhead. That can sometimes be done in the software player.

2011-08-02, 03:46 PM
thanks a whole lot for your help. I turn on de-interfacing option under VLC. Bingo that takes care of the issue right away!

Again thanks very much

2011-08-02, 04:09 PM
just curious to know - in terms of atsc broadcasting, do local channels only carry interfaced format versus progressive one? I am wondering about this since I am not sure if there is anything to do with the process (ie hardware and software) once the digital signal is received. Apparently it is displayed here as interfaced format hence the need of de-interlacing filter.

2011-08-02, 04:27 PM
The general use in N.A. today is 1080i and 720p
Though ATSC allows for progressive or interlaced of all the formats, that just tends to be the ones they choose.

Note that 1080i60 = 1080p30, and it would use the same bandwidth (perhaps less with the way compression algorithms work), so its just a matter of choice on the broadcaster's part.
I think there's a topic on that somewhere.

And one day we'll probably switch to MPEG4.
It will be nothing like NTSC -> ATSC, though.

Now that we're on ATSC, they can easily migrate to any other format, maintaining legacy support during the transition.
E.g. a channel now has a 13mbps MPEG2 stream and isn't using the remaining 6.4 mbps
That 6.4mbps could be used to duplicate the main stream in MPEG4, most likely at the same or better quality than the MPEG2 stream.
This can be maintained for years until they decide it's safe enough to drop the MPEG2 stream and expand the MPEG4 bandwidth or add a new subchannel, etc.

It just means that old tuners would only be able to tune the MPEG2 piece, while newer tuners could read either stream.
Whereas now, a single channel can only hold NTSC or ATSC.
An MPEG4 transition is a piece of cake in comparison.

2011-08-02, 05:15 PM

It just means that old tuners would only be able to tune the MPEG2 piece, while newer tuners could read either stream.

You argue that since the MPEG2 stream would stick around while people to transition to a MPEG4 stream makes that transition easier than NTSC --> ATSC. I find that argument is pretty weak. In your transition model, when the MPEG2 signals get pulled people with ATSC+MPEG2 tuners would be just as screwed as people who still only have NTSC tuners next month.

From a practical standpoint new TVs have supported both NTSC and ATSC for many years now. Both NTSC and ATSC signals were (and are) being broadcast during a transition period, and the transition is easy. Except for people who don't have new TVs.

In both cases the solution is to get a new TV (or new tuning hardware), and that in practice is just as disruptive to the consumer. And lets be honest: 99% of the economic cost of transitioning formats/protocols is the public replacing their tuners/TVs. Sure, going from MPEG2 to MPEG4 less new hardware and less paperwork for the broadcaster, but from a total economic cost perspective that's a rounding error compared to the money required to move the consumers to new TVs.

2011-08-02, 07:51 PM
Its night and day, though.

Right now it's only analog, or only digital.
Come September, there's no choice. You must be on digital (rural exceptions aside).

Once we're digital, they can maintain legacy MPEG2 for 20 years if they so wish. Can't do that now with NTSC, at least not without using double the spectrum.
It's not a required flash cut, by any means.

I'm not suggesting it would happen overnight, or even in 5 years. But the fact remains that switch would be relatively painless, compared to the one to come.
Especially considering the people will come out from under their rocks and learn about ATSC once their analogs go dark.
Those same people would pay attention the next time they say 'transition in 5 years' ;)
Considering how the cost of technology is dropping, by the time they switch to MPEG4, I doubt anyone would notice any significant cost.
Especially since most people these days have a display capable of 1080p decoding/display. That costs more than the actual ATSC chipset.

And let's be realistic here: What's the last electronic device you've bought that has lasted even 10 years?
I think consumers are going to need replacing or repair of their current gen ATSC equipment long before we switch to next gen permanently.

2011-08-03, 12:42 AM
I doubt that we go to MPEG-4 until we have HDTV v2 and hopefully that is something like 4K resolution - forget about 1080p! Youtube has had 4K videos (http://youtube-global.blogspot.com/2010/07/whats-bigger-than-1080p-4k-video-comes.html) for over a year now.

And while it isn't quite 10 years old I have a 9.4 year old Rear Projection HDTV - a Toshiba 50H81 - I doubt it will die before its 10th birthday next March. And I still have a couple of 26" CRT TVs around the house that still work fine. And then there is my Yamaha stereo receiver from the mid 1990s...

2011-08-03, 01:47 AM

Its night and day, though.

And that is where you are wrong.

From the consumer perspective it's basically the same thing: equipment replacement. It doesn't matter if we're going analog to digital, digital to analog or to something totally unheard of protocol. It means we need to change our tuner gear, whether it means a new TV or some other tuning device.

Once we're digital, they can maintain legacy MPEG2 for 20 years if they so wish.

Just like "they" could (if "they" wanted) maintain analog NTSC signals after this DTV conversion; there are no laws of physics that say all OTA signals must be converted after a given date. There is really no difference here from a consumer transition perspective. A ATSC+MPEG4 signal would seem just as alien to a ATSC+MPEG2 TV as a ATSC+MPEG2 signal would seem to a NTSC TV.

The only way you would be right is if existing ATSC+MPEG2 gear could start decoding MPEG4/H.264 via a software upgrade or something, but as far as I know the current HDTVs cannot be upgraded.

Personally, I don't think MPEG4 OTA TV broadcasting will ever happen. Why? Because it would be at least a decade out before anyone wants to stomach another conversion like the one we're going through now, and by that time streaming video will have taken over, so nobody will care about OTA TV.

2011-08-03, 09:18 AM
..., and by that time streaming video will have taken over, so nobody will care about OTA TV.Don't say that in the OTA forums here!

Michael DeAbreu
2011-08-03, 04:05 PM
How can the broadcasters maintain a legacy MPEG-2 signal without either doubling their broadcast spectrum or severely degrading the bitrates to fit both within the same frequency allotment?

Beyond compression efficiencies, switching to MPEG-4 doesn't appear to bring anything new or spectacular to the consumer. It is certainly finding a place in wired/wireless Internet and in HTML 5. The next broadcasting standard could be 4K with the pending re-allotment of frequencies from broadcast to Internet.

2011-08-03, 07:17 PM
Just like "they" could (if "they" wanted) maintain analog NTSC signals after this DTV conversion; there are no laws of physics that say all OTA signals must be converted after a given date.

No, they can't. Because their licenses expire on the 31st of August.
If they want to continue broadcasting, it's in ATSC.

It's not a law of physics, its a law of Canada.

Wayne, I meant items bought in the last 10 years! ;) (ok, the TV is just under the wire, but hardly 21st century tech)

I doubt a CRT will ever die in less than 10 years, unless someone takes a bat to it..

Michael, 4K would be awesome. But considering how long it took to go from 480i to 1080i (despite it being commercially available and affordable long before now), I think that might be a while :p

And if you look at the bitrate use by many channels, they're not at their max 19.4
Assuming MPEG4 is roughly twice as good at compression (or better) than MPEG2:
A station only using 13mbps@MPEG2 has enough bandwidth left to duplicate their entire channel with comparable quality in MPEG4
And as time went on during this theoretical transition, they could slowly allocate more bandwidth to the '4' side (and thus less to '2') and notify the '2' side that the channel is available with superior quality using a new tuner, they will cease broadcasting of MPEG2 by such and such a date, etc.

2011-08-03, 08:35 PM
Sounds like playback problem.

These are typical problems that HTPC users have to deal with. It is a passion of constant tweaks and fixes, finding the best player for certain format, etc.

The good part is almost any video unit can do MPEG2 just fine as long as the decoder/player is good.

I only use Windows Media Center +/- Window media Player for ATS playback. For one, I record in DVR-MS main server running Vista and even .wtv (Kitchen PC running 7) so not like I have many other options.