Canadian TV, Computing and Home Theatre Forums banner

Why is OTA DTV audio easily broken with weak signal?

5253 Views 17 Replies 12 Participants Last post by  Schmerpy
I use OTA HD, and sometimes I watch a weak channel. When the picture is even slightly garbled, the audio is intermittent. Why?

Audio is 448Kbps Video is 19200Kbps. It is 2.3% of the total amount of data for HD. So why is even a slight garble in video results in bad audio? I understand if bad reception causes me to lose say 10-20% of data, but what's the likelyhood that ALL of 2.3% of audio information is lost in there ALL THE TIME?

For analog OTA, the picture can be ridiculously fuzzy, and the audio would remain crystal clear.

I'm using a LG 50PK550, with the antenna plugged directly into the TV.
1 - 1 of 18 Posts
Besides the FEC data, the RDC diagnostic page on my cable STB lists the numbers of retransmissions. Just saying.
The cable digital video system has no retransmission as it was designed to operate over one-way cable networks feeding tens of thousands of customers. Broadcast video is like UDP packets on the internet. The retransmission data you are seeing relates to the reverse channel which is used for reporting PPV purchases, requests for channels in switched digital video and any TV interactive services.

Cable, satellite and OTA digital transmission are based on technologies over 15 years old. In computer terms, processing power was back in the 386/486 era and memory was hugely expensive compared to current prices. The system designers had to balance cost against performance so the level of FEC and other error correction techniques was constrained by cost.

With respect to the original question about audio failure in OTA, the problem has attracted the interest of the ATSC. People are used to the audio signal surviving in NTSC even when the video is seriously impaired but the ATSC design means they both fail at the same time. I believe that some effort was put into increasing the error protection for the audio stream PIDs but I don't recall if it was ever turned into a standard and implemented.

The original design could have included additional error correction for the audio but users listening to the audio without seeing good video was not considered. An alternative design for digital TV could have provided different levels of picture quality with different amounts of error correction. A basic lower resolution picture would have the most error correction to provide a very wide coverage area. The next level of resolution would have less error correction and full high definition would have even less error correction. The quality of picture received would have depended on signal strength and quality. This complex solution would have cost more and reduced the data capacity for a full high definition image but it would have provided a more robust signal for fringe area viewers or people using indoor antennas.
See less See more
1 - 1 of 18 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top