Canadian TV, Computing and Home Theatre Forums banner

OTA ERP Levels (Analogue vs. DTV) - why so different?

22402 Views 96 Replies 20 Participants Last post by  pr92
Effective radiated power, Canada vs. U.S.

Why is it that U.S. metropolitan areas have DTV stations on UHF that use their allocations at their maximum parameters, often pushing the ERP to 1,000 kW, whereas no station in Canada is expected to even come close to that?
Is it possible all those megawatt DTV stations in the U.S. are throwing out more power than is necessary for good coverage? Or is it that Canadian broadcasters are purposefully being timid in their OTA DTV plans, waiting to see how well/poorly their service is received before ramping up their ERPs further?
Industry Canada's post-transition allocations for full-power UHF TV station are commonly set at 1,000 kW (300 metres HAAT), and its website states the max ERP technically allowed is 2,000 kW (200 metres HAAT).
1 - 20 of 97 Posts
No U.S. digital tv station can go beyond 1000 KW ERP.
@thenewdc: Right. I was talking about Canada when referring to the 2 MW limit.
In a recent correspondance with a local Calgary broadcast engineer, I was told that due to the physical situation in Canada with EHAATs and coverage requirements, I would not find any IOT DTV applications which are only used for 20+ Kwatt requirements at the transmitter output. Using a 13 db antenna gain neglecting losses, these would correspond to around ERP 400 Kwatts.

In the U.S. IOT's are frequently used especially in the larger populated centers. What I'm not sure of is if the US coverages are required over a larger area for similar noise boundary values in Canada?

I see also that the DTV UHF services in the UK are using a maximum of 200 Kwatts ERP (20% of the previous analog ERP from the same towers)!
Heh. Many people would argue that even at 1 MW, there's not enough power. Where I'm from in central West Virginia, the rolling hills absolutely kill UHF signals.

You do know that the combination of ERP and HAAT determine range, and that a lower HAAT requires a greater ERP to maintain the same range? In the US, if you choose to install your antenna below the reference height, you normally cannot exceed the ERP limit to achieve equivalent range. The exception is VHF DTV, and that is by special permission.

Oh, to address your question directly, the goal was to replicate the Grade B NTSC service area with an equivalent reliability. Without double-checking, I don't know if the DTV equivalent is the noise-limited value or something a little higher.

BTW, TV ranges in the Northeast are more constrained than in other regions.
Oh yes - ERP and EHAAT will determine the range but the big question for us in Canada is why do Canadian ERPs seem to be generally less than US ones?
Canadian mast and tower heights/EHAATs do not seem to be generally greater!
So why are the US ERPs much higher? In our city the highest UHD DTV ERP that I've heard of is 220 Kwatts and that's not unusual here! As I have said in my earlier post, Canada does not have many or any IOT based broadcast transmitters AFAIK, yet the DTV services are replacing ERPs of 100Kw Lo band, 325Kw Hi band and whatever power of UHF analog service that existed?

Maybe Americans can get away with an indoor antenna more often than we can at home or maybe we are not trying to duplicate the old analog services exactly??
An IOT is a high power final stage amplifying vacuum tube based device used in the final stages of the higher powered UHF DTV and some NTSC transmitters. It is the modern replacement of the old Klystron being more energy efficient, it is now much preferred over the klsytron.

It is very linear and consists of many tuned cavities but has the dissadvantage of requiring cooling (usually distilled water/anti-freeze mix in a closed loop arrangement with fin-fan forced air coolers mounted on the roof of the transmitter building) plus a beefy high voltage supply 10 - 20 KV. It also needs more drive power than the old Klystron due to it's inherently lower gain and typically uses I/Ps or Several Intermediate Power Amplifiers usually of the solid state variety (connected in parrallel) to feed the first tuned input cavity. The power output is usually 20Kwatts or greater. They can also be used in parrallel for higher powered applications.
CBC and Radio-Canada Mt-Royal will be on 30 Kwatts MSDC IOT Channel 19 and 21.
Woww! - Really? - OK - at last some proper power! So what ERP will this correspond to with Mont Royal's aerial (antenna) gain? 600Kwatts ERP?
What brand of transmitter will be used? Larcan?

Sorry TVlurker - I had omitted to say that IOT stands for Inductive Output Tube and MSDC IOT for Multiple Stage Depressed Collector Inductive Output Tube

The idea is that the MSDC IOT becomes more energy efficient by using a smaller voltage drop at each collector stage reducing the overall power consumption. This does introduce additional cooling concerns however.
Thanks for the explanations.
How does dissipating more waste heat make it more energy efficient? Or is it just that the heat is generated over a much smaller area, and therefore needs local cooling?

With respect to the ERP issue, you can see that many allotments in the post-transition plan are in the 450-850 kW range. Good news to here that CBC/SRC Montreal will be using theirs.

So what's the typical gain for a new omnidirectional antenna -- 13, 15dB? So a 30kW IOT on a short-ish tower like the Mount Royal Candelabra would give an ERP in the 600-1000MW range? I think the 19 and 21 allotments in Montreal are for 1MW and 845 kW at 300m EHAAT.
About as general and non-specific as I can be (please don't flame me, transmitter techs!) an electron beam runs continuously in a klystron tube, but after it is blasted into the tube at one end the electron beam has to be caught, so the collector does that job at the other end. An IOT does that quite simply and effectively, but at a loss of great energy that is usually dispersed as heat. An MSDC is a special collector that not only catches the electron beam as usual but then acts as an "energy reclaimer" that allows recovery of much more of the electron beam's residual energy than with a simple IOT. An analogy would be that an IOT is just a big baseball catcher's mitt, while an MSDC has the catcher's mitt located at the end of several hallways that cause the baseball to lose speed as it ricochets off each wall and bounces a few times. WIth electrons, the heat involved in an MSDC is huge so that end of the klystron needs to have great cooling capacity, but the payoff is a higher rate of recaptured energy that can thus be reinvested in the entire process.

I would leave it to L'inquisiteur, Billsmith, and other transmitter techs to go into more detail if needed.
See less See more
i should go to bed earlier, i can't take all this learning all at once.
in addition to learning what an epiphany is,

looking at Stampeder's station status, seems i may be in the same boat with CFTO/CTV as most canadians from Toronto/Hamilton are with WNGS.
CFTO's ERP post transition is gonna be like 15 dB down from current ERP,
on the same channel, at the same height. didn't realize that till now.
be interesting if it works here.

10 * log (10800 / 325000) = -14.8 dB
CFTO's ERP post transition is gonna be like 15 dB down from current ERP,
on the same channel, at the same height.
To be clear, they will be on the same channel and height as their analog broadcast. Compared to their transitional DTV broadcast, they will be on VHF instead of UHF, have better elevation and only slightly less power.
...looking at Stampeder's station status, seems i may be in the same boat with CFTO/CTV as most canadians from Toronto/Hamilton are with WNGS.
CFTO's ERP post transition is gonna be like 15 dB down from current ERP,
on the same channel, at the same height. didn't realize that till now.
be interesting if it works here.

10 * log (10800 / 325000) = -14.8 dB
The comparison is not valid because the NTSC and ATSC signal are completely different. In NTSC, the maximum ERP corresponds to the sync tips (negative modulation). A 100 KW transmitter is only at full power during the sync tips while the maximum video is about 66 KW when the station is sending black. At maximum white, ERP will be around 5 KW. The sound is transmitted with a separate FM transmitter which operates with a consistent power output. Energy across the TV channel is concentrated around the visual carrier, colour sub-carrier and the aural carrier.

The ATSC signal spreads energy across the whole 6 MHz band regardless of the picture content. The signal level changes with each data symbol to a handful of specific power levels but when observed on a spectrum analyser the signal has almost vertical sides and a flat top.

In NTSC, every change of signal level corresponds to a change in picture brightness. Any electrical noise in the signal appears as a random change in brightness which we call 'snow'. In ATSC, signal noise causes data errors but because of the channel coding and error coding, the errors can be corrected or masked until the signal drops below a level called the threshold where the number of data errors exceeds the ability of the coding schemes to make corrections. The ability of the digital system to withstand data errors is called coding gain.

The ATSC system also has an equalizer built into the tuner which reduces the effects of multipath transmission (ghosts) resulting in a cleaner signal being fed to the digital detector. An NTSC ghost cancelling reference signal was developed and tested but never became a feature in analog sets.

In summary, ATSC signals can be transmitted at much lower numerical ERP levels and still provide the same coverage area as an NTSC station.
See less See more
That is probably the most succinct and readable explanation of why ATSC DTV signals require such lower Effective Radiated Power levels than the old NTSC Analogue signals do to cover the same area. :)
Thanks GeorgeMx for the excellent explanation. This also explains why so many people over-estimate the power savings from converting to DTV. The NTSC ERP is the peek power (only used when transmitting a black picture) where as the ATSC ERP is the statistical average power.
Analog ERP allocations and measurements are always PEAK power.
Digital ERP allocations and measurements are always AVERAGE power.
Digital PEAK power is about 7+ dB HIGHER than it's AVERAGE power.

Due to modern error correction techniques, Digital reportedly also requires
about 4+ dB LOWER power levels to provide equivalent "quality".

Hence, many Digital average ERP power allocations are 12 dB LOWER than Analog
Peak ERP power allocation....BUT ONLY IF they are on the SAME frequency.
Since propagation loss (in dB) varies as 20*log(frequency), a lot more power is
needed on higher channel numbers....
Hence, many Digital average ERP power allocations are 12 dB LOWER than Analog Peak ERP power allocation....BUT ONLY IF they are on the SAME frequency.
Isn't this only true for UHF? I gather that post transition in the US they discovered that VHF-HI requires about 10 dB more power than anticipated and VHF-LO requires 15 to 20 dB more power than anticipated.
As a TV design engineer in my earlier life, I will add my input to this discussion.

Analogue TV is not that robust. In order to receive a perfect picture, any interfering signal has to be -53dB or lower on the wanted signal so as not to be visible. Put another way, the wanted signal has to be ~400 times stronger than any interference. This includes tuner or amplifier noise (snow) and co-channel interference from a distant TV transmitter. In the design lab, we used to assume a noise free picture on UHF required 750µV of signal at the TV antenna input. So, analogue transmitters have to be powerful to provide a good field strength in the service area.

Analogue TV transmitters often have a positive or negative frequency offset from the actual channel frequency. This is to minimise interference effects from another TV transmitter on the same channel, often for adverse propagation conditions when signals travel further than intended. The frequency offset is related to the horizontal frequency divided by a whole number up to 12, but more often in the order of 6 or 8. This value is added to or subtracted from the actual channel frequency. Then if there is co-channel interference from another TV transmitter, it appears as thin light and dark horizontal bars. If the co-channel interference increases, the intensity of the bars gets worse and can sometimes be sufficient to ruin reception. BTW, there is nothing worse than watching a zero beat between two transmitters. This appears as very broad light and dark diagonal bars - maybe only 2 to 4 over the screen - which rotate and increase or reduce in numbers as the frequency difference varies by a few Hz.

As mentioned in the first paragraph, 750µV is needed on a UHF channel on analogue TV for a noise free picture (53dB S/N ratio). By contrast, the FCC has determined that the drop dead S/N ratio on DTV is 16dB on UHF. This equates to about 7.5µV at 600 MHz, or a factor of 100 times or 40dB Voltage ratio between ATV and DTV. As we know, with DTV, there is a cliff effect where the signal is received well or not, so there needs to be a margin to allow for fades etc. Even if 75µV is assumed as a good strength for DTV, this is still 20dB or 10 times lower in Voltage ratio. Therefore, the transmitter power on DTV can theoretically be reduced to 1/10 or less of that of the analogue transmitter to achieve the same coverage on the same channel.

Many of the US DTV transmitters are putting out higher power than really necessary for service area coverage. The fact that less signal is required at a DTV receiver than on ATV, this means those of us favourably placed can pick up US DTV stations from 100 miles or more and be perfectly watchable. (The picture, that is, not the programme). ;)
See less See more
1 - 20 of 97 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.