I'm glad the discussion turned to baluns. At 88-108 MHz, where I do most of my playing around, I've always regarded these things as pesky, unavoidable components, with a nagging loss that frequently is too small to worry about. Sometimes I've bit the bullet and taken the trouble to create an antenna design that is inherently 75-ohm rather than 300. Afterward I always wonder whether I've wasted my time, since at FM frequencies the loss is just 0.75 dB or so.
I believe the UHF-TV balun loss numbers at AVS Forum are more realistic than the KYES numbers. The larger losses are troubling.
I test baluns by placing two in series so I can use 75-ohm unbalanced test equipment. I have 50-ohm test gear, but I use resonant 50:75-ohm networks to match everything. I'll have to get some wideband matching networks to test baluns at UHF-TV frequencies. I have a homebrew, wideband, minimum-loss 50:75-ohm resistive pad, but the other day I tested it at UHF and I wasn't impressed with its transparency. I made its resistor lead lengths very short, but evidently not short enough. Chip resistors should work though.
What I mainly wanted to mention is that I've noticed that the input impedance and transmission loss depend on exactly how I couple two test baluns. Sometimes I have just touched the spade lugs together. This works if you can get a reliable connection. But the impedance characteristics depend on the spacing between the spade lugs, as you might expect. The impedance of a transmission line is a function of the conductor spacing and the dielectric constant of the material in between. If the twin-lead that exits the balun is really 300 ohms, when the dielectric is split and carved out and the resulting dielectric becomes mostly air, the impedance is no longer 300 ohms. Essentially you have inserted a short section of transmission line of higher impedance. This can degrade a good match. You really see the effect for antennas that have several inches between their 300-ohm feed terminals. Although I haven't measured it, the impedance at the 75-ohm end of a balun is not likely to be the same if you use the full balun lead length and gradually fan out the conductors to the terminal distance, or you shorten the lead length and feed with wires at right angles to the intact twin-lead portion. The latter case mimics the usual feedpoint computer model, which is a straight wire between feed terminals. I think the dimensions involved are large enough to make quite a difference at UHF-TV frequencies. This issue deserves its own thread.
The other thing I've done is to couple two baluns with a screw-terminal mount, the kind of thing that vacuum-tube FM tuners used for their 300-ohm antenna input. I made a 300-ohm load for testing baluns using one of these terminals. I soldered a resistor very close to 300 ohms across the terminals using extremely short leads. It works fine at 88-108 MHz, but when I tried it a few days ago at UHF-TV frequencies I was surprised how much reactance it had, way too much to be useful as a 300-ohm load.
Anyway, balun loss is important because it is unrecoverable--it comes before any low-noise amplification. If it gets to be several dB you're going to notice its effect on weak signals. But I haven't found balun loss simple to measure.