Analog vs Digital is not SD vs HD
You need to be clear that there are two different topics you're interested in, and they're not equivalent.
Analog will generally always look worse than digital. That's not a law of physics, its just a general rule of thumb. With analog, you never quite know if the colour is right or the sound is right because there is no way to check, unless you have a reference copy of the original signal. With digital, it is generally possible to detect any changes to the signal, simply because of the fact that a digital signal requires a computer of sorts to process it, and the processing can include "checksum" type data with very little extra overhead (as compared to what it would take to achieve something similar for an analog signal.)
Additionally, most digital satellite systems send redundant copies of the data for error correction, because there is room to do so. (A digital signal can be compressed with or without signal quality degredation, so there becomes room for extra redundant data, for forward error correction.)
So the first thing you need to know is that you generally WANT a digital signal (lets not discuss the value of LP records as compared to compact disc's shall we! ;-)
What you don't want is a lossy digital signal... or one that had been processed more than once (where process means analog to digital, then digital back to analog.) If a signal is compressed one way by one technology, and then reconstituted back to analog and then re-compressed in a different way by a different technology (or in full truth even by the same technology again is almost as bad) then it will generally look like "mush". Way worse than a poor quality analog signal. You would need a lengthy explanation as to why this happens, but suffice to say that its a bad bad thing, and it generally happens more often than it should... and here's why:
The originator of a signal is some content provider, lets pick on the theoretical "cucumber cooking channel" (CCC.) CCC hires actors and producers and all sorts of technicians, and makes some choices. (I'm gonna make up brands so as to not pick on any real company's equipment.) They decide they are going to buy Mexxa HD video cameras, and Turrox video post-processing and storage systems (basically digital HD arrays that work like VCRs did for analog signals.) The Mexxa camera could output a pure HD signal at 1920 time 1080 pixels per 1/30th of a second, which is a data rate of about 10 GB (megabytes) a minute (if my math is correct... 1920*1080 is 2M pixels per frame, 2M pixels times 24 bits per pixel is 6M bytes per frame, 30 frames per second is 180MB per second. 180MB times 60 seconds is 10GB.
Hopefully you have enough computer background to realize the HUGE amount of data that would be for a one hour show. There is no way that CCC can store and broadcast pure HD signals. (Yes, eventually this will hopefully be possible, but with todays limits and installed infrastructure, this would be unreasonable.) So therefore, that Mexxa camera comes with an option to compress the signal(in the form of on or off board signal compressing computer, usually implementing a standardized signal compressing technology that is often referred to as a codec which is short for compressor-decompressor.)
Lets say that the Mexxa camera compresses using codec A, which is designed for conversion of real time video to compressed real time video, so must be very very fast. Then it sends the compressed signal into the Turrox video processing suite, where it is decompressed using codec A and eventually chopped and edited into a 1 hour TV show, and then compressed by codec B and stored. Now codec A, in an effort to be fast, is pretty crappy... it needs to work in real time, and so it works by throwing away data, and filling in what is thrown away with repeats of nearby pixels. However, in "blind taste tests" (as in where the viewer is blind to the products/brands involved, but obviously able to see a TV screen :-) the viewers couldn't tell the difference between an uncompressed signal and codec A.
Codec B is better, because its not real time. It has a super-computer and can take its sweet time to process the data over and over to get the compression just perfect. Codec B works by throwing away data about certain very similar colours in a localized area. Lucky for codec B, in a similar "blind taste test", viewers couldn't tell the difference between codec B and raw HD.
However, when viewers are asked to compare the raw HD to a signal that was passed through codec A *and* codec B, they 100% percent preferred the raw HD signal because they saw "weird glitchiness" in the signal that was doubly compressed.
And guess what, CCC hasn't even broadcast the signal yet. There is now the possibility that their broadcasting agent/company, will compress the signal with codec C (or even A or B again, but lets keep count.) Then it will be received (in advance of actual distribution to your home) by some local company, such as Rogers, Bell, Cogeco, Shaw or Videotron, and that company will store it in their equipment, now using codec D.
And now its time for you to watch the grand super spectacular first weekend showing of the CCC premier event... and your cable TV company will broadcast it to your digital box, but unfortunately for you.. they choose not to offer the CCC to Canadians in HD, so they are going to convert the HD signal to a SD signal using, you guessed it, codec E. Then, there digital broadcast equipment will compress the SD signal, using codec F.
By the time the digital SD signal shows up on your poor screen, you're not gonna like the results. Its going to look pretty bad, and that's before there is any signal loss on the way to your house. If signal loss occurs, then codec G will jump in to try and fix the signal as best can.
This has been a gross oversimplification of a very complex scenario, but it shows the extent of the problem and explains why a signal can be digital, but still look like crap.
Hopefully, the content manufacturing and distribution companies care enough about quality that they minimize the codec manipulation of the signal so that you get the best quality possible within the realm of reasonable cost. However, if you want better quality of signal, you'd better buy your content on disc, where you can get a higher data rate, and where there is less chance of redundant manipulation. When mastering a blu-ray disc, there is probably only one or two codecs involved because multiple broadcast storage and delivery systems aren't involved....
Now, the other part of the issue is HD versus SD. Its possible to have analog SD, but I don't think anyone ever seriously considered doing analog HD, simply because it would waste WAY too much broadcast bandwidth. Going back to numbers, an uncompressed digital SD channel is roughly 720 by 480, and has less colour depth too, so is probably only 16 bit rather than 24 bit, so is about 1GB per minute, as compared to the HD 10GB per minute... so you're looking at a factor of 10. Analog SD channels, in most markets, have room for about 30 channels, give or take, so that would 3 channels of HD. (wowie... and you thought nothing was on when there is 30 channels.)
So in summary:
people confuse digital and HD because we never seriously considered offering HD in analog, but the proper terms of your decision are:
analog vs digital
SD vs ED vs HD
If you want the best quality current technology can provide, you want a properly managed digital signal in HD resolution.
In Canada, by 2011, the first choice is gone... you won't be able to get an analog signal any more. Its already hard to get one today. But, you will still be able to get a digital box that will "down convert" to analog outputs, in case you have an old TV that you want to use that only supports analog inputs. (BTW, all inputs to your TV are analog, except HDMI and DVI. That means that if you use coax or RCA plugs or cables, you're supplying an analog signal to the box. In truth, that means, if the connector screws on (coax) or has a color code on its connector (RCA type connections) then it is analog. If it has multiple pins per connection, its probably digital.)
I know this was a super long post, I hope its clear enough and useful enough to you to make your TV viewing better. BTW, if you want to see one of the best HD signals I've ever seen... go rent or buy a copy of the BBC's blu-ray disc Blue Planet and watch it on a TV that can do 1080p over a HDMI cable. If it doesn't blow your eyes outta your socket, then your expectations are too high! ;-)