True hi-def (blu-ray, for example) is 1920x1080 at 30 progressive frames per second. (US broadcast hi-def is less than that, 720 lines vs 1080.)

If shot using a professional codec, such as the Sony digital betacam codec or the Panasonic P2 DVCPro codec, the filesizes are enormous; it streams at 100Mbps and takes up something like 1gb per minute of hi-def video. So nobody is actually streaming true hi def. What is probably reasonably true hi-def is a 1920x1080 video at 30pfps, encoded at above 5Mbits/sec.

By comparision, standard def DV video is 720x480 at 30 pfps, and you can fit 5 minutes of video in 1 gig of space; the streaming rate is about 25Mbps. You can make a fine looking standard def stream at about 1.5 or 2Mbps, which is quite small enough to deliver over broadband internet.

HDV, which is a consumer version of HD, is also limited to 25Mbps because that's the fastest one can write to tape, so it's incredibly compressed, and a lot of detail is lost in compression. It's better than standard def, but it ain't HD.

Sorry the above is really technical, but there really isn't any simple way to describe hi-def when there are so many different codecs, bitrates, frame sizes, frame rates, etc.

Let me know if that still isn't clear.