"High definition" typically means (in the TV space) 1920x1080 or 1280x720. Plenty of other services and storage devices other than Blu-ray support those resolutions.four said:Blu-ray is not good high definition. Blu-ray is high definition.
Netflix content is pre-encoded at each supported codec and bitrate. Practically all streaming video services work this way. If you think that Netflix uses real-time encoding, well, you're nuts. Seriously, do you not comprehend how expensive that would be for their server's processors? Even suggesting that they use real-time encoding tells me how much you know about the topic. Sure, Netflix can change bitrates, but each bitrate is encoded separately.four said:2-pass encoding? Never heard they know what that means, doubt it very much.
IIRC, they use real-time encoders, adjusting to the bandwidth available.
Just like they did here with a test H.265 stream
Nope, encoding video is expensive; even a 4-core x86 server couldn't support more than what, 8 users if it was doing real-time encoding of HD video streams. At a (max) of 5mbps per stream it would only be able to send about 40mbps (tops!) of stream data if it was doing real-time encoding.four said:Encoding H.264 is very cheap.
How can you believe anybody who does not know even that...
Well, at least you've demonstrated that you know as much about video encoding as you do about TRIM.four said:I had enough of your rhetoric for today...
That used to be true. The reason why the industry started using dedicated h.264 hardware was that general purpose CPUs combined with the software encoders available at the time weren't fast enough to do real time encoding. Since then both the software has improved and the general purpose hardware got faster.ExDilbert said:Encoding and decoding H.264 is typically done with dedicated hardware, not a general purpose CPU