On 10 bit h264

A while ago (a year or two), x264 added support for encoding 10 bit h264. Everyone* was talking about it like it was the second coming of Christ. “We can finally leave behind this 8 bit bullshit!”** Initially, I was all “Yaaaay! 10 bit! Yaaaay!” Various encoders announced they would start using it as soon as CCCP added support. I joined in.

But then something funny happened: my enthusiasm died down, and I was left with some facts–the pros and cons.
Pros:
– same quality for slightly fewer bits
– less chance for the encoder (x264) to introduce banding

Cons:
– slower to encode
– slower to decode

Then I realised that I never actually saw x264 introduce banding (maybe it’s a problem at lower bitrates than I use?). Additionally, the reduction in size is insignificant. On the other hand, encoding time is important to me. The slowdown is not insignificant.

The conclusion: I have no reason to use 10 bit h264.

Flames welcome.

~~~~~~~~~~~~~~~~~
* By “everyone” I mean mostly encoders. People on the receiving end were less enthusiastic about it.
** It may not have been said by anyone in those words precisely.

Advertisements

~ by dubhater on May 11, 2013.

13 Responses to “On 10 bit h264”

  1. Well… “If it even is same quality for fewer bits, Christopher Robin,” said Eeyore.

    For cel anime stuff, from my testing I could never reliably tell that it really improves quality. It’s quite possible that there are encodes in which it actually wastes bits.

    I don’t bother to test it further and just *hope* it helps some. I guess that most people might be the same, and don’t actually dare test it. /Well, maybe except those cases where it prevents banding, there the difference might be tangible, but I never dealt with such a case so dunno./

  2. 1. The reduction in banding is only something you can really take advantage of after high-bitdepth filtering. Re-encoding a source with 10-bit x264 will not magically de-band your video. It’s probably true that at very low bitrates, 10-bit has an advantage here as well. I’ve never tested that either.

    2. I don’t see it as 10-20% bitrate savings, but as that much more transparent encodes at realistic sizes. There really is a noticeable improvement, especially in preserving gradients with fewer bits.

    3. What you didn’t mention, probably because you encode a lot of noisy DVDs, is that 10-bit x264 eliminates many of the blocking issues with its 8-bit counterpart. This is due both to the aforementioned compression advantages with regards to gradients and the overall greater accuracy of 10-bit encoding.

    4. With regards to encoding speed: unless you encode a lot of video (and I mean a lot) without ANY filtering at all, the speed difference shouldn’t make much of a difference to you. If you want speed improvements, pray for Vapoursynth to be usable sooner. As for decoding, who cares? Anyone still using a Pentium shouldn’t be complaining about being able to play high-quality encodes in 2013. I think people should encode for themselves first. Surely you can play your own encodes, right?

  3. With Pentium (let’s assume with MMX else this is utterly bollocks) you wouldn’t even be watching XVID at 480p. You need to look at realistic scenarios if you want to discuss this seriously 😀

    • The point of my “scenario” is to point out that it really shouldn’t matter to encoders who can play their encodes and who can’t. While its an extradition, you have to go back many years, or to a netbook, to find a CPU that can’t play most 10-bit material. I’ve found that the majority of people who complain about 10-bit playback are actually having trouble with extravagant softsubs on out-of-date codec installations.

      • Not so old laptops don’t do 10-bit 1080P. Just a few years ago, Core 2-type duals were all too common, with clocks around 2 GHz or even less.

      • The laptop I used six years ago had a core-2 duo that ran at 2.4GHz; that plays most 10-bit content just fine. Sure, it can’t handle really high-bitrate 1080p content, but the screen is 1366×768, so you’d be silly to even try. I think your idea of “just a few years” is pretty off the mark in terms of computer hardware development, though.

  4. No flames here. I agree with you almost 100%. From a average leechers point of view 10 bit is just elitist bragging material. Not that 10 bit is bad or good it’s just not the “end all be all” . A lot of people just swear by 10 bit because that’s what they were told to do and they don’t want to look retarded. Instead of using their own eyes and judgement they just regurgitate what they’ve been told. I have personal never seen the great advantages, from a viewers point of view, that was evangelized about in the beginning.

    • At least not enough to give up hardware compatibility on HD content.

      • Hardware compatibility (devices, etc) and fansubs stopped being a thing when fansubers stopped hardcoding subtitles. If you’re talking about hardware acceleration, see the above comment. I’m sorry if your CPU really can’t play any 10-bit content, but it’s 2013; formats change, and hardware has to keep up eventually. There are always re-encoding groups / streaming sites if you want to compromise quality for compatibility.

  5. Ever heard of HTPC’s or Media Boxes? There are plenty of those out there which rely on hardware acceleration not to mention the majority of PC’s out there aren’t capable of of playing Full HD 10 bit. Most people have different financial priorities to deal with then buying a top of the line PC every 3-5 years. IF and I mean IF most the fansubbing groups (not all of course because some already do the following) really cared about the fans instead of their over-bloated e-peni they would provide both 8 and 10 bit.
    As I stated, I don’t think 10 bit is good or bad per-say, it’s just not the miracle that it was touted to be.
    As for my PC’s ability to play FullHD 10 bit, one of them actually can but that’s beside the point. I’m just a single individual not the silent majority.

    • Friend of mine just built and HTPC for ~$400. I use my four-year-old laptop. The argument that you need a “top-of-the-line machine” to run fansubs is ludicrous, and usually made by people with 7+-year-old Athlons who’ve been lucky enough to get by this far by relying on hardware decoding.

      As an encoder, I take offence at the notion that fansubers “don’t care about their fans” when they don’t cater to out-of-date machines. We do this for FREE, so that you can enjoy anime–for FREE.

      Lucky for you, horriblesubs’ rips should run on just about anything, and they look better than 90% of fansuber’s encodes. That’s what I watch, anyway.

  6. I can remember when I encoded Evangelion 1.11 and it got a ton of broken gradients because of rounding errors. I haven’t tried to encode that one again but it’s exactly the kind of problem I think would benefit immensely from raised accuracy.

    Right around the time 10-bit was introduced I encoded Blue Literature and had much trouble getting the very blue scenes in Episode 4(?) to not band up. I tried static/dynamic noise, zones with b=5 and whatnot, nothing really looked good/worked. When I tried 10-bit encoding it actually helped very much. But yeah, overall the beneficial cases are fringe-matter.

    I’m guess what I’m trying to say is that 10-bit has it’s benefits even in the visual department (try to encode and compare a REALLY [think submarine lighting] red, green or blue scene with smooth gradients), but it’s not like anyone can force you to use it. Just be ready to take the blame when someone spots the blocky midnight sky scene.

  7. Bottom line, for me at least, if the majority of groups were upfront in saying it makes it easier for us instead of trying to make it sound like they were doing the fans such a huge favor. Again……some not all…..encoders/groups kept touting the great savings in space and huge leap in quality which I have not seen. What I have noticed is that not as many filters are needed (for the most part) to achieve reasonable quality. I would surmise that it is much easier for encoders of all caliber to achieve adequate results without having to be an AviSynth maestro.

    It just tends to be irksome to me that hardware compatibility (in most instances) had to be sacrificed so that otherwise average encoders could feel that they were such leet encoders (ala. e-penis).

    As far as your average HTPC being able to handle FullHD, that’s bullocks . Most HTPCs are meant to be very low heat , power efficient, space saving and almost non existent acoustic profile. Low heat for the most part, means fanless heatsinks which means having cpu’s with a very, very low temperature footprint. Of course having a low temp. also means having very low power consumption.

    Now there are some people out there that don’t mind alot of noise and power hungry, high heat HTPC but that kinda defeats alot of the pros for having an HTPC.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

 
%d bloggers like this: