Originally Posted by artem
Where did they find the 10bit source material?
You don't need 10bit source for this to be useful. The output will be 8bit anyway. It's about what the encoder does internally.
The plan for later is to also automatically add a dithering filter into the encoding chain. Doing this increases image quality, but with 8bit precision it comes with a significant increase in required bitrate. But with 10bit precision you get the better picture without requiring a higher bitrate.
@johnc: Do report back, but somehow I have a feeling that it won't work, that it'll require new hardware. Which in a way is a shame, but anime fansubs aren't HD anyway, are they? So any remotely modern processor shouldn't have a problem with it. And it seems they're doing it for a smaller filesize. So instead of increasing quality, they're going for same quality at lower bitrate.