>(Original blu-rays are 8 bit only and most people dont even have 10 bit displays, so why fret it’s not like there is any degradation in quality)
Stopped reading there
Wait, people only have 8-bit displays and sources are in 8-bit? Oh my god! I can't believe no one realized this before! We've been using 10-bit for no reason all these years!
Reason number 1 is the dumbest think possible. You encode in 10bit for better quality REGARDLESS of what depth your monitor is. Come on, it's 2019, this is common knowledge.
10 bit vs 8 bit is bullshit unless you're using CRT display (me, I'm boss), or OLED
No LCD panel displays 10 bit color, even if it says something along those lines.
just because you're literally blind and can't tell the difference between how well 10-bit preserves gradients in anime compared to 8-bit doesn't mean the rest of us are
I do like having at least a text file in the folder that says [GroupName} Anime Title - E.no. (Quality) and lists the crc per file in it..that is something nice to have, but with the advent of set top boxes and phones, long file names don't work to well sometimes.
@nImEHuntetD I don't know if you have checked the subs are not but they are monsterous and is aligned very high, outline and shadow was also very high https://imgur.com/a/RI65Zv6. Anyway I edited them. Here if someone wants my edited subs https://files.catbox.moe/xhilyp.rar
Thanks for the encode
Don't know why everyone so rude, it's not like you guys pay the uploader anything... yes the quality might suffer but it's free stuff. You can tell uploaders nicely, or make your own video.
Btw @nImEHuntetD, do you have any idea which RAW work well with your sub? Just in case my display won't show 8bit video properly.
I see you using a old out of date flac version. May wanna update that to [flac ver 1.3.2](https://www.videohelp.com/software/FLAC-Encoder)
The 10 bit encoder will help in a higher variant of channels to work with then working on the 8 bit encoder would.
8 bit depth = per 256 RGB / YUV channels as opposed to
10 bit depth = 1024 RGB / YUV channels
A certain scene may not have enough info and the color has to stretch to approximate = more likely introduce banding due to a lack of detail by the 8 bit source or a filter or encoder settings the user used. With 10 bit it'll help this out and have a more smooth color gradient look (hopefully) by not having to approximate due to having the extra information to work with unless even with that the banding is real bad and have to use a deband filter or user used low encoder settings. More depth on video the better but more processing power for PC / devices then there's compatibility issues.
In regards to the monitor, you're talking about the amount of color displaying on the monitor to view the billion of color gradient differences not the final processing on the video using 10 bit encoder from a 8 bit source and seeing the differences on that as there's many images and threads about this online you can check out and I can see why you would continue by the logic of the BDMV source being in 8 bit but that's where you are not entirely correct when the issue is about the benefits of using the 10 bit encoder over the 8 bit and the information it provides off the source to decrease banding with a smoother gradient even on a 8 bit monitor.
@Jabo I hope you messing, just in case. Watch this https://www.youtube.com/watch?v=1La4QzGeaaQ in 480p then at higher res then say that again. Otherwise, when was the last time you saw your eye doctor?
@SomaHeir "Raws are from JPBD, and base sub is from Amazon which are translation checked by a friend of mine and all the styling, typesetting and timing is done by me."
Wow, so much salt for just a "8bit encode rather than 10bit", ppl need to calm down and realize this stuff is free, and then get a life. If you're so upset about it just use some other raw or do your own encode/release.
Anyways, thanks for all the hard work, styled subs with OP/ED translation is always welcome; I haven't watched it with these subs yet, but Amazon translations suck ass, so hopefully your translation check's fixed that!
nImEHuntetD well as noZA said 10 bit takes extra computing power so not every device can play it. As for me I go for 8 bit videos bcz I can't play 10 bit videos on TV unless it is x265. I am taking that your reason doing 8 bit.
Someone guys just wants everything to be perfect and I am not one of them. If you keep on trying to find errors in a video then when will you enjoy the series? Encoding scene by scene takes hours and hours. For those who wants quality wait for Beatrice raws bloated version or do it by yourself.
I am seeing your encode on my TV and it is good. So everyone this a good release.
@nImEHuntetD Are your releases a target for TV watching compatibility? Is that why you choose that format? If so, I get it. Have you tried HEVC 8 bit? Can you watch that on TV? That might be better format to try out especially since I also see you got 45 threads which is more than enough for that new encoder, so it may take a lil longer but may look better too so long as you don't use "umh" settings which is probably still broken on HEVC.
Also, if you choce 10 bit you could also shoot it up using the 4:4:4 profile to maintain closer to the color of the source if you do a comparison between 4:2:0 and 4:4:4 but of course that too takes more processing power. I dunno if there's compatibility issues going for that chroma sample on devices.
@gsk_ Can you watch HEVC on that? Namely in 8 bit? What about 10 bit HEVC? I'm just curious as I always wondered if getting a big ass TV for my PC would be better then a monitor to watch stuff and light gaming. I also don't expect people to encode scene by scene cuz it does take a long time but you don't have to encode it scene by scene only the scene that has the problem. But even from so-called good encoders theirs ain't perfect either and I don't expect it to be either in a lossy format.
The issue is the uploader with his/her ignorant statement not understanding the benefits from the 10 bit. Comments more than likely wouldn't of been so many if not for that.
@noZA_ **yes, HEVC works. Every codec works, except for 10bit codecs(for theTV). :P** and **I do know all that, but the thing is, I did make a different encode of 10 bit and I did not find any difference. Albeit the monitor i used was 8 bit. But, still. :P Hence, I went with 8bit, since it was also more compatible with _most/all_ devices.**
@noZA No not all Hevc is supported. It depends on the brand of TV you use. You have to contact your manufacturing site to see which one is supported, to be more specific what Main level can the tv support. But 95% of Hevc or x265 10 bit are supported now a days TVs. But not x264 10 bit. But this is assuming you run it on H/W decoder. If your TV has powerful processor you may watch x264 10 bit as well using s/w decoder.
For the TV vs monitor obviously TV is good for videos. Gaming I haven't tried it myself with 60fps games, I doubt my TV can play it.
Ahh.. thnx for the upload but subs were kinda bold and wrong positioned but I am happy they were 8 bit else my 10 yr old phone might have cussed the shit out of me.
Comments - 42
AXZ
motbob
herkz
gsk_
CoffeeFlux
egozi44
Razeth
DmonHiro
VyseLegendaire
nImEHuntetD (uploader)
herkz
nImEHuntetD (uploader)
ZippyDSMlee
nImEHuntetD (uploader)
herkz
gsk_
KaoriScarlet1997
nImEHuntetD (uploader)
nImEHuntetD (uploader)
MinatoNamikaze
nImEHuntetD (uploader)
SomaHeir
Jabo
noZA_
nImEHuntetD (uploader)
nImEHuntetD (uploader)
ap1234
nImEHuntetD (uploader)
korbaNTR
NightCrawler
gsk_
Kuromii
noZA_
nImEHuntetD (uploader)
noZA_
Draconiano
gsk_
Sargon
Shinobu32
Stratocaster
TheDevilsHyper
nbdxq