I currently have Plex and Radarr. I only ever got 1080p content because that’s all my tv would do. Finally getting a nice new tv and would like to get a few 4K hdr movies to watch. Should I be looking for Bluray2160, Remux, or disk and what will Plex play?
All you need to know can be found on the excellent https://trash-guides.info/
As a general rule of thumb stay away from 4k HDR movies that are below 15Gb. This is because it has a lower bitrate and will look almost the same as 1080p content. If the file is below 15gb then dont touch it.
Have you seen the PSA x265 4K Versions? I find them relatively good. But what is your opinion?
Bluray2160 and use mediainfo to make sure the codecs being used are AVC and AAC in an MP4 container.
If they have a 4K HDR capable TV they can playback HEVC. Does anyone even make x264 2160p releases? And container doesn’t matter either, Plex will remux it for you if necessary.
Would it make sense to set up a second radarr to only do 4K stuff? Or can I have radarr download 4K to a specific folder
It depends on what you want. If you plan on keeping two different libraries in Plex I would also keep two Radarr instances, but if you are going for a combined library you could stick to one and just use a different quality profile for the movies you want to be in 4K.
Just the normal 2160p is probably what you want, remuxes are going to get very large with marginal benefit. You might have trouble if you’re using the built-in smart tv app for plex, but the plex app on chromecast will keep up fine.
As far as I remember, Remux is the only way to have True 10Bit HDR…
When compressed, HDR is either removed or not really working anymore - at least this was consensus some years ago.
There are quite a lot of h265 HDR rips available now, particularly for newer series released on Netflix etc. They definitely support full 10bit HDR and look good to my eyes.
Just for your information, your router/network must be robust enough. I was using the router provided by my ISP and the devices streaming from my NAS were being dropped when streaming 4k (~15GBs), I struggled because I thought it was a problem from the devices but it was in fact the shitty router. Once I updated, I had no problems anymore.
is streaming based on the internet speed? cause I live in 3rd world country & only have 50mbs
If it’s inside your own network, it depends on your local network speeds. Most routers usually have gigabit ethernet ports. WiFi depends on the signal quality
If outside your network, you’ll be capped by your upload speed
I would like to emphasize that it’s not only the speed of the local network connection. Also computing power of the router is important, as too much load will put a lot of strain in the CPU of the router.
That’s interesting, I’d never thought of that before. Is there some metric to measure this by? Like, do manufacturers report what cpu (chip?) their router has? I haven’t seen anything like that on listings for amateur products, at least.