Just a general question. I’ve noticed that in some small builds I use, Skeleton and Fentastic the picture quality can be real bad. Now, my TV works well and on regular broadcasts the picture looks pretty good. My eyesight isn’t the best so I can’t see much difference in 4K or 1080p. Mostly scenes can be rather dark and no changes in TV Settings improves the picture. I use both on Kodi and was wondering if there were some settings I didn’t notice. Thanks
Darkness is usually HDR streams. Firstly, your TV should be HDR compatible. Secondly, when watching HDR content your settings should be at maximum brightness and contrast.
I’ll play with that and mess with settings on my TV. That’s the problem I’m sure. Thanks
Took me a week of playing around with vid settings and doing a ton of learning, to get HDR and HDR10+, to give me the best pic possible on my Samsung QN65S95C using my Shield Pro. Now it’s totally amazing, although DV apparently is still better.
Yeah, for a while I didn’t really know why the new trend in TV buying was to have the brightest TV possible. Makes sense for OLED because they weren’t bright to begin with. But if the overall light output makes for better HDR and DV viewing then that makes sense. Not sure if that’s how it really works… But it sounds rational lol
I am a tad confused about the max brightness and contrast for HDR comment, as that hasn’t been my experience. But I have QOled so there maybe a difference because of the Quantum Dots. This is my very first high end TV so I have no other experience to compare it to. I really would like to see a side by side demo of a Dolby Video capable TV. My system was expensive, but damn I am so impressed with the quality of picture with all this up to date tech that we’ve begun to rewatch some older movies that we watched on a $200 Samsung. It’s like watching them for the first time. I’m also sure the quality of picture has a lot to do with the quality of the stream from the Shield Pro as well.
I misspoke. Most TV’s have an HDR brightness mode that can be adjusted. For regular LCD TV’s you’d likely want to set this mode to brightest. You may or may not still have access to the actual brightness slider, which should be manually adjusted to your liking. They are two different settings.
OLED TV’s can handle much darker scenes without a problem and don’t have to be as bright because they have more contrast by default.
Wow. Excellent tx @Jayhawks659 you are our very own tv guru. I think when I looked, the Quantum dots were the big difference, and mine is a Q-OLED. So suppose to be even better brightness and contrast. I’m pleased with it to say the least. Not so much the price though.
I posted this in the lounge but saw your convo here and thought maybe you could add your thoughts.
Will using the AI feature on a Shield enhance a 1080P to simulate a 4K?
That’s exactly what it’s supposed to do if you have it hooked up to a 4K TV. If you had two of the same TV’s side by side and one had a true 4K source and the other was 1080 upscaled using AI then you might be able to tell the difference. But most of the time if you watch upscaled 1080 content on a 4K TV with the Shield you would have no idea that it wasn’t native 4K.
If you’re asking if AI upscaling will improve the picture of a 1080p source on a 1080p TV, I don’t think it would do anything.
The quantum dots help to improve the color. The OLED technology itself is primarily responsible for the better contrast because each individual pixel can turn on or off. With LCD TV’s there are zones of pixels that are backlit that can be dimmed but they will never match OLED no matter how many zones and how small they are. This is what leads to “black crush”, which is the problem Remy is having. The only technology that might surpass OLED in the next 5-10 years would be micro LED, where each individual LED can turn on and off just like OLED does. The advantage over OLED is that micro LED is much brighter and inorganic, meaning a lot longer lifespan.
Nah, I did a 4K link on Syncler+ and was experiencing some buffering. So, I decided to drop down a notch to a 1080p link and used the AI-Enhanced Detail High. The buffering stopped but, imho, I did not see any noticeable difference in picture quality. However, my eyesight really isn’t all that good, anyways. lol
Samsung QLED
2019 Shield Pro
Spectrum IP - 500Mbps
Ethernet connection
I stay away from 4k links, even in Stremio. 4k can be worse than a finicky ex wife
I look for 4K HDR or 4K HDR10+. The quality is outstanding. True home theater from my Shield on my Samsung QN65S95C
Enable just the 4k resolutions in kodi whitelist for the shield. And make sure adjust display refresh rate is set to on start/stop in player settings
I don’t use Kodi.
Well, let’s add that to the ongoing debate on why people would use Kodi instead of standalone apps. All these little nuances can be adjusted with Kodi.
Does anyone have good luck finding HRD10+ sources on Syncler, Stremio, Kodi, etc? I was exploring it a bit and, since I have Samsung TVs, wanted to see if they could handle HDR10+ (basically Samsung’s version of Dolby Vision). The most the source will say is like HDR in the title or description, but I’m not sure how to tell if it is or isn’t HDR10 or HDR10+.
I have a state of the art Samsung home theater setup and look for 4K HDR and HDR10+. I set up my Torrentio and my RD to focus on scraping for those and eliminate anything under 1080. There are more and more showing up as uploaders add them and realize the demand is there. My Shield Pro can handle them easily but I wonder if lower end devices with just 2GB Ram can handle the decoding. Sorry I forgot to mention this is in my Stremio App.
I think you are right Miki. I likely just need to focus on the settings and set up Syncler, Stremio/Torrentio and Kodi to focus more on finding these sources. I have Shield Pros on my 2 nicest Samsung TVs, so I would love to see the picture using HDR10+.