I connected my PC with my new Samsung 58" TV (VA panel). In Nvidia control panel I have options:
Full RGB
Limited RGB
Ycbcr 4:4:4
Ycbcr 4:2:2
Ycbcr 4:2:0
Which one of this is better? Also what bpc to select (I have option 8 and 10)?
Is it best to put TV to 4k 30hz or 24hz? Aren`t movie filmed in 24fps?
On Samsung TV I have option UHD color, should I turn that constantly on? There is also option Game mode. Activate that only when playing games?
Ok, this is a VERY complicated question and will depend on the specific display you're using (which model Samsung) as well as what content you're displaying. When asking about 30hz vs 24hz it depends on if your display supports something called 3:2 pulldown. This is a process whereby the source or display effectively converts a 24fps image (which film-stock is) to a 30fps image (29.97fps if you want to nitpick). Further confusing things is the fact that TV content filmed to NTSC spec is at 30fps, while PAL content is as 25fps IIRC. High end BluRay players like those from OPPO Digital have special video processors designed to address these issues, with 3:2 pulldown being the big one.
I generally advise against forcing your video card to any specific framerate as that can prevent whatever app you're using to display the video content from controlling the output.
@djbrad is right about letting our program control that.
This is avoiding the whole topic of things like up-converting, line doubling, etc.
As for colorspace, that again depends on what your display is capable of doing. Typically the Ycbcr settings are for projector based displays where high end home units can contain multiple color guns (which is becoming less and less common) that need to be addressed separately. This doesn't necessarily give a superior picture, but it allows the video card to more accurately display colors as the director intended.
Can you give us the model number of your display?
As a side note, a lot of that finesse of the color bands is really going to be lost if your TV hasn't been calibrated. This involves a technician coming out and using a colorimeter to adjust the inputs on the TV using a known good source like a color generator and then going into the service menu to adjust the colors to reference. Most displays are extremely inaccurate color wise as manufacturers error on the side of big, bold colors and over-saturated brightness and contrast to make their displays appear "better" in the store. A good test is to hold your hand up to the display and compare your flesh tone to that of the display. Typically you'll find the display is either far to red (warm), or far to blue (cold). TV displays unfortunately are rarely calibrated for correct color gaumet, although whether that's really an issue is up to the individual.
For the average person it's overkill and also depends on the display technology. If it's plasma, which is still considered second only to OLED for color reproduction, then getting it calibrated is worth it. If it's LCD it really depends on if the TV uses grid LED lighting, edge lighting, and what techniques the display uses to control white/grey/black.
If you want to go down the rabbit hole check out AVS Forum.
For reference, I'm using a THX Certified 65" Panasonic 1080p plasma display in my home theater. It has a 600hz refresh rate. Motion on it is still the smoothest of any display I've seen outside of some very high end OLED displays. That's a byproduct of how plasma's work though and while my display is excellent, I'm considering replacing is just because of how much heat the thing generates.