Forums

Topic: 4k TV Recommendations

Posts 261 to 280 of 305

JohnnyShoulder

@Th3solution Good luck find a good quality TV for that size over in the states, they are few and far between I believe. Most of the better TVs are 50" and up. My new tv is 49" which is perfect for my space in the lounge. It is only available in Europe however.
https://www.whathifi.com/reviews/sony-kd-49xg9005

Yeah I've never liked the build quality on LG TVs and had a couple fail on me and so has my dad. I think we are cursed with LG. When I got my last TV which was a Sony, l noticed the upgrade in quality all round immediately. They are probably a lot better these days, but I'm not risking them any more. I have a couple of mates that swear by them.

I would personally avoid OLED screens as the risk of burn in is there. Some experts say you need to watch the same thing over thousands of hours, but there have been reports of it happening after only a couple of weeks. LG were taken to court recently as they refused to recognise it as an issue, which they had to change.

[Edited by JohnnyShoulder]

Life is more fun when you help people succeed, instead of wishing them to fail.

Better to remain silent and be thought a fool than to speak and remove all doubt.

PSN: JohnnyShoulder

Th3solution

@JohnnyShoulder Yeah, it’s pretty frustrating how large these TV’s have become. If you have a small room like mine, even 55 inches is too big for the view distance.

“We cannot solve our problems with the same thinking we used when we created them.”

JohnnyShoulder

@Th3solution You may wanna read this thread from page 12 onwards as Bam gave me a lot of good advice and you can see me going from defo getting a Samsung to only a maybe lol. There was also a video which I will dig up later which helped me decide between OLED and LCD screens.

Life is more fun when you help people succeed, instead of wishing them to fail.

Better to remain silent and be thought a fool than to speak and remove all doubt.

PSN: JohnnyShoulder

Th3solution

@JohnnyShoulder I actually did read back over that and although a lot of the techno-talk goes over my head, I found it useful. I also have followed this thread off and on over the last year or two and Bam sure knows his stuff. I’ve come close to upgrading my set several times over the last few years and I always chicken out because I’m scared to waste money on a set that is either going to be a lemon or will be obsolete with evolving technology. But at the present pricing, it’s hardly a risk if I don’t need at huge set and I think the new 8K compatibility is completely unnecessary for me, even if PS5 can put out that type of resolution.

I would be interested in that video, although I’ve pretty much ruled out OLED due to my smaller size preference and the image retention risk that was mentioned.

“We cannot solve our problems with the same thinking we used when we created them.”

BAMozzy

On the subject of Dolby Vision, its not important. Its literally an extra layer on top of HDR10 so you won't ever get HDR content that is ONLY available if you have a Dolby Vision TV. If you don't have DV, the content will play in HDR10 without the Dynamic Metadata.

Dynamic Metadata is useful if you have a TV that has a 'limited' HDR capability. It 'optimises' the tone mapping on a scene by scene basis rather than having a single tone mapping algorithm applied across the whole content. For example, if you have a 400nit TV and try to watch content that can be as bright as 1000nits, Static metadata (HDR10) would apply a single algorithm. The 1000nit bright parts would be at the TV's maximum 400nits, 0-100 nits may be displayed at 1:1 and then 100-1000nits would be scaled to fit. A scene that only hits 400nits would therefore look 'darker' maybe only 150 nits because it has to account for a scene that hits 1000nits. It does mean that the content is 'constant' and more even to the intention. That 400nit scene wasn't meant to be as bright as the 1000nit scene.

Dynamic Metadata though would look 'identical' in the 1000nit scene because it would still need to tone map 1000nits down to the 400nit capability of the TV but the 400nit scene would 'fit' within the TV's capability so wouldn't tone map down. That 400nit scene would be as bright as the 1000nit scene, as bright as a 600nit or 800nit scene because the tone mapping would change. That's not the overall 'brightness' of the scene but the 'brightest' part of the scene. Depending on where the Dynamic Metadata is, whether its at the start of each scene or even on a frame by frame basis, you could have parts of the scene darkening or brightening because the tone mapping isn't constant.

If you have a TV that can reach 1500nits though, regardless of whether you have Dynamic or Static Metadata, the Content would be displayed the same. Static Metadata would say that the brightest the content gets is 1000nits so no tone mapping is needed - it can all be displayed as it was mastered. Dynamic Metadata would do the same - whether the next scene only hits 400, 600, 800 or 1000 nits. It would still know that no tone mapping down is needed so display the content as it was mastered too.

The one advantage of DV that could make a difference is the 12bit colour. However, most TV's are 10bit only and the cheaper HDR TV's still use 8bit+FRC (Frame Rate Control). Sometimes called Dithering but the TV still only has 8bit colours but will flick between two different colours very fast to create the illusion of the 10bit colour grade. Its like having a piece of card with one side black and the other side white - spin that fast enough and you see a grey colour. Unfortunately, some TV manufacturers will still say their TV is 10bit because it can receive a 10bit colour signal and use the FRC to create the impression of that colour so its not always clear if you are getting a true 10bit panel or 8bit+FRC panel. As no 12bit panels are currently available (at least not as of the 2019 TV's - not checked all 2020 TV's), its more a future think. HDR10+ is still just 10bit so if we ever do get TV's that can handle the full HDR range (up to 10000nits) HDR10+ is pointless as is Dynamic Metadata. DV at least would offer a 12bit upgrade over HDR10.

What is most important is having a TV that can display both HDR10 and HLG. HLG is the method that will be used by Broadcasters because it doesn't rely on adding in Metadata. Metadata is added in at the post production stage by the colourist. If you want to broadcast 'Live' HDR, you cannot do that with HDR10 (or DV) because you need to add the Metadata so HLG was developed.

Dolby Vision vs HDR10+ isn't like the battle between VHS and Betamax because NO exclusive content will be available for either. Its not like you need DV to watch HDR content on Netflix for example, if you don't have DV, you can't access that extra layer of 'information' - the Dynamic Metadata so you get just HDR10. You are not missing out on HDR. Its the same with DV Bluray discs - if you don't have a DV Bluray player AND DV TV, the disc will still play, still give you HDR - you need BOTH to get DV but don't need DV enabled TV/Bluray player to get HDR.

I would recommend buying the biggest TV you can. The bigger you go, the more the benefits of 4k are noticeable. Anyone who says you need a 100" TV to see any benefits at 7' away is totally BS. Games have a lot of fine lines so they have perhaps the most apparent benefits but even with TV/film, the amount of detail in the mid range is noticeable. Things like Cats whiskers, pinstripes, grass/leaves etc which 'disappear' as they move away from the camera - become blurred at HD but are still noticeable on a 4k screen. You have 4x as many pixels to keep that detail - keep small writing sharp for example - even if its only in the mid range. The first film I watched, I saw that a denim jacket wasn't a light blue but actually made of dark blue and white threads and I could actually make out what the text on 'pin badges' said. The lawn outside of a house had 'texture' not a green blur. Of course we know its a lawn in HD but in 4k, it looked more like a real lawn because of the texture rather than just a blurry patch of green.

The bigger the TV, the more you will see. You may not make out the tiny text in the background is now readable because its too small but that doesn't mean it can't be read which it may not be readable at all in HD regardless of TV size. It may only be 2/3 pixels high in HD but that's 8-12pixels high in 4k remember. If you take a piece of paper with Squares on it and try and write a letter B in a 3x2 block and try to do the same with a 12x8 block, you will see how much more defined and easy it is to get the shape. That's where 4k is working for you and as objects get smaller (further away from the camera), the more detail and definition is retained with higher resolutions and its noticeable. Snakes keep their scales further away, sand looks more grainy, Cats keep their whiskers etc, etc...

A pessimist is just an optimist with experience!

Why can't life be like gaming? Why can't I restart from an earlier checkpoint??

Feel free to add me but please send a message so I know where you know me from...

PSN: TaimeDowne

BAMozzy

JohnnyShoulder wrote:

I would personally avoid OLED screens as the risk of burn in is there. Some experts say you need to watch the same thing over thousands of hours, but there have been reports of it happening after only a couple of weeks. LG were taken to court recently as they refused to recognise it as an issue, which they had to change.

The strength of OLEDs is that they are self emitting displays with each pixel made of 'sub-pixels' of each colour. However, each sub-pixel also wears with use - like a Lightbulb does. The more a sub-pixel is used the more wear that occurs and also the brighter its on, the faster it wears too. A red logo, like CNN has for example, would mean that 'just' the red sub-pixel is on and as the logo is bright (meaning that the red has to be on bright too), its causing those red sub-pixels to wear faster than the red sub-pixels do elsewhere. An equally bright white though would require Red, Green and Blue sub-pixels to be on (all 3 combine to give white) and as all 3 are on, they don't need to be on as brightly as they combine - 3 lightbulbs are brighter than 1 so you can dim the 3 down to get the same brightness.

That's why repeated viewing of CNN will cause the red sub-pixels to wear out faster than the rest of the screen. As they wear, they lose brightness and will eventually fade enough to be noticeably darker - this is often referred to as 'burn-in' but really its more 'burn-out' or uneven wear. HDR for example also accelerates the wear because the sub-pixels have to be even brighter. Again, its like a Lightbulb that has say 5000hrs of life, the more its on, the more of that life you use up but if you have a dimmer switch, you can get more than 5000hrs of use by dimming it down. In general viewing, you don't have a single colour (R, G or B) in a fixed position for long - if at all and most colours you see are also using the other sub-pixels. You also get times when the sub-pixels are off or at the very least, on very dimly due to shadows, night etc.

Its the difference in 'wear' that is causing problems with OLEDs. Certain elements - like a logo, a box, a ticker tape latest news strip etc that are high wear are causing damage - more so if you turn your brightness up and they are using 1 or at most 2 sub-pixels (Yellow is made of Red and Green and Magenta is Red and Blue for example). Its not about leaving these on screen for a hours until they 'burn in', its about the accumulation of 'hours' so they wear significantly more than the rest. If you watch 2hrs of Good Morning Britain every morning, that won't burn-in or cause Image Retention but will cause the area of the logo to wear more than the rest - that's 10hrs a 5-day week, 520hrs a year so will become noticeable sooner than someone who maybe watches the same show for up to 30mins a day - at the same brightness

Point is, OLED sub-pixels have a life span in hours (even if its in terms of 1,000's hrs) but also fade slowly as they are used so the more they are on and the brightness they need to be affect the amount and rate of wear. General SDR use is relatively low wear content because the sub-pixels are not always on and certainly not on as brightly as they could be. A Single bright colour logo (or static health bar) is high wear for that area and will burn through that life span faster causing it to fade more than the rest of the screen until it becomes noticeable and because its often a sharply defined area (as a logo/box etc are), it gets called 'burn -in'. However, it can happen too with watching news as the person reading and their clothing may have more red than the background for example. Because they are not perfectly stationary, the wear is much less defined but the same principal has caused that area to fade more.

The RTING's real life OLED test shows that a mix of general content (NBC has a mix of news, TV, Sport, Films) at 200nits started to display uneven wear after 9000 hrs which if you watch ~4hrs a day (1500hrs a year), that would be around 6yrs of use but if you watch 10hrs a day, that would be around 2.5yrs after buying. The CNN news test shows how brightness affects the rate of wear as they tested at 200nits and max brightness. It also shows that single colours (Red) wear more than White (all 3 colours) in the same 'static' area as the white parts of the logo haven't worn anywhere near as much but even at 200nits, the CNN logo caused wear after several months at 20hrs per day because its accumulative effect. Whether the wear would be noticeable in general content or not at that point, I doubt it but the colour accuracy would be affected.

OLEDs are not suitable for ALL - not if you want a TV to last more than 5yrs for example as it depends very much on the content you display, the amount of hours you use your TV for and the brightness of the content. You have the best PQ at the start but it will be affected by these factors. Even if they have increased the life span of the sub-pixels with newer models, you can still wear out some sub-pixels faster than others. If an OLED with general SDR content starts to show uneven wear after say 10000hrs, that will occur much sooner if you watch 5000hrs a year compared to someone who watches 2000hrs a year - the first would get about 2yrs compared to 5yrs with another assuming they watch the same content at the same brightness. PQ is just 1 factor to consider when buying and not necessarily the only factor IF you want to have a TV that lasts more than 5yrs for example. An OLED may be the best for PQ but that doesn't make it the right choice for everyone - especially if uneven wear/burn-in is not covered by a warranty. This is why I won't recommend ANY TV specifically because it may not be the 'best' for them, the content they watch and their personal preferences/situation - I will try and explain what they should consider though when choosing their TV and maybe how to minimise risks too...

A pessimist is just an optimist with experience!

Why can't life be like gaming? Why can't I restart from an earlier checkpoint??

Feel free to add me but please send a message so I know where you know me from...

PSN: TaimeDowne

JohnnyShoulder

@Th3solution Here is the vid i was on about earlier, iirc it explains it all with the technical mumbo jumbo that some get bogged down with, and if it does use it, explains it quite well.

[Edited by JohnnyShoulder]

Life is more fun when you help people succeed, instead of wishing them to fail.

Better to remain silent and be thought a fool than to speak and remove all doubt.

PSN: JohnnyShoulder

Th3solution

@JohnnyShoulder @BAMozzy Thanks a bunch for all the information! I learned a lot. Interestingly, delving more into the particulars of OLED has made me concerned about my OG Vita. Especially since they don’t make them anymore, there is going to be a limit to image quality, if I haven’t already hit that mark in the hundreds of hours of usage. I haven’t noticed burn-in (or as Bam says more appropriately “burn-out”) per se, but I haven’t really critiqued my screen closely. I really do need to pick up one of the LED models to have for futures gaming in my Vita in the years to come.

Anyway, as it relates to getting a new TV, I feel a little more comfortable now with looking at the Samsung products again since the Dolby Vision HDR issue isn’t quite as deal-breaking as I thought. What’s the deal with Samsung’s newer models being labeled as “Quantum HDR 16X” ... and does it mean much or does a TV just need to be HDR10+?

“We cannot solve our problems with the same thinking we used when we created them.”

BAMozzy

Th3solution wrote:

What’s the deal with Samsung’s newer models being labeled as “Quantum HDR 16X” ... and does it mean much or does a TV just need to be HDR10+?

That's just Marketing BS. Its Samsung's own way of measuring their own TV's HDR Performance on their TV's and NOTHING to do with what HDR format(s) it will handle. At best, you can use it to compare with other Samsung TV's but that's all. A few years ago, they were using terms like HDR1000, HDR1500 and HDR2000 - literally referred to the Maximum Peak brightness their TV's 'could' hit - although HDR1000 meant anything up to 1000nits, HDR1500 meant 1000-1500 etc so a TV that measured 1050nits Peak Brightness would be HDR1500. This is just the updated way of Samsung's marketing BS.

A pessimist is just an optimist with experience!

Why can't life be like gaming? Why can't I restart from an earlier checkpoint??

Feel free to add me but please send a message so I know where you know me from...

PSN: TaimeDowne

JohnnyShoulder

@Th3solution You will learn that most of the fancy stuff that manufacturers spout is BS. You just gotta know when to see through it. Some of the processing that they add can sometimes have a negative effect when playing games. Luckily you turn em off in most cases, or at least lower them so they are less influencing.

Life is more fun when you help people succeed, instead of wishing them to fail.

Better to remain silent and be thought a fool than to speak and remove all doubt.

PSN: JohnnyShoulder

Th3solution

@BAMozzy @JohnnyShoulder Speaking of... how important are “Game Mode” options on a TV? In my limited understanding I figure it has to do with refresh rate? It’s supposed to make the lag better, correct? Is it usually affecting motion detection or other types of visual output? I think most all TV’s have the option for a “Game Mode” input, and I’ve used it for my games, but I’m not sure how much of a difference it makes and if I’m sacrificing some of the visual fidelity for unnecessary bumps in my lag, since I don’t play online competitive MP.

“We cannot solve our problems with the same thinking we used when we created them.”

Ryall

@Th3solution You probably don’t have to worry about your Vita for the same reason as you don’t have to worry about your mobile phone. The people who tend to have problems are those who either use their TV as wallpaper or leave the game paused with the TV still on whilst they do something else. Neither of which is likely to be a problem for a handheld as you would put it into sleep mode.

[Edited by Ryall]

Ryall

JohnnyShoulder

@Th3solution I've turned Game mode off on my tv, I didn't like the results and it kinda messed with HDR a bit. It was great with my previous tv however, but that was only a 1080p model. So I think it depends on how each set implements it and what processing they use.

Life is more fun when you help people succeed, instead of wishing them to fail.

Better to remain silent and be thought a fool than to speak and remove all doubt.

PSN: JohnnyShoulder

BAMozzy

@Th3solution Game mode essentially turns of some of the image processing your TV does to reduce the lag. If its applying image processing, that takes time and adds to the delay between receiving the frame and displaying it - how much depends on what processing you apply - motion settings for example can be quite 'expensive' in terms of lag.

What you also have to remember though is that games are built and have their own image processing applied. Colours are chosen and are more precise for that reason They can make a red be a 'pure' red - no green or blue where as that same red, with a camera filming it, will not be pure.

Some people prefer their games to look more 'filmic' than they were made and prefer the look of using a movie mode for example but game mode is more like using a PC monitor. Both can be calibrated so you are getting the accurate colours but a PC monitor doesn't do image processing. Its the most 'raw' for want of a better word to the source.

In terms of Lag, the best TV's are less than 1 frame behind at 60fps - less than 16.6ms but in Movie mode, that can be -80ms - nearly 5 frames behind. Add in motion settings and that can be ~120ms. At 30fps, you are less than half a frame behind in game mode, over 2 frames behind in a movie mode or nearly 4 frames behind if you also add motion. By behind, I mean that where the game is actually running on your console and the time it takes for you to see that frame and react to that. When you first see an enemy on screen, that enemy is really several frames ahead. If they are moving left to right for example, you would need to shoot where they are, not where they appear to be, lead the enemy - a problem if they suddenly turn back and move right to left because you wouldn't see them do that for several frames.

Its not 'too' bad in some games - even racing games for example because you will learn to adjust. If you break and steer where you think you should based on what you see, you run wide and miss the corner so you learn to break and steer a bit earlier so the car gets round the corner. If you then switch to game mode, you end up breaking too early, wasting speed and cutting the corner until you adjust. Its the same with Platforming. If you are used to game mode and switch to Movie mode, you end up jumping too late, maybe falling off the ledge until you learn to press jump earlier so you see the character take another step and then jump at the right spot instead of jumping almost instantaneously.

It doesn't matter whether you play CoD (MP or not), some 30fps story based action game or whatever you play. There is a delay between where the game is at and what you are seeing. It adds to the way a game 'feels' to play because you press a button and the game reacts BUT you have to then wait to see that action on screen. MP is a big one because if an enemy walks around the corner, they have several frames to see and start shooting you before you see them come round that corner and its 'not' a fair fight. They can get a bullet into you before you even see them but even in the campaign, the game will not feel as responsive because you have to wait for the TV to display that frame where you pressed the button and you are 'reacting' to things that happened frames before anyway.

There is inpit lag from the controller anyway so that also adds on to the overall feel. If the Lag is 80ms from the controller and you have 80ms lag from the TV, that's 160ms lag between pressing the button and seeing that action on screen. If that was an MP against an opponent with 15ms TV lag, that's only 95ms but they also had the advantage of seeing you 3 frames before you see them so can 'react' soone. Like I said, if you are playing a Racing game or Platformer for example, you adjust. If you end up falling off the ledge or not making the corner, you learn to press the button when the car/character looks a bit further back so you get round the corner or make that jump.

At the end of the day though, the choice is yours and with next gen coming, game mode will be more important if you want the benefits of VRR (Variable Refresh Rates) to eliminate screen tear/judder.Every time a new frame is ready, it will be displayed so 'any' frame rates (usually ~44fps and better) are possible. Games are 60fps or 30fps because of TV refresh rates and its holding onto a frame too long because the next wasn't ready (judder) or the screen refreshing with some old and some new frame (screen tear) that causes these issues but with VRR, your TV could refresh at the same frame rate as the game is running to give a much smoother and cleaner image. Only a few TV's offer VRR right now but that will increase when the 2020 TV's arrive.

I use Game Mode myself and certainly don't find the image to be bad - maybe because I have used 'game mode' ever since I bought my first flat panel HD TV and was blown away by the upgrade to HD and the size increase from a 32" CRT TV to a 46" HD TV. I am used to it and find the Movie mode to look softer and like a filter is applied to it. Like I said though, its the choice of the individual and up to them whether they want their games to look that way or want their games to feel a bit more responsive....

A pessimist is just an optimist with experience!

Why can't life be like gaming? Why can't I restart from an earlier checkpoint??

Feel free to add me but please send a message so I know where you know me from...

PSN: TaimeDowne

Ridwaano

@Th3solution did you get a TV? Hopefully a TV with 2.1 hdmi. Since you will get all the futurproof feather ALM. VRR. Since it about gaming on it

Ridwaano

Th3solution

@Ridwaano Well, yes and no. I did finally decide on a purchase and I upgraded my TV but I didn’t have the money for HDMI 2.1. I decided to go entry level / cheaper right now and then in a couple years I can get something with HDMI 2.1. I just don’t have the funds to be future proof right now.
What I got was a Samsung QLED 6 series, so for a very reasonable price (got it on sale to boot) It does actually have VRR capabilities; one of the few non-HDMI 2.1 TVs that does it. It was a major decision for me, because of the superior aspects for gaming. Of course what I mainly gave up by going the Samsung route was the Dolby Vision HDR, and at the end of the day, the extra set of gamer friendly options like VRR and Freesync as well as the slightly more affordable pricing was what tipped the scale.
I’m very happy with the TV so far. It’s an enormous upgrade for me and the 4K with the QLED colors are beautiful.
The complaints about lack of HDR brightness and less “punch” from HDR are well founded though. Again, for me, its a huge upgrade over my old TV anyways which had no HDR. But I have had to crank up the backlight settings and I typically use Dynamic picture mode to really get the supersaturated vibrant colors that I was hoping for, but it usually delivers. I have found that I need to toggle between picture modes depending on what content I am watching. For example gaming is great in Dynamic mode, as is Netflix, but with Amazon Prime or watching a Blu-Ray (and I just have a normal Blu Ray player, can’t afford a UHD player yet, and I’ll probably just wait for PS5 for that) then Movie mode is actually better and I can see dark scenes much better. There is a definite difference between watching something in HDR 10+ like Amazon Prime versus watching Netflix in just basic HDR since the TV doesn’t do the Dolby Vision. Like I say, for me it’s great for the price, and I came pretty close to getting the Sony equivalent but price, lack of VRR, Sony’s slightly more muted and more “natural” colors without QLED, and a few quality of life necessities like lack of native Apple TV integration pushed me to the Samsung.
If I had the money, I’d love to have gone with the LG C9 or Samsung Q9 series, but that will have to wait a couple years. As a gaming TV this one if fabulous. As a general use TV it’s really good, but sometimes hit or miss depending on the content.

[Edited by Th3solution]

“We cannot solve our problems with the same thinking we used when we created them.”

JohnnyShoulder

@Th3solution Glad you are enjoying your new tv. I don't get people that still say there is not a massive difference between standard dev and UHD. Have you played many games in HDR, and what are your thoughts if you have? I known you have a peasant ps4 like me, so will be interesting to read what you think.

Life is more fun when you help people succeed, instead of wishing them to fail.

Better to remain silent and be thought a fool than to speak and remove all doubt.

PSN: JohnnyShoulder

Th3solution

@JohnnyShoulder I actually have not. My gaming time has been limited and so I’ll have to let you know as I mess around with it.

“We cannot solve our problems with the same thinking we used when we created them.”

Elodin

I need a new TV 55" range, $500 price range. I want as little motion blur as possible. Also the ability to turn off the "soap opera" effect for movies, which is probably most tvs. I'm leaning towards this:

TCL - 55" Class - LED - 6 Series - 2160p - Smart - 4K UHD TV with HDR - Roku TV
Model:55R625 SKU:6367716

Anyone have a TCL model or recommend another tv?

Elodin

This topic has been archived, no further posts can be added.