So I think that I'll go for an MU7000 or 8000, but I'll wait in case I find a Xmas deal or something.
One more question came up. Do I need to change my HDMI cable? I've been using the one who came along with my PS4 Slim. I read here that I need a "premium" cable. Is the one I already have "premium" enough...?
@KappaBeta The MU series are 'decent' 4k TV's but not the 'greatest' with HDR. The best MU (depending on where you live depends on model number - MU9000 in US, MU8000 in EU) only hits around 600nits and not the widest colour gamut either - leading to mediocre colour volume. In fairness though, they are 'decent' TV's for the money and great for gaming with low input lag. At this price, you won't get a UHD Premium TV - unless you find a 2016 KS Samsung for sale and its HDR is better than SDR so its still an upgrade.
As far as HDMI cables are concerned, there are only 2 (currently - until HDMI 2.1 and 48Gbps cables are released) different cables. These are Standard (or Category 1) and High-Speed (or Category 2).
Standard cables are up to ~11Gbps bandwidth and good enough for all '1080p' content - even some 4k content but only up to 30fps - Most blurays are 24fps so this may be adequate.
High Speed cables are up to ~18Gbps bandwidth and good enough for 4k/60, HDR.
Cables are like 'pipes' and the wider the pipe (bandwidth), the more data can flow through per second. Obviously if you are sending a 4k HDR image, you need more data than a 1080p image. You need to send more images if the frame rate is higher too 30 per second vs 60 per second for example.
All cables are backwards compatible so you can use a High Speed cable in any device that uses HDMI - you just won't necessarily max out its bandwidth. There are some retailers/manufacturers that will use 'High Speed' or 'Premium' to sell as it sounds better than 'Standard'. Also some may say 4k capable but are standard because a standard cable can offer 4k with limited frame rates. Some also may say HDMI 2.0 but a 'standard' cable will still work with HDMI 2.0 - unless that device is pushing out 4k and 30fps+. What you need to look for is 'High Speed' (official name) but also check its bandwidth (if given should around 18Gbps) or whether its stated as handling 4k/60 and HDR. You don't need to spend a lot - Amazon do a 'basic' High Speed HDMI for a few quid (price is dependant on length but all under £10) that does exactly the same job as (no loss of Image or sound quality) as a £100 one from some retailers (like Curry's) - there really is no need to spend a lot of money at all! Whether you need to buy one for 1080p HDR - I don't know. It may not hurt to buy a High Speed cable as they not expensive...
A pessimist is just an optimist with experience!
Why can't life be like gaming? Why can't I restart from an earlier checkpoint??
Feel free to add me but please send a message so I know where you know me from...
@BAMozzy am I right in thinking a hdmi cable with insufficient bandwidth won't "throttle" a signal, it'll just not display any picture at all? Therefore it should be pretty clear if your cable isn't fit for purpose?
@BAMozzy Yeah, I know they're not the best for HDR, but that's how far my budget gets me (I figured out the stupid onomatology from reading a lot of reviews ; I'm in EU BTW). Regardless it'll be a big upgrade from my previous set. After I had read your response I actually checked my cable and saw that "high speed" is written on it, so I guess it'll do the job.
@kyleforrester87 That's what I read too. It'll either work or not.
So I ended up buying the mu7000. At first I didn't configure the HDR settings correctly and I was really frustrated because the image looked awful. My mistake was that I didn't set the settings once the console was actually sending an HDR signal. After I figured that out and it was all set, I was really impressed by the image. I tried GT Sport, where HDR really shined, and I also played the 3 quests of the MHW demo.
However although in-game GT Sport looks gorgeous, there seems to be an issue (?) with some of the videos it renders while in the main menu. I'm talking about the videos where cars move through still pictures. There seems to be something like a "static" effect in these videos and it becomes really strong in imagery with low lighting. Any ideas?
Also I have an off-topic question. I've been getting really paranoid because of the Youtube phone to TV pairing. I'm visiting my parents for Christmas and my TV is still shown in the list of devices my phone can connect to. Can my phone turn my TV on now if I cast a video by mistake? If yes then TV will remain on for a few days since I was dumb enough to disable the auto power-off...
@Flaming_Kaiser They have - the WE663 for example is a 1080p HDR TV. Its very difficult though to find any details on their HDR performance - whether its just 'supported' so you get a 'better' than SDR image or actually delivers HDR to a specific standard. It could, for example not have the widest colour gamut, an 8bit (with dithering) panel and only hit 300-350nits. If this is the case, then its downgrading HDR to its limited capability - a long way off of bringing the content as it was intended to be viewed. It could however offer close to the UHD Premium standards - losing out on resolution but if you only intend to use HDR on a PS4, then it doesn't matter.
I still think it makes little sense to invest in a 1080p TV unless that's all the budget you can stretch too. It makes more sense to me to look at 4k as that is definitely where the future lies. I can't see Sony making a 1080p PS5 and if you only have a 1080p TV, you will lose out on the resolution increase. You could lose out on the 4k HDR Bluray players, 4k streaming content etc too. Its a bit like buying a CRT SD TV when gaming/TV etc was moving to HD.
1080p HDR games can still look fantastic on a 4k HDR screen. I played a few games on my XB1s on a 4k HDR screen and they looked stunning. Granted they weren't as sharp but they could still compete and often looked better than some 1440p SDR games. A standard PS4 though should still look great on a 4k HDR screen - I have had a 4k screen for 3-4yrs now and a 4k HDR screen for 18months - months before HDR gaming came along. Point is, I have spent 3-4yrs gaming with 720-1080p games on a 4k screen and also seen what 1080p HDR games look like on a 4k HDR screen. I still think it makes more sense to buy 4k and be more future proof.
Each to their own of course. Its their money after all but if they do upgrade their console in the next few years, will they be happy with super-sampled images? If not, that would mean yet another TV purchase. It may well be better to buy a 4k HDR screen then - if prices drop. A 4k HDR screen may well be within budget and deliver a similar HDR performance (not the 'best' but still comparable to the 1080p TV) for a few £'s more - especially in the January sales.
A pessimist is just an optimist with experience!
Why can't life be like gaming? Why can't I restart from an earlier checkpoint??
Feel free to add me but please send a message so I know where you know me from...
Man I find HDR is such a minefield at the moment! I really hope they have one standardised format by the time the next consoles are here as that is when I will most likely upgrade my TV.
At least we know who to ask for advice, @BAMozzy really knows his stuff.
Life is more fun when you help people succeed, instead of wishing them to fail.
Better to remain silent and be thought a fool than to speak and remove all doubt.
@JohnnyShoulder Its not a minefield really. HDR10 is the minimum standard and the format that every pre-recorded HDR content should be. HLG is just HDR for broadcasters because it doesn't rely on metadata. HDR10+ and Dolby Vision are basically enhanced versions but delivered on top of HDR10. If you don't have a DV/HDR10+ TV or Bluray player, you will get HDR10. HDR10+ and DV are designed to help weaker spec TV's more than higher spec but as no TV yet offers the mastering level spec, its still beneficial. Dynamic metadata helps the darker scenes more in general and bright scenes can look the same. There is no 'standard' for how TV manufacturers should tone map but there are standards for how it is mastered.
HDR is basically all about having a much larger colour volume. Colour is made up of Red, Green and Blue but each colour value also has Luminescence (brightness). The colour range (gamut) is bigger and the luminescence is much higher so you have a much bigger colour volume. Its like a Toblerone where the 'Triangle' base representing 'colour gamut' and the length representing how bright it can be. A bigger triangle and longer length has a lot more volume - and that's basically what HDR is. HDR content has a minimum 'volume' to which it is mastered to. The better the TV the greater the volume, the less tone mapping. Tone mapping is like Super-sampling - taking a bigger image/volume and fitting it into a 'smaller' size/volume. UHD Premium means a display (for example) has a minimum standard of colour volume.
Dynamic metadata helps optimise the tone mapping on a scene by scene basis where as static metadata is fixed throughout which can result in less optimal dark scenes but still look good anyway
A pessimist is just an optimist with experience!
Why can't life be like gaming? Why can't I restart from an earlier checkpoint??
Feel free to add me but please send a message so I know where you know me from...
@BAMozzy thanks for clearing that up, that makes a bit more sense now.
I wish some of the websites i visited were that clear. Most em just list all the different hdr formats say which one is the best.
I still think is there too much jargon used. One site said to make the tv has a certain 4k label, went on to Amazon and none of the tvs i looked at had this! Doesn't help that most manufactures use there own terminology. Full 4k, True 4K, Live HDR.
Or maybe I'm just being a bit thick lol.
Life is more fun when you help people succeed, instead of wishing them to fail.
Better to remain silent and be thought a fool than to speak and remove all doubt.
@JohnnyShoulder Most websites are comparing the different formats and treating them as 'separate' things. HDR10 and HLG are the two most important though and your HDR TV ought to support both of these if buying. HDR10+ and Dolby Vision are basically enhanced options on top of HDR10 - both similar. If you don't have DV or HDR10+ support, ANY content you watch will default to HDR10.
Its a bit like having a stereo and some CD's have 'Dolby stereo' and others have 'enhanced stereo' but unless your Stereo supports either 'Dolby' or 'enhanced' stereo, you still get 'stereo' audio. Websites may say that Dolby is better than enhanced which is better than standard 'stereo' but all 3 are still 'stereo' and the differences are not 'always' audible in every moment of a song. Dolby may sound the same but the quiet moments seem more impactful like enhanced but also has a slightly 'cleaner' top end - however all 3 are greatly superior to Mono.
Dolby Vision is mastered to a higher standard in general but a lot of HDR10 is also mastered to that standard too. The only real difference between DV and HDR10+ is the colour bit depth - 12bit vs 10bit that 'could' see a slight advantage in some gradients but no TV yet has a 12bit panel. Of course there are differences to manufacturers and professional colourists as Dolby is completely closed and requires a fee to be paid to Dolby where as HDR10+ is open-source meaning that many more people could add it to their content or devices. Samsung sells around 50% of all TV's a year and as such, HDR10+ is likely to be in more homes. LG accounts for only 25% and only the OLEDs have DV so in a year, Samsung could sell more HDR10+ TV's than DV TV's sold in 2-3yrs.
At the end of the day though, as a consumer, if your TV doesn't have DV or HDR10+ support, you will still get HDR10. You would need both Source/device AND display to support either. You can't watch DV on on a HDR10+ TV and vice versa BUT it will default down to HDR10 so you will never miss out on HDR content.
As I said, HLG is different from these as its designed for broadcasters to deliver HDR without metadata meaning that they can show live events in HDR as it doesn't require post-processing to add in metadata.
What you need to look out for when buying TV's is the 'UHD Premium' logo. This logo indicates any display that has been independently verified to deliver a 'minimum' standard for HDR content. The problem though comes from the fact that Sony for example won't send their TV's to be independently verified - maybe because only the XE9305, A1 and ZD9 would pass where as the XE9005 and below wouldn't. Only the LG OLEDs passed as did the Samsung KS and Q series. Panasonic only had the DX902b that would pass until they too offered OLEDs. As a result, the Logo isn't on many TV's - fewer than it could be. Manufacturers don't want their budget or lower spec TV's to not sell and 'HDR' is a buzz word so they will use 'HDR' to sell TV's. Its a bit like selling 1080-1440p monitors as '4k' because they will accept and super-sample 4k content down to that resolution BUT you are not getting a '4k' quality image.
Any TV that accepts a HDR source (doesn't say 'this cannot be displayed' or give a black screen) and can down-grade it to the limitations of the TV is still called 'HDR'. That includes TV's without the wider colour gamut or decent brightness. Even SDR TV's can go brighter - how many watch TV with the backlight on max? The only way to guarantee that you are getting a display that can meet the minimum standard, is to look for the UHD Premium logo. Currys display it - you can also go to the UHD Premium website and check which models have been verified.
If you are looking at Sony TV's though, the best bet is to do research, find out what the specs are and whether the TV meets or exceeds those. Generally though, if its under £1k, chances are it's not meeting those specs. Prices could drop in time but right now, you would need to spend over £1k. Those without UHD Premium, could be nothing more than a SDR TV that scraps all the extra colour information and compresses the contrast range down to the limits of the TV which can make some HDR content, especially dark scenes, be a lot worse than SDR. Its not that HDR is 'bad' but that the Display is incapable of displaying it properly.
A pessimist is just an optimist with experience!
Why can't life be like gaming? Why can't I restart from an earlier checkpoint??
Feel free to add me but please send a message so I know where you know me from...
@BAMozzy Nice one for the Dolby Vision to Dolby stereo comparison, as an audio geek that suddenly made the hdr stuff make loads more sense.
From what I'd read about 12 months ago when looking into hdr, a lot of explanations seemed to make Dolby Vision and HDR10 seem like two separate formats almost fighting a modern Betamax vs VHS war and for that reason I steared well clear.
Nice to know that's not the case and that Dolby Vision is just Dolby doing a sort of enhancement job with hdr like they did with stereo sound.
I know I'm probably still missing the more technical nuances but thanks for clearing the basics up in a nice, simple to understand way.
@DistortedAndroid No probs. I hope that makes it a lot easier for when you do decide to jump in.
HDR10+ and Dolby Vision are, to a degree, in competition but unlike VHS vs Betamax or HD-DVD vs BluRay, if you do buy a Dolby Vision or HDR10+ bluray disc, you will still get at least HDR10 even if your 4k Bluray player and TV don't support these enhanced layer. If you owned a VHS or Bluray player, you couldn't play betamax or HD-DVD's but regardless of whether you own HDR10+ or DV, you will still be able to play that disc and still get HDR10. You will never need DV or HDR10+ to get any content in HDR but not all content will be available in DV or HDR10+ - hence its not as important because you will never miss out on HDR content if you have a HDR10 only TV.
Its not always necessary to understand the 'nuances' and differences, as long as you understand the basics. HDR10+ and DV are generally better suited to TV's that have more limited colour volume as tone-mapping becomes more important. Its the way they get say 10litres to fit into a 5litre jug (as this is 'volume') and still keep the integrity. The bigger the TV's colour volume, the less tone-mapping is required. UHD Premium is basically giving consumers a minimum sized 'Jug' to fit HDR's colour volume into - anything less means that some of the 'nuances', impact and/or colour may well be lost.
HDR10+ and DV will try and 'optimise' the picture on a scene by scene basis which generally help dark scenes. Dark scenes have a 'lower' colour volume because they lack the 'bright' colours and may not need such wide colour range either. Therefore Dark scenes can be tone mapped to a bigger colour volume on your TV. For example, if you have static metadata that knows the bright scenes top out at 2000nits, it will scale the whole movie down so that your 700nit display will show 2000nits at its max ability - 1000nits could well be only hitting 450nits on your TV because its scaled down and a scene that's around 500nits, well within the TV's capability, is now only 250nits. Dynamic metadata could make that 500nit scene hit 500nits and map 1:1, that 1000nit scene would now be mapped down to 700nits and the 2000nit scene would map down exactly like it did with static metadata. As you can see, it better optimises the TV. That 500 nit scene now is mapping 1:1 instead of looking much dimmer because a few scenes hit 2000nits and it needs to keep the same tone mapping algorithm throughout. Of course, if your TV has over 2000nits, it will map the entire film 1:1 regardless of whether it has Dynamic metadata or not. Therefore, it becomes less important whether you have Dynamic metadata or not. If you buy a 'under-performing' TV, the benefits of HDR10+/DV are more noticeable - that's the general essence of it.
Colour Volume is the most important aspect of HDR. I know a big emphasis is on the increased 'brightness' but that's only a part of the volume with the wider colour gamut being the other part. It helps with 'dark' scenes too - enabling more detail in shadows etc as well as the more specular highlights and more vibrant colours. UHD Premium accreditation is a guarantee that you have a minimum standard of colour volume - a minimum contrast (difference between the black and bright) with minimum acceptable black level as well as minimum peak brightness and a minimum colour range. Any TV that doesn't deliver any aspect (like not decent black level, not high enough contrast, not wide enough colour range or combination) will not get accreditation. It also benefits SDR content too as it means your TV has to deliver a much higher quality of 'black' - not an issue with OLEDs but some LCDs can be more 'grey' than black. It is meant for us consumers - to have a defined standard of what HDR should mean and that we are buying that standard (or better) rather than buying a SDR TV that is claiming its HDR because it can play that content but not getting the colour volume, the wide colour gamut or contrast ratio. I don't know anyone that would be happy to buy a 1200/1440p TV that is marketed as 4k because it can play 4k content and super-sample it down to the screen resolution - yet this is the principal we are seeing with HDR TV's at the moment. Its not 'cheap' to build TV's that can deliver the minimum Colour Volume yet manufacturers know a lot of people can't or don't want to spend over £1k and 'HDR' is the 'buzzword'. Saying your TV is HDR compatible isn't going to sell if your rival with the same specs is claiming its a HDR TV.
A pessimist is just an optimist with experience!
Why can't life be like gaming? Why can't I restart from an earlier checkpoint??
Feel free to add me but please send a message so I know where you know me from...
Forums
Topic: PS4 (not Pro) and HDR?
Posts 21 to 33 of 33
This topic has been archived, no further posts can be added.