Been watching some impressions on Youtube and I wouldn't be surprised if 2077's turns out to be a PS5 game. The amount of stuff you can do in that game sounds very impressive. The fact you can apparently do stuff like climb buildings makes me want it in third person even more.
Everyone seems to be saying the same thing - that this is the most impressive game they have seen in years. I think First Person can work and I think its particularly good for the gun-play and it seems that the augmentations will take on a whole new dimension too - seeing these happen from a first person like having your 'eye' ripped out and replaced by a new augmented one for example. Driving can be 3rd person which is great too. Its unusual to have an RPG from a 1st person so it will be interesting and different - not necessarily bad - you get to see everything without your character blocking something.
As for whether its 'this' gen or next, its possible. The fact that its 'buildings' all around means that you don't need such high polygon counts - a lot of just boxes with texture fills and potentially limits the draw distance too. It means they can keep characters looking great with higher polygon counts further away. Games like the Witcher has a lot of vegetation which are more complex than a building and require a lot more CPU resources too. The trailer ran at 30fps and wasn't the 'best' anti-aliased quality which would be more indicative of this generation - albeit the 'X' hardware (or at least a PC with X type specs or X dev kit which has a few more GPU cores and a LOT more RAM). Point is, you would expect at least 60fps from a next gen Zen CPU that I bet is running at over 3ghz - probably around 3.5ghz - that I would bet would be in the next gen hardware. People focus so much on GPU but that is 'generally' being told what to draw by the CPU and the more powerful it is, the quicker it can draw and process the image - do all the lighting, the shadows, the reflections etc A Standard PS4 can render 4k images and if the PS4 could output them too but because its not that powerful (comparatively), it would take much longer than the 33ms needed to render 30 frames per second. The CPU is the brain though that tells the GPU what to render and usually the part that also calculates where everything is in the world, the AI, the physics etc etc.
A pessimist is just an optimist with experience!
Why can't life be like gaming? Why can't I restart from an earlier checkpoint??
Feel free to add me but please send a message so I know where you know me from...
I wonder why they don't show the gameplay demo. Does it make a difference? Cause all these previews are only making me more curious about the gameplay.
@BAMozzy Sorry, but I don't understand most of this tech stuff but that video you've linked was one of videos I've watched. The other one was DigitalFoundry "Tech Analysis" video.
@BAMozzy you can forget about 3.5GHz zen processor as they would require high power draw and would produce high temperatures so it's out of question in console business.
@WanderingBullet To try and simplify, the GPU is mostly for the 'visual' of a game and CPU is the brains and tells the GPU what to draw. Both of these have to able to do all their calculations and rendering in either 33ms (for 30fps) or 16.6ms (for 60fps). If either takes too long to do their work, you drop frames. You can save time in the 'render' by making the image smaller (drop the resolution) in order to hit 30 or 60fps.
The different style of Cyberpunk, the fact that buildings are quite flat and require much fewer polygons than say a tree or bush means that they are not so 'complex' to draw and calculate - even in 3D. This saves a lot of render time and therefore can hit 'higher' resolution without impacting on the frame rates. The DF analysis also shows that some of the post processing pipeline isn't particularly high level so therefore saves a bit more time on the render and again another reason why Cyberpunk could be a higher resolution and stable 30fps on hardware like the X.
It doesn't mean it looks any less impressive but from a 'technical' point of view, the amount of polygons could be significantly lower because of the different environments which means that the GPU and CPU doesn't have to calculate and render so many different points etc. Its easier to track the four points and draw a square than it is to track all the points of 20 triangles and draw these to make a more complex 3D object. In other words - its easier to draw a building than it is a tree - especially in 3D. A flat road is a lot less taxing than a dirt track with ruts from carts with all the grass, shrubs etc.
Because the environment is more simplistic from a purely technical perspective, CD Project can increase the polygon count elsewhere - like in the number of people and the LOD's (level of detail) at greater distance. Objects that are further away are often simplified - meaning that the polygon count is lower because its too 'small' to make too much difference to save both object calculation and render time. Its easier to track and calculate 200points than 2000 and less to draw in too.
What this means is that its much more likely to hit 4k/30 on an X than something like the Witcher 3 - meaning that its much more likely to be running on current hardware. In theory, it should scale down well too because the base hardware is only expected to render at 1080p - 25% of the image size. Being in First Person too can save both render and CPU usage as well as you don't have such a complex object to track and draw.
A pessimist is just an optimist with experience!
Why can't life be like gaming? Why can't I restart from an earlier checkpoint??
Feel free to add me but please send a message so I know where you know me from...
@NecuVise Maybe - maybe not. A Zen 1700 upclocked to 3.7ghz draws up to 65watts yet a PS4 during gameplay draws ~140watts - more than double. AMD's Zen is far less power hungry than nVidia's i7 for example. The 'heat' issue can be resolved by a decent cooling system and the X's 'Vapour chamber' solution is incredibly effective. 3-3.5ghz could well be within both the power draw and temperature range for a console. An 1800x starts at 3.6ghz so a 'downclocked' custom version won't have the power draw or heat issue either...
The PS4 Slim obviously dropped the power draw by moving to 16nm so if they do go 'smaller' - say 7nm, then the power draw will also be less as will the heat generated. The OG PS4 is much higher than the OG XB1 on power consumption but still not as high as the OG PS3. The Xbox One X can hit up to 190watts but its averaging around 175-180watts during intensive 4k HDR game-play in Gears 4 - lower in Forza 7 (160-170) and Rise of the Tomb Raider in Geothermal Valley - around 155-160w. That's also with a much less efficient CPU and can show what type of power draw that consoles could target. A Zen at (upto 65w) could well be within the range - let alone dropping it a bit by downclocking to reduce both power and heat as well as improvements to heat dissipation methods, better efficiency smaller die size etc...
Edit: Just remembered that the Xbox 360 had a 3 core CPU running at 3.2Ghz - that's double the speed of a PS4 CPU - albeit with a lot fewer cores. Point is though, a 3 - 3.5Ghz CPU wouldn't be unique for a console...
Hands down the best conference goes Devolver Digital. Now that is how a conference should be done. Can't see Square Enix doing the same next year though.
Life is more fun when you help people succeed, instead of wishing them to fail.
Better to remain silent and be thought a fool than to speak and remove all doubt.
Hands down the best conference goes Devolver Digital. Now that is how a conference should be done. Can't see Square Enix doing the same next year though.
Forums
Topic: E3 2018
Posts 341 to 355 of 355
This topic has been archived, no further posts can be added.