Here I am sitting with my new $499 console. It does 4K and it does some ray tracing. For years, I have been wanting to play Cyberpunk on my PC. But why should I be racing to buy an additional $700-$800 singular PC part in order to enjoy Cyberpunk 2077 with slightly better graphics on PC next month?
Forget the 3080 ($700) or 3090 ($1500), TBH; those are Premium pricing. I think probably the 3070 might be the way to go for PC gamers, not 3080 or 3090 (seems like those two are at a huge premium to me) - hence why they priced the 3070 at $500 (i.e. price of a new console), in hopes people would choose b/t PC gaming or Sony PS5...and then just go Nvidia 3070. I would guess, some might even do both (PS5 and buy a RTX 3070), too. [shrug]
I think the thing for most people getting a PS5 is for the exclusives...that won't hit PC or other platforms (or at least not here there ASAP) - like Spider-Man; The Last of Us series; Uncharted series; God of War series; and anything else exclusive from Sony that you can think of.
Yeah, and this all even though Sony's been dabbling here & there on the PC, of late - i.e. Horizon: Zero Dawn Complete; and Quantic Dream's Detroit: BH; Beyond: Two Souls; and Heavy Rain. Probably when almost everybody's already played all this stuff on PS5...and they feel they can make some more $ on the PC and those willing to take a double-dip to get it on PC. If Sony's even porting stuff to PC, I think we might not see it for around 6 months to a few years anyways, depending on what they feel like doing; I'd think it's probably when they feel PS5 sales of a game might dry up, when they decide to port a game to PC.
Anything else besides the Sony exclusives - like say better AMD hardware; SSD support w/ quick load-times; 4K support; etc etc -
for $500, makes it look like a really solid deal...for those that want the exclusives; to just pick-up and play; back compat' support for a lot of PS4 titles; and not deal w/ a lot of the PC-platform complexities and whatnot (i.e. drivers, pricing of hardware, upgrading/updating, lots of different services, etc etc).
For me, games like Cyberpunk and even Borderlands 3 - yeah, and other games w/ shooter elements like that too - is I'd prefer to play that on KB/mouse. I really hope more consoles support that, TBH - since they're basically mini-PC's anyways these days (especially in their architecture).
But, hearing performance of that on PS5 is 4k60 - that's awesome. I do wonder if PS5 ever added 1440p support, if it would hit 90fps or 120fps, which would be awesome.
I've been super impressed with the performance of Borerlands 3 in 4K60 and HDR on the PS5. It's amazing how well it runs and the grahpics look fantastic. It only takes 10sec to load into any new map.
You see now why Nvidia was so eager to get ahead of everyone back in September. They

ing blew it. I've gotten tired of trying to give them money since then. Unfortunately I don't think that my GTX 1080 is going to cut it for CP2077 until I get it replaced.
I dunno, did NVidia really blow it though? Sure, AMD's competitive as heck again and all - but 3070's to 3090's seem to still be sought after; I don't know, but they were smart to jump when they did. They got everybody (especially 900 series owners and mid-1000 series owners) all horned-up for a huge increase in performance and did it real early, putting a lot of their 2000 series problems (especially in RT here) to shame this go around - i.e. seems like RTX and DLSS are much better, this go around too.
It's not like AMD has a DLSS tech yet out on the market (to improve performance w/ Super-Sampling and upscaling) - but, we can bet, they will, sometime soon.
Plus, as we all know - we can probably bet Nvidia is gonna refresh 3000 series next year w/ KO's, Supers, etc etc - and likely going to cause a price-drop of prices on the older 3000 cards. And when 4000's comes, that'll probably drop 3000's in price too.
Also, I do wonder how that GTX 1080 will handle CP. I'd guess w/ RTX off, CP 2077 PC would be at somewhere b/t Recommended (1080p at High) and High (1440p at Ultra). 1080's no slouch, as it was a top card at that time, pretty much. But, we're now 2 Nvidia gens ahead, here w/ the 3000 cards out. I'd guess, it could be solid performance-wise - but that'll be w/ RTX stuff off or on very low. GTX 1080 wasn't built for RTX stuff.
Personally, dunno if upgrading from a GTX 1080 to RTX 3070 (or better) is really needed yet. Depends on what res', if you want high framerates, etc etc. If you're doing a console - then you probably want 1080p60 or 1440p60 at least...or 4K30.
I dunno, but do we really think we'll see 4K60 on RTX's games w/ open-world and NPC's galore like CP 2077? Miles Morales is hitting 4k30. It's why I wish they (Sony) had 1440p support on PS5 - b/c it'd be better than 1080p, but almost as good as 4K; it'd just be that happy medium/go-between resolution for performance and eye candy, IMHO.
Looks like right now, most stuff right now is at 1440p60fps at Ultra or 4k30fps or better there on the GTX 1080. You probably could wait another gen or so for a 4000 series; or for some significant price reductions/price cuts on the 3000's way later...or the AMD equivalents.
Just saw this cool video on GTX 1080 v. RTX 3070:
https://www.youtube.com/watch?v=qrbDkwdlO4A
Steve (of Gamer's Nexus) didn't even seem to think a 1080 Ti to a RTX 3070 is a great jump either (see his Conclusion section):
https://www.youtube.com/watch?v=NbZDERlshbQ