Steam+ Deals Mega Thread (All PC Gaming Deals)

Neuro5i5

CAGiversary!
Feedback
151 (100%)
This thread will attempt to provide a place to discuss past/present/future PC gaming deals. While mainly focusing on Steam games, any standout sales may also be presented. I will not be updating every Daily/Weekly/etc. sale. The tools to help individuals become a smarter shopper will be provided below.

See this POST for links to store sale pages, threads of interest and other tools to help you become a more informed PC game shopper.
 
Last edited:
The VRAM issue did make me notice one thing, Nvidia makes the 1050 in 4GB, but the 1060 in 3 and 6. Is there any reason they didn't go with the traditional 4 and 8?
That they could "get away with it" at the price point they want... (aka their new tech is such an improvement in efficiency that they don't need to brute force it with the amount of RAM. I'm pretty sure the 1080 is way more powerful than the Titan X but consumes so much less power.)

 
I can attest to the generosity of CAGs as a lot of my current build is made up of parts donated from CAGs who felt pity for me fakeybroing games and then running them at 800x600. That and a lot of refurbished Hitachis. Sure, some may ask why not put the money you waste on fakeybroing towards your PC instead, but that would be an asshole thing to think and whoever thinks that is just killing the spirit of giving and generosity and deserves nothing.

Unfortunately, I've never returned the gratitude with a Play of the Game or even average gameplay. Usually the bros are yelling at me like "WTH are you doing Fox?" "I haven't seen Fox all game" and "I spawned on Fox and he's in the middle of nowhere doing nothing again." I'm sure some of them are even regretting their generosity now and wishing they'd helped out a better gamer. Well, it's too late for that, you guys are stuck with me.
the only currency fox pays in is Mom insults. Its his personal gold standard.

 
I don't know about him, but that's what I'm saying. I've only played DA:O and ME1, and whenever plot was happening I couldn't wait for the game to move back to gameplay, and whenever gameplay was happening I couldn't wait for the game to move back to plot. It was a weird succession of different kinds of boredom, and I couldn't stand the large majority of the characters.

Both games had good party banter, though.
Da:eek: i thought had a boring, by the numbers plot, but i thought me1 was pretty well done.

 
Da:eek: i thought had a boring, by the numbers plot, but i thought me1 was pretty well done.
I actually liked the plots of both... allot. I'm not sure which I liked better truth be told but I think I'm inclined to lean towards DA:O as I feel it accomplished in one game what ME was trying do do in three... and failed. (It had decisions that I felt really mattered in the end.)

Gameplay wise... I honestly cannot judge DAO fairly as I played on console and it kind of hamstrung combat but I would say Mass Effect was better... Granted, it took some leveling/better equipment to get there.

 
I looked up the video where it shows the head on the right in motion and it's even more disturbing. I'm getting chills even thinking about it.
Good morning everyone.

Fox, don't click...

MHAGoUu.gif

 
I agree with him.

Also there's another little concern. Whilst a Fury is a much better GPU than a 480 and a 1060 it has a higher power consumption than 480/1060 so it wouldn't be a good idea to run it with a cheap private brand PSU.
Heat, power, noise are all concerns for the R9 Fury, but it's still the best performer in that price range and all you'll need for 1080p. You're going to run into a handful of games that will utilize more than 4GB of VRAM at 1080p, but you're not going to run into any that require it. You'll have to lower settings a little, but not much, on those games, but you'll still get better performance on most games using the R9 Fury compared to an 8GB 480.

 
Da:eek: i thought had a boring, by the numbers plot, but i thought me1 was pretty well done.
I had high expectations going to DA:O, based on BG:II. In games like these the setting and world is very important to me. DA:O seemed very derivative. The tactical combat was good though. I enjoyed the Witcher I, for all its jankiness, more than DA:O just because its world was more interesting.

 
Is there any reason they didn't go with the traditional 4 and 8?
As I recall, the 1060 3GB was a response-move based on AMD's cards which explains why the 3GB version is crippled all around versus the 480 cards which are identical to one another in GPU and differ only in memory.

Da:eek: i thought had a boring, by the numbers plot, but i thought me1 was pretty well done.
DA:O had a stock plot but the use of that plot was well done in my opinion. Enough so that I've played through it four times now despite no longer being shocked that an ancient evil is once again threatening feudal pseudo-Europe.

Heat, power, noise are all concerns for the R9 Fury, but it's still the best performer in that price range and all you'll need for 1080p.
1080p is still the most popular/used resolution with 44% usage according to Steam. Second place is 1366 x 768 with 24%. Resolutions higher than 1080 make up only 4% of Steam players [edit: These are single screen stats only and don't count multi-screen users]. Anyone who is currently content with 1080 and not planning on upgrading any time soon will probably be safe for years to come -- even after 4k becomes the norm, the 1080 holdouts will have several years of legacy support.

In fact, if you ever want to feel good about your computer go read the Steam hardware survey. Most people are running potatoes -- the most common VRAM is 1GB with 2GB in second place and the most common CPU is between 2.3-2.69 Ghz. At least 4 cores finally became the norm over 2 cores; by a margin of, uh, 1%.

 
Last edited by a moderator:
Da:eek: i thought had a boring, by the numbers plot, but i thought me1 was pretty well done.
Eh, I think most BioWare games seem to do the typical fantasy formula of "Bad guy is coming to destroy/takeover the land/world; you're the Chosen One; Chosen One rounds up Chosen allies from all kinds of different races + cultures to save the world from Bad Guy!"

For me, it's usually the way the character interactions, character developments, dialogue, superb writing that really makes BioWare games normally great. You can go through most of BioWare's games - BG1; BG2; DAO; DAI; DA2; SW:KOTOR; Mass Effect series; Jade Empire; NWN Platinum - most of the party members had great characters, back-stories, and fantastic banter. Not many companies to me are on BioWare and Obsidian's level normally on this stuff.

I think DA2 had the most interesting + different structure for a BioWare RPG game w/ its 3 Acts being their own 10-20 hour campaign idea + the very personal nature spread over the course of numerous different years w/ Hawke and his family. Problem is: that game's design + format was WAY too experimental for BioWare and should've spent WAY more time in development. There were way too many corners cut in DA2 - recycled areas; too much enemies jumping from the rooftops in the City of Kirkwall; dumbed-down strategic combat (especially compared to DAO); lack of outfitting + gearing party members' equipment up; and other issues that plagued DA2 for many players.

If there's one game I'd love to see completely overhauled, redone, remastered, released, revamped, and actually properly finished - DA2 would be the one. It should've been one of BioWare's best, but it ended it being one of their worst. And when one of their worst, which I still think is still decent even despite its litany of issues, that's saying a lot about the usual very high quality of their other games.

 
Last edited by a moderator:
A big concern if you ask me... Even if you're only gaming at 1080P. We're just seeing too many cards with 6+ for me to think a card with 4GB is going to be future proof enough to warrant 250 bucks.
If I was aiming for a new card - yeah, I'd be looking for 6+ GB of VRAM. Plus, the 1060 6GB looks to be capable at 1440p. I seem to do fine w/ most games on my 3.5GB GTX 970 w/ at least Medium settings here at 1440p. In many cases, I do a mix of Medium + High settings here w/ performance above 30FPS. I usually am happy w/ anywhere in the 40FPS-60FPS ballpark, as long as it remains stable (i.e. framerate stays at a certain spot most of the time; framerate doesn't take too many big hits; and framerate barely moves up + down many frames when losing or gaining a few frames).

Except probably Homefront: The Revolution. That game has tons of flickering + shadow issues for me, when not run at my monitor's native resolution (i.e. 4K is my native res' on my monitor) with any sort of uncapped framerate. At 4K, sure it runs - but it runs like crap and has no flickering shadow + texture issues. But, I will not tolerate sub-25 FPS's, where I've seen as low as 10FPS to as high as 25FPS. No thanks, I'll just have to live w/ 1440p with 30FPS and a forced V Sync On (via NVidia Inspector), just to get rid of the crummy texture flicking + shadow issues.

 
Last edited by a moderator:
If I was aiming for a new card - yeah, I'd be looking for 6+ GB of VRAM. Plus, the 1060 6GB looks to be capable at 1440p. I seem to do fine w/ most games on my 3.5GB GTX 970 w/ at least Medium settings here at 1440p. In many cases, I do a mix of Medium + High settings here w/ performance above 30FPS. I usually am happy w/ anywhere in the 40FPS-60FPS ballpark, as long as it remains stable (i.e. framerate stays at a certain spot most of the time; framerate doesn't take too many big hits; and framerate barely moves up + down many frames when losing or gaining a few frames).

Except probably Homefront: The Revolution. That game has tons of flickering + shadow issues for me, when not run at my monitor's native resolution (i.e. 4K is my native res' on my monitor) with any sort of uncapped framerate. At 4K, sure it runs - but it runs like crap and has no flickering shadow + texture issues. But, I will not tolerate sub-25 FPS's, where I've seen as low as 10FPS to as high as 25FPS. No thanks, I'll just have to live w/ 1440p with 30FPS and a forced V Sync On (via NVidia Inspector), just to get rid of the crummy texture flicking + shadow issues.
I'm far from an expert in these things, but I've spent most of the weekend reading up on these things and all the nerds tend to agree that going for the 1060 6GB just because it has more VRAMZ is short-sighted and not based on actual performance. The R9 Fury has a different type of memory (HBM I think) that makes it a lot faster than even the 6GB 1060. Here's this video (which just lists FPS on different games) and the Fury R9 beats out the 1060 in almost every test, even on 4K and even with "only" 4GB of VRAMZ:

https://www.youtube.com/watch?v=qePzD5gXYio

 
In fact, if you ever want to feel good about your computer go read the Steam hardware survey. Most people are running potatoes -- the most common VRAM is 1GB with 2GB in second place and the most common CPU is between 2.3-2.69 Ghz. At least 4 cores finally became the norm over 2 cores; by a margin of, uh, 1%.
For my gaming laptop = 4K 15.6'' laptop screen; also tied to 1080p 24'' monitor (when not going mobile); runs most games @ 900p-1080p; 4GB 960M; i7 4720HQ Haswell; 16 GB RAM; W10 64-bit.

For my gaming rig = 4K monitor; often using 1440p w/ AA+AF cranked up; i7 950 Bloomfield; 3.5 GB 970; 16 GB RAM; W7 64-bit.

Man, does that Steam hardware survey make me feel good; damn good...

giphy.gif


 
Last edited by a moderator:
I'm far from an expert in these things, but I've spent most of the weekend reading up on these things and all the nerds tend to agree that going for the 1060 6GB just because it has more VRAMZ is short-sighted and not based on actual performance. The R9 Fury has a different type of memory (HBM I think) that makes it a lot faster than even the 6GB 1060. Here's this video (which just lists FPS on different games) and the Fury R9 beats out the 1060 in almost every test, even on 4K and even with "only" 4GB of VRAMZ:

[R9 Fury v. 1060 6GB @ 1080p + 2160p/4K]

https://www.youtube.com/watch?v=qePzD5gXYio
Since we're at it - these 1440p numbers are also interesting on 1060 vs. R9 Fury, as well:

https://www.youtube.com/watch?annotation_id=annotation_715761919&feature=iv&src_vid=qePzD5gXYio&v=KvpPY4wuSbI

Chances are, though - I currently have very little intentions of changing video cards again so soon (unless, of course, I get an insane deal) and/or ever swinging towards AMD's direction, so....

Ryan-Gosling-Shrug.gif


More VRAM will help w/ actually running higher settings + graphics features newer and higher-end games have - i.e. some games have settings that can go over 4GB, such as Doom having 6GB VRAM card Nightmare graphics settings.

I guess, it depends on the user, when spending their money in a certain area - budget; enthusiast; or above.

Do you go NVidia? Or AMD?

What's your budget for a card?

Do you want a card that can handle 1080p? 1440p? 4K?

Do you want more VRAM? or Better Performance w/ less VRAM? Or pony up way more $$ for both?

 
Last edited by a moderator:
Since we're at it - these 1440p numbers are also interesting on 1060 vs. R9 Fury, as well:

https://www.youtube.com/watch?annotation_id=annotation_715761919&feature=iv&src_vid=qePzD5gXYio&v=KvpPY4wuSbI

Chances are, though - I currently have very little intentions of changing video cards again so soon (unless, of course, I get an insane deal) and/or ever swinging towards AMD's direction, so....

Ryan-Gosling-Shrug.gif


More VRAM will help w/ actually running higher settings + graphics features newer and higher-end games have - i.e. some games have settings that can go over 4GB, such as Doom having 6GB VRAM card Nightmare graphics settings.

I guess, it depends on the user - when spending their money in a certain area - budget; enthusiast; or above.

Do you go NVidia? Or AMD?

Do you want a card that can handle 1080p? 1440p? 4K?

Do you want more VRAM? or Better Performance? Or pony up way more $$ for both?
I'm not in the market for a card either, I was just posting the deal and clarifying the misinformation a lot of people are posting, yourself included. The R9 Fury outperforms the 6GB 1060 even at 1440p in a lot of games. At the end of the day, people are going to buy what they want, but any futureproofing is just based on speculation. Doom is like the one anomaly of a game that is well optimized and could use more than 6GB VRAM. People can base purchases off those anomalies, but I wouldn't recommend it. You even posted that you would go for at least 6GB, but again, that's misinformed because we have a 4GB card that outperforms both a 6GB 1060 and a 8GB 480 on most games and most settings.

These tests from the video I posted (and even the one you posted) show that in the case of the 1060 more VRAM doesn't help with actually running higher settings and graphic features compared to the R9 Fury (as you claim with no proof). Why? Because VRAMZ doesn't tell the full story. In this case you have a card with less VRAMZ, but faster VRAMZ and better performance. Plus, we've all been through stages of GFX cards where companies just stuff every iteration of a card chock full of as much VRAMZ as they can just because and not because they actually make a difference as far as real world gaming performance.

 
DA:O had a stock plot but the use of that plot was well done in my opinion. Enough so that I've played through it four times now despite no longer being shocked that an ancient evil is once again threatening feudal pseudo-Europe.
Oh, i don't disagree at all. I probably didn't articulate what I meant exactly in my original post that well, just comparatively speaking although bioware did a good job w/the presentation of the plot and character interactions In DA:O, ME1's plot definitely hooked me better.

If there's one game I'd love to see completely overhauled, redone, remastered, released, revamped, and actually properly finished - DA2 would be the one. It should've been one of BioWare's best, but it ended it being one of their worst. And when one of their worst, which I still think is still decent even despite its litany of issues, that's saying a lot about the usual very high quality of their other games.
I agree, I think DA2 gets slammed a bit too much. Not one of their better games but I enjoyed it

 
Most people are running potatoes
SeasonalCAG 24hrs of Work On Sunday/Hardware Blues:
That quote makes me miss TheHangout Wags and Trav know what I'm taking about.

So been off Steam for about a week or so because I have been trying to troubleshoot some BSODs. First my PC would just shutdown with a different BSOD every time. Some pointed to drivers others to memory. Tested the memory came back good even bought new memory to test and uninstalled any new drivers and Windows updates still BSOD. Tried every sane and crazy trick I could find even did a clean install several of times. Then my computer would not get pass the BIOS. So I started to check connections, reseated memory, cpu and gpu and ended up taking my whole computer apart.

Then just by luck I looked at my 24pin connector and noticed there was a pin missing (didn't know this was normal at the time) so looked up my PSU and found out my custom cables had the missing pin in the wrong place.

So after more research this could fire my cpu, Mobo, PSU and more.

So now I have bought a 7700k, Asus Z270 Tuf Mark 1 and Seasonic Prime Titanium 750w. Wish I would had known this was going to happen before buying the 1080 ti that is sitting in it's box with no home would have waited on that. So instead of waiting for RMAs that may or may not happen and just throwing out the bath water I threw out the baby with it. If I can get RMAs for the CPU, Mobo and PSU then I can sell it or make a back up rig in case something like this happens again.

On phone so my bad can't spoiler but...
TL;DR: PC fucked up didn't fix just bought new shit instead.

[attachment=25961:IMG_20170316_180852.jpg]
 
Last edited by a moderator:
SeasonalCAG 24hrs of Work On Sunday/Hardware Blues:
That quote makes me miss TheHangout Wags and Trav know what I'm taking about.

So been off Steam for about a week or so because I have been trying to troubleshoot some BSODs. First my PC would just shutdown with a different BSOD every time. Some pointed to drivers others to memory. Tested the memory came back good even bought new memory to test and uninstalled any new drivers and Windows updates still BSOD. Tried every sane and crazy trick I could find even did a clean install several of times. Then my computer would not get pass the BIOS. So I started to check connections, reseated memory, cpu and gpu and ended up taking my whole computer apart.

Then just by luck I looked at my 24pin connector and noticed there was a pin missing (didn't know this was normal at the time) so looked up my PSU and found out my custom cables had the missing pin in the wrong place.

So after more research this could fire my cpu, Mobo, PSU and more.

So now I have bought a 7700k, Asus Z270 Tuf Mark 1 and Seasonic Prime Titanium 750w. Wish I would had known this was going to happen before buying the 1080 ti that is sitting in it's box with no home would have waited on that. So instead of waiting for RMAs that may or may not happen and just throwing out the bath water I threw out the baby with it. If I can get RMAs for the CPU, Mobo and PSU then I can sell it or make a back up rig in case something like this happens again.

On phone so my bad can't spoiler but...
TL;DR: PC fucked up didn't fix just bought new shit instead.

attachicon.gif
IMG_20170316_180852.jpg
Man, that sucks, especially since I know how much you love tweaking your rig. When did you install those custom cables? Where did you get them?

 
I agree, I think DA2 gets slammed a bit too much. Not one of their better games but I enjoyed it
Yeah, I think in this particular game, BioWare's characters - Isabella, Varric, Fenris, Avelline, and especially Merrill - were really what kept me going here. I thought they were excellent, TBH.

Plus, the interesting DA2 3-Act structure felt like 3 mini-RPG games and stories for an RPG. For me, that was a refreshing set-up, even though it didn't execute always perfectly on all cylinders. I still think Act 2 was the best Act, TBH. While Act 3 set-up DAI, I think Act 2 felt the most complete + was the interesting of the acts; that was the act with the Qunari in Kirkwall.

I think following DAO, it was really going to be hard to top that one - especially since it was near-perfect w/ its old-school style CRPG w/ somewhat modern graphics. It really helped that DAO in the making for so damn long, too (around 7 years, IIRC) - unlike the half-cooked DA2. BioWare finally ditching D+D; not dealing w/ WoTC + Hasbro (so they can do whatever content they feel); and doing their own M-rated dark-fantasy was just refreshing, IMHO.

Probably b/c of its issues and problems, is really why I'd love to see DA2 get re-released w/ all its DLC. Given the repetitive areas + nature of the game, I'm sure we probably could use Legacy DLC and Assassin DLC to freshen some things up a bit. Would've been nice to have seen that Exalted March expansion come to fruition - but, they scrapped it and we got the excellent DAI instead.

 
It's a shame too because a lot of the ME dlc is really, really good. Leviathan is pretty much essential, lore wise. I've wanted to replay the series on PC but not w/out the dlc...if they released the trilogy remastered with all dlc i'd be hard pressed not to pay the full 60 for it
funny-no-squirrel-animal-pics.jpg


 
I'm not in the market for a card either, I was just posting the deal and clarifying the misinformation a lot of people are posting, yourself included. The R9 Fury outperforms the 6GB 1060 even at 1440p in a lot of games. At the end of the day, people are going to buy what they want, but any futureproofing is just based on speculation. Doom is like the one anomaly of a game that is well optimized and could use more than 6GB VRAM. People can base purchases off those anomalies, but I wouldn't recommend it. You even posted that you would go for at least 6GB, but again, that's misinformed because we have a 4GB card that outperforms both a 6GB 1060 and a 8GB 480 on most games and most settings.

These tests from the video I posted (and even the one you posted) show that in the case of the 1060 more VRAM doesn't help with actually running higher settings and graphic features compared to the R9 Fury (as you claim with no proof). Why? Because VRAMZ doesn't tell the full story. In this case you have a card with less VRAMZ, but faster VRAMZ and better performance. Plus, we've all been through stages of GFX cards where companies just stuff every iteration of a card chock full of as much VRAMZ as they can just because and not because they actually make a difference as far as real world gaming performance.
Dishonored 2 can eat up around 7GB of VRAM, too - again, another "anomaly." That means one would either need a GTX 1070, 1080/1080Ti or a higher-end 8GB AMD card, just to even run it at some of its higher settings.

Of course, this is speculating - I think w/ the 4K push, we're going to see more of these game eating up more VRAM, especially at higher settings. 1440p is fine w/ my 3.5GB 970 at Medium to High, but never Ultra on ewer games. 4K can vary; depends per game. And usually, 4K's a solid "No" for me...unless it's MK XL or WWE 2K16, which doesn't have huge arenas and doesn't have a ton of things going on. Those 2 games are fine for me at 4K.

And yeah, doesn't surprise me to see more cards w/ more VRAM offered, even in the same series of a card. We saw it w/ the 960, where there were 2GB and 4GB flavors; and in most cases they perform around the same. We've seen it w/ the GTX 1060, w/ the 3GB + 6GB versions. Both times, w/ the 960 and 1060, for the extra $50 - I think you're better off w/ the double-GB version, just to have VRAM buffer so you can toss up higher settings + future-proof it, even if it's just a little bit.

Yes, we know the 4GB R9 Fury defeats the GTX 1060 6GB, in terms of performance. It's no real surprise, given the Fury's such a very high-end product. It's older, runs way hotter, and has less VRAM.

For some, though - they might not go for that card b/c it's AMD. I'm a NVidia guy and normally support them; and that's even despite having issues w/ their drivers quite a bit of late - so don't be surprised if you hear me picking Nvidia over AMD; not picking AMD; or not even mentioning AMD - in most instances.

 
Last edited by a moderator:
WOW.

GamersGate actually region locked the game for me. The base game and all the DLC? Just fine. But not this one. Looks like it was just too cheap to sell it to me.
Damn TrumpCat and his Walls!

First physical Walls, now digital. Ugh.

He must've struck some kind of deal w/ Ashot P.

#MakeGamersGateGreatAgain #BreakTheDigitalWallDown #CAGsWantToBuyGamesAnywhere

#NoRegionLocking

 
Last edited by a moderator:
Man, that sucks, especially since I know how much you love tweaking your rig. When did you install those custom cables? Where did you get them?
SeasonalCAG Cable Madness:

I think it was last summer or later so it could have been slow coming my components. They were the Bitfenix Cables and it's funny because I usually do an insane amount of research when upgrading but didn't because I figured it's just colored cables. They were the cheapest at the time beating EVGA, CableMod and others at cost. Also it could be something else I'm missing but after trying everything else most were pointing to replacing/rmaing either cpu, Mobo or PSU and aintnobodygottimefothat.gif!
 
Dishonored 2 can eat up around 7GB of VRAM, too - again, another "anomaly." That means one would either need a GTX 1070, 1080/1080Ti or a higher-end 8GB AMD card, just to even run it at some of its higher settings.
I would hope that, when buying a $200 GPU, you're accepting that you may not be able to play at the highest settings and may even have to settle for "medium" in some instances.

Fortunately, in this day and age, "Medium" still looks pretty good.

 
LEGO Marvel and LEGO Marvel Avengers are both far and away better than the early LEGO games like Batman. But as much as I love them, I have to space them out a bit since they're all so similar. Don't be discouraged if their LEGO enthusiasm wanes after they finish the first game -- they'll be back.
Yep. Lego burnout has a shorter duration than Disgaea or Civ burnout.

 
I would hope that, when buying a $200 GPU, you're accepting that you may not be able to play at the highest settings and may even have to settle for "medium" in some instances.

Fortunately, in this day and age, "Medium" still looks pretty good.
Of course, that's what I'd expect with any card: not to max games out. I never normally do max them out here, anyways. I'm not running games here at 4K on Ultra, you know?

I normally aim for 1440p on my desktop PC @ Medium or above with 30FPS+. Preferably 60FPS, if possible.

On my gaming laptop PC - 900p-1080p @ Medium or better with 30FPS+.

I usually go from there and tweak things to get more performance or graphical quality, if need be. Depends. Sometimes, I'll mix some Medium or High settings; mix High or Ultra settings; or whatever I need to, to make me happy w/ a decent mix of nice graphics + nice performance.

Often, I turn off V Sync to get more frames + performance, as long as nothing important causes the game to have issues.

Any issues of shadow flickering + texture flickering (Homefront: The Revolution); graphical tearing; hit-detection going out the window; physics going out the windows (Fallout 3 + NV) ; and others things can cause me to force V-Sync On or force on NVidia FastSync, if necessary.

 
Last edited by a moderator:
I would hope that, when buying a $200 GPU, you're accepting that you may not be able to play at the highest settings and may even have to settle for "medium" in some instances.

Fortunately, in this day and age, "Medium" still looks pretty good.
Not to mention, the only assertion I've made is that the R9 Fury will have you covered at 1080p for the foreseeable future, not 1440p, not 4K, not 5 years from now when we'll be on the 2080 with 64GB of VRAMZ.

 
Of course, that's what I'd expect with any card: not to max games out. I never normally do max them out here, anyways. I'm not running games here at 4K on Ultra, you know?
Which is why it's not terribly important, when buying a budget-priced card, if a particular game "can" use 7GB or 8GB or 37GB -- the question is, is the game playable with 4GB? If the answer is yes, then you're set. You're always going to have poorly optimized or memory-hungry games that can put away all the GB's you throw at them.

 
Last edited by a moderator:
Not to mention, the only assertion I've made is that the R9 Fury will have you covered at 1080p for the foreseeable future, not 1440p, not 4K, not 5 years from now when we'll be on the 2080 with 64GB of VRAMZ.
I dunno, but the numbers for 1440p w/ Ultra in many games from that vid I posted (from the same guys that had your vid w/ 1080p and 4K benchies) look very interesting on the R9 Fury. I'm not a AMD guy, but those numbers are interesting.

With that card - someone probably could drop down to Medium or High and/or many turn down/turn off a few other unnecessary things off to get 60FPS in many games, I'd guess.

Would have to see more benchmarks for 1440p at Medium or High for that R9 Fury. Don't need a new card yet, but I'm certainly curious.

EDIT:

Which is why it's not terribly important, when buying a budget-priced card, if a particular game "can" use 7GB or 8GB or 37GB -- the question is, is the game playable with 4GB? If the answer is yes, then you're set. You're always going to have poorly optimized or memory-hungry games that can put away all the GB's you throw at them.
You'd have to turn some settings down or off, if you are going to surpass that VRAM usage buffer. It's the sacrifice you make.

I wish more games in-game would have expected VRAM usage in the in-game menus w/ the graphics options like Batman AK does.

 
Last edited by a moderator:
I dunno, but the numbers for 1440p w/ Ultra in many games from that vid I posted (from the same guys that had your vid w/ 1080p and 4K benchies) look very interesting on the R9 Fury. I'm not a AMD guy, but those numbers are interesting.

With that card - someone probably could drop down to Medium or High and/or many turn down/turn off a few other unnecessary things off to get 60FPS in many games, I'd guess.

Would have to see more benchmarks for 1440p at Medium or High for that R9 Fury. Don't need a new card yet, but I'm certainly curious.

EDIT:

You'd have to turn some settings down or off, if you are going to surpass that VRAM usage buffer. It's the sacrifice you make.

I wish more games in-game would have expected VRAM usage in the in-game menus w/ the graphics options like Batman AK does.
Sadly, that's like the only thing Batman AK did right - tell you how much VRAMZ it would gobble up.

By the time you need a new card, R9 Fury won't even be factor. You have a 4K monitor so you'll want something that will easily handle that. I would wait for another similar situation though, new gen cards pumping out so you can get a high-end card from the past gen for cheaper. And a lot of people say the R9 Fury runs pretty quiet and not superhot (comparatively speaking) although the power consumption will always be higher on AMD cards. If my card went down right now and I wanted to get a replacement at the same level I'd probably go with the 1050ti. I don't think you can beat that for $100. But once you get into the $200 range, things get a lot more complicated. If the new 580s drive down prices on the 480s you might be able to snag a 8GB 480 for $150 which would then I think justify the savings over the R9 Fury.

 
SeasonalCAG Hardware Wisdom:

I'm going to preface with this is not directed towards anyone in particular just bored at work and wanted to say something about the vramz and resolution stuff. There are enough people on the internet that talk shit so I'm just an asshole IRL not on the web.

Most people will run resolutions their card is not made for. Can you have a 4k monitor with a 1060 sure if you want more screen real estate for surfing or content but for gaming it's not ideal. People will do it in the console world buy a 4k TV for gaming when most games will not push 30fps on it. I am still rocking a 1080p monitor for my gaming and have a 1080p TV for the consoles when I had them. People probably think it is crazy to run a 1080p 144hz monitor and a 980ti. The reason I do this is because I want to run my games at ultra settings and get 144fps or more on my games for my fps broh loving games. I have a 1440p 75hz monitor that I use as a second monitor for surfing, mixing music or just want more real estate. For shits and giggles I have ran some games on it but had to bump down graphics and get alot of screen tearing because there is no way I'm using vsync. Think I'm just rambling so basically what I'm trying to say is just be more realistic with your monitor choice and dump as much as you can on a good gpu.

All I do is work, spend time with the fam or in the mancave with my PC. Don't have expensive hobbies or like to socialize so most my extra money goes to my PC and games.
 
+1

I still rock 1080p on a 1080 which people still proclaim as crazy. But I do it for the same reason as you. So I can have everything set on ultra and still hit 100-200fps+. For shooter/mp games it helps so much in the feeling of the game, the whole thing just feels more fluid and fast paced when you can hit that. Plus the added benefit of knowing the card will last me at least through until the volta chips come out in a few years and probably even past the point of voltaTI (hopefully to second-gen volta) that would be a worthy upgrade.

By that time the games will probably start pushing the 1080 down to the realm of 60-80fps even at 1080p. Maybe by then Ill be ready for 1440p/4k monitors as well to make it worth the while, no real point in buying a 4k monitor to barely be able to drive it these days, may as well wait for better/cheaper 4k monitors to come out when you can really push them.

Ive never understood the 'get XXX card for 1080p its all you'll ever need'. Games will continually push and add more and more stuff, hell just look at the 970. When it came out it was kickin ass, gettin ~80-120fps on the current games. Now with more recent games it can barely hit 60 at higher settings and I would expect that trend to continue

 
Last edited by a moderator:
+1

I still rock 1080p on a 1080 which people still proclaim as crazy. But I do it for the same reason as you. So I can have everything set on ultra and still hit 100-200fps+. For shooter/mp games it helps so much in the feeling of the game, the whole thing just feels more fluid and fast paced when you can hit that. Plus the added benefit of knowing the card will last me at least through until the volta chips come out in a few years and probably even past the point of voltaTI that would be a worthy upgrade, by that time the games will probably start pushing the 1080 down to the realm of 60-80fps even at 1080p

Ive never understood the 'get XXX card for 1080p its all you'll ever need'. Games will continually push and add more and more stuff, hell just look at the 970. When it came out it was kickin ass, gettin ~80-120fps on the current games. Now with more recent games it can barely hit 60 at higher settings and I would expect that trend to continue
I guess my questions would be when did you buy it and how much did you pay for it and how many years do you expect it to last you?

I haven't seen anyone here say "get XXX card for 1080p its all you'll ever need." The comments have been that if you game at 1080p certain cards should serve you fine for the foreseeable future (which in computer speak is probably 2-3 years tops, maybe 3-4). I guess it should be clarified that the assumption is for an average user looking for a moderately priced card and happy with the performance you can get in that price range.

Someone wanting 100-200 fps on ultra isn't going to be looking at anything in the $200 price range. I'm not criticizing anyone who wants to drop $600-700 on a video card, but those aren't your average gamers. Average gamers just want to be able to play games at acceptable levels and have their hardware last as long as possible. I would assume most average gamers would be happy getting console level graphics at 1080p.

At the end of the day, I think that most of us end up spending about the same on graphics cards averaged out over the years just in different ways. You'll have the guy dropping $700 on a high end card and hold on to it for years while another guy might buy the latest $100 card every year or two and another guy will buy a $200 card every 2-3 years.

 
I absolutely love that on 4K monitor 28", so much space on screen - in browsers, desktop, whatever. And I am happy to get away on main PC from 1080p.

I think my 4K Samsung was a sweet deal for around $250 on sale, which was around $220 after Amazon Bing GC's. Moved from 1080p 24" up to 28" 4K too. I was planning to move to 1440p once I got the GTX 970, in which 1440p monitors were often around that price anyways or a bit more (esp. on IPS or better). 

With games, I love going 1440p and more on-screen. I just crank up AA + AF and I'm good to go to kill any sort of blur from the 4K to 1440p downgrade. 

 
At the end of the day, I think that most of us end up spending about the same on graphics cards averaged out over the years just in different ways. You'll have the guy dropping $700 on a high end card and hold on to it for years while another guy might buy the latest $100 card every year or two and another guy will buy a $200 card every 2-3 years.
yea Id agree. I had something typed up to that effect but figured I was long winded enough anyway :)

I bought mine after the partner cards came out last summer, and I think it was like 600. I expect it to last me until probably 2019/20-ish since I think volta is slated for probably a 2018 release (a few volta ID benchmarks have started to show up randomly). So that probably puts a TI variant in 2019 and a refresh in 2020. So 4 maybe 5 years worth of gaming.

So in the end sure it does cost more to have a higher end card, just depends on what kind of quality/fps/res you want to be able to run your games at. I just see it as may as well drop it up front so at least there is some point at which you can play games in their full glory with a slow drop off, instead of upgrading once a year or three to still be at middle/upper middle graphics settings. Of course if you're one to stretch hardware further it will also go further if you really want it to and you can hold out longer, but at least at some point in that gpu's timeline you are able to completely max out everything for 1-3 years

 
Last edited by a moderator:
Don't forget that we're on the curve and technology is improving exponentially. Buying a high-end card that'll be outperformed by low-end cards in several years is a luxury expense, so I think buying mid- or low-end cards is just fine and will cost less in the long run. Optimization teams do a decent job at making the games function without needing 4K, 3D, body pillow attachment support on an Nvidia >9000 GTXGP Titan Colossus.

 
Last edited:
Don't forget that we're on the curve and technology is improving exponentially. Buying a high-end card that'll be outperformed by low-end cards in several years is a luxury expense, so I think buying mid- or low-end cards is just fine and will cost less in the long run. Optimization teams do a decent job at making the games function without needing 4K, 3D, body pillow attachment support on an Nvidia >9000 GTXGP Titan Colossus.
Which also probably uses less energy as well. At least that's what I tell myself: my budget graphics card is good for the environment.

 
Last edited by a moderator:
I've always preferred buying something in the second tier of top end performance which was always in the $325-350 range to last me a few years.  The best cards used to cost $500 and the second best cards were just over $300.  It used to be the perfect price range to target something ideal for Price Vs Performance.  But it seems like that pricing category has all but been eliminated now.  Now that category is $400+ and there are multiple cards to choose from. 

Things have gotten confusing.  There is always something new around the corner coming out soon.  It seems like it's never a great time to buy a card.  The older high end models also don't come down in price fast enough. 

 
Last edited by a moderator:
At the end of the day, I think that most of us end up spending about the same on graphics cards averaged out over the years just in different ways. You'll have the guy dropping $700 on a high end card and hold on to it for years while another guy might buy the latest $100 card every year or two and another guy will buy a $200 card every 2-3 years.
or u can just play grandma games from YannyC collection and you will be set for the next 20 years, graphically speaking

 
It was only a year ago that we were all a-quiver at the thought of getting $550 980 performance for $250 or less with the 480/1060 cards. We must have grown jaded :oldman:
It's also brand too. If I posted a $235 4GB 980 I doubt we'd see as many words of warning and we might even see a few "That's a great deal, that card was $500+ a year ago."
 
bread's done
Back
Top