I have a feeling that the debate should be split off to a separate thread, but since it's here and now there are a couple of points I wanted to chime in on.
[quote name='MSI Magus']If you are a PC gamer and you do not know how to fix a PC(which is still most of the worlds population)you would be looking at $50-$200 to take it in to a PC expert and have them determine the problem, then another $100-$400 for parts and labor. Generally if something on your PC breaks and you can not fix it yourself your looking at $150-$300 easy.[/QUOTE]
This comment here (while I understand it is to make a point in favor of consoles) is heavily dependent on your location and willingness to look up a computer technician on Google, in the phone book, or in the want ads. The prices quoted seem to be along the lines of work done at the Big Box stores as opposed to a smaller or one-man technician outfit. If you can find a technician with a small or one-man operation and you're a residential customer, you'll find that many of them will waive the diagnostic fee if you follow their recommendations in regards to repair (obviously there will be a diagnostic fee if you take your computer to another company for the repair work). Quite often you'll find that the technician is also an avid gamer, and some friendly chatter during the repair may net you a discounted rate and/or some extra tweaking to allow your machine to perform better for gaming.
For reference, I am a one man IT contracting company specializing primarily in small to medium medical offices. I also make house calls to residential homes, and I love to discuss gaming with my customers. I usually waive my diagnostic fee on residential customers if I handle the repair unless I have to travel more than 100 miles one way to your location. I also tend to sell parts to gaming customers for cost (even showing them where I order the parts if my source deals with retail customers) and rely on the installation fee for my income.
[quote name='lordopus99']To summerize, most consoles last 7+ years. In order to keep up with the new big titles on PC, you have to upgrade $200+ every 2 years. You are essentially replacing your system every 2 years vs consoles at every 7 years. To each their own...[/QUOTE]
This is a statement that I see made in almost every debate of this sort (well, usually they say the console lasts 5+ years and computers 1 year). While the numbers are true, I usually consider the context to be wrong for this kind of discussion.
Here's the issue. Consoles are static devices with relative few (if any until recently) changes to the hardware. While they are usually great quality when they first come out, 2 years down the road they look dated in comparison to new computers or newer consoles (I'm speaking major releases, not tweaks like the new slim versions of the consoles). The reason for this is because the hardware itself is manufactured by one company, and any one company (no matter how big) has only finite resources to devote to R/D (whether it's straight parts development, or researching better parts to incorporate). Computers, on the other hand, have thousands of different companies focusing on hundreds of different aspects of the hardware inside, all of which are coming out with a new latest and greatest product. As long as you are willing to spend the money, your computer will generally make your brand new console seem dated within a month.
Which brings us to the money issue again. As before, I really consider that the majority of people take this point out of context. Working in the business side of IT, I see a lot of new/younger companies spending thousands of dollars (sometimes millions) to constantly update their equipment to the latest specs. It's such a common occurrence that a lot of the latest software I see in the medical field has hardware specifications that make me pause and double-check to see if I'm looking at a video game system requirement instead of business software (there's a medical record company out there that put out a 2010 version of their product that includes a minimum requirement of 4GB RAM (pretty much forcing a 64-bit OS) and a quad core processor clocked at 3GHz for the desktop client). However, I also deal with older companies, businesses that have been around for 20-30 years, and see that they are still happily working on servers and computers built in the 80's. Sure, sourcing parts nowadays may cost as much as a new computer, but the savings that they have incurred by avoiding the rat-race of hardware makes those costs seem like chump change. (There's also a point to be made about the longevity of a computer being related to proper care and maintenance, but I really don't want to go into that here except to ask when was the last time you did a really thorough dusting inside your computer?)
The same can be held true for gaming computers. Yes, I'm a technician and know more about tweaking hardware and settings than the common user, but a lot of what I'm sharing has more to do with how you view your computer and gaming experience. You don't have to follow the recommended requirements on each new game. The recommended requirements almost always take into account new hardware advances, and if you rely only on those requirements then you will find yourself always buying new hardware. However, you'll find that the minimum requirements are (typically) only what you need to play your game, and that they are significantly less demanding than the recommended requirements. Upgrade only when there is a specific need that you really want to fulfill, do not upgrade to fulfill a developers need.
How does this apply to the console vs computer argument? Well, it's true that there is rarely an upgrade cost consider and low repair costs for consoles, but you have to understand that to achieve this the developers are forced to conform to the existing hardware. A couple of years after a major console is initially released to the market, it's the equivalent of running all games in minimum spec mode on a computer. That's why you start seeing issues with frame rate lagging getting worse after 3-4 years of a console being on the market, or this new issue with consoles using sub-HD images. Yes, a good developer works around this limitation (such as dropping images to sub-HD), but in the console vs computer debate you have to acknowledge that there simply is no option available to consoles, whereas computers have the option to upgrade.
Giving this view point, you'll find that a computer's shelf life in regards to gameplay is significantly longer. Yes, you won't get the greatest quality out of the games, but the quality is comparable to the compromises forced on to the developers and players on the consoles. My last gaming machine before my current build lasted 6 years with only a $40 video card upgrade in the middle (keep in mind, I'm a tech and I modded and tweaked the living daylights out of that machine). My current gaming computer is now 3 years old, with the only upgrade being a $120 video card this year simply (well, and new hard drives but that's related to pack-ratting Taiwan drama videos for my mother and not my gaming). That video card upgrade only came about because the video card in my HTPC died and I decided to migrate my gaming video card to the HTPC and spend a little extra on a new video card for my gaming computer.
As for the consoles... well. I happen to be fortunate enough to own all of the latest gen models (I've given away my older models since I made sure my PS3 was hardware PS2 compatible). I love the games I play on my consoles as much as I love the games on my computers, but in the long run, I doubt that I'd bother with a console at all if it weren't for exclusive titles.