Z0mB13 has raised some interesting points... but I feel that the main points raised don't exactly provide a good enough justification for games to see more "optimization" or "utility" from high-end GPUs. First up, as some members have already pointed out, Games and the hardware than runs them enjoy a symbiotic but still separate place in the technology industry. Indeed, I'd argue that games constitute a product segment within the entertainment industry rather than the technology industry per se. While we do see hardware manufacturers occasionally develop first-party applications and games - e.g. Microsoft, Sony etc. - these are still relatively rare arrangements and with very good reason: most of these form because of acquisitions rather than these parties setting out from the very beginning to develop both the hardware and the software that will run on it.
In simple business terms, the issue here is really all about risk. While it might seem like a wonderful idea, I would argue that it entails a significant amount of risk and also has the potential to create an extremely disadvantageous situation for companies that attempt this unless they have extremely deep pockets or are cornerstone companies within their industry. To consider the positions of AMD and nVidia and why their top-end GPUs seem to be so pointless at times, we have to understand where the majority of their business is really coming from and what their product and support strategy really entails.
It's no surprise that PCs represent a much smaller portion of the hardcore gaming market than consoles and, in recent times, mobile devices do. But, in order to more properly critique this state of affairs - in which high-end GPUs seem like white elephants, we have to understand how most developers strategize their developments. The larger game developers and publishers understood early on that with the rising complexity of games and the subsequent demand for more content, the costs of developing games in recent years has increased exponentially. I too have been a gamer for a long time, but to simply reminisce about the "good ol' days" and how the current titles lack the "innovation" and "sense of novelty" that made those older "standout" titles so fond to remember would also entail us not realizing the changing landscape of the industry and how development cycles and processes relate to the end product and how it will be received.
For sure, some elements of a games development - such as innovative gameplay and so on - may require ingenuity more than time, but other elements such as the modelling, game engine, textures, sound and video are tremendous timesinks. In fact, with the increased capability of new hardware, developers spend increasing larger and larger amounts of their budgets (time and money) on the latter aspects rather than on creating the "innovative" gameplay that you might desire most of all. With ever more complex game engines come ever more complex requirements, development cycles and validation. It is this potent mix of complex featuresets and assets that had made games, by and large, the expensive developments that we know today. This then brings us to the central point that most hardware afficionados are at pains to rant about - the expensive hardware often does not seem to warrant the expense because the games are not taking advantage of them. To this rant, I would like to propose two ways of looking at the issue.
1) The combination of factors from time to cost to return, all basically means that console development is the only justifiable focus for developing the so called "serious" or hardcore games. In programming and development terms, you always work towards optimizing for the lowest common denominator rather than trying to work the other way round - after all, in terms of audience reception, by and large, it doesn't really make sense for marketing to promote a game as being a PC game which is going to be ported to the console later when the consoles represent a much larger share of the pie and are much easier to develop for anyway. A PC-centric development strategy might please the gaming-PC-owning minority, but that in and of itself will not really pay the bills half as well as a top-selling console title in most cases. Beyond this, a console-centric development strategy is generally a much better way of seeing a title hit the shelves (and thus see the bills and risks of development covered) much sooner. So, I feel that the point here is that one of the most obvious answers as to why hi-end GPUs may not see their full potential realized all the time boils down to how the bulk of games are developed and why it makes sense that they are developed this way.
It's a simple matter of economics...
... and economic sense dictates that you play to the most likely source of you getting a positive return on capital.
But you could look at the other sides of things...
2) The hardware and the games that the hardware runs are not one and the same, nor should the necessarily be. For a GPU developer to develop games ignores two fundamental tennants of business - 1) Always play to your strengths, 2) Avoid placing yourself in a situation where the risks of participation or creating barriers to entry far outweigh the potential benefits.
The first point is that developing entertainment software is not a core function or strength of either AMD or nVidia. For them to go down that path would imperil them in a multitude of ways - firstly because of the financial risks involved - as consumers it is easy to underestimate the manpower, time and money it takes to make a modern day blockbuster. While devs may not always release the exact figures, its safe to say that making something to the scale of say, Battlefield 3, is a huge undertaking and all those programmers, modellers, artists and marketing people need to be paid. This kind of money generally isn't going to be paid off by the hi-end GPU toting niche PC crowd...
And getting the hardware manufacturers to make their own games might even be akin to virtual economic and political suicide - because somewhere along the lines, there's going to be a case where a conflict of interest arises. Given that they are doing just fine developing hardware that other companies (software developers) try to work with, why would they benefit from creating a situation where their product portfolio threatens to be a closed box? As it is, it makes much more sense to provide the platform and hardware to run it on rather than try to make games themselves.
You could argue that innovative games don't have to be blockbusters, but then we reach the greatest contradiction and also, in my opinion, the answer to it all... the hardware itself is not the problem.
What Z0mB13 seems to be suggesting sounds more like the desire for more technology demos rather than games per se, isn't it? After all, if what you really want to see is the hardware getting more maxed out, then a tech demo is really where its at.
Preview...
Truthfully, if you buy any hi-end GPU, you aren't really looking for bang-for-buck, right? Realistically, anyone who does so is missing the point there are much better value propositions out there and trying to pin the responsibility of purchase justification on software developers is a little bit unfair, isn't it? Shifting the buck to the hardware developers is also stretching it a little. At the end of it all, if AMD or nVidia tells you that with this GPU you will have the "XXXXXXX most awesome gaming experience ever", its time to realize when the marketing department is doing its work rather than them delivering the gospel truth about hardware. You buy a top-end GPU because you would like to experience what the cutting edge in technology can deliver, but that doesn't always mean that there is an obligation by anyone to deliver the other component of that experience. I wouldn't go so far as to call it "willy waving" universally, because some of the most extreme purchasers of such hardware that I know don't even publicize the fact that they have bought such stuff. However, what I can at least opine is that when purchasing a piece of top-end kit, its important to realize the manufacturer's perspective at least, and I would reckon that to them, the top-end adopters are really the first ones who help to fund the R&D of tomorrows mainstream PC, console and mobile parts and whom are willing to pay the premium that a preview should entail.
As one from the 8086 days myself, I find that often, when our generation of gamers/hardware enthusiasts rants, we sometimes get mixed up between our gamer and hardware enthusiast sides. I hope that this post will go a little way to calm things a little.
-neOnEPIC