AMD 7990 to launch in Q1 2012

There is still hope in the indie devs.
smile.gif
 
Atleast EA gave us the proper version of FIFA this year lol... but yeah getting a load of console ports does really take the mick and theres no point in having a £700 video card when the games are very similiar on a £150 console but i dont think you can blame AMD for that... their just being awesome
 
AMD should invest money back though. Like, maybe even start their own software department to code up games?

If Microsoft can do it with the Xbox then why the heck can't Nvidia and AMD do it? like, actually create exclusive games that will only run on one or the other card?

Because from where I am sitting there is absolutely no point in buying any of their new cards for single screen gaming. Put simply, you can get a card now for £250 that can handle Battlefield 3, at ultra settings, with 4XFSAA and never dip below 35FPS as a minimum.

And given that BF3 is the greediest game around it doesn't really mean you'd need the extra power for anything else really.

Even the new Dirt game (which I do like the look of even though they are just ripping off Destruction Derby from 1997 ) is running on EGO 2. So we will hardly need anything new for that either.

Other than that? Obsidian are making a South Park game, which at best will use the new Skyrim engine, or worst case scenario will use the Fallout 3 one (and it really only needs the latter for some cartoon graphics).

No Crysis 3 (not that I would care, C2 was linear and the suit was watered down) and not much of anything really.

It's hardly making me want to loosen my bank account to the tune of five hundred notes.
 
idk if it was said... maybe even i did (long weekend) but maybe being 7870 is on the verg of max bandwitch for 2.0 this is were 3.0 will start to show some difference?
 
Have to totally aggree with you there. And I had a laugh about NFSU 2. It really is by far better than any of the new ones. Maybe not visually, but as a game as such - yes. And what different there is in the new ones? More shadows and details that just pointlessly eat your hardware and don't make any difference. All the BF's and COD's are simply the same stuff, to the point where I just want to vomit, how sick I am of people abandoning the old versions and buying and playing the new ones just because they're new. People are such milk cows nowadays...
 
It was the content Nex. A good 25 hours + of gameplay and an enormous city.

Snoop Dogg, Brooke Burke. What do we get now? some fat knob in a racing suit that just waffled on so much in Shift 2 I stopped playing it.

They're just not putting in the effort any more man, nor the money. Yet they still want £50 a pop.
 
After reading this thread my heart sunk a little...... as I built a gaming rig for BF3 and I went for AMD's and CrossfireX, and although it works like a dream in BF3, what I have read is that modern games will NOT, okay so BF1943 WILL and this is typically a BF machine...

But did I actually waste the best part of £600 on two 6970's back in november?

I've been an AMD and ATi fanboy since back when it was all about Slot A's amd the transition to Socket A's and K7's.... I tired one system with Intel and Nvidia and didn't like it, but find it hard to believe that the guys with multi-gpu systems have been ass raped by the manufacturers and game developers who had no intention of awarding end users with this level of technology
sad.gif
 
smile.gif
well done zombie!

No christmas release "my days" blimey that is bad news.

I agree. Games today mostly are cons, likewise, they're like small refreshes of the last title, though "ye" they spend loads of time making them, same for PC components. Gota be the right time and place for many to justify upgrading. So it's good to have the choice if it isn't raved about as a luxury to own all of it!

I've loads of FPS and I'm glad i have BULLETSTORM, BF3 and CRYSIS's, all by EA

I loved BULLFROG and WESTWOOD, dead now they are. The old Syndicate WARS (I run that on Win7 32bit & Bubble Bobble
rolleyes.gif
), EA has ripped off with this next Syndicate game (i think), looks erm? like what it shouldn't, it should be a watever format the old games looked like, with tactical nukes on hand
biggrin.gif


The next Metal Gear? hmm "lost the plot sir!"

THQ has Warhammer Retribution
smile.gif
That's my fuel for delight, quite a unique classic PC game that! And 3rd persons like Lost Planet 2 online is fun, just installed Red Faction Armaged' hope it's similarly fun.

A GPU upgrade for my rig would benefit the race against global warming maby
 
You guy ... you made me make an account in this fourms just to be able to reply to this topic.

Anyway , Am with you Z0mB13 in every point you mentioned throu out the topic. I purchased a Radeon 6990 on day one of release. and Boy oh boy you would not believe how many problems I ran throu during the year. The biggest disappointment was AMD drivers not supporting Stereoscopic 3D on crossfire X or multi GPU setups ! while they advertised clearly on the box that is is Stereoscopic 3d ready ! but only using HD3D By AMD , which was what ? One game during 2011 ? Dues Ex ?

And they just fixed it few weeks ago in their latest drivers ? Really ? After one year ? or is that just because you know that you were about to lose every customer that wanted to try stereo 3D to Nvidia on the next generation cards ?

About PC Gaming ... It is surprising that After all the money activision made on Modern Warfare 2 .. TWO Years after they come with a game that has exactly same engine ? but with worse textures ? I read somewhere that you would actually need better hardware to run Modern warfare 2 than 3. Are they that greedy that they don't want to spare a couple of million dollar from the billions of dollar they made on creating a new engine for this decade ? or are they scared to come up with something different from modern warfare 2 and afraid that people would not like it and therefore not sell ?

You guys remember the old days of PC GAMING ? Free updates ? Free content ? Free Maps ? Mods ? innovation ? 30 $ a game ?

Look at PC gaming now .. Modern Warfare 3 ? worst looking than its prequel ? Port ? 2 years old engine that is based on a 13 years old quake 3 engine ? Raising the bar for Retail PC Games prices ? Since when we pay 60 $ For a PC Game ? Since call of duty ! and Robert Bowling shouting at us telling us that he will support Dedicated servers ? And mouse control ? NO SHIT , REALLY ?

I enjoyed Trine 2 and Jamestown this year more than any other game .. My only last hope is on Indie Developers. But I will always respect Valve , Blizzard and Hi-Rez Studios for dedicating them selves to pc gaming ( Hi-Rez Studios for reviving a PC Gaming legend : Tribes ).

Zombie If you have not tried Tribes yet .. I highly recommend this game , True it will not require a Radeon 7990 but it is very enjoyable to play. Add me on steam and ill gladly give you one of my extra Beta keys.
 
There is still hope in the indie devs.
smile.gif

This is where the only real hope is for me.

The likes of EA started gobbling up companies to publish for, inserting timelines - fixed timelines and release dates. We used to get patches in the past that were proper patches for improvements sake, now we get more "fix" patches.

It's not an argument I like to get into really cos I'm biased. Imo, learning about computers and programming them in college - to then go on to write things, is a bad scenario to be in. Personally I wouldn't let you near a compiler if you hadn't been programming off your own back prior, and I'd want examples. It'll only get worse. The same goes for people in computer support for me.

I can remember the first video I saw of BF3 online before the game came out. Yes major graphics, yes major themes with the rocket being fired at a building - but hold on, look to the right in the foreground. There's a flower pot floating a few inches off the wall.

That's how picky I am. I'll pick at anything that shows shoddiness.
 
The good thing about indie devsis they are usually doing it for the love of gaming. I think the big companies now used to care when their customer base was small enough that it's opinions mattered, but consoles have been the death of PC gaming, simply because it impedes game development for seven years or whatever at a time. When the new consoles launch we will see better PC games correspondingly, the issue will be in say another seven years when the console graphics are awful and we are in the same situation as now.

However, I don't think consoles are going to last very long, unlike, admittedly, most people.

The reason I think this is, is because in looking for new ways to make their respective consoles more appealing and to get more sales, Microsoft and Sony will have to bring something new.

And the idea I've had has already started to come into fruition, I saw an article just this morning on a Samsung TV which has the capability to change it's processor and hardware.

What I reckon will happen is that either Sony, Microsoft or Nintendo will start offering consoles with this capability, and every say two years people can upgrade their graphics card for say £80 or something. People on older hardware will have to play newer games on lower settings. Now they won't be too happy with that, and some will upgrade, some might stop gaming, some will get PCs. With the PC industry the cost of buying a budget gaming PC is going down, and has been going down for sometime. After a while I reckon consoles with upgradeability are going to be seen by a lot of people as just PCs with a different OS, at which point it's the social aspect (e.g. Xbox live and PSN) which is keeping people gaming together on consoles rather than PC. And then after that, I reckon the consoles will just die a slow death. Also if Valve or an equivalent company release a gaming OS for PCs for an affordable price, that will speed up the death of consoles as it will hopefully use PC hardware better. I wouldn't mind seeing the death of DirectX either, being replaced with something more efficient at using hardware. I remember reading this article on bit-tech. Although then the hardware requirements when they are designed might change.

I seem to recall things called Killer Apps. Or even, exclusives. There are no longer any. The last true exclusive PC owners got was Crysis. Ever since then all of the PC games released on PC have been full of reminders of just what they are - console code bothched onto a PC so they can claim another few million in sales. That's all we are - sloppy seconds.

Still are some RTS games like the Total War series, just not everyone likes them I suppose, plus they are pretty buggy games.
 
as a dev i can tell you this...

today people worldwide buy PCs with the intention of it being a 'pure gaming rig'... it was not like that in the late 90s/early 00s. the N64, PS1 and Xbox were the main game contentors in those days, and them were the days when the BIG game dev houses first created their game engines (DX9-based and GL2-based).

even though games were widely played on the PC, they were not in the same calibre as the consoles.

here is the reason:

unless your parents were very well off, and had a PC for themselves and ONE for you, the chances that the targetted demographic played games on the PC for hours on end after schooltime was very few.

these games were commonly played on a console. it was only the RTS-type and MMO-type 'sim' games that really held the market for the PC.

making games is a business, not a pleasure. and, as there are less gaming-rigs than consoles, games will be made for the greatest market.

YOU must not get blinkered by the fact that most of the members of this forum have gaming-rigs, and so, may agree with you. but, the fact still remains that 'we' are very few in terms of the big-bad-world of gamers.
 
I think the majority of people just want to switch a console on and do some gaming.

I agree that on this forum we are biassed, as we all love tech and do research and enjoy tech for tech's sake. Most people want a plug and play experience - they don't want to know how it works or why it works - they are not purists, they just want to play some games and for them a console will always be the easiest option.

They can buy a console knowing that it will play all the latest games. Going out and spending £300 on a GPU every time they want to play the latest blockbuster (console port) thats just crazy....
 
Z0mB13 has raised some interesting points... but I feel that the main points raised don't exactly provide a good enough justification for games to see more "optimization" or "utility" from high-end GPUs. First up, as some members have already pointed out, Games and the hardware than runs them enjoy a symbiotic but still separate place in the technology industry. Indeed, I'd argue that games constitute a product segment within the entertainment industry rather than the technology industry per se. While we do see hardware manufacturers occasionally develop first-party applications and games - e.g. Microsoft, Sony etc. - these are still relatively rare arrangements and with very good reason: most of these form because of acquisitions rather than these parties setting out from the very beginning to develop both the hardware and the software that will run on it.

In simple business terms, the issue here is really all about risk. While it might seem like a wonderful idea, I would argue that it entails a significant amount of risk and also has the potential to create an extremely disadvantageous situation for companies that attempt this unless they have extremely deep pockets or are cornerstone companies within their industry. To consider the positions of AMD and nVidia and why their top-end GPUs seem to be so pointless at times, we have to understand where the majority of their business is really coming from and what their product and support strategy really entails.

It's no surprise that PCs represent a much smaller portion of the hardcore gaming market than consoles and, in recent times, mobile devices do. But, in order to more properly critique this state of affairs - in which high-end GPUs seem like white elephants, we have to understand how most developers strategize their developments. The larger game developers and publishers understood early on that with the rising complexity of games and the subsequent demand for more content, the costs of developing games in recent years has increased exponentially. I too have been a gamer for a long time, but to simply reminisce about the "good ol' days" and how the current titles lack the "innovation" and "sense of novelty" that made those older "standout" titles so fond to remember would also entail us not realizing the changing landscape of the industry and how development cycles and processes relate to the end product and how it will be received.

For sure, some elements of a games development - such as innovative gameplay and so on - may require ingenuity more than time, but other elements such as the modelling, game engine, textures, sound and video are tremendous timesinks. In fact, with the increased capability of new hardware, developers spend increasing larger and larger amounts of their budgets (time and money) on the latter aspects rather than on creating the "innovative" gameplay that you might desire most of all. With ever more complex game engines come ever more complex requirements, development cycles and validation. It is this potent mix of complex featuresets and assets that had made games, by and large, the expensive developments that we know today. This then brings us to the central point that most hardware afficionados are at pains to rant about - the expensive hardware often does not seem to warrant the expense because the games are not taking advantage of them. To this rant, I would like to propose two ways of looking at the issue.

1) The combination of factors from time to cost to return, all basically means that console development is the only justifiable focus for developing the so called "serious" or hardcore games. In programming and development terms, you always work towards optimizing for the lowest common denominator rather than trying to work the other way round - after all, in terms of audience reception, by and large, it doesn't really make sense for marketing to promote a game as being a PC game which is going to be ported to the console later when the consoles represent a much larger share of the pie and are much easier to develop for anyway. A PC-centric development strategy might please the gaming-PC-owning minority, but that in and of itself will not really pay the bills half as well as a top-selling console title in most cases. Beyond this, a console-centric development strategy is generally a much better way of seeing a title hit the shelves (and thus see the bills and risks of development covered) much sooner. So, I feel that the point here is that one of the most obvious answers as to why hi-end GPUs may not see their full potential realized all the time boils down to how the bulk of games are developed and why it makes sense that they are developed this way.

It's a simple matter of economics...

... and economic sense dictates that you play to the most likely source of you getting a positive return on capital.

But you could look at the other sides of things...

2) The hardware and the games that the hardware runs are not one and the same, nor should the necessarily be. For a GPU developer to develop games ignores two fundamental tennants of business - 1) Always play to your strengths, 2) Avoid placing yourself in a situation where the risks of participation or creating barriers to entry far outweigh the potential benefits.

The first point is that developing entertainment software is not a core function or strength of either AMD or nVidia. For them to go down that path would imperil them in a multitude of ways - firstly because of the financial risks involved - as consumers it is easy to underestimate the manpower, time and money it takes to make a modern day blockbuster. While devs may not always release the exact figures, its safe to say that making something to the scale of say, Battlefield 3, is a huge undertaking and all those programmers, modellers, artists and marketing people need to be paid. This kind of money generally isn't going to be paid off by the hi-end GPU toting niche PC crowd...

And getting the hardware manufacturers to make their own games might even be akin to virtual economic and political suicide - because somewhere along the lines, there's going to be a case where a conflict of interest arises. Given that they are doing just fine developing hardware that other companies (software developers) try to work with, why would they benefit from creating a situation where their product portfolio threatens to be a closed box? As it is, it makes much more sense to provide the platform and hardware to run it on rather than try to make games themselves.

You could argue that innovative games don't have to be blockbusters, but then we reach the greatest contradiction and also, in my opinion, the answer to it all... the hardware itself is not the problem.

What Z0mB13 seems to be suggesting sounds more like the desire for more technology demos rather than games per se, isn't it? After all, if what you really want to see is the hardware getting more maxed out, then a tech demo is really where its at.

Preview...

Truthfully, if you buy any hi-end GPU, you aren't really looking for bang-for-buck, right? Realistically, anyone who does so is missing the point there are much better value propositions out there and trying to pin the responsibility of purchase justification on software developers is a little bit unfair, isn't it? Shifting the buck to the hardware developers is also stretching it a little. At the end of it all, if AMD or nVidia tells you that with this GPU you will have the "XXXXXXX most awesome gaming experience ever", its time to realize when the marketing department is doing its work rather than them delivering the gospel truth about hardware. You buy a top-end GPU because you would like to experience what the cutting edge in technology can deliver, but that doesn't always mean that there is an obligation by anyone to deliver the other component of that experience. I wouldn't go so far as to call it "willy waving" universally, because some of the most extreme purchasers of such hardware that I know don't even publicize the fact that they have bought such stuff. However, what I can at least opine is that when purchasing a piece of top-end kit, its important to realize the manufacturer's perspective at least, and I would reckon that to them, the top-end adopters are really the first ones who help to fund the R&D of tomorrows mainstream PC, console and mobile parts and whom are willing to pay the premium that a preview should entail.

As one from the 8086 days myself, I find that often, when our generation of gamers/hardware enthusiasts rants, we sometimes get mixed up between our gamer and hardware enthusiast sides. I hope that this post will go a little way to calm things a little.

-neOnEPIC
 
personally if this is actually two fully 7970s and not 7950s.then i will buy one as i would only need one to run a 30" monitor (currently own) water cool it for silence and be done with it.Id do the same if nvidia had a 590 successor first. also im not sure if it was said but the 7990 is suppose to be delayed till computex (june)
 
Back
Top