Some HD7970 reviews

Waw that's great I'm very happy, great technology. Blimey my mind is skippin' to the point of, who actualy created the 28nm micro chip eh, and let the marketing giant play with it. I think "LOVELY"

Well done AMD and I bet nVidia is spending their money on higher grade 28nm components, making it another round of quality over quantity goes to those who wait. Good score for AMD again lol nice
 
Anyone know what scaling is like with 2 cards. would be epic if were like 6870s.

Its most likely the bomb. that is until Nvidia get thir cards out. No Nvidia fanboy but im getting Nvidia next time i upgrade
smile.gif
 
Consumer attraction; GTX 580 prices coming down I hope and is it too late for that with I think 2ish weeks for 7XXX's to be available?!
 
Hi,

A fairly strong showing from AMD here, this is good as it will hopefully keep NV honest.

Doubtless this would make a great card for anyone upgrading from an older model, though I'd imagine less interest from those already running fairly modern cards - especially if they have a pair of them.

A couple of things that surprised me though, based on the reviews I've read.

i) Quite an aggressive base clock - looks likes AMD are pushing these fairly hard from the get go.

ii) Surprisingly power hungry, I expected less for 28nm. Several benches I've seen show these cards pulling near 400w while playing Metro 2033. Is it just me or is that really rather high?

iii) Stock coolers seem to be "adequate" but no more than that - afermarket will fix this reducing temps and noise.

iv) An unexpectedly high price-point - in USD at least - maybe a testament to the increased manufacturing costs rather than AMD taking advantage of being on top performance-wise. Seems to go against the grain, usually AMD would price very aggressively. Let's see how prices from both sides shift in the new year...possible bargains to be had.

Overall I'm quite pleased with this release. I do however look forward to "proper" reviews with retail versions using proper drivers. I still remember certain reviews panning older cards (from both teams) where, for some totally unknown reason, they've used drivers far far older than those available at release.

I really can't get over how much power these cards use, IF the reporting is correct. I mean, I have a pair of overclocked 570's but when I had just one I did some power tests using a wall socket metre. From what I recall the card did not seem to pull it's rated ~220w at standard clocks, and only began to creep up when overclocked to 900mhz+ seeing around 250w max.

I do wonder, and can people please comment if this is rubbish or not, that sometimes when overclocking during reviews the PSU might be being pushed out of its efficiency zone so making the cards appear to draw more than they would if the supply was in its sweet spot. Personally, using said wall socket power meter, I've seen the same system pull LESS from the wall after upgrading to a better PSU. This often goes hand-in-hand with a new PSU letting you up your overclock etc.

Looking forward to Tom's "proper" review in the new year.

On a final note, any of the recent gen of high-end GPU are staggeringly powerful when it comes to gaming. This, and doubtless NV new stuff, just push the silly factor that bit higher - which is great
smile.gif
I think there are few of us who really need such hardware to get that perfect 60fps in our faviourite games, but there's nothing wrong with good, old-fashioned LUST for tech lol.

Cheers,

Scoob.
 
i) Quite an aggressive base clock - looks likes AMD are pushing these fairly hard from the get go.

I think that would just be a case of making them faster than the competition and pushing them as hard as you need to to take the lead. No doubt they're held plenty back to respond with should Nvidia get their skates on !

ii) Surprisingly power hungry, I expected less for 28nm. Several benches I've seen show these cards pulling near 400w while playing Metro 2033. Is it just me or is that really rather high?

It is. I think tbh that performance is always going to come at a price. I think the main part of the 28nm 'improvement' is simply allowing the cores to run cooler and thus push them harder. But everything comes at a cost. I'm sure that if it was possible to push a 6970 core to the same level of performance it would probably use more power, but alas, these gains are not going to be used properly as performance is always what sells a card. They could quite easily down clock the card to perform around a GTX 570 level (and use less power) but that's not why people want a new GPU. They want it faster.

iii) Stock coolers seem to be "adequate" but no more than that - afermarket will fix this reducing temps and noise.

Don't quote me on this Scoob but I am certain that I read somewhere that AMD basically do not want to have to make full cards, so are leaving that part in the hands of the distros (MSI, Asus etc). That's how it should be tbh. Let AMD come up with the technology and then give companies free reign to design their cooling and so on. This works ten fold really, because not only does it mean that the "Sticker brigade" (smaller lower end companies who were just buying boards and sticking their logos on) will now have to compete. Asus DirectCU and Matrix cards are sterling examples, as well as MSI's Twin Frozr cards. All of which push things on and make sure that every last drop of performance is squeezed from the cores. Otherwise we would simply get stock cards with a 40mhz overclock and pay an extra £80 for it like we used to *glares at XFX* It's high time distros start earning their keep, and not just running a production line to put on stickers IMO.

iv) An unexpectedly high price-point - in USD at least - maybe a testament to the increased manufacturing costs rather than AMD taking advantage of being on top performance-wise. Seems to go against the grain, usually AMD would price very aggressively. Let's see how prices from both sides shift in the new year...possible bargains to be had.

Bulldozer fail / taking the piss. That's all it is. AMD have had serious troubles over the past few months. Firstly they laid off 10% of their staff and then they had a complete shuffle in their marketing department, firing the guy responsible for selling Bulldozer. Fact is even the most ruthless nasty salesman could have sold Bulldozer because performance stats don't lie. It's kind of hard convincing a market to buy an inferior product, no matter how much crap you talk. AMD know they can charge for a product if performance matches the hype. It sort of matches the hype, but the bottom line is it is, even if only 1%, faster than a 580. They are going to no doubt take that fact and run with it, making people pay for the pleasure.

I really can't get over how much power these cards use, IF the reporting is correct. I mean, I have a pair of overclocked 570's but when I had just one I did some power tests using a wall socket metre. From what I recall the card did not seem to pull it's rated ~220w at standard clocks, and only began to creep up when overclocked to 900mhz+ seeing around 250w max.

GPUs have used that much power for a while now. The problem is who in the market (end users) want cards to be slower, yet use hardly any power? People want performance. So as soon as they come up with a way to make a card perform better they are going to use it. IE - overclock that core and wring every last ounce of performance out of the card to make it a market leader. Power does not come into that equation, because no one will care. All they will care about is saying "ner ner na ner ner, I have the fastest GPU in the world, ner ner na ner ner"

I do wonder, and can people please comment if this is rubbish or not, that sometimes when overclocking during reviews the PSU might be being pushed out of its efficiency zone so making the cards appear to draw more than they would if the supply was in its sweet spot. Personally, using said wall socket power meter, I've seen the same system pull LESS from the wall after upgrading to a better PSU. This often goes hand-in-hand with a new PSU letting you up your overclock etc.

I doubt it. Modern PSUs have to meet tough criteria. There are hardly any, if any at all, power guzzlers around now as per Bulldozer - If something sucks people are going to find out. When a PSU I owned was reviewed and a certain load put on it the total consumption was 602w. They said that two years ago on an old BFG 1000w PSU the same hardware used 756. It's all about 80+ now. If a PSU doesn't carry the logo no one will buy it.

On a final note, any of the recent gen of high-end GPU are staggeringly powerful when it comes to gaming. This, and doubtless NV new stuff, just push the silly factor that bit higher - which is great
smile.gif
I think there are few of us who really need such hardware to get that perfect 60fps in our faviourite games, but there's nothing wrong with good, old-fashioned LUST for tech lol.

Cheers,

Scoob.

Oddly all of the 7 series selling points will be the reason people don't want the cards. Mind you, it seems AMD have bolted on everything including the kitchen sink trying to appeal to the masses.

There will be an initial flurry of sales no doubt, but I am still suspecting that Fermi has some gas left in the tank tbh. I really need to study the specs ofthe high end Tesla cards by Nvidia to see just how much more is on them compared to say, a GTX 580. I would strongly imagine they didn't quite bolt their load just yet, and could well very easily respond with a card equally as fast, if not faster, than the 7970.

If that happens we will see huge price drops on older Fermi cards (570 and 580) and AMD better hope and pray that they have something left to unleash or they will be in big trouble.

Can you imagine the headlines now? "Nvidia deliver a knockout blow with a two year old tech. What does that say for Tahiti?"
 
Witcher 2
smile.gif

Oh that game that when it released was so buggy it couldn't even run low settings on a GTX 470?

How silly of me
laugh.gif


Sorry man, I'm just very deluded and dejected at the gaming market in general at the moment. You would think with all of this incredible hardware we have games would have possibly moved on to a different level Instead we are about to get Dirt 4, main addition is some destruction derby that Psygnosis delivered in 1997.

Great.
 
Back
Top