Nvidia GTX 595?

Dewinte

New member
Do you think they'll really do it?
ohmy.gif


Would it be driven mainly through fear of competition from a dual top ATI 6xxx card, and wanting to be on top? I don't see a single 3D vision surround card being a motivation for Nvidia to do it, more just a bonus for the end user...

Story link
 
Were the cores in the 295 as hot as the 580?

I wouldn't put anything past Nvidia tbh. Heat and power consumption just don't seem to matter to them
biggrin.gif
 
Were the cores in the 295 as hot as the 580?

I wouldn't put anything past Nvidia tbh. Heat and power consumption just don't seem to matter to them
biggrin.gif

I have a second PC here with a Palit 295 in it and I ran heaven benchmark on it and it peaked at 102 Celsius (215.6 Fahrenheit) not overcloacked at all, air temp was cold.

Ran really hot so I wouldn't be surprised if it nvidia do make it.
 
Isn't it just risking a PR problem... if they just put out an ultra hot card. After the GTX 480 as well it really wouldn't be worth it unless by magic it ran cool enough and was a monster that people would expect it to be. After how good the 580 launch has been for them and people loving the 460's why bother risking screwing it up haha.
 
You gotta love Nvidia for trying. But i find it funny when they make the claim that a "real gamer/enthusiast" wants nothing but performance and is willing to accept high temps/power draw in order to have the best card.
 
Yeah, cos an enthusiast will take ANY component and clock the crap out of it to make it hot, high performance and use loads of watts.

.. they maybe decided to cut out the middle man being as the enthusiast is a dying breed
tongue.gif
 
Yeah, cos an enthusiast will take ANY component and clock the crap out of it to make it hot, high performance and use loads of watts.

.. they maybe decided to cut out the middle man being as the enthusiast is a dying breed
tongue.gif

The enthusiast is a dying breed only because there is no real need to do so right now due to so many games being crappy console ports, Why cant consoles have computer ports like the old days, Its sucks, The only game I have that seems to max out my video card with out turning the AA up is Dawn of War 2. probably because it pre readers so much.

Look at all the games that are used to benchmark video cards.

Alien vs Predator

Crysis Warhead

FarCry 2

HAWX 2

Metro 2033

(Pulled from the latest Gigabyte GTX480 SOC Review on OC3D)

All these games run fine at 8x AA and 1920x1200 besides Metro 2033 but even then it doesn't drop belove 30FPS so is still playable at max setting not maxing out the hardware since most of the enthusiast are running 2x 460 GTX's and they benchmark faster than a 480, seems there is nothing out there besides folding that will max out our cards, So I guess we can game and fold at the same time now
tongue.gif


Consoles have to run a game on...

Xbox 360 specs.

3.2 GHz PowerPC Tri-Core Xenon

Memory 512 MB of GDDR3 RAM clocked at 700 MHz

Graphics 500 MHz ATI Xenos

Sorry about the rant, just grumpy due to how crappy the COD console ports have become. I guess that comes from consoles that are old as the hills (2005).
sad.gif
 
You gotta love Nvidia for trying. But i find it funny when they make the claim that a "real gamer/enthusiast" wants nothing but performance and is willing to accept high temps/power draw in order to have the best card.

I completely agree with that tbh. I wish that both GPU manus would make single cored cards and then a crossfired single card made from all of the GPUs they offer. For example a dual 5670 cored card, a dual 5770 cored card ETC. It would make it cheaper and easier for those who wanted to buy two, take up one slot and create less heat over all.

I still for the life of me don't get why they don't do that. The more the merrier right? and at least then game coders would be sort of forced into adding more GPU core support into their games.

or the people likely wanting to buy these cards

Absolutely.
 
The enthusiast is a dying breed only because there is no real need to do so right now due to so many games being crappy console ports...

Absolute cop-out
tongue.gif
Everything needs overclocking, and should be ! Such words would never have been mentioned in the past. Those melting their pcs never really needed the extra hertz to play something silly like Quake/HL/BF
tongue.gif


If there was an obsticle created by the manufacturer, the enthusiast would hack around it. Rip off that heat sink, make your own ghetto coolage.

It's certainly a long gone era.

lol
 
Absolute cop-out
tongue.gif
Everything needs overclocking, and should be ! Such words would never have been mentioned in the past. Those melting their pcs never really needed the extra hertz to play something silly like Quake/HL/BF
tongue.gif


If there was an obsticle created by the manufacturer, the enthusiast would hack around it. Rip off that heat sink, make your own ghetto coolage.

It's certainly a long gone era.

lol

Maybe it's because now most people who call themselves enthusiasts, want the clocks etc for a reason. If that reason isn't there then they stop, and for some the reason is purely for benching, if they don't bench they don't seek the highest of everything as it's just a number. Only really round numbers are praised widely on their own, saying I hit 4GHz vs 4.2GHz most don't care, saying you hit 5GHz all of a sudden it's OMG, but very few even try. As I said yggdrasil has a point in that now most people only want to OC or mod for a purpose and not for the sake of it, this also incorporates your opinion that the enthusiast is dying. You're both on similar lines.

__________

@AlienALX Providing a dual card is a whole other manufacturing process especially if you want it stable etc especially when the one making it will be someone you give a design spec to such as Foxconn. You cannot assume it'd be as simple as just putting two cores into one because you don't know the organisation of the firms production process, especially when the resellers want to add their own modifications to the design. Widening the product range with your suggestion brings problems in cost, and those are large ones. That is only overcome when the profit you stand to make justifies it and/or the branding/marketing hype eg. We made the GTX 595 so we are the fastest, consumer X thinks I can't afford the GTX 595 but I will go for what I can afford from Nvidia as they are good enough to have made the fastest. That example works with people that don't read reviews or research their purchases.

My opinion is that if they were to do what you thought was good it'd only be on the best midrange cards, so for Nvidia they'd dual the 580 as it's their top and they'd dual their 460 as it's the best midrange that many buy, and the same for ATI/AMD dual the 5870 for the 5970 and then dual the 5770. They couldn't justify going dual on any of the other cards. That's because those between the 460 and 580 wont sell enough and aren't worth bragging rights if in dual config. Making a dual card from more than two cards out of the range is impractical to costs, and even making the two themselves may be impractical as the dual 460 for example may due to costs be way to cost inefficient.
 
Yeah lots of fair points.

I'm being pretty tongue-in-cheek, even tho I take the appreciation that a "4ghz oc" is taken as said for many a cpu these days, it's achieved on such a level that it's almost reached as a settling point - and fair comment to that.

It's much beyond the old days of purchasing a mid-range Athlon and using craft to oc it to the ceiling of the range. Much of the craft is kept basic.

It's admirable to make a purchase of a midrange graphics card and make it almost reach the performance of it's bigger brother. But I'd imagine the majority would just want to purchase that bigger brother and get upset that it costs too much. That aint enthusiast to me.

We do live in a more "I want it now" society mind.
 
That's a really good point actually, most people now instead of getting their mid range up to the performance of the top will just buy the top..

Granted the money that goes into computing has risen and the money that most spend on it has risen, most "enthusiasts" now have the money to not need to buy a lesser powered unit, and consider those with lower powered ones to just be OCing to gain performance they couldn't afford. Ah well eh! I'm one of the people who don't know a whole lot and buy what is already powerful haha.. but then I ask it for more because I want to
smile.gif
. There are also limits to what people will do because they feel by buying the i7 over the i5 for example they can't afford to break it and the warranty so don't push as far as they'd push the i5
tongue.gif
.

Maybe it's not that enthusiasts are dying, it's just that enthusiasts are being diluted and blurred by the numbers of people doing things similar! As it's really not uncommon to find people willing to pay for serious hardware and they can almost be included into the "enthusiast" category by some manufacturers. So yer maybe enthusiasts have increased nominally but decreased as a proportion of those within the world of higher end computing!
 
Back
Top