Laptop GPU

annetonne

New member
Aha.

Cool beans.

Back on topic: The best card you can mash into a laptop at the moment is probably the GeForce Go 7900 GTX and this is a good card but not quite what’s needed to have a really good time in Cyrodill. And my dilemma is that even when the X1900XTX (or higher) is made mobile, big cards like this are unlikely ever to be available on other screensizes than 17" or maybe 15" laptops.

Say I was interested in getting down around the 12” or 14” (student life and all – need to be as mobile as Freakazoid). I would then be running an extra (bigger) screen at home to fully enjoy the wonders of the gaming world.

How might the interesting development described in your write-up affect my problem? Is there any chance the unified architecture will fit better into the laptops or should I just quit dreaming and face that if I wish to play Oblivion properly in near future (hopefully before ‘summer time’) it isn’t going to happen via a 12” or 14” screen (or a laptop at all for that sake)?

Great review by the bye.

Your insatiable inquirer,
 
Wel by all accounts the new cards run hotter and require more power than the current mobile chips.

However with the industry moving onto smaller fabrication processes I would assume both nvidia and ATI will be doing their best to get their next gen onto people's laps for high-end gaming.

Once again it's a case of "wait and see" but I am confident that ATI and nVidia won't rest on their laurels and will be pushing the very lucritive mobile computing market pretty soon :)
 
Sounds promising.

If your foresight tells you that this is a case of ‘wait and see’ I guess that’s all there is to do.

Thanks again for all the help and hope you’re feeling better.
 
There will always be something better coming out otherwise the companies would go out of business. My guess is that we will see the Desktop and Mobile chips becoming more and more divergent. If you look at like the 9xxx and 5xxx series Ati and nVidia cards they were virtually identical to the mobile parts, but on the latest generation ATI have been unable to release a mobile X1900.
 
Yeah, but...

name='Nagaru' said:
My guess is that we will see the Desktop and Mobile chips becoming more and more divergent.

Hey - glad someone else felt like sharing their opinion. I've been thinking that too. But isn't it impossible to predict anything like this with an entirely new way of building GPU's on its way?

Read this great write-up if you don't have a clue what we're talking about.

Sincerely
 
I am not sure what you mean by

But isn't it impossible to predict anything like this with an entirely new way of building GPU's on its way?

But the way I see it both manufactures are talking about heat levels exceding the total power consumption of most "laptops" these days, and I see no reason for them to lie. Therefore, I see the only option, at least in the vast majority of cases, being either down clocking or smaller and cores. Down clocking is probably the better option because the voltage can be dropped as well. I think that unified shader architecture, that ATI is going to be using is a good bet due to its extra flexibility.
 
name='Kempez' said:
And it's not an entirely new way of building the GPU: rather it's an architecture change

I understand now, from what I have read Direct X 10 can only bring more complication not less. Wait I forgot lets all bow to the all holy

Microsoft again. :worship:

:noobsign:
 
Nagaru said:
I understand now, from what I have read Direct X 10 can only bring more complication not less. Wait I forgot lets all bow to the all holy

Microsoft again. :worship:

:noobsign:

In all honesty DX10 sounds like rather a good idea. Unified shaders make sense and MS seem to be going the right way.

What I mean is that yes it's a redisign of the layout of the shader pipes and the actual chip architecture, but the chips are still being made on silicon using 80nm fab process

Keep your :noobsign: to yourself ;)
 
Back
Top