Apu & discreet card comparison

Filby83

New member
Not so much a Vs thread, but given the new consoles both have AMD Apu's, theres things i really wana know.

So, 8 Core Jaguar Apu, thats 2 4 Core's on the same die right ? not one 8 core ?

This is where i get mixed up, Cores and GPU's, so how many cores does say the AMD 7970 have ?

Just how does a company go about getting say the Cpu+7870 on one Die and come up with an APU at 28nM ?

How do they keep it cool without it sounding like a jet Engine ? i know they could run the core at 2Ghz but chopped it down to 1.6Ghz to help this but still.

How does the difference in GDD5 vs DDR3 come into play ? and is GDDR 5 is so good why do home Pc's not just run that instead of standard Ram ?


Looking at cores and speeds and ram and everything, then the new consoles are going to be better than the current ones for sure but reallt not all that good in terms of Pc's of today, and whats all the talk of "Just wait untill they get to grips with them" as if 3 years down the line there going to be knocking out games like there running on titans or something ?
 
That's a lot of questions there.

the apus on the next gen consoles are a 8 core cpu and a gpu on a single die.

28nm is the manufacturing process, the die size can be as large as needed. for example intels sandybridge cpus and sandybridge-E cpus use the same 32nm process but have a different die size. the extreme processors have a larger die size to fit in the additional cores etc inside the processor.

as for cooling i believe it is simpler to cool than previous consoles. first there is a a single apu to cool rather than a separate cpu and gpu, this allows for a heatsink to be much easier to design. also with more modern cpu and gpu tech there will be much less heat output compared to older tech.

as for gddr5 vs ddr3, gddr5 will be faster but will have latency issues (these can be easy to fix). gddr5 also run much hotter than ddr3 and use more power.
 
Not so much a Vs thread, but given the new consoles both have AMD Apu's, theres things i really wana know.

So, 8 Core Jaguar Apu, thats 2 4 Core's on the same die right ? not one 8 core ?

This is where i get mixed up, Cores and GPU's, so how many cores does say the AMD 7970 have ?

Just how does a company go about getting say the Cpu+7870 on one Die and come up with an APU at 28nM ?

How do they keep it cool without it sounding like a jet Engine ? i know they could run the core at 2Ghz but chopped it down to 1.6Ghz to help this but still.

How does the difference in GDD5 vs DDR3 come into play ? and is GDDR 5 is so good why do home Pc's not just run that instead of standard Ram ?


Looking at cores and speeds and ram and everything, then the new consoles are going to be better than the current ones for sure but reallt not all that good in terms of Pc's of today, and whats all the talk of "Just wait untill they get to grips with them" as if 3 years down the line there going to be knocking out games like there running on titans or something ?

Lots of questions lots of answers :P

The Jaguar APU is a true 8 core I believe with both an integer and floating point scheduler for each individual core. Without going into too much detail, this is where the argument as to whether AMDs FX CPUs are true 8 cores or not.

The 7970 has 2048 shader ALUs, so you could in theory say that the 7970 is a 2048 core GPU? If you saw a Quad core APU for sale anywhere though, that is refering to the number of CPU cores that the APU has.

The 28nm bit is the size of the transistors used in order to make the APU. Realistically, there isn't a limit to the number of transistors they can use, so the actual size of the chip they make can be as big as they want, permitting they can power it and keep it cool.

They can do that in many ways, in this case they would have lowered the clock speed of both the GPU and CPU which will in turn allow them to greatly lower the voltage used to power the chip. Also with an efficient air cooler, they should be able to keep the chip nice and cool without the need for a massive fan.

GDDR5 is much more expensive for a start. That's why you don't see 16GB graphics cards coming out on the market. Not only that but there would be no need for them. There are other technical reasons I'm sure but I don't know enough about them to give you an informed answer.

The thing is with consoles is that you don't have to make a game for a lot of different graphics cards. You only have to make it for one. Therefor they are able to get a lot more performance out of the graphics cards compared to the GPU's on computers since the code they use is much more efficient.
 
True 8 core CPU with a GPU core of something in between a 7850/7870 in performance. We know very little about how the architecture on the die is planned out. We know "Jaguar" is one of AMDs low power options and with AMDs help it's very flexible. Jaguar is also only for PS4. I'm fairly certain it originates from a newer architecture that AMD has been working on.

28nm is the lithography. Or in other words the size of the transistors. More transistors means better power efficiency and raw power but will need a bigger die(or size of chip) to be put on, which in turns greatly reduces profitability and makes the yielding process much more difficult. Yield is the percentage of how many products are effective and defective. This is a problem the Xbone has at the moment and that's why they have pulled back launching in some countries(compared to ps4s 32 countries xbone is about half i recall?), simply because too many products are being made but not actually working.

Rumors are both next consoles will be 1.6ghz. Now Xbone is running at that speed but i'm fairly certain PS4 runs at a higher clock cycle. Simply to put it over the edge in terms of CPU power and bragging rights(which is also bragging rights for marketing).

Power consumption is less of an issue. They will be pulling less, less heat, and therefore less noise. Mainly because of the increase of transistors(over 10x the amount i believe).

GDDR5 is MUCH faster than DDR3. PS4 has a max theoretical bandwith of 176GB/s compared to Xbone 68.3GB/s.. quite a big difference! With Xbone ESRAM it brings up to around 102GB/s though the ESRAM in practice won't be enough to maintain performance on par with PS4 simply because 32MB and 50% less GPU power is far too much to take on.

Optimization is key. Marketing is the winner.

edit: XBONE will have ONLY 5GB of DDR3 ram available for game usage. the other 3GB is wasted on the OS. Which will further increase the gap between them.
 
Last edited:
So, how longs we looking at untill we get a True Powerhouse APU ? say even a full on 4770k/780 APU ? And how would this work given than AMD do both full on CPU's and GPU and Intel really only do Cpu with crumby in house built in grafix.


And the Final Question if games Dev's now have 8 Cores to play with, in the consoles, and with consoles being the lowest common denominator can we no expect 8 core usage in games as the norm ?
 
That's a VERY long ways away from that type of performance. We don't know how it will work out except the fact it will eventually take over the future with no more discrete GPUs anymore and have everything on the single die.

Game engines need to evolve before anything really great can stand out to take full proper use of 8 cores and hyperthreading.
 
Trouble i can see is game's have evolved.

The xbox/ps4 no longer have a wow factor like the past consoles did, or maybe i was ignorant to the Pc world back then, but to me the ps1 and the Xbox's come along and i was like wow.

Now i see Battlefield 4 and Cod 3 million, and they all look like last Gen, they look nice but not wow like when i first saw the Trailer for Battlefield Bad Company 2 i was like, "im playing that game!!!!"

i played crysis2 on xbox 360, and spent hours trying to convince myself how good Pc looked, and even to me who's addicted to how things look i was like "yeah it looks better,but not 100% better" same with BF3, that just leaves only better "Frames per second"

So how long now untill we get photo real Graphix ? like playing real life ?
 
Photo realism is doable in multi-million dollar(or pounds) labs. On the consumer level i say 10 years give or take 2 years. Now video? Give that an extra 2 on top of photos.
 
Last edited:
Back
Top