Just saw this video :O

I'm not that excited just yet. If the recent string of games are anything to go by though it won't be long now.

Implementing multiple core support really isn't that hard . It was just gaming devs saying "Well, consoles don't support it so why spend more than ten seconds on something that won't have any financial benefit to us at all?"

That's set to change now though.

When i become a game dev i'll personally come back to you and tell you how hard it really is:p Until then.. you stand to be corrected lol
 
When i become a game dev i'll personally come back to you and tell you how hard it really is:p Until then.. you stand to be corrected lol

lol :P i think its more of the "profit" than the "complexity", if it suddenly when it becomes profitable which it looks like it's set to do, i'm sure they will all be jumping support for 8 cores ^_^
 
lol :P i think its more of the "profit" than the "complexity", if it suddenly when it becomes profitable which it looks like it's set to do, i'm sure they will all be jumping support for 8 cores ^_^

Well yes no doubt but as the more cores/threads that need to be properly(or correctly) coded for the more complex it gets. It can be done and doesn't take too long but the money factor as you stated is probably the reason why it's not been done because it wasn't a necessity.
 
When i become a game dev i'll personally come back to you and tell you how hard it really is:p Until then.. you stand to be corrected lol

I already know dude. My friend used to program emulators and added in dual CPU support for me back in 2002 and it took him a few hours.

Most of it will come with the game engine, so it's whether or not they implement it.
 
Yes dual cpu support... Now add 6 more cores and threads. From my understanding emulators are worthless now a days for game engines. Why try to mirror hardware when you can directly access it?
 
Yes dual cpu support... Now add 6 more cores and threads. From my understanding emulators are worthless now a days for game engines. Why try to mirror hardware when you can directly access it?

They only need to do it once.

I'm going to test a few games that I suspect support any amount of cores (like RAGE) when I get mine in :)
 
What you need to understand is that designing and implementing a multi-threaded game engine takes a lot of time, and is a mammoth task. Not to mention the engine API will most likely be more indirect and take game programmers longer to learn it and then use it to their advantage.
 
What you need to understand is that designing and implementing a multi-threaded game engine takes a lot of time, and is a mammoth task. Not to mention the engine API will most likely be more indirect and take game programmers longer to learn it and then use it to their advantage.

Hence why the costs millions to develop. And then after there's a learning curve people have to learn.
 
What you need to understand is that designing and implementing a multi-threaded game engine takes a lot of time, and is a mammoth task. Not to mention the engine API will most likely be more indirect and take game programmers longer to learn it and then use it to their advantage.

As I said before though it only needs to be done once. DICE have already done their work on Frostbite which is why it will support anything you throw at it. You could say they had done that a bit prematurely, but who will be laughing now they've just been told they're about to code for consoles that support 8 cores?

Same goes for ID Tech V. It was one of the very few games to take everything I threw at it with my Quad SLI/Surround rig and worked flawlessly. Yeah it was only DX10, but it was still one of the most beautiful games I have ever seen. Shame it was a bit bland and went somewhat unnoticed.

90% of game developers do not use their own engines. Case in point, Bethesda clambered into bed with Carmack and used Tech V for Skyrim. They will also no doubt use it for Fallout 4 or whatever it's going to be named. Skyrim also sees the 8350 win against the 3570k. I will, of course, put that into writing once my rig is set up.

When the Ps4 and Xbox launch you will see new engines (most certainly an Unreal engine of sorts) that all support lots of cores natively.

However, what I suspect has happened here has been as simple as the hot fixes which have allowed Piledriver to work properly and as it should. I did read a bit about it being out of whack and deleting caches that shouldn't be deleted and all of those things led to degraded performance.

So tbh? this debate about cores is a non debate. Even if it cost $5m to implement you can bet your rosey ass cheeks it'll be implemented once these new consoles are with us.

Hence why the costs millions to develop. And then after there's a learning curve people have to learn.

Of course ! but in the world of games it's all about going big or going home. No real red blooded gamer will play inferior titles and people will flock to those that have had the time put in and the money spent.

See also - Battlefield 3. Whilst it may not be my game of choice I have to admit it's a technical masterpiece.
 
Last edited:
Hence why the costs millions to develop. And then after there's a learning curve people have to learn.

Is that an addition to my post or are you telling me that? lol

EDIT:

90% of game of devs is a bold statement. lol
Skyrim uses Bethesda's creation engine.
 
Last edited:
Is that an addition to my post or are you telling me that? lol

EDIT:

90% of game of devs is a bold statement. lol
Skyrim uses Bethesda's creation engine.

Addition and i agree 90% is a bold statement.

AlienALX- Yes they need to program the game engine once BUT then they have to still program the consoles/pc to use all those cores. Yes it will be easier abd quicker but still needs to be done.
 
Addition and i agree 90% is a bold statement.

AlienALX- Yes they need to program the game engine once BUT then they have to still program the consoles/pc to use all those cores. Yes it will be easier abd quicker but still needs to be done.

90% isn't far off. Most games these days just use some iteration of the Unreal engine. It's very rare to see a company who codes the engines AND makes the games using that engine.

Seriously, very few game developers are 100% "in house".
 
90% isn't far off. Most games these days just use some iteration of the Unreal engine. It's very rare to see a company who codes the engines AND makes the games using that engine.

Seriously, very few game developers are 100% "in house".

Name some who do use their own engine.
 
Bethesda Creation (even though it's linked with Gamebryo). ID Tech V (even though they don't really make games any more). DICE Frostbite. And possibly Square Enix (Hitman SD and Tomb Raider).

Other than that? I can't think of any more.
 
Well this is just it. If the 8350 can turn the tables on a 3570k *now* then imagine what will happen when games always use all 8 cores :o

From where I stand, to me personally (this is just an opinion) AMD's breakthrough CPU is not the 8350 it's the 8320.

Simply because Intel offer no cheaper K solution than the 3570k and the 8320 can do everything its big brother can do.

Sure the 8350 comes faster out of the box but the success always comes from lower in the ranks from the little CPU that could.

Mind you let's face it, the 8350 doesn't exactly cost a million dollars does it?
Actually the 8350 doesn't have 8 cores. It has 4 cores with each core having two smaller threads/cores in them. Much like intel hyperthreading, but more efficient and better in real time use. Still AMD has done a mighty fine job on a budget CPU that pretty much kills intel with gaming. :D
Still, bulldozer was shit. :p
But Piledriver. Wow.
 
Actually the 8350 doesn't have 8 cores. It has 4 cores with each core having two smaller threads/cores in them. Much like intel hyperthreading, but more efficient and better in real time use. Still AMD has done a mighty fine job on a budget CPU that pretty much kills intel with gaming. :D
Still, bulldozer was shit. :p
But Piledriver. Wow.

They're clusters similar to the PS3's CPU. The one that was never utilised fully thanks to the 360 and developers being generally lazy.

There are four clusters, each contain two cores. They're different to how Intel's work, though.

BD is only 15% slower than PD. Which yeah, is quite a lot, but it's more the fixes and so on that have sorted out how it's used.
 
Bethesda Creation (even though it's linked with Gamebryo). ID Tech V (even though they don't really make games any more). DICE Frostbite. And possibly Square Enix (Hitman SD and Tomb Raider).

Other than that? I can't think of any more.

You're too focused on larger companies.
 
You're too focused on larger companies.

Well yes, given that they are able to afford the time and money it takes to design their own engines.

It's the smaller companies who use stuff like the Unreal engine or something pre made and they make up the majority I would have thought.
 
http://www.eurogamer.net/articles/digitalfoundry-future-proofing-your-pc-for-next-gen

You guys really need to read that. It's very interesting reading.. Most notably -

We approached a number of developers on and off the record - each of whom has helped to ship multi-million-selling, triple-A titles - asking them whether an Intel or AMD processor offers the best way to future-proof a games PC built in the here and now. Bearing in mind the historical dominance Intel has enjoyed, the results are intriguing - all of them opted for the FX-8350 over the current default enthusiast's choice, the Core i5 3570K.

I can really see this turning into a core race, similar to an arms race. How gutted are Intel going to be to have to sell 6-8 core CPUs for less than the stupid prices they charge now?

Ladies, Gentlemen, AMD are back.
 
Back
Top