AMD's Navi architecture referenced in Linux drivers

Better be perfect otherwise I fear it will be very far behind than Nvidia.

Agree there man, given the disappointment Vega turned out to be they either need something along the lines that Ryzen turned out to be or maybe they will concede and focus on console gfx. TBH I think AMD are beaten in terms of graphics and reckon Nvidia have more held back
 
We need mGPU or whatever the heck it is called working first. I do hope that AMD don't go making more tech for 5 years time now. If they do they deserve to go out of business.
 
We need mGPU or whatever the heck it is called working first. I do hope that AMD don't go making more tech for 5 years time now. If they do they deserve to go out of business.

Maybe, just maybe they have been doing to GPUs what they did to CPUs :D Now that would be sweet
 
Maybe, just maybe they have been doing to GPUs what they did to CPUs :D Now that would be sweet

0% chance. They have just "Glued" Polaris together. From what I recall that is all it is. Just multiple Polaris cores that rely on DX12 mGPU or whatever it is called. Nvidia are making it like that too, so expect serious coreage within the next decade.

My worry is that AMD will do that now, before, for example, the 7970 even does what it's good at (and every GPU they have made since).
 
0% chance. They have just "Glued" Polaris together. From what I recall that is all it is. Just multiple Polaris cores that rely on DX12 mGPU or whatever it is called. Nvidia are making it like that too, so expect serious coreage within the next decade.

My worry is that AMD will do that now, before, for example, the 7970 even does what it's good at (and every GPU they have made since).

Um no. Have you even ever looked at the architecture of any AMD design? It's not glued together. The closest thing to being glued is the interposer and even then that's stretching it.

As for future architectures it is highly likely we are moving towards that. It has more benefit than cons.
 
Um no. Have you even ever looked at the architecture of any AMD design? It's not glued together. The closest thing to being glued is the interposer and even then that's stretching it.

As for future architectures it is highly likely we are moving towards that. It has more benefit than cons.

Did you miss the opening" and closing "s dude? I was being sarcastic and terming Intel's phrase for IF.
 
Just remember guys that sarcasm isn't read well in text, especially when given the fact that people from all over the world (mostly Europe) use this forum. (Sarcasm) is a much better indicator of sarcasm than quotation marks.
 
Did you miss the opening" and closing "s dude? I was being sarcastic and terming Intel's phrase for IF.

That has absolutely no impact on sarcasm and your post in general does not convey any sense of sarcasm. You kept on going and explained your thinking on how future fabrics are going to be.
 
That has absolutely no impact on sarcasm and your post in general does not convey any sense of sarcasm. You kept on going and explained your thinking on how future fabrics are going to be.

If I had left it without " then you may have a point. The very fact I surrounded it with " was me trying to emphasise the fact that it is "glued" together. If you don't understand that it's your lookout mate TBH.
 
As I said in my previous post, sarcasm is not read well in text. Quotation marks do not mark sarcasm; quotation marks highlight quotes.

The best practice for sarcasm is by directly putting (sarcasm) at the end of a sarcastic statement. This puts it beyond any shadow of a doubt that you are being sarcastic.

As I said before, people from all over the world use this forum, many of which did not speak English as their first language. Sarcasm is difficult to read at the best of times, nevermind those who are reading it in a different language.
 
Context is never well read either tbh. Now imagine being ASD and having the context hurdle to jump as well....
 
As I said in my previous post, sarcasm is not read well in text. Quotation marks do not mark sarcasm; quotation marks highlight quotes.

The best practice for sarcasm is by directly putting (sarcasm) at the end of a sarcastic statement. This puts it beyond any shadow of a doubt that you are being sarcastic.

As I said before, people from all over the world use this forum, many of which did not speak English as their first language. Sarcasm is difficult to read at the best of times, nevermind those who are reading it in a different language.

Has nothing to do with it. It was the furthest thing from sarcasm. I can read sarcasm over a post just as well as anybody here. Heck me and Dice do it all the time without " or (sarcasm). It's not difficult to convey such a message for someone who speaks English. But his post had no inclination of sarcasm. You don't say it's sarcasm when you are explaining your line of thinking that's in agreement with sarcasm. Sure his " implies him taking a dig at Intel, I got the reference but to call it sarcasm in context of the post? Definitely not.

Point is a multi die chip is not far away. And to think any previous design has this layout is wrong. Zen was the first to have it. It's successful and now everybody is going to know okay it has serious potential. More R&D will be put into it and people will come up with better designs
 
Has nothing to do with it. It was the furthest thing from sarcasm. I can read sarcasm over a post just as well as anybody here. Heck me and Dice do it all the time without " or (sarcasm). It's not difficult to convey such a message for someone who speaks English. But his post had no inclination of sarcasm. You don't say it's sarcasm when you are explaining your line of thinking that's in agreement with sarcasm. Sure his " implies him taking a dig at Intel, I got the reference but to call it sarcasm in context of the post? Definitely not.

Point is a multi die chip is not far away. And to think any previous design has this layout is wrong. Zen was the first to have it. It's successful and now everybody is going to know okay it has serious potential. More R&D will be put into it and people will come up with better designs

So now you are on the internet telling people how they are actually thinking and feeling.

Srs get the sand out of your vagina.

Tole - I was being sar-car-stik ;)
 
Last edited:
AMD's next top end single GPU NEEDS to be as good as the 1080 Ti at the very least, If not I fear Radeon may well go under, Leaving AMD to do CPU's only and leaving Nvidia to be the sole GPU vendor on the market, We then will see 1070 style cards costing the same as a 1080 Ti.
 
AMD's next top end single GPU NEEDS to be as good as the 1080 Ti at the very least, If not I fear Radeon may well go under, Leaving AMD to do CPU's only and leaving Nvidia to be the sole GPU vendor on the market, We then will see 1070 style cards costing the same as a 1080 Ti.

they have the cards stacked against them. Pun intended ^_^

They really do need to hit it out of the park.
 
Well the idea is sound. It's like the dawn of something really big, because look back to single cored CPUs and then where we are now.

Pretty cool that in ten years I may be talking crap on forums using a quad core GPU with four tiny little dies smashing out games.

AMD's next top end single GPU NEEDS to be as good as the 1080 Ti at the very least, If not I fear Radeon may well go under, Leaving AMD to do CPU's only and leaving Nvidia to be the sole GPU vendor on the market, We then will see 1070 style cards costing the same as a 1080 Ti.

Actually fella I don't really agree. It just needs to be reasonably decent, with no BS spread around it and not cost the earth or chow down on power. Vega could have been good with no HBM and a much cheaper price with less power draw. The 480 and so on were quite the success IMO, even though a lot of people disagree. No BS, no hype, none of that just a decent little GPU for a great price (remember how it gave Nvidia a slap and brought down the prices of the 970 and 980?).

Not every one games with a super high end GPU. In fact, hardly any one does. Most people use mid range stuff.
 
Last edited:
Back
Top