Quick News

I still am confused as to whether rumours are allowed to be posted or not, but OK.

Not confusing at all. Really straightforward and has nothing to do about posting rumors in the quick news section.
I am simply saying the story you posted I don't believe is true because it basically says "hey we heard from nothing and no one that the after Navi we get a next gen architecture. Based off an old roadmap that is extremely vague and has no content about this next gen architecture."

I don't see how that is believable. Seems like clickbait.
 
Not confusing at all. Really straightforward and has nothing to do about posting rumors in the quick news section.
I am simply saying the story you posted I don't believe is true because it basically says "hey we heard from nothing and no one that the after Navi we get a next gen architecture. Based off an old roadmap that is extremely vague and has no content about this next gen architecture."

I don't see how that is believable. Seems like clickbait.

OK, that's fine then. I got the impression you didn't think rumours should ever be posted, but if that's not the case then I understand.
 
It is OK to post rumours here.

Also, the slide that says "Next Gen" on 7nm+ slide is from CES, so as recent as the GPU roadmap gets.

If they are already developing a new GPU architecture this will be Raja's baby, though it will be David Wang that will bring this design to fruition.

Whether or not this is true is hard to say, in some ways it makes sense for AMD to design a new architecture, as there is the possibility that they are simply finding GCN too difficult to scale. The question is do AMD wantAM<D hires to make some fundamental changes to GCN or do they want to build something new based on what they learned with GCN.

Just look at it this way, AMD brought back Jim Keller in August 2012 to design Zen, leaving in September 2015 after the design was basically done with a product release in 2017.

Raja becomes the head of the newly formed RTG in September 2015, leaves in late 2017. Imagine if Raja was working on something new, perhaps a heavily reworked GCN or a totally new microarchitecture, a release in 2020 would seem accurate. Remember that it takes a lot of time before a totally new design can be made into a product.

This makes a lot of assumptions, but it is possible.
 
Taken from another forum, Original post.

Nvidia discontinued GP102 production, Titan X Pascal, Titan Xp and 1080 Ti now reached EOL.

Nvidia GA104 already in production since February probably over the last week.

Nvidia to unveiled GA104 chip based on Ampere architecture, announce GTX 2070 and 2080 cards at GTC 2018 between 26-29 March 2018.

Ampere GTX 2070 and GTX 2080 with GDDR6 launch date 12 April 2018?

https://translate.google.com/transl...104-seit-februar-produktio&edit-text=&act=url
 
... I doubt this is true, as it would mean that Nvidia wasn't creating new GPU chips for a while and effectively caused their own GPU shortage in the mining boom.

While Nvidia does need to march forward with new products, I don't see why they would want to ATM.

And people will complain. And then those people will go and buy their new GPU.
 
... I doubt this is true, as it would mean that Nvidia wasn't creating new GPU chips for a while and effectively caused their own GPU shortage in the mining boom.

While Nvidia does need to march forward with new products, I don't see why they would want to ATM.

I thought I had covered that here, but I think it may have been somewhere else.

Recently I read an article about mining (well, a couple of months back) and it asked "If mining is so good why don't AMD and Nvidia just up production and make loads more?".

The answer was pretty complicated, but I will do my best to explain. OK, so basically if they both did that then they make a rod for their own back. Firstly if mining dies then they are going to be stuck with loads and loads of stock. That's fine, but it will be worth a crapload less than before. It also prevents them from carrying out any plan of releasing their next gen, because they are still stuck holding the baby. Which means that they would have to sell that stock at rock bottom prices, taking a loss, just to get rid. Then the market gets flooded with old stock on the cheap and people all buy that.

For example, after the 400 series Nvidias were given the heave ho and Nvidia moved to the 500 series there was loads of cheap stock flying around. Instead of buying a 500 series I bought a GTX 470 for £170. Less than half of what it launched for. I put up with the helicopter noise for as long as I could then whacked a £25 Zalman cooler on it, hey presto easily as fast as a 570. The 480 was also very popular at the knock down price of £220.

Not only that, retailers don't want to be stuck with loads of old stock when the new cards come out because all of the people who buy every new Nvidia GPU (and there are many) will want the new Nvidia GPU right away.

So it's a fine balance. One that is hard to get right, without losing money. And that is why both of them (and more importantly AMD, who could go bust in such an event) are just making pathetic amounts. Enough that they all sell and make a decent profit but not so many they are risking their shirt and being stuck holding the baby.

IIRC Nvidia are now all out of Titan Xp Star Wars in the USA. I doubt that they will be replenished. This means only one thing, IMO, and that is that Ampere is near (I predicted this weeks ago off the back of stuff I found out just by playing detective for a few hours. Most namely why are Scan and others marking 1080Ti as EOL? why not just OOS? they are potentially ruining the chances of selling some one a GPU if they say it's EOL. OOS? you would go back every other day to check. EOL? you wouldn't bother.

Why does everyone assume its 2080 series? did i miss something? why not 1180? Each series has jumped 100 with exception of the 880m

Also, I think its smart to exhaust their supplies of pascal. its means they are not stuck with surplus stock. They have sold all chips at full cost/full profit.

Yup 7-8*-9-10-20? doesn't make sense :D

*800 series were laptop only.
 
Last edited:
Why does everyone assume its 2080 series? did i miss something? why not 1180? Each series has jumped 100 with exception of the 880m

Also, I think its smart to exhaust their supplies of pascal. its means they are not stuck with surplus stock. They have sold all chips at full cost/full profit.

Because 2080 looks better than 1180. If they go with 2080 then they can start moving towards 3080, 4080, etc, which is a cool shift. The reality is, I'm just calling it that as a placeholder. I have no idea what it's actually going to be called. I didn't think they'd call the 1080 a 1080. I though it sounded silly, but that's what they went with so what do I know? :p
 
It may be neither tbh. Neither sounds very appealing to me.

IMO they should do away with numbers and name their cards like AMD do. I like that much better.
 
It may be neither tbh. Neither sounds very appealing to me.

IMO they should do away with numbers and name their cards like AMD do. I like that much better.

I thought they were going to do that with Pascal. It seemed like the perfect opportunity to try something new. They had a massive market share and pretty much everyone wanted to or was using Nvidia. They could have named them NvidiaMacPascalFace and people would have bought them.
 
I thought they were going to do that with Pascal. It seemed like the perfect opportunity to try something new. They had a massive market share and pretty much everyone wanted to or was using Nvidia. They could have named them NvidiaMacPascalFace and people would have bought them.

lmao. "Overpriced McDickface" has a certain ring to it :D

But yeah, 2080? I doubt it. "What GPU you got mate?" "A twenty eighty !". It just sounds a bit smeggy.
 
I quite liked the the old AMD southern island naming scheme. Tahiti/pit cairn etc

It's actually so easy.. It could go, for example, like this. Ampere one. Ampere two, and so on. IDK why they give them stupid numbers there's just no point.
 
Yeah, I think it's a lot easier to market processors or electronics in numbers. I'm surprised AMD stuck with their codename, Vega. I think it works really especially with 56 and 64, but Nvidia are likely to continue with their boring nomenclatures as it's simpler.
 
Back
Top