Nvidia rumored to release their GTX 1080 GPU in May

WYP

News Guru
Nvidia are rumored to releasing their GTX 1080 GPU in May, which judging by the leaked specifications will have 8GB of GDDR5X memory and be powered by a single 8-pin power connector. More powerful Pascal based GPUs are expected to be released later.

11070228570l.jpg


Read more on Nvidia's rumored GTX 1080 GPU.
 
If it will be coming out with GDDR5X, it will be extremely disappointing. VR devices are here and it needs every bit of of power the new GPUs can give them. HBM would have been much better getting favorable results.

*Sigh* Once again the end user will get screwed... if nVidia uses the slightly cheaper GDDR5X RAM (For them) on their high-end GPUs while asking the end user a premium price.

If this is the case, one word comes to mind. <Fill in the curse word of your liking in this space for nVidia>

I think I will be going with an AMD GPU this time, seems they are updating their drivers regularly now; which was my only real gripe with them in the first place.
 
If it will be coming out with GDDR5X, it will be extremely disappointing. VR devices are here and it needs every bit of of power the new GPUs can give them. HBM would have been much better getting favorable results.

*Sigh* Once again the end user will get screwed... if nVidia uses the slightly cheaper GDDR5X RAM (For them) on their high-end GPUs while asking the end user a premium price.

If this is the case, one word comes to mind. <Fill in the curse word of your liking in this space for nVidia>

I think I will be going with an AMD GPU this time, seems they are updating their drivers regularly now; which was my only real gripe with them in the first place.

I think NV isnt doing anything wrong here. HBM has nearly no benefit compared to a decently decked out GDDR5 or 5X memory interface. I can imaging seeing HBM on the Ti Versions.
 
If it will be coming out with GDDR5X, it will be extremely disappointing. VR devices are here and it needs every bit of of power the new GPUs can give them. HBM would have been much better getting favorable results.

*Sigh* Once again the end user will get screwed... if nVidia uses the slightly cheaper GDDR5X RAM (For them) on their high-end GPUs while asking the end user a premium price.

If this is the case, one word comes to mind. <Fill in the curse word of your liking in this space for nVidia>

I think I will be going with an AMD GPU this time, seems they are updating their drivers regularly now; which was my only real gripe with them in the first place.

Why dont you just wait and see ?
You are going to fast on bashing Nvidia.
We have seen FuryX, it has HBM, but makes few fps more at max from game to game more than 980ti which has gddr5, this is gddr5x tho.
So just hold back your bashing.
And not just its gddr5x its also new architecture + 2GB vram more than 980ti.
And just like guy before me said, it has no real benefit, it would rise up prices and lower production capacity for almost no benefits.
VR you say, 4K ?
Its far away form being standard and profitable for manufacturers to work on it.
Thats why you have flasghips, and maybe 1080ti will have HBM, but again no real need yet.
 
remember that this isn't bid Pascal, there was no point where we really expected HBM on anything other than Big pascal.

There is still the titan end card and the 980ti segment card to come above the 1080, provided that the 1080 is the actual name.
 
If this picture is to be believed then it also won't use NVLink and will instead use the normal SLI Fingers

Link



Honestly if it's not got HBM2 it's not a big deal as HBM doesn't offer much more performance in games compared to GDDR5.

If the new architecture was being held back by, the Vram then I think nVidia would be using HBM2 on all cards, but I am guessing here that it's not so it won't/shouldn't be a issue.

Either way we won't know until the cards release and then some games release that will take advantage of them, compared to most of the current games from my experience that just use resources for the sake of it rather than using them for a beneficial reason.
 
VR devices are here and it needs every bit of of power the new GPUs can give them.

VR devices might be here but support and content isn't so it is still very much a niche (and very expensive) product and is not where the dollars are, and probably won't for a year or two, and thats only if the games industry doesnt treat it like the film industry treats 3D cinema (which IMO is a bit more of a recycled gimick than VR) and just slaps "VR support" on games with a mere modecum of enhanced interactivity.
 
Having used HBM I would rather have a card with GDDR5X.

8gb is not enough for the potential life of a Pascal/Polaris card though.
 
If this picture is to be believed then it also won't use NVLink and will instead use the normal SLI Fingers

Link



Honestly if it's not got HBM2 it's not a big deal as HBM doesn't offer much more performance in games compared to GDDR5.

If the new architecture was being held back by, the Vram then I think nVidia would be using HBM2 on all cards, but I am guessing here that it's not so it won't/shouldn't be a issue.

Either way we won't know until the cards release and then some games release that will take advantage of them, compared to most of the current games from my experience that just use resources for the sake of it rather than using them for a beneficial reason.

that is a picture of a 980ti chip has gm200-310-a1 written on it.
 
I look forward to seeing what it is capable of. Let's just hope that the pricing point is better than the 980 ever was.
 
that is a picture of a 980ti chip has gm200-310-a1 written on it.


I might be blind here but where does it say it on the picture because I have zoomed in on it and didn't see anything?

The only time I see 980Ti, is when I try to save it and then it shows as a EVGA 980Ti SC
 
Why is everyone keep making the rumor up about GDDR5X? It won't even hit volume production until Summer. It won't have GDDR5X. If it does, expect super low quantities and not great overclocking chips do to lower yields.
 
lol What if Nvidia pulls an AMD and the first Gen is just re branded 900 series cards lol

If that is what it's going to be i'm going to be laughing my a** off lol

As i have said don't buy into the hype and what are people doing buying into the hype
 
I dont understand why so many of these video cards have TWO displayports... I dont know a single person (even enthusiasts) who care about displayport; they all just use HDMI. HDMI is ubiquitous and they dont see any advantage of using displayport instead.

You know what they do what? TWO HDMI PORTS!
 
I dont understand why so many of these video cards have TWO displayports... I dont know a single person (even enthusiasts) who care about displayport; they all just use HDMI. HDMI is ubiquitous and they dont see any advantage of using displayport instead.

You know what they do what? TWO HDMI PORTS!

I think you may have that backwards.. DP has higher bandwidth available for higher res and higher refresh rates.
 
I dont understand why so many of these video cards have TWO displayports... I dont know a single person (even enthusiasts) who care about displayport; they all just use HDMI. HDMI is ubiquitous and they dont see any advantage of using displayport instead.

You know what they do what? TWO HDMI PORTS!

IMO HDMI is over rated in my book, To me HDMI = TV's and DisplayPort = Computers
 
I dont understand why so many of these video cards have TWO displayports... I dont know a single person (even enthusiasts) who care about displayport; they all just use HDMI. HDMI is ubiquitous and they dont see any advantage of using displayport instead.

You know what they do what? TWO HDMI PORTS!

Check out some 4K monitors, they love DisplayPort.
 
lol What if Nvidia pulls an AMD and the first Gen is just re branded 900 series cards lol

If that is what it's going to be i'm going to be laughing my a** off lol

As i have said don't buy into the hype and what are people doing buying into the hype

Do you mean what if nVidia pull an nVidia and do refresh of maxwell just like they did with Fermi and Kepler? Because personally I think that would be excellent, the 970 and 980 look quite uncompetitive at the present time and if what is the 980Ti became the 1080 at around the £400 mark I think that would be excellent. The 5XX and 7XX series high end cards were excellent, personally i'd like to see that as maxwell in it's best forms has always been on the expensive side.

JR
 
I dont understand why so many of these video cards have TWO displayports... I dont know a single person (even enthusiasts) who care about displayport; they all just use HDMI. HDMI is ubiquitous and they dont see any advantage of using displayport instead.

You know what they do what? TWO HDMI PORTS!

Then you don't know any enthusiasts.

DP is very popular in the enthusiast world as you can have very high refresh rate high res panels i.e 1440P @ 165Hz etc....and with DP 1.3 we'll have 4K at 120Hz, Something HDMI cannot do.
 
Back
Top