Nvidia is rumoured to be releasing a GTX 1070 Ti?

1080FTG = 1080 Filling The Gap :lol:

I think what NVIDIA should be doing right now is dropping maybe a 1050 or 1060Ti and targeting the low to mid bracket at a comfortable price as they would sell like hot cakes and steal the bang for buck thunder from AMD the 1070 and 1080s are already proven cards in their tier.
 
Vega56 OC seems to be doing fine up against 1070 OC.

Even more so since people have started flashing their BIOS to 64 clockspeeds.

And her that changes nothing. People are still going to buy the 1070. Lower power consumption. Lower heat. "Nvidia". Basically sums it up.
 
And her that changes nothing. People are still going to buy the 1070. Lower power consumption. Lower heat. "Nvidia". Basically sums it up.

The only way that will change is if AMD produce cheap great GPUs like they did with Ryzen. When the prices are the same, or close, why on earth would you buy from what to you is an unknown company?

The only way to win is to set off people's tight alarms and make the deal so good that they will switch. That is what has happened with Ryzen.

They had that with Polaris. People were getting right behind them and cheering on the pics and posters they were making. Polaris was awesome (note was). Small, cheap to make, high yields, every one can afford a decent GPU for the first time in years = land grab. Then they go in the complete opposite direction with Vega. That is what I just don't understand. AMD were back to doing what they are good at and they seemed to let it go to their head.

Vega die with GDDR and we would have seriously been onto something.
 
The only way that will change is if AMD produce cheap great GPUs like they did with Ryzen. When the prices are the same, or close, why on earth would you buy from what to you is an unknown company?

The only way to win is to set off people's tight alarms and make the deal so good that they will switch. That is what has happened with Ryzen.

They had that with Polaris. People were getting right behind them and cheering on the pics and posters they were making. Polaris was awesome (note was). Small, cheap to make, high yields, every one can afford a decent GPU for the first time in years = land grab. Then they go in the complete opposite direction with Vega. That is what I just don't understand. AMD were back to doing what they are good at and they seemed to let it go to their head.

Vega die with GDDR and we would have seriously been onto something.

From what we've been told, though, Vega without HBM2 would not have been possible. They were stuck with HBM whether they wanted to use it or not.

Vega 56 has only marginally (5-10%) better performance-per-watt than Polaris and Vega 64 is around 980Ti performance-per-watt. That's appalling in my eyes.

That's called an OC 1070.

Yeah, to me even Vega 56 is a bit of a failure. Reviewers seem to like Vega 56 a lot, but I don't see why. Sure it slightly beats a reference 1070 on average, but a year late and with a significantly higher TDP. A non-reference 1070 with a 2050Mhz overclock (very attainable for pretty much all 1070s) should be able to beat a Vega 56 in most tests, even when the 56 is overclocked itself. And even if it can't, it'll be far more efficient (and it's older and could see a price reduction).

No matter how I look at it, Nvidia is winning in the mid-high (1070), high-end (1080), and enthusiast sector (1080Ti; even the Titan X isn't that bad). I wouldn't recommend high-end Freesync monitors any more as AMD just can't make high-end GPUs to warrant them. You're better off paying the premium for Gsync because at least you're getting the graphics cards you want at the right time. You're paying for them, but at least they're good cards. I don't think Vega is a good architecture for gamers, and that means neither is high-end Freesync. A £300 1080p Freesync monitor with a £300 RX 580, yep, very good value for money. A £500 1440p Freesync panel with a £500-600 Vega 56/64? Nope. If Navi were eight months away, I'd say it would be. But investing in Freesync is now too much of a gamble. You'll be stuck with a mediocre GPU while the competitor is sailing away. I so wish Nvidia supported Freesync. Maybe they're making so much on Gsync that it's better than selling more GPUs.
 
From what we've been told, though, Vega without HBM2 would not have been possible. They were stuck with HBM whether they wanted to use it or not.

Then I don't understand why they are boasting of a mobile APU with Vega. Which will use DDR4 I would imagine.

I'm no expert on GPU design but alarm bells are ringing, Willy*. Also see**

*that is a line from Lock, Stock, Two Smoking Barrels. Willy is the guy's name, I am not calling you a phallus :D

** Unless of course they are lying and it's not really Vega in the APU but rebranded Polaris.
 
Then I don't understand why they are boasting of a mobile APU with Vega. Which will use DDR4 I would imagine.

I'm no expert on GPU design but alarm bells are ringing, Willy*. Also see**

*that is a line from Lock, Stock, Two Smoking Barrels. Willy is the guy's name, I am not calling you a phallus :D

** Unless of course they are lying and it's not really Vega in the APU but rebranded Polaris.

I suspect that's because the APUs will have significantly lower clock speeds and not be a full Vega 10 GPU. AMD had to use HBM2 on Vega 56 and 64 because of its efficiency. Vega is so overpowered (read: thirsty) that without HBM it would be drawing too much power (not that isn't already). Vega 11 might not use HBM2 for the same reasons. It'll be a much slower GPU and therefore would not need HBM's efficiency benefits. Just a theory.
 
Vega is a feat of engineering. It really is. Fury was too. The only issue is unless you can get it loaded up and doing what it should it is crap, slow and inefficient.

Vega should be incredible. Instead it isn't. Mostly because it has too much of what we don't need right now, and not enough of what we do.

If I could never buy another GPU again and had a choice I would pick Vega for sure. It will shine... One day.
 
Vega is a feat of engineering. It really is. Fury was too. The only issue is unless you can get it loaded up and doing what it should it is crap, slow and inefficient.

Vega should be incredible. Instead it isn't. Mostly because it has too much of what we don't need right now, and not enough of what we do.

If I could never buy another GPU again and had a choice I would pick Vega for sure. It will shine... One day.

Vega is undeniable an incredibly advanced and potentially amazing architecture, but unless games are developed with it in mind, how far will its wings spread? I think of it this way: Why would game developers code for Vega when only 1% of their clientele (a possible number until Vega 11 comes out) use Vega? Why not code their games for the 99% who use Nvidia and other AMD GPUs that are completely different? AMD engineers said themselves how vastly different Vega was to Polaris, Hawaii, and Fiji, so it even makes sense to code the games for them rather than Vega.

As for drivers, I personally don't think drivers have that much room left in the tank. I think games like BF1 are about as high as Vega is going to get, which means the 1080Ti is still the absolute king. Whatever progress Vega is going to make, I think it'll be very appreciated for games where it clearly is suffering, but it won't be enough. Doom performance hasn't improved much in the months since it was first benchmarked back in January. As you said, it could take years before games actually start to take advantage of Vega, and that just isn't enough for me personally.

I have freesync and I bought a 1080.

I also know others who have done the same. A lot of people are using Nvidia/Freesync setups because it's actually becoming harder to find 'gaming' monitors that are affordable without Freesync, and since AMD cards are struggling to compete in the high-end right now, a lot of people are using Nvidia 1070s, 980Tis, 1080s, and even 1060s.

How have you found it coming from an RX 480?
 
:D

RZHNc5V.jpg
 
I also know others who have done the same. A lot of people are using Nvidia/Freesync setups because it's actually becoming harder to find 'gaming' monitors that are affordable without Freesync, and since AMD cards are struggling to compete in the high-end right now, a lot of people are using Nvidia 1070s, 980Tis, 1080s, and even 1060s.

How have you found it coming from an RX 480?

Ships tomorrow.
I don't notice the difference with freesync on/off. So honestly it's just crap like Gsync imo. All marketing. I'm just stuck with it because I can't find anything else that's 144hz without one or the other. And I'm not counting the Korean stuff. I won't risk it.

Honestly I would just like to sell my monitor and get a 21:9 IPS lovely display. 100hz even would be fine. 144hz is not a big difference.
 
Ships tomorrow.
I don't notice the difference with freesync on/off. So honestly it's just crap like Gsync imo. All marketing. I'm just stuck with it because I can't find anything else that's 144hz without one or the other. And I'm not counting the Korean stuff. I won't risk it.

Honestly I would just like to sell my monitor and get a 21:9 IPS lovely display. 100hz even would be fine. 144hz is not a big difference.

No it really isn't.

Just because you can't see the difference doesn't mean others can't.
 
You won't. Show me the difference between 144hz with and without freesync/Gsync.

Well I physically can't as I'm on another continent so that's a bit of a stupid thing to ask for ^_^

You can't see the difference, That's fine, But stop trying to lay down the law like it's fact, I can see the difference between Freesync/G-Sync on/off and so to can many others.

I can't ride a fully manual dirt bike so no one can !!!!!!!!!!!!
 
Last edited:
Well I physically can't as I'm on another continent so that's a bit of a stupid thing to ask for ^_^

You can't see the difference, That's fine, But stop trying to lay down the law like it's fact, I can see the difference between Freesync/G-Sync on/off and so to can many others.

I can't ride a fully manual dirt bike so no one can !!!!!!!!!!!!

Until someone comes out with a physically definitive test between 144hz and 144hz with Adaptive sync on, it's just all in your head. The measured differences are insignificant. It's like trying to argue you can see the difference between a 1ms and a 2ms monitor. You can't and no one argues that. But noooo people have to argue over Adaptive sync. It really only benefits lower framerate. Especially with AMDs LFC technology. That is a easily see able difference. But uber high framerates it loses it's usefulness. The higher you go the less you can actually get between them. That is a fact. Input lag can't even be argued as it has no effect on it last I checked.
This is all of course with Vsync off and no driver limits placed. That would certainly change something.

Poor analogy btw with dirtbikes.
 
Back
Top