Nvidia 16GB RTX 3070 Ti has been spotted inside Lenovo's Legion T7 system

Thats got to be a typo error on Lenovo part. No way would Nvidia be providing this card to anyone at this time. They want the base 3070 sold first after all.
 
Thats got to be a typo error on Lenovo part. No way would Nvidia be providing this card to anyone at this time. They want the base 3070 sold first after all.

But the card makes sense. We know an 8GB limit may be a tough for laptops because people often keep them for 5 years. And even in the desktop space people are moaning about the 8GB limit.
 
But the card makes sense. We know an 8GB limit may be a tough for laptops because people often keep them for 5 years. And even in the desktop space people are moaning about the 8GB limit.

Then this kind of leak is far worse for Nvidia than any leaks about the standard 3070,3080 and 3090 cards.

We have just been told that early adopters are about to get hard pressed and will instantly not have the latest and greatest.

I said it before. But I really like this industrial looking twin fan. Its a beautiful simple and elegant card. And decent priced too.
 
Then this kind of leak is far worse for Nvidia than any leaks about the standard 3070,3080 and 3090 cards.

We have just been told that early adopters are about to get hard pressed and will instantly not have the latest and greatest.

I said it before. But I really like this industrial looking twin fan. Its a beautiful simple and elegant card. And decent priced too.

Just because Ampere cards appear too good to be true they are not. At all.

They are just a pre emptive strike against the inevitable, the new consoles. Jen knows no one is going to pay three times what a console costs for a GPU that does the same thing. The 3080 costs more than both forthcoming consoles, and will achieve the same. 4k 60. He even mentioned this in the longer video I watched where he wasn't just taking things out of the oven and GPUs hid behind spatulas.

Sorry, but VRAM is VRAM. I learned that lesson with 4gb HBM. It doesn't matter how fast it is etc etc.

The actual cards he was planning to release? are the ones with the higher VRAM on them. They will come in after, for more money, once the plebs have pounced.

TIcrORH.jpg


COD MW. Maxed out, 1440p, RT on etc. Shadows all maxed, and the new stuff they added. 9.5gb VRAM @ 1440p.

That is why you should not all get too excited at the 3070. Maybe at 1440p it will cut it for a while, but all of that "Much faster than the 2080Ti" is all nonsense when you run out of VRAM.

Nothing has changed but the model numbers. We are just back to Pascal is all.

1070 £400 odd, 3070 £400 odd. Only in all that time it has the same VRAM lmao.

3080 replaces the 1080Ti at the same price, 3090 is Titan money. Nothing has changed *at all* apart from the fact at least two new consoles (S was proven yesterday in code on the Xb1x) and that they will still be much cheaper than these GPUs.
 
just curious what you are bencmarking exactly. I just see the corner of a black screen?

and you seem to get 50fps more than any other review, including OC3D own benchmarks on the same settings.

https://www.overclock3d.net/reviews...y_modern_warfare_rtx_raytracing_pc_analysis/4

It's COD MW. Everything maxed, RT on etc. I even did a video of it with a min FPS of 138 but it seems it was deleted. It was running around 150 FPS without the overclock, that is with some of it.

The game installed a 20gb update before I played it, BTW, and there were new shadow options etc for me to use.

So, it's possible that either the game has improved since launch or something else but that is what I got.

The reason it's black is, you try taking a pic with your left hand whilst trying to control the game with your mouse. LOL.
 
And guess what, the game would run without issues with less VRAM. Allocating memory != needing memory.
 
And guess what, the game would run without issues with less VRAM. Allocating memory != needing memory.

Guess what? that's the same crock that was sold to me when I bought a Fury X.

8gb is not going to be enough. Especially not for 4k, which is what the next gen consoles promise. Maybe with heavily reduced textures and settings? yeah, maybe.

Remember, the whole launch so far has all been about double the performance at 4k.

Anything less than 4k? it's pretty clear that you don't need any of these cards if you have a decent 20 series. Even more so if you have a high end one.

It won't change anything.
 
Guess what? that's the same crock that was sold to me when I bought a Fury X.

8gb is not going to be enough. Especially not for 4k, which is what the next gen consoles promise. Maybe with heavily reduced textures and settings? yeah, maybe.

Remember, the whole launch so far has all been about double the performance at 4k.

Anything less than 4k? it's pretty clear that you don't need any of these cards if you have a decent 20 series. Even more so if you have a high end one.

It won't change anything.

I think you are forgetting the 240/360hz monitor race.
 
just curious what you are bencmarking exactly. I just see the corner of a black screen?

and you seem to get 50fps more than any other review, including OC3D own benchmarks on the same settings.

https://www.overclock3d.net/reviews...y_modern_warfare_rtx_raytracing_pc_analysis/4

https://www.youtube.com/watch?v=SximHQ2RDgg&feature=youtu.be

There you go. I'm hitting FPS that apparently that article says are not possible. It can only be one thing, the game has been improved.

0LAWWqz.jpg


Everything is on and as high as it will go.
 
yeah, as I said before. I have no care for 4k. I think 1440p is beautiful enough. Id rather have 1440 and push those higher frames.

Absolutely. It's amazing how many people have completely lost the plot over these new GPUs tbh.

I'm chasing one right now (2080ti water). If it comes off I will post about it :)

BTW going back to COD. Level etc are irrelevant. The screen shot last night was not even in London, but a very busy train depot and the FPS were identical. I don't think I have even seen less than 130 yet.
 
Guess what? that's the same crock that was sold to me when I bought a Fury X.

8gb is not going to be enough. Especially not for 4k, which is what the next gen consoles promise. Maybe with heavily reduced textures and settings? yeah, maybe.
Ah yes the 4GB card without compression tech. That's entirely different.
In addition, the faster IO they're pushing, probably makes GPU swapping have even less of an impact. You won't run into issues in current gen games, and in the future you might have to run "high" or even "medium" textures instead of ultra. Sounds fairly reasonable to me.
 
Ah yes the 4GB card without compression tech. That's entirely different.
In addition, the faster IO they're pushing, probably makes GPU swapping have even less of an impact. You won't run into issues in current gen games, and in the future you might have to run "high" or even "medium" textures instead of ultra. Sounds fairly reasonable to me.

These are all things that totally don't matter yet, and won't until the consoles start using the same tech. And from what I have read and been told it's only the PS5 that will use it.

At the same time all of the games that exist don't use it.

If you wanted to know the catch with these "lower" prices? that is it. And, it's why there are cards with much more VRAM than that being rumoured heavily right now. Only, they won't be anywhere near as cheap as they never are.
 
This 3070Ti leak just goes to prove anyone who buys the new cards when they launch is a moron cause you can guarantee that as sure as your points at the ground that you're going to get burned, as Nvidia will have Ti's/Supers ready and waiting.
 
Does anyone remember 760 Ti? It was an OEM only version of 760.


2GB GDDR6X chips aren't available yet - I wouldn't be surprised if it was an OEM flavoured 2070 Ti with GDDR6.
 
Does anyone remember 760 Ti? It was an OEM only version of 760.


2GB GDDR6X chips aren't available yet - I wouldn't be surprised if it was an OEM flavoured 2070 Ti with GDDR6.

Or the GTX 555 they launched to Alienware only?

So yes, this could well be an OEM only card. Like the Ryzen 3900 being an OEM only CPU.
 
Just because Ampere cards appear too good to be true they are not. At all.

They are just a pre emptive strike against the inevitable, the new consoles. Jen knows no one is going to pay three times what a console costs for a GPU that does the same thing. The 3080 costs more than both forthcoming consoles, and will achieve the same. 4k 60. He even mentioned this in the longer video I watched where he wasn't just taking things out of the oven and GPUs hid behind spatulas.

Sorry, but VRAM is VRAM. I learned that lesson with 4gb HBM. It doesn't matter how fast it is etc etc.

The actual cards he was planning to release? are the ones with the higher VRAM on them. They will come in after, for more money, once the plebs have pounced.

COD MW. Maxed out, 1440p, RT on etc. Shadows all maxed, and the new stuff they added. 9.5gb VRAM @ 1440p.

That is why you should not all get too excited at the 3070. Maybe at 1440p it will cut it for a while, but all of that "Much faster than the 2080Ti" is all nonsense when you run out of VRAM.

Nothing has changed but the model numbers. We are just back to Pascal is all.

1070 £400 odd, 3070 £400 odd. Only in all that time it has the same VRAM lmao.

3080 replaces the 1080Ti at the same price, 3090 is Titan money. Nothing has changed *at all* apart from the fact at least two new consoles (S was proven yesterday in code on the Xb1x) and that they will still be much cheaper than these GPUs.

The 3080 is likely going to offer more than 4k/60. The 2080Ti has been doing that the last 18 months, so it stands to reason the 3080 will offer more like 4k/90.

Someone from Nvidia recently spoke about VRAM usage when asked whether the current limits of 8 and 10GB would be enough. And I felt convinced by his reasoning. He mentioned being in touch with game developers. These are the people building next gen console games, not you or I. You could absolutely damn Nvidia and say that they just want you to buy Geforce 4000 series when they're released or upgrade to a 16GB 3070 from your 8GB 3070, but that doesn't make sense for Nvidia to do that.
 
Back
Top