Nvidia Pascal Titan Rumored to be coming as early as April

Because the 1080 will be considerably cheaper than the 980ti and probably use less power too.

The 970 and 980 weren't that much faster than Titan Black (in fact the 970 was 5% slower) but when you were looking at a minimum of £699 for Titan Black and the 970 was released for £350 or so.....

That erm, slightly annoyed me, having paid nearly £1400 just months prior.

Moving onto Titan P or whatever stupid name they give it this time..

If it launches in April then that's a serious slap in the face for people who have stumped up over £700 for a Titan X. Seriously, if they are going to release cards quicker than crap through a goose at those prices then even the really crazy people (and that's putting it politely) who are buying them will be annoyed.

Having said all of that I shall now wind my neck back in and go back to not caring really. I'm done handing over grands for PC hardware and I have a really bad feeling that I got conned buying these Fury X too. Even though they shouted "4gb will be enough" I have a really horrible feeling that within months it won't be.

As soon as PS4 VR comes along I am ditching PC gaming and going over to the dark side. It's just so much cheaper, and even though games are not as pretty they perform a darn sight better on the derped hardware of a console than they do on an equivalent PC (even though we were told that once the new consoles came out they would be X86 so "PC ports" (spits at the ground at that phrase) would be better than ever.

Yeah, right.

Yeah 30FPS at 900P is sooooo much better than than 100FPS at 1440P with increased visuals.......... ^_^

Having played Halo 5 recently I could never go back to console gaming, The low resolutions, The horrid graphical poppins all the time and sluggish FPS of 30, No thanks, Not for me but each to their own.

Oh and the PSVR is rumoured to be around twice the price of a PS4 and have inferior specs to the Oculus Rift So yeah, Not a good move.
 
Yeah 30FPS at 900P is sooooo much better than than 100FPS at 1440P with increased visuals.......... ^_^

Having played Halo 5 recently I could never go back to console gaming, The low resolutions, The horrid graphical poppins all the time and sluggish FPS of 30, No thanks, Not for me but each to their own.

Oh and the PSVR is rumoured to be around twice the price of a PS4 and have inferior specs to the Oculus Rift So yeah, Not a good move.

It's a stable platform, though. It will also likely have far more games than any of the split faction PC technologies.

I know the quality of games is worse on a console. I've known that since 97 when I started taking PC gaming seriously. But I'm growing sick of the entry price and the fact that we get about four games a year, one of which might be half decent.

It's BS dude. All the hype and trump talked about PC gaming and we get one decent game a year if we are lucky.
 
It's a stable platform, though. It will also likely have far more games than any of the split faction PC technologies.

I know the quality of games is worse on a console. I've known that since 97 when I started taking PC gaming seriously. But I'm growing sick of the entry price and the fact that we get about four games a year, one of which might be half decent.

It's BS dude. All the hype and trump talked about PC gaming and we get one decent game a year if we are lucky.

True, I'll see how this year goes, If it's abysmal then I might be switching over to console too mainly because everyone I know who plays games frequently or games I like which have co-op etc....is on Xbox and Playstation.
 
True, I'll see how this year goes, If it's abysmal then I might be switching over to console too mainly because everyone I know who plays games frequently or games I like which have co-op etc....is on Xbox and Playstation.

I'm really thinking I'm just gonna let my PC grow old and die tbh. Don't get me wrong I still like tinkering and stuff but nowhere near as much as I used to.

It's a hobby I have really enjoyed for many years but I think I am just getting tired of the costs involved..

It just alarms me that companies like Nvidia can keep churning out technology so fast. It's also ridiculous, because it gives game devs 0 time to sit down and code for that technology properly so we get bodged rubbish that doesn't work properly and doesn't support anything (the last four or five games don't work with Crossfire and apparently never will).

Just sick of being made to feel that I have to have something which in reality will do absolutely sod all before it's obsolete. PC gaming for me for the past couple of years has been this. IE - sitting on a forum chatting with people because there's nothing decent to play.

It's not been helped by Tomb Raider. The company boasted how it would work with everything yet on launch day, the most critical day of all it doesn't work in Crossfire whatsoever and can't even be forced to and SLI is as buggy as hell with flashing textures and god knows what else.

And I feel really, really let down by that. Give it a few days (AMD are working on it as we speak, driver coming with Crossfire support Monday I think) and I might rethink but right now I have just about had enough of it.

Yeah the Xbone is slower and yeah, it's not as pretty (but I have seen it running and it's not exactly that much worse than the PC version either) but it works and I could have completed it by now.

Instead I've handed over a grand to AMD and it doesn't even work yet. DX12 will supposedly fix all of this but something I have learned from PC gaming is that pretty much all promises will be completely broken and the machine rolls on, eating money and stopping for no one.

IE - by the time DX12 even becomes a thing my Fury X cards will be something to laugh at, rather than actually get used.

If this news is accurate and Nvidia really do have full fat Pascal ready for launch in April? then they will have stooped to new lows IMO.

It's stupid dude. For years and years I couldn't afford high end hardware and made do with mid range crap after mid range crap. Then all of a sudden I find myself being able to afford it and man, what a complete rip off high end hardware is and what a let down. You pay over a grand and within 11 months it's worth £250 if you are lucky.

Going back through my collection and playing Crysis @ 4k has really taught me a very valid lesson dude. Gaming just isn't moving on. Not if it's done properly, any way.
 
Anyone who bought a dx12 card and jumped on windows 10 with the impression that dx12 games and content would immediately follow are simply newbs.

Whining about an api that is still in its infancy is pointless and pathetic, games that are being released now were under development with dx11 for as long as 2 - 3 years... So boo hoo if your games are not dx12 yet and boo hoo if you think you wasted money on shiny new hardware.. Its called development for a reason, we wont see any proper AAA dx12 titles for a good couple of years yet.

We went through all this with dx9 and dx10 there is always a transitional period, so be patient and relax and support the devs and the hardware will shine.
 
Anyone who bought a dx12 card and jumped on windows 10 with the impression that dx12 games and content would immediately follow are simply newbs.

Whining about an api that is still in its infancy is pointless and pathetic, games that are being released now were under development with dx11 for as long as 2 - 3 years... So boo hoo if your games are not dx12 yet and boo hoo if you think you wasted money on shiny new hardware.. Its called development for a reason, we wont see any proper AAA dx12 titles for a good couple of years yet.

We went through all this with dx9 and dx10 there is always a transitional period, so be patient and relax and support the devs and the hardware will shine.

1. DX11 launched and we had the cards and our first DX11 game almost immediately.

2. When I bought two Fury X I was told by every reviewer around the world that the 4gb of HBM was more than enough.

3. DX10 never really took off. There were a tiny handful of games for it. I'm not expecting that from DX12.
 
2. When I bought two Fury X I was told by every reviewer around the world that the 4gb of HBM was more than enough.

I never understood how it was equivalent to a higher quantity of GDDR5, 4GB is 4GB, or 3.5GB. Otherwise surely everyone would just have one IC clocked at 5000GHz and an effectively infinite amount of memory.

If you just ignore reviews and 'the news', buy a nice shiny thing and play with it then you'll be happy. Also never monitor your FPS or core temperature in game.

JR
 
Dx11 was unveiled late 2008. One year later it was released together with Win7. After another year the first actual games got released. It means we have to wait just a bit more buds.
 
I never understood how it was equivalent to a higher quantity of GDDR5, 4GB is 4GB, or 3.5GB. Otherwise surely everyone would just have one IC clocked at 5000GHz and an effectively infinite amount of memory.

If you just ignore reviews and 'the news', buy a nice shiny thing and play with it then you'll be happy. Also never monitor your FPS or core temperature in game.

JR

Well the theory was that it could load in textures faster so should perform around the same as 6gb. However, when a game is bloated (as all of this console slop probably will be) and the game wants to load it all at once you get problems.

To be fair I have only seen this an issue in one game and that's BLOPS 3 but the game graphically is awful any way. And given I play it in co op it's still better than Left 4 Dead 2 which looks donkey's years old now.

I would ignore reviews and the news but if you do you pay a heavy price with PC stuff and especially PC games. IE - you buy them and they don't work :D

I'm not bitter about it, just more disappointed. This new wave of better looking games we would get now that the consoles had launched hasn't materialized either, with games launched last week looking hardly any better than Crysis, which was launched in 2007.

You run 4k don't you JR? be sure to give it a go when you can find the time. It's absolutely stunning.

Dx11 was unveiled late 2008. One year later it was released together with Win7. After another year the first actual games got released. It means we have to wait just a bit more buds.

The Radeon 5870 launched in Sept 2009 along with Dirt 2, which was DX11. (Dirt was released on the 8th September, in time for the GPU launch)

September 10 to be exact. So if it was unveiled in late 08 then within a year it was released and there were games using it. I clearly remember two or three more in early '10 too.

Just because DX12 has only just been released (in Windows 10) does not mean that game devs have not had it for ages.
 
Last edited:
You run 4k don't you JR? be sure to give it a go when you can find the time. It's absolutely stunning.

I have an original Swift which I use most of the time with Tri-SLI 780's and a baby 25" 1440p for work/LAN's. Just got a 32" 4k IPS AOC from work too which I intend to use with Lightning at events this year. Once imersa has HOF finished (Tri-SLI 980Ti's) then we may swap so he has the 4k and I take the 34" ultrawide AOC for Lightning. Either way I should get some time to play with lots of different resolutions and densities over the summer.

Is Crysis 3 better looking than the original? I liked the forest setting of your screenshots I could certainly flail around and attempt to shoot things in that environment. Can't wait to check Vanishing of Ethan Carter out on them all, I think that might be my eyecandy benchmark.

JR
 
Crysis 3 is obviously more technically advanced, but I still think the original holds its own. 3 may look more 'hollywood' but I like the theme of the original better.
 
Crysis has indeed remained a great looking game. I also agree that engine advancements since then have appeared incremental. Many harder to run titles that draw two or three times as much VRAM only appear slightly more detailed on the surface. Physics engines have somewhat plateaued as well, though I'm sure there are hundreds of developers who would heavily disagree with that, and that's something to consider. Maybe we've hit a point where the improvements are just not as evident as they used to be, and it's becoming harder to warrant the continued investment.

I personally don't like where many engines and games have gone. A recent exemption to this would be Mad Max. That game was easy to run, looked exactly as it should, contained fun gameplay, and supported various hardware. Max Payne was another example of a PC game done right. It was a very good looking game that ran well (for the most part) and felt at home on PC. Metal Gear Solid, Dishonored, Shadow of Mordor, Bioshock Infinite, Spec Ops The Line, Deus Ex, these are some of my favourite games, and none of them are as demanding or beautiful as the new Tomb Raider, Syndicate, The Witcher 3, Crysis 3, etc. Crysis 3 isn't even that good.

As an example of something very different, Assassins Creed is an artistically beautiful game with a very powerful engine. But poor hardware support and rampant bugs detract from the experience. Also, the game is repetitive and tired. I see it the same way I see films like Man Of Steel and Avatar. They're great to look at, but they miss the ball on what is important.

Another 'type' of game is GTA V and The Witcher 3. They're well made PC 'ports' that are demanding and good looking, yet they are scalable. With a Fury at 1440p, I have to turn a lot of settings down in GTA V to hit my preferred frame rate of around 60-90 FPS. A fast-paced competitive game like GTA V needs to be ran at 60 FPS or more. Otherwise you may as well just play on console. The graphical fidelity is not worth the huge price increase, the additional headaches, and the fewer players in the servers. There are so many more maps and players on PS4 for GTA Online. But it's worth in my opinion. And Rockstar created a game that has enough superfluous settings to allow that kind of performance. Other games expect you to play at 30 FPS all the time, and that's patronising and ignorant.
 
Last edited:
GTAV to me at least was quite boring. I mean sure, it looks nice, but that was where it ended. The single player game (which I had never even touched before, I just free roamed in all of the others which has now gotten old) was completely insipid and slow. Christ, that mission at the docks where you need to slowly drive that truck and load the containers was skin crawlingly bad !

So I ventured off a bit, no fun to be had there any more. I had exhausted the roaming thing in San Andreas. Planes, helicopters etc.. Nothing really new there.

I totally agree about Max Payne. That game was a complete triumph, both visually and in the way that pretty much every one could max it out on sensible GPUs.

Tomb Raider was another example. Whilst initially there were a few hiccups both AMD and Nvidia soon had it sorted via drivers and it turned out to be a wonderful experience.

Loading up ROTTR though? I was a little disappointed. Now granted, I only watched the slide show for about a minute as she trudged through the snow, but it did not make my jaw drop even at 4k (which was going to be the major difference and bonus of going to ROTTR from TR). Maybe once there are some nice stable drivers from AMD that may change and I am really hoping it's fixed and doesn't become a lost cause like Just Cause 3.

BTW I'm not trying to take anything away from GTA online. I tried it, wasn't for me, moved on.

Witcher 3 had serious issues at launch. No Crossfire support, poor SLI support, derped Kepler (which I happened to be running) and so on all took away from the enjoyment. In the end? I got so sick of having issues with it that like sour milk I went off of it and never got into it. THAT is how badly a poorly released game can affect me. Like anything new I want to play it on day one. Before any one plays the SLI card it wasn't just SLI. I had issues with SLI, Gsync *and* Kepler, so I didn't stand a chance.

Project cars was also very poor on AMD and the devs tried to shirk that onto AMD by saying AMD had not approached them. But the way I see it? it's their game, why would THEY not approach AMD to make sure it was sorted? Trying to pass the blame onto AMD because their game ran like crap was a cop out to me.

If it were my game and I was proud of it (like back when I used to work on an emulator and code front ends for the games) I would care about how my game runs on people's computers. I wouldn't just not bother with AMD hardware and then blame AMD for not approaching me.

And that, at its root, is the problem with PC gaming. No one cares, it's totally all about the money and doing as little as is humanly possible in order to obtain that money. And the more that continues the less we get, and the more stripped out and cut down it will be. I went to buy ROTTR on DVD. Guess what? they have now skipped that part because it costs money to produce them so now they don't have that expense.

How much was ROTTR on Steam? oh yeah, £39.95.

So why, a couple of years back, could I buy the latest PC game *on DVD* for £25 or so? because there was no licensing fee, right? OK, so why the heck am I being charged £39.95 for a game with absolutely nothing physical neither natural or man made for this game I just paid nearly forty quid for?

And see, hardware developers are starting to cash in on this now too.

"Hey every one, look at this Maxwell full fat.... Shiny ! Giss £700 for it, it will enrich your gaming experience and make it better".

So we are sold something with the good old fashioned sales patter, then six months later they completely forget about this technology they sold us for a damn fortune and start beating the war drums again with the same old rubbish about this new hardware and how it's going to change EVERYTHING.

Yeah, right.

It should not be possible for me to load up a game made in 2007 and for it to be completely comparable, or even much better in some cases, that a game made NINE YEARS later.

We've seen an awful lot of hardware in nine years.
 
I can't argue with any of that, and I don't think I want to.

Regarding The Witcher 3, I take back what I said. You are right; it was a controversial release. I just have seen it run well and look well and both AMD and nVidia. I forgot about the 960 beating the 780 and SLI/Crossfire issues.

I would love to see DVD's return. I have 4MB DL speeds so it can take a long time to download a 40GB game. And they're getting bigger all the time. I also love to see my physical collection grow. I miss that aspect, and it's the reason I still buy Blu-rays even with the onset of torrents and streaming services. I also never sold my 250+ DVD collection when Blu-rays became the next best thing, or why I'll not sell my Blu-rays when 4K discs become a thing. I just like seeing them there. They're not worth much to others, but they're worth a lot to me even if I don't watch them any more. There is sentimental value to them.

The ridiculous BS that the media (I believe Tweaktown are one of the worst purveyors of this) spew with new GPU architectures is indeed frustrating. How many news titles about new GPU's have we seen saying "These new GPU's are set to be the best we've ever seen!"? Of course they are bloody are. Nvidia and AMD are hardly going to release a weaker GPU than they already have out as a flagship. The 970 was a genuinely exciting GPU for me, but they ruined it with the VRAM thing. The Fury X was an exciting GPU, but it underperformed. And while the 980ti was successful, if you invested £1300 in SLI 980ti's and water cooled them just a few months ago, if you had waited six months you would have had a more powerful setup for the same price. That is a chronically flawed system that, in particular, 'Titan owners' are perpetuating. I don't know how many Titan X owners were quietly disgruntled when the 980ti came out. "Damn, I could have saved £500 just by waiting a few months... Ah well, I still have Titan X... Ooooooo, Pascal rumours!!"

I don't play games upon release. The last time I pre-ordered and played a game on release (GTA V), it was a disaster. My 970 was faulty, but it took ages for me to find that out. I also didn't know that my connection wouldn't be able to handle the game and I would be disconnecting all the time. I won't be playing TW3, Fallout 4, Unity, Syndicate, Just Cause 3, Deus Ex, Rise of the Tomb Raider, Rainbow Six, The Division, Far Cry 4, for a long time. I only played the 2013 Tomb Raider and Far Cry 3 last year. I'm probably going to play Bioshock Infinite next.
 
Another 'type' of game is GTA V and The Witcher 3. They're well made PC 'ports' that are demanding and good looking, yet they are scalable. With a Fury at 1440p, I have to turn a lot of settings down in GTA V to hit my preferred frame rate of around 60-90 FPS. A fast-paced competitive game like GTA V needs to be ran at 60 FPS or more. Otherwise you may as well just play on console. The graphical fidelity is not worth the huge price increase.

Witcher 3 is definitely not a pc port. It was developed on PC first. Even the engine was built around PC for its first two versions and updated to RedEngine3 to support the consoles.
 
Doesn't make it more clear? Just different way of saying ported to PC.
I'm sorry, I didn't mean to be quick. I've just had a really ty few hours and I'm fed up.

I used the apostrophes to refer to how they either weren't a port (The Witcher 3) or were so well done that they never felt like a port even though they were (GTA V). But ultimately, many people consider them ports. I was drawing attention to how gamers associate games with consoles, even if they began on PC.
 
I never understood how it was equivalent to a higher quantity of GDDR5, 4GB is 4GB, or 3.5GB. Otherwise surely everyone would just have one IC clocked at 5000GHz and an effectively infinite amount of memory.

If you just ignore reviews and 'the news', buy a nice shiny thing and play with it then you'll be happy. Also never monitor your FPS or core temperature in game.

JR

3.5 RAM + 0.5 Potato.
970 fo' Lyfe

*Disclaimer - they still work just fine
 
Doesn't make it more clear? Just different way of saying ported to PC.

Games are not ported to PC.

Think of it like this. Games are written as modules. So you have modules for the sound, modules for the graphics, modules for any menus etc. Then you write the code that executes them and you compile it. That's about as simple as I can make it.

So just because they may use elements of a console game to create a PC version doesn't mean there is any 'porting' going on. In fact, porting is probably the absolute worst word you can use because it's completely incorrect.

Try 'compile'.
 
Back
Top