The Witcher III: The Wild Hunt, First look.

Apparently the 700 Kepler series are derped. So basically a GTX 960 out performs a GTX 780 in this game. Nvidia claim it's because tessellation is better on the Maxwell series.

So basically an all out fail, starting with the devs going right down to the heart of Nvidia.

All I gotta hope now is that they fix it, but it looks doubtful. Talk about trying to use a game to create more sales.

Well they can fuck off. I don't want their shit.

Wow i didnt know that kepler had been gimped by this game, sort of glad i have 2 970s....sorry you're having auch problems though dude. Ypu think nvidia would of done something about this with it only been the last gen of cards
 
Wow i didnt know that kepler had been gimped by this game, sort of glad i have 2 970s....sorry you're having auch problems though dude. Ypu think nvidia would of done something about this with it only been the last gen of cards

It's no great loss to me, this isn't my sort of game any way. Now if it were Fallout 4 I'd be in tears.
 
Pardon le Français, but this game is le f$*#ing s*#t.

With two Titan Blacks I can get 30 FPS at medium settings. If I disable everything (blur, motion blur, alias) I get about 42 FPS.

Problem is it then looks like dog poo. I'm so, so glad I only paid £20 for this game. GTAV it ain't.

Why didn't they just delay it for a few months to get it right, like Rockstar did with GTAV? Even Nvidia, who should be right behind this game are at blame here. Apparently Kepler performance is derped.

Just an all out mess really.

It's a sad day when sli Titan blacks get trounced buy a 750Ti, but to be fair NVIDIA borked this game the day before release with their "Game ready" drivers. As many many people have already discovered roll back to the previous release and performance is fantastic.

Gotta agree with you on the extra time to polish the engine, it was ready 2 months before release date, and they did little in that time to work a little harder on the main core of the game.

On a side note, I really wish the foul language on these forums could cease a little, it's becoming all to frequent and there is very little need for it.
 
I tried it before updating the driver and sadly it was the same.

This game has really changed my plans. There I was, all set to go to the newer cards but this game has made me feel I don't want to bother. I spent £1390 not even ten months ago on two cards that I thought would last two years. Yes, only two years. I'm not stupid I know how it works.

But ten months? seriously, are they having a ******* laugh? do Nvidia really think that I would spend £1400 to play one game? then in ten months (this time of year is rife for releases) buy another one or two?

It's getting beyond a joke now and it's time for me to stop. Very sad, if I'm honest. But I saw the supposed release price for AMD's new card ($849) and they can **** off too.

I don't know whose idea it was to start doing what they're doing but it will cost them big time in the end.

I'm out. I'm literally out. I can't even play a game on £1390 worth of graphics cards.

I wouldn't even mind if the graphics were good but tbh? that woman in the very first scene looks like a fucking Realdoll. If you don't know what they are have a poke on Google (and a LOL). They've over done the realism and the people look completely synthetic.

I'm not playing the game any more. Once I get my new motherboard I'm going to fit that and then I am done.

I don't know what makes me angrier in all honesty. Nvidia ass raping people with their disgusting prices or the mugs that pay those prices and have encouraged this bullshit.

And yeah, I was one of those utter mugs too but I figured I would at least get a couple of year's grace before they ****** me over. Ten months !!
 
Last edited by a moderator:
I guess it's that same sad old story again, what should have been a PC mega game turned into a dumbed down consoles are more important port. They should have stuck with what made them who they are, the loyal PC followers and just ported it to console later. They backed the wrong horse this time and the GPU manufacturers are left trying to pick up the pieces again and taking the flac.
 
I guess it's that same sad old story again, what should have been a PC mega game turned into a dumbed down consoles are more important port. They should have stuck with what made them who they are, the loyal PC followers and just ported it to console later. They backed the wrong horse this time and the GPU manufacturers are left trying to pick up the pieces again and taking the flac.

To be fair I wasn't expecting that much any way. I've already had Christmas this past month, what with GTAV coming out and running superbly @ 4k and taking my breath away. There was also the rather excellent Dying Light, which I put many an hour into and then of course Pcars which is a rather nice little racing game.

I never played the first Witcher and it really isn't my sort of game. I ended up with Witcher 2 (I can't remember how) for free when it launched and I had a GTX 470 and it ran like poo on that. It really was awful and I had to lower the settings drastically. The only card better at that time was the GTX 480 and it wasn't really any better on that if my memory is working.

I played it later on more modern hardware and put oo, about three or four hours into it. Very slick and some superb visuals, but I'm not really a medieval type. I much prefer nuclear holocaust :)

So yeah, I kinda knew that it wouldn't run at 4k very well given how the second instalment fared, so I really wasn't expecting too much. In all honesty I probably wouldn't have played it for long any way I only bought it because of the game drought just before we had this recent brace.

PC gaming was dead until about two months back. Hopefully this October we will get a bunch more :)
 
The amount of rage on the W3 makes me LOL so hard...

It's getting beyond a joke now and it's time for me to stop. Very sad, if I'm honest. But I saw the supposed release price for AMD's new card ($849) and they can fuck off too.

I'm out. I'm literally out. I can't even play a game on £1390 worth of graphics cards.

I don't know what makes me angrier in all honesty. Nvidia ass raping people with their disgusting prices or the mugs that pay those prices and have encouraged this bullshit.

And yeah, I was one of those utter mugs too but I figured I would at least get a couple of year's grace before they fucked me over. Ten months !!

So you're complaining about Nvidia ass raping people for spending 1400pounds of money on GPUs.. but you were the one who spent that much? Jokes on you for even spending that much on GPUs. Why would you spend that much ever on GPUs alone? You will never get your money's worth out of them...

I guess it's that same sad old story again, what should have been a PC mega game turned into a dumbed down consoles are more important port. They should have stuck with what made them who they are, the loyal PC followers and just ported it to console later. They backed the wrong horse this time and the GPU manufacturers are left trying to pick up the pieces again and taking the flac.

No offense dude but things like this make me laugh. The good ol' point of "dumbed down for consoles" is misused in this case. They changed the renderer they used so they didn't have to make a completely different one for the consoles. That explains the difference in graphics. They instead made a new one that would work on all three and made development easier. That's not dumbing down. That's just smart and logical thinking. Why make 3 different one's when they can use one? Saves time, resources, and most importantly money. The loyal PC followers? You mean they should make ONE game for only their current fan base? What money can be made off that with a game of this scale? That's right none. They even said themselves, this game is only the way it is because making them on Consoles allowed them to make enough money for the project to be funded. Otherwise it wouldn't have happened. So would you rather this game, which is easily a GOTY candidate, or none at all?

To be clear. This is not a console port and saying so is just laughable. The ENGINE itself, REDengine3, was a resdesigned REDengine2(which was a PC only engine) that was turned into a multi-platform capable engine. It's still a PC originated engine. They just tuned enhanced the hell out of it for PC and then capable of running on Consoles.
 
Last edited:
So you're complaining about Nvidia ass raping people for spending 1400pounds of money on GPUs.. but you were the one who spent that much? Jokes on you for even spending that much on GPUs. Why would you spend that much ever on GPUs alone? You will never get your money's worth out of them...

Yeah I will get my money out of them. I'll simply refuse to buy anything else until I feel they have served a fair duration of time. The rest of my rig is still very much current, I mean, I can easily swat aside the 5820k and 5930k due to their crap clocks so there's nothing that needs to be replaced.

I estimated two years for 4k and if I'm being honest every game I have runs very smoothly at 4k, thanks to Gsync. Of all the games I own it's literally this one that is completely broken, so I don't really feel tempted to spend money on the problem. That never works in computers.

After two years? I will either turn the settings down or, as is the norm, bag a pair of AMD cards for around £200 each. Look at the 290x right now. It would easily be a good 4k stand in at 4k for my Titans and they're selling for peanut money.

I'll wait until the 390x or whatever AMD decide to call it can be bought as a pair for £500.
 
Well I put my 3970x back in. Clocked it to 4.7ghz and guess what? the FPS haven't changed at all and this game still runs like crap. I've literally not gained any FPS at all.
 
Well I put my 3970x back in. Clocked it to 4.7ghz and guess what? the FPS haven't changed at all and this game still runs like crap. I've literally not gained any FPS at all.

is this with the titan black? as i am running this on the latest drivers with 2x 970's an everything cranked up an it doesn't go below 60 fps at all
 
I'm only using a 4440 and a 750Ti (rolled back drivers) and averaging 45fps on medium settings. Very strange how a 3970x and a Titan Black struggle with it, last time I saw anything like this was with UT3 my old Athlon 4400 and ATI X1300 Pro would nail that game but yet my mate running a Q6600 and 8800 GTS struggled with it.
 
It's because it's derped on Kepler dude. It really is as simple as that.

I disabled the hair and oddly it still looked the same. Got the game to around 43 FPS with the new patch. I think that was on high too.

It's no bother really. I'm only just setting up the second heist in GTAV with Trevor so I've got loads left to go before I'll be looking for something else to play :)
 
It's because it's derped on Kepler dude. It really is as simple as that.

I disabled the hair and oddly it still looked the same. Got the game to around 43 FPS with the new patch. I think that was on high too.

It's no bother really. I'm only just setting up the second heist in GTAV with Trevor so I've got loads left to go before I'll be looking for something else to play :)

It's more of the drivers than the game. This game isn't CPU limited but it prefers 4 cores over 6. http://www.techspot.com/review/1006-the-witcher-3-benchmarks/page5.html

While it's not a big difference, still is one. It does utilize cores heavily but 4 faster cores is what it wants. If you want to enjoy it rage at Nvidia. Pretty much every other card works flawlessly.
 
It's more of the drivers than the game. This game isn't CPU limited but it prefers 4 cores over 6. http://www.techspot.com/review/1006-the-witcher-3-benchmarks/page5.html

While it's not a big difference, still is one. It does utilize cores heavily but 4 faster cores is what it wants. If you want to enjoy it rage at Nvidia. Pretty much every other card works flawlessly.

Yeah I know dude :)

I actually truly believe that Nvidia deliberately did this to help those with 780s finally decide to give in and go to a 970.

Plus it's come literally just before AMD release their new cards.

Thing is I don't have 780s I have Titan Blacks with plenty VRAM so I'll just give the game a miss.

Seriously, with GTAV being as good as it is and looking and running at 4k how it does I'm more than happy.

CPU wise it seems I chose well. 3970x that will do an easy 4.7ghz fully 24/7 stable and 4.9 totally bench stable I won't need another one for years. It's actually faster than nearly all 5820ks, as fast as a good percentage of the 5930ks I've seen and I paid £300 for it nearly a year ago! )

I can't see me upgrading any time soon.
 
I find using Adaptive V-Sync with nVidia and Triple Buffer = on, makes the experience with most games much more enjoyable. A few titles can cause some nasty stuttering though.
 
Right it's time I updated this thread on the latest.

So since getting one Fury X the game runs beautifully at 4k with mostly high settings and Hairworks off. It has also stopped stuttering almost completely and feels far smoother than it did before. I suspect that SLI coupled with the demands of 4k and Gsync stepping in was the culprit.

I don't usually like this sort of game. I'm far more a wastelands/zombies kinda fella but I must admit with every two hour session I put in this game is drawing me in more and more.

Noticed a few funny things so far. Firstly the characters up close (and more so partially naked) look like Realdolls. Even funnier though there's a scene with the dark haired woman and it looked like she had a pube on her neck :D I wish I had got a screeny now.

But yes, overall I must say I am very pleased with this game and it's beautifully done. Unless it ever becomes too hard I have no doubt I will stick it out 'til the finish.
 
Right it's time I updated this thread on the latest.

So since getting one Fury X the game runs beautifully at 4k with mostly high settings and Hairworks off. It has also stopped stuttering almost completely and feels far smoother than it did before. I suspect that SLI coupled with the demands of 4k and Gsync stepping in was the culprit.

I don't usually like this sort of game. I'm far more a wastelands/zombies kinda fella but I must admit with every two hour session I put in this game is drawing me in more and more.

Noticed a few funny things so far. Firstly the characters up close (and more so partially naked) look like Realdolls. Even funnier though there's a scene with the dark haired woman and it looked like she had a pube on her neck :D I wish I had got a screeny now.

But yes, overall I must say I am very pleased with this game and it's beautifully done. Unless it ever becomes too hard I have no doubt I will stick it out 'til the finish.

Only played a little of it so far but I've enjoyed it, Playing through Fallout 3 + NV is my priority ATM though ;)

As to your stuttering, G-Sync would actually have helped, It would have been simply down to SLI microstutter problems.
 
Thing is micro stutter is all but eliminated on Nvidia cards thanks to built in hardware. This is why they released FCAT because they knew that only AMD would look bad lol.

I did not have any stutter before at 1080p so the only culprit was Gsync. I read that sometimes it does make things worse.

Glad you're enjoying the Fallout games, they really are quite wonderful.
 
Back
Top