Quick News

PCper were honest about that though. It's all speculation at this point, but personally I find it very interesting speculation. This CPU fascinates me, and makes me want to learn more about Intel's core architecture for comparisons sake. It won't change any purchasing decisions for me since I'm going Ryzen for sure in the near future, once the memory stuff gets sorted out.

I know they were. Many people just read one line and take it as fact is all.
 
I know they were. Many people just read one line and take it as fact is all.

Agreed! Those people are stupid and must be ignored. :) Hard to do though! Ryzen has broken the internet and revealed all sorts of ignorant short-sighted stupidity, even from popular tech reviewers. It's kinda sad really.
 
It's not really educated though it's all speculation. The person who made these discoveries over at Pcper doesn't even know the whole story other than there is some latency between CCX modules. That was the only data he can prove. Everything else is just what people may think but really only CPU architects would know the whole issue and Devs would know it's implications for what it means to program for. Like I said, not a lot to guess about until we get some Dev who can explain what's going on, if there is an issue or if it is just that they optimize for Intel so they just need to add Zen or whatever

You know, you're probably right. I just think it's fun to stipulate.
 
You know, you're probably right. I just think it's fun to stipulate.

I understand. Sorry if It came off edgy. Just completely tired of everywhere Zen is mentioned this thing pops up and people just spill out BS. Drives me nuts. I usually don't care but stupid people get my blood boiling. So I probably came off edgy. I didn't intend to.
It is fun to stipulate, but I feel personally we can't really do that right now with next to no information
 
I understand. Sorry if It came off edgy. Just completely tired of everywhere Zen is mentioned this thing pops up and people just spill out BS. Drives me nuts. I usually don't care but stupid people get my blood boiling. So I probably came off edgy. I didn't intend to.
It is fun to stipulate, but I feel personally we can't really do that right now with next to no information

I really didn't like it when AMD fanboys were throwing around the term Asynchronous Compute when the Fury X came out. They had no idea what it was yet were spouting about it like they were well-versed in it. They all said it would mean Nvidia's demise since they didn't include it on a hardware level to the same degree as AMD, but Nvidia remains the top dog after three years. I understand it can be really, really annoying when people talk about things they have no clue about. I do it from time to time as well. :D
 
I really didn't like it when AMD fanboys were throwing around the term Asynchronous Compute when the Fury X came out. They had no idea what it was yet were spouting about it like they were well-versed in it. They all said it would mean Nvidia's demise since they didn't include it on a hardware level to the same degree as AMD, but Nvidia remains the top dog after three years. I understand it can be really, really annoying when people talk about things they have no clue about. I do it from time to time as well. :D

I did a lot of reading about it when they released the 7970 (Async). I still believe they should have gone for smaller dies/higher clocks, even to this day. It's Fermi kitchen sink if it's not going to be used.
 
I really didn't like it when AMD fanboys were throwing around the term Asynchronous Compute when the Fury X came out. They had no idea what it was yet were spouting about it like they were well-versed in it. They all said it would mean Nvidia's demise since they didn't include it on a hardware level to the same degree as AMD, but Nvidia remains the top dog after three years. I understand it can be really, really annoying when people talk about things they have no clue about. I do it from time to time as well. :D


To be fair, aysnc compute is going tobe world changing, and back then amd did have the edge with the fury because of hbm, problem is i think people expected async tobe an overnight thing, like devs were just gonna dig into there code and rewrite everything like it was as simple as ticking a box.

I just think some people dont understand all the nuts and bolts that go on under the hood of games, and yes we can scream horrible console port alot of the time, or badly optomised.

Sometimes they are terrible ports, sometimes they are badly optomised. But its not an easy thing tobe the best of the best in everything. Thats why we have awesome games companys and devs we look upto and others we laugh at, it aint easy bein good, and good talent aint easy to get.
 
Last edited:
To be fair, aysnc compute is going tobe world changing, and back then amd did have the edge with the fury because of hbm, problem is i think people expected async tobe an overnight thing, like devs were just gonna dig into there code and rewrite everything like it was as simple as ticking a box.

I just think some people dont understand all the nuts and bolts that go on under the hood of games, and yes we can scream horrible console port alot of the time, or badly optomised.

Sometimes they are terrible ports, sometimes they are badly optomised. But its not an easy thing tobe the best of the best in everything. Thats why we have awesome games companys and devs we look upto and others we laugh it, it aint easy bein good, and good talent aint easy to get.

That's right. People complain about poorly optimised games as if Ubisoft or EA can just click the 'optimise' button and we can all suddenly hit 60 FPS at 1080p with a GTX 960.
 
That's right. People complain about poorly optimised games as if Ubisoft or EA can just click the 'optimise' button and we can all suddenly hit 60 FPS at 1080p with a GTX 960.


Well thats the other side of the coin isnt it, graphics companys and engine creators create all these new fangled things that make games that little bit better (most of the time) like tress fx, lighting, shading, fine detail hell now we even have makes grass act real effects from nvidia.

People wanna turn that stuff on to see all the shineys, but then are shocked when they cause a performance hit like all those effects are somehow free.
 
Well thats the other side of the coin isnt it, graphics companys and engine creators create all these new fangled things that make games that little bit better (most of the time) like tress fx, lighting, shading, fine detail hell now we even have makes grass act real effects from nvidia.

People wanna turn that stuff on to see all the shineys, but then are shocked when they cause a performance hit like all those effects are somehow free.

Tress FX was probably the least taxing of all of them, and worked on Nvidia very well after a patch. And it was awesome too. How many other games used it? none AFAIK.

Things like that are not the problem. The problem is lack of time. Coding a half decent game can take years, and by the time it launches it's already outdated (see also Fallout 4, which takes a very long time to code due to all of the quests and tiny nuances that need to be coded in). Bethesda slapped on the Gamedoesn'twork after it was launched and it shows.

If for example Nvidia slowed down launching GPUs then devs would not have faster hardware to excuse sloppy code. Now yeah, I'm not going to be too hard on them because they do what they can in the timescale provided to them. Most game devs now are working under a corporation so are being pushed like crazy to get the games out there.

However, let me use an example here. Dead Island. It initially launched looking incredible (the graphics, gameplay was like a B movie, I loved it, others hated it). Any way, game play aside the game looked utterly stunning when it launched and was very easy to run. Then recently they udpated it with bang up to date graphics that are, at times, breathtaking and again, pish easy to run.

So it can be done. However, they've obviously had lots of time working on it to make it run like that, and I feel the issue is compounded by the fact that hardware moves on at five hundred miles an hour.

I can safely say though that 1. It's one of the best looking games I own and B. It flies along regardless of what GPU I feed it.
 
Ye but your touching in the area of engine vs engine a bit, what one engine can do with ease, the other engine you have to drive the effects through with a freaking sledge hammer.

Hell even the same engine but dif version can have drasticlly different needs when it comes to effects.
 
Just noticed that the 10 hour trial is up for mass effect andromeda, funky thing is its not in the origin access, its actually on the store page, so check the actual store page on origin, scroll down and you will see download trial there.

Dont know how you want to report this, i have origin accesss so others without might want to test, but it looks like everyone gets the 10 hour trial that carries over.

Edit, had a friend check, its only origin access, but how you get it is from the store page still.
 
Last edited:
I think if a 1600X (or the cheaper 6 core) can hit the same gaming scores as an R7 1800X, even if a 7700K hits higher numbers I think I'll just go for Ryzen and be contented. I might splurge for the Crosshair and have a nice motherboard for the next three years that has a muted colour scheme and can be adapted to whatever I want.
 
I think if a 1600X (or the cheaper 6 core) can hit the same gaming scores as an R7 1800X, even if a 7700K hits higher numbers I think I'll just go for Ryzen and be contented. I might splurge for the Crosshair and have a nice motherboard for the next three years that has a muted colour scheme and can be adapted to whatever I want.

If I go Ryzen, I want a X370 ITX board... just don't think it is gonna happen:(
 
If I go Ryzen, I want a X370 ITX board... just don't think it is gonna happen:(

Another issue with ITX that stops me from adopting it is how starved of air the GPU can become. I've chosen a case that's still really compact and sleek, but supports E-ATX motherboards and cards as big as the ASUS Strix.
 
Back
Top