Quick News

Well, I guess that's an improvement. Since Syndicate was out cards have become more powerful. It's been two years since Syndicate and Pascal and Vega are now out.

The most important thing is it's a good game that doesn't have loads of bugs. Unity and Syndicate are gorgeous to look at so I don't mind that they were hard to run.

Also note the changes to the game though, no more insane NPC counts like in Unity and Syndicate. To say the least there is less going on on screen, which will help the game run better.

It will be interesting to see how the game looks on all platforms, Xbox One VS Xbox One X etc.

One of the big problems with the last two games, especially Unity on consoles, was the lack of raw CPU performance on both platforms. It is pretty clear that the game was developed with the idea of stronger consoles in mind, Ubisoft were no doubt disappointed in the specs of the PS4 and Xbox One at launch.

As for the PC system requirements, the easy thing to say is that the hardware baseline for the game is unchanged (Xbox One Specs), so a huge increase in PC's minimum system requirements is not expected. Resolution and framerate targets might, but the base specs will remain similar.
 
quite a hideous design actually :blink:
I dont think RGB lighting suits white sticks as well as black sticks. Even then, to slap the vengeance logo where it is makes it obtruse.

Yeah, I'm with Warchild, they are not attractive looking RAM modules, not to me. They look like cheap Star Wars kids toys from an Argos catalogue.

I'm starting to think the B in RGB stands for barf. I don't like them either.

I just call them "Flashy Lights" but that's my own opinion I wouldn't want an Amusement Arcade in my room ^_^

All valid points and in my humble opinion for white styled products such as these, The grating along the top needs to have a different design, The black sticks looks great regardless of LED colour but the white ones need a different top design, Maybe Corsair can bring some replaceable tops out seeing as you can just slide them off.

As said, Because the tops of the Vengeance RGB memory simply slides off, Corsair could bring replaceable tops out just like they have for the Dominator Platinums.

Something a bit more solid like this would look better IMO.


E5ym3rQ.jpg




Well, I guess that's an improvement. Since Syndicate was out cards have become more powerful. It's been two years since Syndicate and Pascal and Vega are now out.

The most important thing is it's a good game that doesn't have loads of bugs. Unity and Syndicate are gorgeous to look at so I don't mind that they were hard to run.

Still haven't completed Black Flag since my save got wiped due to a bug and only played 30 minutes of Unity and Syndicate, Although I do want to go through them as I really like the AC series.
 
Also note the changes to the game though, no more insane NPC counts like in Unity and Syndicate. To say the least there is less going on on screen, which will help the game run better.

It will be interesting to see how the game looks on all platforms, Xbox One VS Xbox One X etc.

One of the big problems with the last two games, especially Unity on consoles, was the lack of raw CPU performance on both platforms. It is pretty clear that the game was developed with the idea of stronger consoles in mind, Ubisoft were no doubt disappointed in the specs of the PS4 and Xbox One at launch.

As for the PC system requirements, the easy thing to say is that the hardware baseline for the game is unchanged (Xbox One Specs), so a huge increase in PC's minimum system requirements is not expected. Resolution and framerate targets might, but the base specs will remain similar.

Yeah, the inclusion of enormous amounts of NPCs didn't really impress me about the earlier AC games. It's fine in one or two places at one particular time to see large crowds—just to witness the scale of it and be immersed—but to have it in every open square takes away from the majesty of it. I remember when Hitman Blood Money came out and there was one map where you had a lot of NPCs on the screen at the one time. It was amazing. But the novelty wore off quickly and the rest of the maps were more closed in and tightly woven. That's the way it should be done in my opinion, or like Origins seems to be doing it.
 
OK look I never post links to this site. Mostly because some people don't like it. This time however I must.

https://www.bit-tech.net/features/asus-interview-andrew-wu-rog-motherboard-pm/1/

It's an interview with Andrew Wu of Asus. May I just point this out...

bit-tech: Can you go into more technical detail about why the new CPUs are not backwards-compatible with Z270 motherboards?

Andrew: Actually, it depends on Intel’s decision.

bit-tech: So it’s not a physical limitation? Intel said it was to do with power delivery.

Andrew: Not really. It [the power delivery] makes a little bit of difference, but not much.

bit-tech: So what are they referring to – the 20 or so unused pins from before?

Andrew: Yes.

bit-tech: So if you wanted and Intel let you, you could make Z270 compatible?

Andrew: Yes, but you also require an upgrade from the ME [Management Engine] and a BIOS update. Intel somehow has locked the compatibility.

I rest my case.
 
bit-tech: So if you wanted and Intel let you, you could make Z270 compatible?

Andrew: Yes, but you also require an upgrade from the ME [Management Engine] and a BIOS update. Intel somehow has locked the compatibility.

I said this ages ago, Software and firmware update is all that is really needed but I got shot down and called various names, Not on here but on a Facebook comment.

Greed is all it comes down to.
 
I said this ages ago, Software and firmware update is all that is really needed but I got shot down and called various names, Not on here but on a Facebook comment.

Greed is all it comes down to.

Well you now have definitive proof from the horse's mouth :)
 
I said this ages ago, Software and firmware update is all that is really needed but I got shot down and called various names, Not on here but on a Facebook comment.

Greed is all it comes down to.

You made a sensible comment on facebook... you were just asking for trouble posting there. It's cancer for any public debate.

Anyways... It's an intentional way to push stock value up. Force more sales and shareholders are happy.
 
But you read what you want to read. The Asus guy is asked if he could make it work. And he says yes, but its locked in ME. They don't talk about the physical pins.
If Intel sticks to 2 CPU generations at a time, an next gen is 8 core, then it can be down to the fact that they needed the change to make it forward compatible.

Yes yes i get it, Intel are bad, and yada yada yada. But you are basically discussing stuff you know nothing about. Short answers to questions that could be elaborated about shows that he is not opening up.
 
But you read what you want to read. The Asus guy is asked if he could make it work. And he says yes, but its locked in ME. They don't talk about the physical pins.
If Intel sticks to 2 CPU generations at a time, an next gen is 8 core, then it can be down to the fact that they needed the change to make it forward compatible.

Yes yes i get it, Intel are bad, and yada yada yada. But you are basically discussing stuff you know nothing about. Short answers to questions that could be elaborated about shows that he is not opening up.

Only we do know something about it. For example when Intel came out and said it was due to power. We all know that decent Z270 boards have more than enough phases/can deliver more than enough power for two more Kaby cores and etc.

So yes, some of it was based on facts. If 8 phases could run a FX 8 then it could sure as eggs run 6/12 Kaby because that's basically what it is.

Any way, it gets pedantic when it really isn't. Very clear cut, IMO. "Could it work Mr Asus guy, guy who knows more than most will ever dream about knowing?" "Why yes, yes it would".
 
Then i go an read the link.
Funny that you decided not to bring the next few lines as well? I'll just leave it at that.

bit-tech: The 20 previously unused pins that you mentioned, what are they now used for?
Andrew: Many of them are used for power control. It's possible that these are in preparation for the high-core count processors.
 
Then i go an read the link.
Funny that you decided not to bring the next few lines as well? I'll just leave it at that.

bit-tech: The 20 previously unused pins that you mentioned, what are they now used for?
Andrew: Many of them are used for power control. It's possible that these are in preparation for the high-core count processors.

That is irrelevant for the reasons I just mentioned. IE - if 8 phases could push a 250w+ Piledriver they could easily handle two more Kaby cores from 4.

Maybe some time in the very distant future (because it will probably take Intel another decade to have another two core enema) it may make a tiny bit of difference. But the fact remains, there was no reason, at least not for now, not to release this 6 core chip on Z270. None.

Edit. The bit you posted is a moot point.

bit-tech: So it’s not a physical limitation? Intel said it was to do with power delivery.

Andrew: Not really. It [the power delivery] makes a little bit of difference, but not much.

They had already covered that.
 
That is irrelevant for the reasons I just mentioned. IE - if 8 phases could push a 250w+ Piledriver they could easily handle two more Kaby cores from 4.

Maybe some time in the very distant future (because it will probably take Intel another decade to have another two core enema) it may make a tiny bit of difference. But the fact remains, there was no reason, at least not for now, not to release this 6 core chip on Z270. None.

Edit. The bit you posted is a moot point.

bit-tech: So it’s not a physical limitation? Intel said it was to do with power delivery.

Andrew: Not really. It [the power delivery] makes a little bit of difference, but not much.

They had already covered that.

Lets make a thread guys. We shouldnt flood quick news with this.
 
Then i go an read the link.
Funny that you decided not to bring the next few lines as well? I'll just leave it at that.

bit-tech: The 20 previously unused pins that you mentioned, what are they now used for?
Andrew: Many of them are used for power control. It's possible that these are in preparation for the high-core count processors.

Don't start getting into "you didn't say this or that" etc... it's annoying to read and gets a little childish after a while, There's no need for it.

Make a thread if what someone didn't say is that important to you...
 
Last edited:
Lets make a thread guys. We shouldnt flood quick news with this.

Honestly man there is no point. There's nothing left to argue about. We may never get conclusive proof about the soldering and so on, but this is conclusive. To me at least. The biggest most BA board maker in the business says it would work fine if Intel didn't block it, that's about all she wrote really.

After that it just becomes nonsense (pretty much like the solder gate thread that I will put my hands up to !).
 
I don't know if I was fully sold on all the points he was making. It's been two days since I watched it though so I can't remember what they were. Sometimes Jim is right, sometimes he's glaringly wrong. Rarely ever does he admit that in video form. He might correct himself in a comment, which is great, but sadly most won't see that.
 
I don't know if I was fully sold on all the points he was making. It's been two days since I watched it though so I can't remember what they were. Sometimes Jim is right, sometimes he's glaringly wrong. Rarely ever does he admit that in video form. He might correct himself in a comment, which is great, but sadly most won't see that.

Each manufacturer of motherboards have been leaving MCE (or multi core enhancement) left on auto which means on. This is an Intel feature. What it then does is overclock the CPU to the full single core boost rate. So 4.7ghz in this case. So, when reviewers reviewed the "stock" 8700k it was actually receiving a substantial boost.

But this is great, you may think, as it saves people overclocking right? well first of all it skews reviews and makes the 8700k sound far better than it actually is, and secondly Intel came out with a statement a few days back that said that they were only guaranteeing single core boost frequencies and the base freq, IIRC. In other words if Intel knock out a dodgy batch of CPUs that don't clock as high as the review samples ETC then these speeds will be lower etc.

Just to prove how right Jim was...

https://www.youtube.com/watch?v=zi-zU2p2ykc&t=2s
 
Each manufacturer of motherboards have been leaving MCE (or multi core enhancement) left on auto which means on. This is an Intel feature. What it then does is overclock the CPU to the full single core boost rate. So 4.7ghz in this case. So, when reviewers reviewed the "stock" 8700k it was actually receiving a substantial boost.

But this is great, you may think, as it saves people overclocking right? well first of all it skews reviews and makes the 8700k sound far better than it actually is, and secondly Intel came out with a statement a few days back that said that they were only guaranteeing single core boost frequencies and the base freq, IIRC. In other words if Intel knock out a dodgy batch of CPUs that don't clock as high as the review samples ETC then these speeds will be lower etc.

Just to prove how right Jim was...

https://www.youtube.com/watch?v=zi-zU2p2ykc&t=2s

It's not the MCE debacle I'm talking about; that's obviously an unfair and unwise thing to do since those with inferior cooling will be held back without realising it. They'd have significantly lower performance numbers without knowing why. It's his other points—again I can't remember them, sorry—that he made in the video as well as his seemingly constant attacks against Intel and Nvidia that I sometimes doubt.
 
It's not the MCE debacle I'm talking about; that's obviously an unfair and unwise thing to do since those with inferior cooling will be held back without realising it. They'd have significantly lower performance numbers without knowing why. It's his other points—again I can't remember them, sorry—that he made in the video as well as his seemingly constant attacks against Intel and Nvidia that I sometimes doubt.

TBF if AMD mess up he does say so. And yeah, the MCE thing sucks, especially as Intel will not guarantee any speeds at all. You could end up with a right minger. Kinda like the C? D? batch I7 920. Absolute pants.
 
Back
Top