Ubisoft claims that Assassin's Creed Origins' DRM has now "no perceptible effect" on

I don't believe them. According to some Steam reviews the game is utilizing 60% of the CPU just in the main menu. Something else is going on here.
 
I don't believe them. According to some Steam reviews the game is utilizing 60% of the CPU just in the main menu. Something else is going on here.



imagine if ubisoft installed a CPU minner into your game making them money while you cpu suffer with 100% load
 
I don't believe them. According to some Steam reviews the game is utilizing 60% of the CPU just in the main menu. Something else is going on here.

What's wrong with using a lot of CPU resources? It's like people expecting 10% usage during a 3Dmark physics run.
Games get more demanding. It helps push things forward as we start to use more threads Devs can do more things. The title is just demanding. That could just be by design with all the elements the game has or just not enough optimization in certain areas, which seems to be the issue since I've seen people say in heavily populated areas is when the game struggles.
 
What's wrong with using a lot of CPU resources? It's like people expecting 10% usage during a 3Dmark physics run.
Games get more demanding. It helps push things forward as we start to use more threads Devs can do more things. The title is just demanding. That could just be by design with all the elements the game has or just not enough optimization in certain areas, which seems to be the issue since I've seen people say in heavily populated areas is when the game struggles.
60% of CPU in a menu? What's so demanding in a menu? And how come consoles can run the game without performance issues on their weak CPU's? I'm actually kinda used to Ubisoft's games relying too much on the CPU (Assassin's Creed 3 was another similar case), but this is just retarded.

imagine if ubisoft installed a CPU minner into your game making them money while you cpu suffer with 100% load
The worst thing is that it wouldn't surprise me if they tried that in the future.
 
60% of CPU in a menu? What's so demanding in a menu? And how come consoles can run the game without performance issues on their weak CPU's? I'm actually kinda used to Ubisoft's games relying too much on the CPU (Assassin's Creed 3 was another similar case), but this is just retarded.


The worst thing is that it wouldn't surprise me if they tried that in the future.

You would be surprised how much power a menu can use, I remember digital Foundry using AC Unity's start menu as a fan noise stress test for early PS4/Xbox One consoles.

Smarter developers will lock the framerate in opening game menus, as otherwise, it will use whatever resources it can and get insane framerates and make GPUs squeal/hum.
 
60% of CPU in a menu? What's so demanding in a menu? And how come consoles can run the game without performance issues on their weak CPU's? I'm actually kinda used to Ubisoft's games relying too much on the CPU (Assassin's Creed 3 was another similar case), but this is just retarded.


The worst thing is that it wouldn't surprise me if they tried that in the future.

As WYP said, the menu is no different from the actual game. It's displaying everything in real time just like the gameplay. The difference is pretty much all games have a locked in menu framerate to avoid using unnecessary power. So it's not a big deal. You spend 30 seconds in the menu anyway

As for Consoles, they are capped at 30 anyway. They are designed for such a thing. So there cpus are already maxed out. It's no different. You would see 100% utilization if you had a task manager.
 
So you people think that there's nothing wrong with a game that is using 100% CPU, doesn't fully utilize the GPU and it can barely reach 60fps on high-end hardware?


Did you bother reading Ubisoft's statement? They think that a 100% CPU utilization is OK to ensure steady 30fps gameplay. On what planet is this normal? There is either something else going on in the background or this game literally didn't go through any kind of optimization at all before it was released.
 
So you people think that there's nothing wrong with a game that is using 100% CPU, doesn't fully utilize the GPU and it can barely reach 60fps on high-end hardware?


Did you bother reading Ubisoft's statement? They think that a 100% CPU utilization is OK to ensure steady 30fps gameplay. On what planet is this normal? There is either something else going on in the background or this game literally didn't go through any kind of optimization at all before it was released.

It's not using 100% according to your previous post. It was 60%. Now your overly exaggerating it.
I don't think you are following along well here hence your confusion.
If you read the statement, you would know they were talking about 30fps for the minimum target. The other mention was about being seem less across both minimum and recommended.


The game runs okay on Nvidia but poorly on AMD. Whether that's on Ubi or AMD we don't know.
We aren't saying it's okay to have poor performance but it is definitely being over exaggerated. Especially by you. The menu thing isn't even a big deal, they can cap it in a hot fix. The game itself performs fine until heavily populated areas. Which just happenes to be many areas of the game.
 
It's using 100% during gameplay. It's using 60% in menus.

When I play the Witcher 3 it's at 90%. What is so wrong about 100%? Do you want it to use less? Heck when I play World of Warships I am at 60% and that is far from a complex game. I play Total War which is a CPU heavy game, and it uses anywhere from 60-90% and hits 100% during turn times. Civ uses a lot as well. I don't understand why you are upset over it being 100%? Sure it's not running as high of FPS as other games are but you never really expected it did you? I can't remember the last AC that ran at super high framerates. It's a console game.
 
When I play the Witcher 3 it's at 90%. What is so wrong about 100%? Do you want it to use less? Heck when I play World of Warships I am at 60% and that is far from a complex game. I play Total War which is a CPU heavy game, and it uses anywhere from 60-90% and hits 100% during turn times. Civ uses a lot as well. I don't understand why you are upset over it being 100%? Sure it's not running as high of FPS as other games are but you never really expected it did you? I can't remember the last AC that ran at super high framerates. It's a console game.
Really, because when I play The Witcher 3 on my i5-4590 it doesn't go beyond 70% on all four cores at 1080p. GPU works really hard though, as it should. Hell, I don't think I remember any game that is so CPU intensive. AC3 was, sure. But that game just benefited from high IPC. AC:O doesn't seem to care even if you have a 5Ghz CPU. It's always going to utilize 100% during gameplay.
And by some reports, it's causing framerate drops, overheating and BSOD's and on top of that the game is not utilizing the GPU fully. That is a sign or either no optimization at all or some other issue. If the game wasn't optimized, then at this stage I'm skeptical that anything can be done about it. It would be far better if the issue truly is just the DRM.
 
100% usage for 30FPS on what hardware? On a console targeting 30FPS this is ideal, otherwise it's not at full potential.
 
Back
Top