Assassin's Creed dev thinks industry is dropping 60 fps standard

Status
Not open for further replies.
its 900p :(:(

Lets do 640x480 at 30fps! mario 3 looked fine!

having said that if you said 480p some would think its high deff!
 
Ultimately I blame AMD and the mediocre toss they put out. Sony et al buy their shit because it's cheap. Devs have to make do. And before they spend loads of money on doing a decent port to hardware that's clearly more than capable of running 60fps and beyond, they will try and market a shitty port as "cinematic" to save money.

AMD, go shove your mantle and shitty APUs up your arse.

After reading this post, you sir have lost all credibility. How the hell is it AMDs fault that Sony and MS both asked for nearly the same product? All AMD did was make it then sell it to them.

Consoles (more so PS4) can do 1080p 60fps. Its just going to take time. At the release of both they were still running beta dev kits and not finalized kits. Even then overtime all the software will improve making it easier.

Also your last sentence is quite funny. Consoles don't use Mantle. You clearly no nothing on the matter. Also its a custom APU and not a straight up desktop one. Again you know nothing. Just stop trying. Just because 1big AAA game couldn't do it, doesn't mean you go on and on in two different threads and bashing (or "defending" as you say).
 
I don't think this issue is all that clear cut. The resolution thing bothers me and I feel it's a cop out but not so much the framerate.

There is definitely a case for framerates giving a certain feel and texture to the way a game looks and plays. Straight up, the more complex a scene is to render, the slower it going to be on a given piece of hardware. I recently upgraded to a GTX980 and played Black Flag, and I honestly do not like how the game feels to play when it's running at 60fps. It feels fake and loose somehow. I would much rather limit my framerate to 30 and crank up quality and effects and keep it a consistent 30 than go for a variable high fps.

Of course I don't feel like that about all games. Some definitely look, play and feel better at 60 or more, case in point FPS games. Unless input lag and subtle aids are added to smooth movement of the camera, 30fps for shooters is pants. Definitely depends on the game and I am in agreement about assassins creed and the devs.

I do think it's disappointing that it's a hardware limitation over an artistic vision, but I also feel that we've come to expect too much from console hardware. It's not keeping the industry back because the game isn't going to be limited in the same way on the PC, it's just it isn't feasible to put hardware into consoles that's powerful enough to do that when the PCs they're being compared to are monsters that cost several times more than these consoles do.
 
the hardware needed to do 1080p 60fps on a properly optimized game really is not that expensive.
The 8 core jaguar processor they chose is just feeble. always was and i have no idea why they chose it. a 6 core fx cpu would have been better.
then the gpu they chose is ALMOST just about good enough.
The pc version would have been just about good enough. the mobile version would have been a little underpowered but could have probably been utilized well with optimising and then the one they put in it even the ps4 which is slightly better than the xbox one.. is just a underpowered version of the under powered version..

Then you have the shared memory.. there is a lot of it and its gddr so its fast. but i would imagine that it would have been better with dedicated memory for system and gpu.
With the bulk quantity of these things that they buy and order they arent paying full retail price for them. so yes it would have been a bit more expensive per console to put in the slightly better hardware. but i cant possibly imagine it would be that much more expensive.
its something that annoyed me about these consoles from the start. they over estimated the ability of the components later with api advancments on the systems and some updates "xbox one should get dx12 and so on" then they will probably be able to do a bit better. but they should have set them up so they could do it now and then get better still. not set them up so later hopefully they can do what they should have from the start..

you know something is wrong when you can build a pc at retail prices for the same price as a console (no os, terrible case, bearly adequate psu but is adequate) and it can perform the same or better than the console.
that should not happen and can only happen because of the specs.

the 30fps debate i cannot get in to.
i would rather play at 1024x768 at a constant 60fps than 1080p at 30fps.
its the ONLY reason i upgrade my gpu. all i want is a constant 60fps. its a smother better expirience.
i cant even imagine playing assetto corsa at 30fps.
even my RTS games like the total war seriese i have to have a constant frame rate on them and do my best to get to 60fps by changing settings..
you could argue you dont need high frame rates on those types of games.. but you do really if you want to play it properly.

real life is a infinite ammount of frames per second. im prety sure humans can see over 200 frames per second and do so all day every day, but our brain filters out lots of the stuff we dont need to know about untill there is an emergency life or death situation. "then people say things like everything went in slow motion/black and white" because right then your brain says ok we need every single bit of information we can get out of this situation so we can save our selfs.

thats just what we see and what our brain choses to report back to us. so on that point all games should be over 200fps.
but when it comes to input, fluidity and generall feel of the game. i do say you need a minimum constant 60fps. and if i can i will set 60fps with tripple buffer v synk too.

30fps bearly acceptable in any game. the playstation one was doing 60fps. racing games and fighting games like tekken.

the subjective feel of 24-25 fps is something we have been conditioned too and you would NEVER find it in real life.
30fps is just horrible. it neither emulates the conditioned 24fps. and does not allow the fluidity of 60fps+
30fps is like walking arround all day blinking 100+ times a second. the more fps you have the better it will be.
And untill we push for 300fps avarage we will never fully have the feeling of reality in gaming.
 
Last edited:
The fps debate isn't as simple as that though Shambles. A lot of it comes to personal preference and what the game is doing. Yes, low frame rate isn't true to life or realistic in that sense. However it can have a huge impact on how we perceive what we're seeing.

I saw the Hobbit in HFR 3D and whilst the image popped, it gave the film a quality that I really hated. It made the sets look like sets, it made the make up look like make up and made the costumes look like costumes, at least to me. I wouldn't go see another movie in high frame rate for that reason. So it's not as simple as realism because high framerate gives a truer to life perception but there are cases against doing that. I don't want my fantasy adventure to look so real that I can clearly see that it's fake. I want it to look amazing, cinematic and cool. For films that means 24fps and for some games that means 30fps.
 
I don't want my fantasy adventure to look so real that I can clearly see that it's fake. I want it to look amazing, cinematic and cool. For films that means 24fps and for some games that means 30fps.

You're really going to compare games and films? REALLY
 
I saw the Hobbit in HFR 3D and whilst the image popped, it gave the film a quality that I really hated. It made the sets look like sets, it made the make up look like make up and made the costumes look like costumes

so your argument is..
30fps looks worse so its better because if it looks less realistic you cant tell that its fake?:confused:

vs every one elses argument of.
60+ fps is better because it looks and feels better and they just need to make the games look more realistic now..

i really dont know how people can argue that having individual pictures that are being displayed at "if we are honest" about 10% of what they really should be displayed at is better.

back to your movie thing..

if the sets werent made so poorly. the makeup wasnt make up and was actuall faces and so on. and the costumes werent costumes at all.. it would have been fine??

because that has nothing to do with the FPS..

now let me just make one point about film vs games..
i can watch a film up to the point where a CGI explosion goes off..
or they put in some cgi bullet and so on.. Then i have to turn it off cant stand that. i much prefered it when they just set off 500 gallons of petroll "club me with a whale if you like for killing the planet"
But in a game its perfectly fine. it fits in the game...
so you cant compare one to the other on asthetics or fps..
and even then i still stand by my point of everything should be 300+ fps.
Tricking your mind in to seeing a seriese of images as a flowing occurance is all fine. but why would you say "this is the minimum you need for that to happen. so lets use that as our standard"
it should be progressivly getting closer to what the eye actually can cope with.
and this "yeah lets just use 30fps" debate is just going to hinder progress if they get enough people on board.
 
Last edited:
so your argument is..
30fps looks worse so its better because if it looks less realistic you cant tell that its fake?:confused:

vs every one elses argument of.
60+ fps is better because it looks and feels better and they just need to make the games look more realistic now..

No, my argument is that different framerates give a different feel to the image. I said earlier that 60fps is much better for certain types of games. I'm not attacking anything just explaining that it's not clear cut because it's subjective.

People just acting like they're butt hurt over what I'm saying. I've got a high end system and a ps4 so I'm not trumpeting either 'side' and don't have any agenda aside from voicing my own opinion on the matter, which I believe is the point of this topic "what do you think".
 
It's not like I compared a running shoe to a piece of lettuce for God sake.

Apples and oranges. Both are vaguely the same (fruit/visual media) but they're different in experience and interaction. :)

The hobbit looked awful when they did that high fps version (looked more like a soap opera), I'll give you that.

Low fps does not make games more cinematic however. If you want to make a game look more cinematic you create that through textures, filters et all. Point in hand, Skyrim. Skyrim @30fps is not any more cinematic then Skyrim @60fps. Its the core material, not the FPS that will create a feeling. If it made it more cinematic then you'd be talking about 24fps, as thats cinema fps, not 30 :)
 
No, my argument is that different framerates give a different feel to the image. I said earlier that 60fps is much better for certain types of games. I'm not attacking anything just explaining that it's not clear cut because it's subjective.

People just acting like they're butt hurt over what I'm saying. I've got a high end system and a ps4 so I'm not trumpeting either 'side' and don't have any agenda aside from voicing my own opinion on the matter, which I believe is the point of this topic "what do you think".

I get your point and it's valid. FPS is very subjective.

The main issue with games as opposed to films is the change in image based on input. I like fluid animations based on a smaller timestep but again it's subjective.
 
i really do want to argue that its nothing to do with how a game looks. its all to do with how it feels.
asetto corsa at 30fps would suck. skyrim at 30fps would suck.. any modern game i have in my libary would suck..
they would not feel as fluid they would not be as responsive. you would notice that it visually dosent flow right (even if its not stuttering). but that really is not the main thing about it. it is how it feels and plays and how imersive it will be..
and games really are about immersion.
i wish some one would make a film at 60fps all the way through. not using a High deff camera so you dont see the make up is make up.
dont tell people they did it. and see what the people think of it.
im sure they wouldnt really notice it as a bad thing.
 
Haters gonna hate, players gonna play, dev's gonna dev (kinda) and gamers are going to complain about it on the internet. Fact.

I think a lot of people just buy console games pretty blind and honestly don't care, they are more likely to be impressed by fidelity over FPS. I hate it when people make up resolutions though that's BS, nobody has a 1600x900 TV, nobody will ever see it natively, that would annoy me far more than low fps. GTA V looks far superior on a 1440p display (exactly double native) than a 1080p display irrespective of pixel density.

JR
 
The fps debate isn't as simple as that though Shambles. A lot of it comes to personal preference and what the game is doing. Yes, low frame rate isn't true to life or realistic in that sense. However it can have a huge impact on how we perceive what we're seeing.

I saw the Hobbit in HFR 3D and whilst the image popped, it gave the film a quality that I really hated. It made the sets look like sets, it made the make up look like make up and made the costumes look like costumes, at least to me. I wouldn't go see another movie in high frame rate for that reason. So it's not as simple as realism because high framerate gives a truer to life perception but there are cases against doing that. I don't want my fantasy adventure to look so real that I can clearly see that it's fake. I want it to look amazing, cinematic and cool. For films that means 24fps and for some games that means 30fps.

The only reason for why you perceived films differently is that you are used to 24fps movies. By that logic we should still be watching movies in black/white in 320x240 because HD and colors take away the cinematic feeling and make it easier to spot CGI, etc..
Fucking ridiculous argument.
 
I think some one should make this debate pop up on my face book so i can share it and see if any one wants to come join in, and get more oppinions on the matter.
 
The only reason for why you perceived films differently is that you are used to 24fps movies. By that logic we should still be watching movies in black/white in 320x240 because HD and colors take away the cinematic feeling and make it easier to spot CGI, etc..
Fucking ridiculous argument.

Lol no. When colour came about - it looked better, when HD came - it looked better, when 4K came out - it looked even better. When a high FPS film came out - it did not look better (according to the majority).
 
Lol no. When colour came about - it looked better, when HD came - it looked better, when 4K came out - it looked even better. When a high FPS film came out - it did not look better (according to the majority).

i think thats because of the rates they used more than it being a higher rate.
24fps is the minimum we need.. so we cope with that
48 fps is just a multiple of that. it has no relivance, there will be a better frame rate for film but thats not really the point. That may be the argument DEVS are using. but it has nothing to do with capping games other than being a convinient excuse.

I dont like cgi in movies cant stand them cant watch them. they didnt say well lets go back to doing it properly because it looks better.. (that would cost money) they said they will get used to it after a while and we can try to make it look better..
just because they did one film with a rediculous frame rate they pulled out of their hoo-ha's, now devs can argue games should be 30fps.. (because its easier for us)
its infuriating..
no one in the history of gaming had ever said 60fps makes this game look worse than 30fps before..
It has always been the more fps the better for a good reason.

this debate makes me want to rage lol.
keyboard-smash-o.gif
 
Last edited:
Apples and oranges. Both are vaguely the same (fruit/visual media) but they're different in experience and interaction. :)

The hobbit looked awful when they did that high fps version (looked more like a soap opera), I'll give you that.

Low fps does not make games more cinematic however. If you want to make a game look more cinematic you create that through textures, filters et all. Point in hand, Skyrim. Skyrim @30fps is not any more cinematic then Skyrim @60fps. Its the core material, not the FPS that will create a feeling. If it made it more cinematic then you'd be talking about 24fps, as thats cinema fps, not 30 :)

I realise cinema fps is 24 and referred to that in an earlier post.

Perhaps cinematic is the wrong word to use, but we're just debating semantics really. What I'm trying to get at is that whilst it might not be more cinematic, 30 gives a very different feel to 60, which can be borne of a technical limitation but can also be good. It's hard to debate why I prefer Black Flag at 30 over 60 I prefer the feel, I can't quantify that.

The fact is though that whilst I understand why many would not agree with me, most people are just arguing because obviously a higher number is better to them, be it resolution, framerate etc. All I'm saying is that it's not a universal truth.
 
I dont like cgi in movies cant stand them cant watch them. they didnt say well lets go back to doing it properly because it looks better.. (that would cost money) they said they will get used to it after a while and we can try to make it look better..
just because they did one film with a rediculous frame rate they pulled out of their hoo has now people can argue games should be 30fps.. (because its easier for us)
its infuriating..
no one in the history of gaming had ever said 60fps makes this game look worse than 30fps before..
It has always been the more fps the better for a good reason.

I prefer games to be at least 60FPS - actually I struggle to play games any less.

I will say that the comment that games don't look any better at 60FPS might be right but to me they play like shit. The other comment about achieving 60FPS being hard on consoles is true though, and completely depends on the type of game and what other systems are simulating.
 
Status
Not open for further replies.
Back
Top