alienware
Banned
I was going to write this to my blog. Sadly it seems I have (in my infinite wisdom) forgotten the login details to my blog and it uses an email account I no longer have access to. Ah, the marvels of modern living. I would email Blogspot, but usually when you attempt such a thing you are either met with some one who does not comprehend the English language or, worse still, isn't even human.
It all kind of reminds me of the scene in Robocop 2 where he is reading Miranda to a corpse. You may as well be talking to a cadaver for all of the good it will do you trying to communicate with a machine that only really wants to follow you around the net so it knows how to advertise best to you.
Any way, without further ramblings let us continue with the subject at hand.
A pretty crappy Christmas.
Yeah, I am aware I have pinched the line from an episode of South Park. You will have to forgive me as my brain is currently over whelmed with all of the terminal boredom that set in over the festive period.
I refer, of course, to this years' hardware line up for our festive period. Last year I was inundated with excitement as I was in the midst of building a computer. It was nothing over the top, just an I7 950 with triple channel memory and so on, but it had me all excited. It was worth doing too, considering there were a lot of new games around and a buzz going around over PC gaming. That part set me up for the cold winter months, and I then went ahead and gamed solidly until spring came and it was worth leaving the house again.
Last Christmas (I gave you my heart ? was a very exciting time for the computing masses. It hailed new hardware (Sandybridge) and still had its dramatic ups and downs (Sandybridge, up, down, up) and so on. There were new graphics cards that were capable of smashing any game we wanted into submission, as well as actual new games on the horizon for said hardware to come into its own.
There were competitions, an overload of reviews and more than enough to suppress the appetite of the constantly hungry PC gaming crowd.
And then slowly over the course of the year we discovered that all of this new hardware we had invested in wouldn't come into its own.
What I mean is aside from all the over clocking and synthetic shenanigans Sandy bridge delivered no more than the existing range of I7 processors. And the existing range of I7 processors delivered no more than the range of I7 processors that came before them. Maybe if you were into image editing, video editing or something of that nature you could see Sandy bridge as a worthwhile upgrade, but in gaming land, in gaming town it made hardly any difference to PC gaming at all apart from frame rates the eyes can not see.
Still, if you were building a new PC then it was definitely the way to go, but it hardly offered any incentive to those who had existing I7 computers, or, even those who bought the I7 920 way back in November of 2008.
Once again the new processors (Sandy bridge I7 and I5 models) were pitted against the ones before them, and once again they came out on a synthetic high, but back in gaming land and gaming town they made no difference. And they still don't. Go and look at comparisons and you will simply find the same old games and the same old tests being used to portray how fast a new processor is.
The thing is, we had already done all of that before. We looked at the same graphs, running the same tests when we made the switch from Core 2/quad (insert CPU name here) and discovered how capable the I7 920 was. And when we look at what the Sandy bridge processors can do it's all the same, only slightly quicker or better but doing the same thing.
Now call me a cynical old bastard (I am, and I'm bloody proud of it too) but if a CPU can already run a certain test or operation it is good enough for the task, no? Do we really care that much if it's three milliseconds faster at encoding a video? do we really get that excited?
So, the bottom line is (and if you don't agree that's fine, but you're wrong) that a three year and three month old processor is not worth replacing with its current "cousin".
Which if you think about it is quite impressive really. There isn't a game or application known to man that a three year old CPU can not run or can not do. Therefore, unless you can come up with an excuse you'd be pretty silly replacing it.
Sadly in reality it's not quite as impressive as it appears. The depressing reality is that software is not moving forward. Hardware can move forward at a rate of knots, and does, but without the software to show it off properly it's not even worth having really, is it?
So since buying my I7 950 processor and figuring out how to overclock it (and then realising there is no point and putting it back to stock speed) you'll have to excuse me for not pissing in my proverbial pants over the Sandy bridge episode. I did not ride the wave of hype, and I didn't have to experience the ups and downs, and I didn't sit and faff around in my bios over clocking it to make my games go faster at speeds I couldn't see any way.
Other than that we have received nothing that was new (in the real sense) and so we then had to wait for the software.
Did it come?
Well, no, it didn't. It was another year of sequels as usual. We got Crysis 2 (which turned out to be a visual feast but ultimately a crap game, which is of course the important bit !) we got Battlefield 3 which technically even at its peak does nothing more than Battlefield 2 (even in multi player !) and we got Call of Duty Modern Snorefare 3.
The diamond in the rough of course was Skyrim, but even that was a sequel. Again you will have to excuse me for not needing to rush off to the bathroom and change my pants and give myself a sponge down. The problem of course is that if you're not into spells and magic (and all of that sort of stuff) then Skyrim was far less impressive. Personally I've never watched a Harry Potter movie, nor a Lord of the rings movie, nor any other medieval sort of stuff because it doesn't interest me at all.
You say Star Wars, I say Indy Jones.
So that left three major titles that were all definitely sequels to the titles we had been fed the year before. Hardly stuff of knicker pissing legend is it?
So that leads us up to two weeks ago, and where it becomes its most funny of all.
A new range of graphics processor ! Wahoo ?
Well again, not really.
First of all AMD started beating the war drum. They started to spread rumours (like a ten year old child does) of how their new cards were going to be twice the speed of Nvidia's current offerings. People started to get very excited all of a sudden ! Again you will have to excuse the fact that I'm a cynical old bastard, but the first thing out of my mouth, even before the charts began making their way to our screens was -
Oh goody gumdrops ! I bet they'll be benchmarking all of the games we already have to show how fast they are !
And I was right. The charts and graphs came (though TBH AMD really shouldn't hire monkeys to draw their graphs as they look resoundingly poo) showing just how fast we could play all of our existing games, that were mostly shit in the first place, at FPS counts the likes we have never seen !
Jesus, if that was enough to make any man go and break out his rubber piss proof panties then that's got to be it, no?
Actually no.
Look, sorry again for being a miserable old twat, but I don't find the prospect of playing through a pile of games that were mostly shit the first time around at frame rates that my eyes can not decipher from the graphics card I've had for months.
I'm not going to turn this into a boring review of all of the games that they're using to show how powerful their cards are, but let's just say they range from "Mildly shit" to "Totally shit" and leave it there.
So other than being able to play the games we are already able to play what do the new Radeons offer?
You may have noticed that all of the resolutions they are being tested at are higher than 1080p. There's a method to this, but basically that's about all these cards truly have on offer.
If you game at 2560x1600 then you're in for a treat (or that other strange something by 1440 res). So basically the idea is this.
Go out, buy a 27" monitor to replace the 24" monitor you may already have.
Next, go home and burn a hole in your credit card by paying the proposed £450+ for a Radeon 7970.
Get out your old games, install them, and then sit and play through them all again one by one at this new resolution you could not play them at before !
There is, however, one major flaw in that logic.
Firstly the games are all pretty shit. Secondly, even if they weren't we would still have to be able to clear our minds of all of the "Magic Moments" we had during those games and pretend that we haven't played them before. Now this ideology worked quite well for me in Mario 64 on my second play-through, but I have to admit that by play-through 4 the "collect 120 coins on each level to find Yoshi on the roof of the castle" bit was beginning to grate pretty badly.
But that's Mario 64. And no game released since it is worth putting the time into, as they're all just sequels of sequels.
So is that it? is that what we have to look forward to?
Of course not ! because Nvidia will respond ! And when they do we can see Crysis go even faster, or, Battlefield 3 actually make an actual FPS score of 69FPS at a resolution of 2560x1600 ! Amazing, right?
No. It is not.
All of this new hardware is a total waste of time.
Today some one linked me to a Quad fire test of the new Radeon 7970 under LN2. WOW !. Hardly. These Radeons in Quadfire managed to push out a GPU score of over 90,000 points !.
Impressed? no.
What good does a Vantage score do you? Does it let you do something that other people can not do with their existing computers? would it really make you want to own these cards just so you can replicate the results? would you truly see £450 as a worth while investment just to run a synthetic benchmark test and put out a score that every one else who has a similar PC to you can achieve?
Once again it's Groundhog Day. Once again this new hardware is being tested with old software in order to woo us and make us feel that we have no other option than to spend money on it.
It's not going to work.
This all feels eerily reminiscent of the 1980s. You see, what happened was home computing was born. The Sinclair Spectrum, Commodore 64, Amstrad CPC and all of the others came along. Then the games began to appear, and we all went crazy for it. The problem was the hardware. It was coming along so quickly that machines were being out dated before the software for them even appeared. The fight started out quickly and fiercely, but soon enough the user realised he was being had. So, he simply didn't buy into the new hardware and the entire market collapsed from within, an implosion, if you will.
The end result was that Sinclair Research sold to Alan Sugar's Amstrad (who came along and licked up the sloppy seconds) and Acorn (Sinclair's direct competition) went bankrupt.
It all slowed down then. Basically game creators were forced to work with what they had, and game sales chugged on.
The problem as I see it now is that all of this new hardware relies on a secondary gimmick. The gimmicks are quite wide spread, but mainly go as follows.
1. In order to have GPU processing power like the world has never seen, thy must have two graphics cards.
The problem
The problem of course is software yet again. In order for this methodology to work you need drivers that will support the games you intend to play. So if step one is a tick (drivers) then you move onto the game. It is a fact that 90% of games on the market today do not support more than one GPU and never will. The reasons for all of this are obvious so I won't point them out.
So right away POS (point of sale) number one doesn't look too promising really. Let's move onto reason number two at once !
2. 3D gaming, 3D in general.
Firstly a vast chunk of the population can not even see 3D. It's due to a condition called lazy eye. So that means they are simply not going to be interested in buying a graphics cards to play games in a way they can not see. Secondly a vast number of those who can actually see it will either get headaches or suffer from motion sickness and or nausea. Meaning again, you will have to count them out.
3D gaming is also far from perfect, and things like accuracy and smooth ness go out of the window, making a good number of games unplayable.
So if Nvidia think this is a worth while business model I look forward to seeing the receivers heading to their offices.
3. Larger screens.
Again, like all of the other things that are being used as selling points larger screens are not essential to the gaming experience. And, they don't turn games into different games so that the investment feels worth while. Once again you are going to be limited, as what most people don't know (or just ignore) is that the games you are playing at 1600p are using the same textures that the games you are playing at 1080p uses, only stretched. This is, again, because of the mass market. No gaming company is going to sit down and spend time and money on a minority audience, business does not work that way.
So that's about the gist of it really. I would love to say that 2012 is going to be a great year for computer gaming and technology, but it isn't. I think the most exciting part of 2012 will come down to the drama of watching manufacturers go out of business.
If the software can catch the hardware and make full use of it (note, IF, and a bloody big IF at that) then maybe we will see some exciting times ahead.
But I strongly doubt it.
It all kind of reminds me of the scene in Robocop 2 where he is reading Miranda to a corpse. You may as well be talking to a cadaver for all of the good it will do you trying to communicate with a machine that only really wants to follow you around the net so it knows how to advertise best to you.
Any way, without further ramblings let us continue with the subject at hand.
A pretty crappy Christmas.
Yeah, I am aware I have pinched the line from an episode of South Park. You will have to forgive me as my brain is currently over whelmed with all of the terminal boredom that set in over the festive period.
I refer, of course, to this years' hardware line up for our festive period. Last year I was inundated with excitement as I was in the midst of building a computer. It was nothing over the top, just an I7 950 with triple channel memory and so on, but it had me all excited. It was worth doing too, considering there were a lot of new games around and a buzz going around over PC gaming. That part set me up for the cold winter months, and I then went ahead and gamed solidly until spring came and it was worth leaving the house again.
Last Christmas (I gave you my heart ? was a very exciting time for the computing masses. It hailed new hardware (Sandybridge) and still had its dramatic ups and downs (Sandybridge, up, down, up) and so on. There were new graphics cards that were capable of smashing any game we wanted into submission, as well as actual new games on the horizon for said hardware to come into its own.
There were competitions, an overload of reviews and more than enough to suppress the appetite of the constantly hungry PC gaming crowd.
And then slowly over the course of the year we discovered that all of this new hardware we had invested in wouldn't come into its own.
What I mean is aside from all the over clocking and synthetic shenanigans Sandy bridge delivered no more than the existing range of I7 processors. And the existing range of I7 processors delivered no more than the range of I7 processors that came before them. Maybe if you were into image editing, video editing or something of that nature you could see Sandy bridge as a worthwhile upgrade, but in gaming land, in gaming town it made hardly any difference to PC gaming at all apart from frame rates the eyes can not see.
Still, if you were building a new PC then it was definitely the way to go, but it hardly offered any incentive to those who had existing I7 computers, or, even those who bought the I7 920 way back in November of 2008.
Once again the new processors (Sandy bridge I7 and I5 models) were pitted against the ones before them, and once again they came out on a synthetic high, but back in gaming land and gaming town they made no difference. And they still don't. Go and look at comparisons and you will simply find the same old games and the same old tests being used to portray how fast a new processor is.
The thing is, we had already done all of that before. We looked at the same graphs, running the same tests when we made the switch from Core 2/quad (insert CPU name here) and discovered how capable the I7 920 was. And when we look at what the Sandy bridge processors can do it's all the same, only slightly quicker or better but doing the same thing.
Now call me a cynical old bastard (I am, and I'm bloody proud of it too) but if a CPU can already run a certain test or operation it is good enough for the task, no? Do we really care that much if it's three milliseconds faster at encoding a video? do we really get that excited?
So, the bottom line is (and if you don't agree that's fine, but you're wrong) that a three year and three month old processor is not worth replacing with its current "cousin".
Which if you think about it is quite impressive really. There isn't a game or application known to man that a three year old CPU can not run or can not do. Therefore, unless you can come up with an excuse you'd be pretty silly replacing it.
Sadly in reality it's not quite as impressive as it appears. The depressing reality is that software is not moving forward. Hardware can move forward at a rate of knots, and does, but without the software to show it off properly it's not even worth having really, is it?
So since buying my I7 950 processor and figuring out how to overclock it (and then realising there is no point and putting it back to stock speed) you'll have to excuse me for not pissing in my proverbial pants over the Sandy bridge episode. I did not ride the wave of hype, and I didn't have to experience the ups and downs, and I didn't sit and faff around in my bios over clocking it to make my games go faster at speeds I couldn't see any way.
Other than that we have received nothing that was new (in the real sense) and so we then had to wait for the software.
Did it come?
Well, no, it didn't. It was another year of sequels as usual. We got Crysis 2 (which turned out to be a visual feast but ultimately a crap game, which is of course the important bit !) we got Battlefield 3 which technically even at its peak does nothing more than Battlefield 2 (even in multi player !) and we got Call of Duty Modern Snorefare 3.
The diamond in the rough of course was Skyrim, but even that was a sequel. Again you will have to excuse me for not needing to rush off to the bathroom and change my pants and give myself a sponge down. The problem of course is that if you're not into spells and magic (and all of that sort of stuff) then Skyrim was far less impressive. Personally I've never watched a Harry Potter movie, nor a Lord of the rings movie, nor any other medieval sort of stuff because it doesn't interest me at all.
You say Star Wars, I say Indy Jones.
So that left three major titles that were all definitely sequels to the titles we had been fed the year before. Hardly stuff of knicker pissing legend is it?
So that leads us up to two weeks ago, and where it becomes its most funny of all.
A new range of graphics processor ! Wahoo ?
Well again, not really.
First of all AMD started beating the war drum. They started to spread rumours (like a ten year old child does) of how their new cards were going to be twice the speed of Nvidia's current offerings. People started to get very excited all of a sudden ! Again you will have to excuse the fact that I'm a cynical old bastard, but the first thing out of my mouth, even before the charts began making their way to our screens was -
Oh goody gumdrops ! I bet they'll be benchmarking all of the games we already have to show how fast they are !
And I was right. The charts and graphs came (though TBH AMD really shouldn't hire monkeys to draw their graphs as they look resoundingly poo) showing just how fast we could play all of our existing games, that were mostly shit in the first place, at FPS counts the likes we have never seen !
Jesus, if that was enough to make any man go and break out his rubber piss proof panties then that's got to be it, no?
Actually no.
Look, sorry again for being a miserable old twat, but I don't find the prospect of playing through a pile of games that were mostly shit the first time around at frame rates that my eyes can not decipher from the graphics card I've had for months.
I'm not going to turn this into a boring review of all of the games that they're using to show how powerful their cards are, but let's just say they range from "Mildly shit" to "Totally shit" and leave it there.
So other than being able to play the games we are already able to play what do the new Radeons offer?
You may have noticed that all of the resolutions they are being tested at are higher than 1080p. There's a method to this, but basically that's about all these cards truly have on offer.
If you game at 2560x1600 then you're in for a treat (or that other strange something by 1440 res). So basically the idea is this.
Go out, buy a 27" monitor to replace the 24" monitor you may already have.
Next, go home and burn a hole in your credit card by paying the proposed £450+ for a Radeon 7970.
Get out your old games, install them, and then sit and play through them all again one by one at this new resolution you could not play them at before !
There is, however, one major flaw in that logic.
Firstly the games are all pretty shit. Secondly, even if they weren't we would still have to be able to clear our minds of all of the "Magic Moments" we had during those games and pretend that we haven't played them before. Now this ideology worked quite well for me in Mario 64 on my second play-through, but I have to admit that by play-through 4 the "collect 120 coins on each level to find Yoshi on the roof of the castle" bit was beginning to grate pretty badly.
But that's Mario 64. And no game released since it is worth putting the time into, as they're all just sequels of sequels.
So is that it? is that what we have to look forward to?
Of course not ! because Nvidia will respond ! And when they do we can see Crysis go even faster, or, Battlefield 3 actually make an actual FPS score of 69FPS at a resolution of 2560x1600 ! Amazing, right?
No. It is not.
All of this new hardware is a total waste of time.
Today some one linked me to a Quad fire test of the new Radeon 7970 under LN2. WOW !. Hardly. These Radeons in Quadfire managed to push out a GPU score of over 90,000 points !.
Impressed? no.
What good does a Vantage score do you? Does it let you do something that other people can not do with their existing computers? would it really make you want to own these cards just so you can replicate the results? would you truly see £450 as a worth while investment just to run a synthetic benchmark test and put out a score that every one else who has a similar PC to you can achieve?
Once again it's Groundhog Day. Once again this new hardware is being tested with old software in order to woo us and make us feel that we have no other option than to spend money on it.
It's not going to work.
This all feels eerily reminiscent of the 1980s. You see, what happened was home computing was born. The Sinclair Spectrum, Commodore 64, Amstrad CPC and all of the others came along. Then the games began to appear, and we all went crazy for it. The problem was the hardware. It was coming along so quickly that machines were being out dated before the software for them even appeared. The fight started out quickly and fiercely, but soon enough the user realised he was being had. So, he simply didn't buy into the new hardware and the entire market collapsed from within, an implosion, if you will.
The end result was that Sinclair Research sold to Alan Sugar's Amstrad (who came along and licked up the sloppy seconds) and Acorn (Sinclair's direct competition) went bankrupt.
It all slowed down then. Basically game creators were forced to work with what they had, and game sales chugged on.
The problem as I see it now is that all of this new hardware relies on a secondary gimmick. The gimmicks are quite wide spread, but mainly go as follows.
1. In order to have GPU processing power like the world has never seen, thy must have two graphics cards.
The problem
The problem of course is software yet again. In order for this methodology to work you need drivers that will support the games you intend to play. So if step one is a tick (drivers) then you move onto the game. It is a fact that 90% of games on the market today do not support more than one GPU and never will. The reasons for all of this are obvious so I won't point them out.
So right away POS (point of sale) number one doesn't look too promising really. Let's move onto reason number two at once !
2. 3D gaming, 3D in general.
Firstly a vast chunk of the population can not even see 3D. It's due to a condition called lazy eye. So that means they are simply not going to be interested in buying a graphics cards to play games in a way they can not see. Secondly a vast number of those who can actually see it will either get headaches or suffer from motion sickness and or nausea. Meaning again, you will have to count them out.
3D gaming is also far from perfect, and things like accuracy and smooth ness go out of the window, making a good number of games unplayable.
So if Nvidia think this is a worth while business model I look forward to seeing the receivers heading to their offices.
3. Larger screens.
Again, like all of the other things that are being used as selling points larger screens are not essential to the gaming experience. And, they don't turn games into different games so that the investment feels worth while. Once again you are going to be limited, as what most people don't know (or just ignore) is that the games you are playing at 1600p are using the same textures that the games you are playing at 1080p uses, only stretched. This is, again, because of the mass market. No gaming company is going to sit down and spend time and money on a minority audience, business does not work that way.
So that's about the gist of it really. I would love to say that 2012 is going to be a great year for computer gaming and technology, but it isn't. I think the most exciting part of 2012 will come down to the drama of watching manufacturers go out of business.
If the software can catch the hardware and make full use of it (note, IF, and a bloody big IF at that) then maybe we will see some exciting times ahead.
But I strongly doubt it.