All caught up ?

alienware

Banned
I was going to write this to my blog. Sadly it seems I have (in my infinite wisdom) forgotten the login details to my blog and it uses an email account I no longer have access to. Ah, the marvels of modern living. I would email Blogspot, but usually when you attempt such a thing you are either met with some one who does not comprehend the English language or, worse still, isn't even human.

It all kind of reminds me of the scene in Robocop 2 where he is reading Miranda to a corpse. You may as well be talking to a cadaver for all of the good it will do you trying to communicate with a machine that only really wants to follow you around the net so it knows how to advertise best to you.

Any way, without further ramblings let us continue with the subject at hand.

A pretty crappy Christmas.

Yeah, I am aware I have pinched the line from an episode of South Park. You will have to forgive me as my brain is currently over whelmed with all of the terminal boredom that set in over the festive period.

I refer, of course, to this years' hardware line up for our festive period. Last year I was inundated with excitement as I was in the midst of building a computer. It was nothing over the top, just an I7 950 with triple channel memory and so on, but it had me all excited. It was worth doing too, considering there were a lot of new games around and a buzz going around over PC gaming. That part set me up for the cold winter months, and I then went ahead and gamed solidly until spring came and it was worth leaving the house again.

Last Christmas (I gave you my heart ? was a very exciting time for the computing masses. It hailed new hardware (Sandybridge) and still had its dramatic ups and downs (Sandybridge, up, down, up) and so on. There were new graphics cards that were capable of smashing any game we wanted into submission, as well as actual new games on the horizon for said hardware to come into its own.

There were competitions, an overload of reviews and more than enough to suppress the appetite of the constantly hungry PC gaming crowd.

And then slowly over the course of the year we discovered that all of this new hardware we had invested in wouldn't come into its own.

What I mean is aside from all the over clocking and synthetic shenanigans Sandy bridge delivered no more than the existing range of I7 processors. And the existing range of I7 processors delivered no more than the range of I7 processors that came before them. Maybe if you were into image editing, video editing or something of that nature you could see Sandy bridge as a worthwhile upgrade, but in gaming land, in gaming town it made hardly any difference to PC gaming at all apart from frame rates the eyes can not see.

Still, if you were building a new PC then it was definitely the way to go, but it hardly offered any incentive to those who had existing I7 computers, or, even those who bought the I7 920 way back in November of 2008.

Once again the new processors (Sandy bridge I7 and I5 models) were pitted against the ones before them, and once again they came out on a synthetic high, but back in gaming land and gaming town they made no difference. And they still don't. Go and look at comparisons and you will simply find the same old games and the same old tests being used to portray how fast a new processor is.

The thing is, we had already done all of that before. We looked at the same graphs, running the same tests when we made the switch from Core 2/quad (insert CPU name here) and discovered how capable the I7 920 was. And when we look at what the Sandy bridge processors can do it's all the same, only slightly quicker or better but doing the same thing.

Now call me a cynical old bastard (I am, and I'm bloody proud of it too) but if a CPU can already run a certain test or operation it is good enough for the task, no? Do we really care that much if it's three milliseconds faster at encoding a video? do we really get that excited?

So, the bottom line is (and if you don't agree that's fine, but you're wrong) that a three year and three month old processor is not worth replacing with its current "cousin".

Which if you think about it is quite impressive really. There isn't a game or application known to man that a three year old CPU can not run or can not do. Therefore, unless you can come up with an excuse you'd be pretty silly replacing it.

Sadly in reality it's not quite as impressive as it appears. The depressing reality is that software is not moving forward. Hardware can move forward at a rate of knots, and does, but without the software to show it off properly it's not even worth having really, is it?

So since buying my I7 950 processor and figuring out how to overclock it (and then realising there is no point and putting it back to stock speed) you'll have to excuse me for not pissing in my proverbial pants over the Sandy bridge episode. I did not ride the wave of hype, and I didn't have to experience the ups and downs, and I didn't sit and faff around in my bios over clocking it to make my games go faster at speeds I couldn't see any way.

Other than that we have received nothing that was new (in the real sense) and so we then had to wait for the software.

Did it come?

Well, no, it didn't. It was another year of sequels as usual. We got Crysis 2 (which turned out to be a visual feast but ultimately a crap game, which is of course the important bit !) we got Battlefield 3 which technically even at its peak does nothing more than Battlefield 2 (even in multi player !) and we got Call of Duty Modern Snorefare 3.

The diamond in the rough of course was Skyrim, but even that was a sequel. Again you will have to excuse me for not needing to rush off to the bathroom and change my pants and give myself a sponge down. The problem of course is that if you're not into spells and magic (and all of that sort of stuff) then Skyrim was far less impressive. Personally I've never watched a Harry Potter movie, nor a Lord of the rings movie, nor any other medieval sort of stuff because it doesn't interest me at all.

You say Star Wars, I say Indy Jones.

So that left three major titles that were all definitely sequels to the titles we had been fed the year before. Hardly stuff of knicker pissing legend is it?

So that leads us up to two weeks ago, and where it becomes its most funny of all.

A new range of graphics processor ! Wahoo ?

Well again, not really.

First of all AMD started beating the war drum. They started to spread rumours (like a ten year old child does) of how their new cards were going to be twice the speed of Nvidia's current offerings. People started to get very excited all of a sudden ! Again you will have to excuse the fact that I'm a cynical old bastard, but the first thing out of my mouth, even before the charts began making their way to our screens was -

Oh goody gumdrops ! I bet they'll be benchmarking all of the games we already have to show how fast they are !

And I was right. The charts and graphs came (though TBH AMD really shouldn't hire monkeys to draw their graphs as they look resoundingly poo) showing just how fast we could play all of our existing games, that were mostly shit in the first place, at FPS counts the likes we have never seen !

Jesus, if that was enough to make any man go and break out his rubber piss proof panties then that's got to be it, no?

Actually no.

Look, sorry again for being a miserable old twat, but I don't find the prospect of playing through a pile of games that were mostly shit the first time around at frame rates that my eyes can not decipher from the graphics card I've had for months.

I'm not going to turn this into a boring review of all of the games that they're using to show how powerful their cards are, but let's just say they range from "Mildly shit" to "Totally shit" and leave it there.

So other than being able to play the games we are already able to play what do the new Radeons offer?

You may have noticed that all of the resolutions they are being tested at are higher than 1080p. There's a method to this, but basically that's about all these cards truly have on offer.

If you game at 2560x1600 then you're in for a treat (or that other strange something by 1440 res). So basically the idea is this.

Go out, buy a 27" monitor to replace the 24" monitor you may already have.

Next, go home and burn a hole in your credit card by paying the proposed £450+ for a Radeon 7970.

Get out your old games, install them, and then sit and play through them all again one by one at this new resolution you could not play them at before !

There is, however, one major flaw in that logic.

Firstly the games are all pretty shit. Secondly, even if they weren't we would still have to be able to clear our minds of all of the "Magic Moments" we had during those games and pretend that we haven't played them before. Now this ideology worked quite well for me in Mario 64 on my second play-through, but I have to admit that by play-through 4 the "collect 120 coins on each level to find Yoshi on the roof of the castle" bit was beginning to grate pretty badly.

But that's Mario 64. And no game released since it is worth putting the time into, as they're all just sequels of sequels.

So is that it? is that what we have to look forward to?

Of course not ! because Nvidia will respond ! And when they do we can see Crysis go even faster, or, Battlefield 3 actually make an actual FPS score of 69FPS at a resolution of 2560x1600 ! Amazing, right?

No. It is not.

All of this new hardware is a total waste of time.

Today some one linked me to a Quad fire test of the new Radeon 7970 under LN2. WOW !. Hardly. These Radeons in Quadfire managed to push out a GPU score of over 90,000 points !.

Impressed? no.

What good does a Vantage score do you? Does it let you do something that other people can not do with their existing computers? would it really make you want to own these cards just so you can replicate the results? would you truly see £450 as a worth while investment just to run a synthetic benchmark test and put out a score that every one else who has a similar PC to you can achieve?

Once again it's Groundhog Day. Once again this new hardware is being tested with old software in order to woo us and make us feel that we have no other option than to spend money on it.

It's not going to work.

This all feels eerily reminiscent of the 1980s. You see, what happened was home computing was born. The Sinclair Spectrum, Commodore 64, Amstrad CPC and all of the others came along. Then the games began to appear, and we all went crazy for it. The problem was the hardware. It was coming along so quickly that machines were being out dated before the software for them even appeared. The fight started out quickly and fiercely, but soon enough the user realised he was being had. So, he simply didn't buy into the new hardware and the entire market collapsed from within, an implosion, if you will.

The end result was that Sinclair Research sold to Alan Sugar's Amstrad (who came along and licked up the sloppy seconds) and Acorn (Sinclair's direct competition) went bankrupt.

It all slowed down then. Basically game creators were forced to work with what they had, and game sales chugged on.

The problem as I see it now is that all of this new hardware relies on a secondary gimmick. The gimmicks are quite wide spread, but mainly go as follows.

1. In order to have GPU processing power like the world has never seen, thy must have two graphics cards.

The problem

The problem of course is software yet again. In order for this methodology to work you need drivers that will support the games you intend to play. So if step one is a tick (drivers) then you move onto the game. It is a fact that 90% of games on the market today do not support more than one GPU and never will. The reasons for all of this are obvious so I won't point them out.

So right away POS (point of sale) number one doesn't look too promising really. Let's move onto reason number two at once !

2. 3D gaming, 3D in general.

Firstly a vast chunk of the population can not even see 3D. It's due to a condition called lazy eye. So that means they are simply not going to be interested in buying a graphics cards to play games in a way they can not see. Secondly a vast number of those who can actually see it will either get headaches or suffer from motion sickness and or nausea. Meaning again, you will have to count them out.

3D gaming is also far from perfect, and things like accuracy and smooth ness go out of the window, making a good number of games unplayable.

So if Nvidia think this is a worth while business model I look forward to seeing the receivers heading to their offices.

3. Larger screens.

Again, like all of the other things that are being used as selling points larger screens are not essential to the gaming experience. And, they don't turn games into different games so that the investment feels worth while. Once again you are going to be limited, as what most people don't know (or just ignore) is that the games you are playing at 1600p are using the same textures that the games you are playing at 1080p uses, only stretched. This is, again, because of the mass market. No gaming company is going to sit down and spend time and money on a minority audience, business does not work that way.

So that's about the gist of it really. I would love to say that 2012 is going to be a great year for computer gaming and technology, but it isn't. I think the most exciting part of 2012 will come down to the drama of watching manufacturers go out of business.

If the software can catch the hardware and make full use of it (note, IF, and a bloody big IF at that) then maybe we will see some exciting times ahead.

But I strongly doubt it.
 
How has the high end ever been any different? You pay a load more for a 3% increase because you are an enthusiast not because it makes financial sense. Gaming right now is bigger than it has ever been. More money is spent on gaming now than activities people used to do (like going to the movies, going out etc).

The new games aren't holding your interest, but sales would suggest you are a minority. I agree with you that games don't have the same feel or don't leave you with that...feeling... for lack of a better term. I think that comes down to our age. You grow out of things you used to love. I still find a gem of a game every now and then, but not the level I used to.
 
erm, its took you till now to notice this dude?

gaming has been stagnant since the current gen of snot boxes came out, hell even those who was known for pushing the envelope of pc games have gone back and made softer engines for them (crysis 1 on consoles now with new engine) so they can make some dosh too.

we are at the point now where this years pc will become all thats needed for the next 5 or so again as the consoles will refresh with current pc spec and then games will be made for that hardware for the next few years, same old same old.

edit

+ for the post tho dude, good read
biggrin.gif
 
Games now are no better than they ever have been.

If I load up say, Battlefield 3 in single player it is technically no better (infact, it's worse) than Half Life 2. Visually it may be better, but that's not important really.

I had more fun out of Duke Nukem Forever than any other game this year. Quite simply put it had a good amount of single player content with some classic ideas. I don't see pressing buttons in sequence as either a good idea, nor a new one. SIMON did that in 1980 odd.

Infact going back to Battlefield 3 the levels are actually very linear. So that's either down to a limitation with hardware (VRAM or CPU power) or, because they couldn't be bothered.

Either way the hardware shouldn't be a limitation, and isn't. If they can't utilise what we have (GPUs so powerful that they can out pace a CPU) then we're in trouble.

Obviously I'm not going to sit here and post statistics or research actual game sales numbers, but quite obviously they have dropped over last year's. This is simply down to a recession. All I can do is make a very quick judgement on myself personally, and this year I purchased and played about four games for the entire year.

That's about a third of what I purchased and played the year before, and it simply boils down to what was available. Hardly anything.

As for new games? they're not new games are they? they are repeats, re-runs and sequels. The last "new game" I can recall was Fallout 3. New Vegas was a sequel, but given that it was completely and utterly different to Fallout 3 (different location, people, graphics) and actually had some original content I think I'll forgive it and say it was a worthwhile purchase.

Cast your mind back to 1999 or so when Half Life came out. Then, think about your expectations for gaming ten years on.

Is it there? is it heck. If the best they can do are GPU guzzling graphics then we are in a pretty sorry state of affairs.

erm, its took you till now to notice this dude?

gaming has been stagnant since the current gen of snot boxes came out, hell even those who was known for pushing the envelope of pc games have gone back and made softer engines for them (crysis 1 on consoles now with new engine) so they can make some dosh too.

we are at the point now where this years pc will become all thats needed for the next 5 or so again as the consoles will refresh with current pc spec and then games will be made for that hardware for the next few years, same old same old.

edit

+ for the post tho dude, good read
biggrin.gif

Nah it's just taken me this long to write about it
biggrin.gif


What's hilarious is that if you go to a site like IGN and load up the forth coming games or to be released bit there's bugger all there really. They haven't even gotten around to removing the ones that came out in 2011.
 
Everyone wants something new. You always hear someone say, "This game brought nothing new to the table." I like old solid gameplay mechanics. I hate open world sandbox games (there are exceptions). I love linear shooters. Give me 1000 clones of Half-life with better visuals and a compelling story line and I am sold, just don't cut the game time down to 5 hours dammit!!

I don't always want something new and fresh. I like the PC because for most games I play I already know how to play them. I don't have to learn a new control scheme. I can jump right in the action and have fun right away.

You're right, hardware shouldn't be a limitation. Right now the only limitation is the almighty dollar.
 
Wow, I feel depressed and feel like topping myself after reading that.

Could you BE anymore negative. (in a Chandler Bing voice)

What is with all this "why do we need all these advancements for if what we have now already does the job" and "all new hardware is a waste of time"

By that logic we should just not bother to constantly push technology and performance to develop things that perform better than old technology but by using less recourses and costing less ?

You are making out as though this is all new and it hasn't been going on for years.

If you already have something that already does the job then why are you moaning, no one is forcing you to buy these new things. But the people who don't have something that can already run things and are buying the new stuff then it IS worth it to them.

This is a forum full of people that like new tech and are interested in seeing how things develop and new advancements, whether those advancement are producing something that performs slightly better than before at a lower cost using less power or something that offers a lot more performance over the last it is all an advancement.

How can you moan about people who are benching this new stuff to see how many extra points they can get, have you seen what this site is called OVERCLOCK3D it's what people who are into PCs, components and pushing hardware do.

You may as well be someone at a Star Trek convention telling everyone how much you like Star Wars better.

As for 3D I play games in 3D I watch films in 3D, do they look amazing and does it add to the immersion of the gaming experience? -snip-, yes it does, a hell of a lot 3D is amazing only people who are affected by it or haven't experienced it say otherwise.

Sorry if it makes you feel funny or you are immune to it but there are thousands that aren't and should we just not bother with it and to constantly make it better over time just because of those people who don't like it? No we shouldn't and I hope it it continues because I enjoy it.

The same goes for a larger screen, yes it does make a game better. Obviously not in terms of gameplay or graphics but gaming on a larger screen is more impressive and feels a lot more immerse. You just keep telling yourself it doesn't while I enjoy gaming on my large screen that I know from personal experience is better than gaming on a smaller screen.

All I see from you is negativity, sorry but this forum is for people who enjoy new tech and talking about it, you don't seem to. If you don't then why are you here and if you do at least be positive and discuss it and not bitch and moan.
 
Everyone wants something new. You always hear someone say, "This game brought nothing new to the table." I like old solid gameplay mechanics. I hate open world sandbox games (there are exceptions). I love linear shooters. Give me 1000 clones of Half-life with better visuals and a compelling story line and I am sold, just don't cut the game time down to 5 hours dammit!!

I don't always want something new and fresh. I like the PC because for most games I play I already know how to play them. I don't have to learn a new control scheme. I can jump right in the action and have fun right away.

You're right, hardware shouldn't be a limitation. Right now the only limitation is the almighty dollar.

If Half Life 3 came out tomorrow I would no doubt sprout a terrible case of amnesia.

Sadly Valve are just being dicks as usual.
 
nah, we all need a good vent from time to time sieb, and tbh this was much better to read than my rants about how much bf3 fails.
 
Actually I should point something out.

I spend, on average, about 6 hours a day minimum reading. I look for facts. As far as I am concerned when facts are in play there really is no argument.

Boring? yeah, I guess you could call it that. I'm a firm believer in science. Science can and will irrefutably prove things in a precise manner, leaving no room for argument. Well, unless you think you are god and can disprove the marvel of factual science.

Thus, it is fact that -

Hardly any one uses multiple monitors, nor game companies support it. Infact, I only got done reading an interview with "Bozz" who was a part of EA's dev team for Dead Space, in which he confessed that whilst he would have relished the thought of doing a true multiple GPU version of the game it would have been utterly pointless. He refers back to what they call "units". And, "units" are games sold.

They would only sell at best a couple of thousand units that took full advantage of quad SLI, and the costs would FAR outweigh the returns.

Again, those are statistical proven facts.

You can add the same logic and science to dual GPU gaming. The fact is that dual GPU set ups pose as a minority, and as such game coders just won't bother to code for them. A fact that is very easy to spot when playing Fallout 3, as no matter how many GPUs I have added it is still clearly only really using one.

This is because at the core it only understands, and has been coded to understand one GPU. Again, fact.

Anything of that ilk is not going to become commonplace, ever. So, whilst they may be a novelty idea the reality is they will never catch on.

Physx. Yet another one. It can only work on one brand of GPU, and that GPU sits in a PC. So, as sad as it is, it can not and will not ever become a standard commonplace commodity. Game coders have nothing to gain by adding it all in, and the fact is that it can all be done by a CPU any way (see also, Havok).

It's a gimmick. Something that a minority will latch onto, but not something every one needs.

Moving on then we can clearly and safely add multiple screen gaming to the above. Thankfully there are hacks and workarounds, but inevitably it's not something that will become commonplace due to it costing a fortune and once again relying on support that just isn't there.

Power usage, being green.

Roughly to run a 400watt PSU it will use 1 KWh in 2.5 hrs.

This is 1 unit of electricity.

therefore in 24 hrs it will be roughly 10 units.

at 8p per unit that makes it 80p per 24hours


OK. So let's assume that you are gaming for twenty four hours a day, with your system in 100% constant full load, and are drawing 400 watts TDP. This isn't far off of the average PC, even now. So, seeing as you haven't dropped dead (because it wouldn't take long gaming 24 hours a day) that, for example, a Radeon 6970 pushes that up to 440 watts. So that machine would then cost 88 pence a day if it was under full 100% load all day.

Now. Let's then use as an example that the new Radeon 7970 uses 40 watts less. That's about right (infact if anything it's being a bit kind) for a stock clocked Radeon 7970. The fact is that those who buy them will overclock them as much as they can yes? given their price tag and the limited audience (who defy science and indeed sense).

So yes, where was I? that rig will use 8p less a day, given a full 24 hour load. The thing is (as again common sense and fact dictates) no one games for 24 hours solid day in day out. They would quite obviously drop dead.

So let's assume that they spend four hours a day gaming (even I don't game that much) in the evening.

let's work out how much they would save per day given that basis.

4 hours is 1/6th of 24 hours. So, we would need to take 1/6 of 8 pence. One sixth of eight pence is 1.333333333333 pence. That means that the Radeon 7970 can save you 1.3 pence a day.

Now let's say you game every single day of the year for four hours. A year is 365.242199 days.

So, let's figure out the yearly savings of running a Radeon 7970 over a Radeon 6970.

365.242199 x 1.3 = 474.8148587 pence. So over the space of a year the Radeon 7970 can save us £4.74

The Radeon 7970 costs a predicted £450. The Radeon 6970 costs £270. That means you would need to run the Radeon 7970 for about thirty years for it to make it value for money in power saving.

So as I keep saying it's utterly stupid to argue with facts. Those are facts.

Given that I am a very fact based person there's no point in trying to convince me by using techniques akin to brain washing. The problem is that each and every time (thanks to modern technology) I will do my research, read about it, and then come to the table armed with facts.

And the fact is we are being conned and screwed.

Edit. Sorry, missed 3D. You can add that to the above. The fact that any one with "lazy eye" can't see it is very very telling. That means that even if they really wanted to a huge portion of people can't use it as they can't see it. So it will never EVER catch on.
 
Actually I confess that I feel really rather stupid for buying a 6970.

Fact is there was absolutely nothing wrong with my GTX 470 other than the fact it would not run BF3 smoothly at the highest settings with 4XFSAA.

Like a complete pilchard I gained this utterly daft mentality that I had to tame the game, forgetting that I really couldn't stand it.

Money came into my hand and yeah, rush of blood "F**K YOU GAME BE TAMED !" and then I got to sit down and realise what a stupid fool I was because within an hour the game annoyed me considerably and since then I haven't even played it.

I don't like it. I don't even like, or agree to war. I'm a pacifist
laugh.gif
 
It may be a fact that not may people game on more than one screen but the fact is there are people that game on more than one screen. With more powerful cards coming out that are capable of running three screens with decent performance more and more people will start using multiple monitors.

There are also people that game at 2500x1600 on a single screen which again requires a more powerful card that can give decent performance at that res *cough* 7970 *cough* again not many people may be gaming at 2500x1600 but there are people who do and the cheaper GPUs get that are capable of running at that res with decent performance, more and more people will start using that res.

SLI/crossfire is not effective in all games but it is getting better and better (fact) and with cards scaling to almost 100% you are doubling the performance by doing it.

SLI/crossfire is also cost effective as well, if you have 1 7970 for say 2 years and performance starts to drop because of more demanding games then adding another is cheaper than buying the latest top end card.

As for power usage, the fact that it uses less power is a fact no matter by how much less it is still less. And it's also not about the amount of electric you saving to make back the cost of the card it's about the card being more efficient and saving money on electric non stop, fact.

And your 3D argument is week, I have a lazy left eye and i'm actually 20% blind in that eye, yet 3D works fine for me. 3D has already caught on and it continues to grow, more and more games are being made for 3D Batman: Arkahm City for example, which in 3D looks absolutely amazing and btw that is a fact.

I think you need to stop reading so much into facts and start looking into how these things play into a real life scenario. Black is Black and White is White, that is a fact, but if I mix the two together you get grey and that is where facts start to change depending on what the situation is. 
 
Every single argument you are trying to make for the 7970 are all very small weak minority arguments.

You are not using any facts, just what you feel and think.

Yes, it does have some uses. The problem of course is that none of them make it worth having. Not in a real world, fact based sense.

I have seen so many people try and justify their purpose using utterly stupid arguments that it's unbelievable.

The thing is if I were not educated in all of the subjects you speak of then you might be able to catch me out. The problem of course is that yet again due to my nature I have tried and owned all of them.

As such I can truly say, with my hand on my heart, that all of the people that run

Crossfire

SLI

Surround / Eyefinity

3Dvision

Physx

And any other you can think of that then say "I've had no problems !" are absolutely and utterly full, to the crammed "point of exploding brink" of sh*t.

Quadfire? been there, seen it, done it, got the Tshirt. In reality it was utterly terrible because only one out of the six or so games I tried with it supported it (leading back to my fact based findings ) thus, it's a pointless willy waving benchmarking synthetic excercise.

yayforme.jpg


Don't worry, the rig was a mess and I'm proud of that. It was put together for one week only to prove a point. Point proven, it's rubbish.

Then onto Crossfire (dual)

newfan.jpg


This lasted me about six months before I could stand the headaches no more and resorted to a single GPU set up. The drivers were crap, even when it said Crossfire was working it wasn't, and I began to realise that people who are like me now are like that for a reason.

Moving on, quad SLI

cardsin.jpg


And Surround.

white.jpg


Which I am running right now. A complete novelty, cost me £65 for a second GTX 295 and £135 for three monitors. So far I have managed to get about three of my games to run properly and I wouldn't even consider that properly. Fact is that the only ones that do are Dirt 2 and Dirt 3. All of the others have needed hacking around and come with consequences. IE - parts of them do not work properly and you have to live with it.

Quad SLI, whilst far better than Quadfire, is still a complete load of smelly arse, and don't let any one try and tell you different. I say that as a user and owner of it speaking with brutal honesty. See also - Surround / Eyefinity. Add the two together? that rig gets switched on for an hour or two a day, I play Fallout 3 (which almost works quite well though at times it stutters like a slag) and then turn it off.

3D vision? made me throw up after about thirty minutes. See, I have a condition that makes my eyes very sensitive to flickering and light. Thus, couldn't have had it even if I wanted to.

I can and will sit here and talk about this all day. Again, I am armed with facts. You are armed with hyperbole and that will never stand up in a fact based argument.

If you would like me to continue debunking your reasons for wanting (because if you used the word need I would call you rude names) then dude, be my guest. The problem of course is that you're not talking to an idiot who will believe it.
 
As such I can truly say, with my hand on my heart, that all of the people that run

Crossfire

SLI

Surround / Eyefinity

3Dvision

Physx

And any other you can think of that then say "I've had no problems !" are absolutely and utterly full, to the crammed "point of exploding brink" of sh*t.

I can honestly say, with my hand on my heart, that I have experienced no more issues because of SLI than with a single card. With all the fud I heard before I went SLI I was slightly worried. I can't think of one instance since I've been running SLI that it has caused me any more issues than running a single card. I don't game very often, that could be the reason, but when I do game my computer plays everything I throw at it fine.
 
But my argument is not about the 7970 it's self it's about the fact that what these better cards can offer and will improve on over time, i'm just using the 7970 as an example because it happens to be one of those cards.

And you can use facts all you want but they don't play into how these cards will change those facts.

Like I said, with the cards getting more powerful and it becoming cheaper to run a three screen set up or a single monitor with a higher res, more and more people will start doing it because of it becoming cheaper, that is not a fact but it is common sense.

And SLI/crossfire performance is improving in games, that is a fact.

Scaling in SLI/Crossfire is improving, that is a fact.

You are arguing that the power draw isn't that much less but that in not the point the point is they do use less power and that power draw will continue to get lower and lower.

You are also saying it's not worth buying these cards and as I have said if you already have something that can run games or what ever then no it's not worth it but if you are upgrading or buying new yes it is worth it, FACT

And as I said your 3D argument is weak "bawwww it doesn't work for me and I don't like it so it's turd" well it does work for me and many other people and we all love it.

New tech constantly comes and you can keep sitting their moaning about the facts on why you shouldn't buy it but the fact is there are people that will be building their first PC who it will benefit, people who are upgrading from old PCs which it will benefit and there are people who just want the latest and fastest who wont benefit but it's what they do and it's what they enjoy.

Even if things are a minor improvement they are still an improvement and will continue to improve, just having a negative attitude and moaning is not the way to look at it. The way to look at it is how these things will change over time and what they bring to the table and what can be improved on even more.
 
But my argument is not about the 7970 it's self it's about the fact that what these better cards can offer and will improve on over time, i'm just using the 7970 as an example because it happens to be one of those cards.

What it can do over time is also not a selling point. One thing I did not cover today was Direct X 11. So I will, now.

Firstly I would imagine a huge portion of people weren't even aware of this.

dx10t.jpg


Now usually I would have asked you what Direct X 11 had going for it. And, nigh on every single bloody person who has defended it or argued "FOR" it would have said tessellation. There it is, in black and white glorious little lines and triangles. There's your tessellation.

Fact is? we were conned. We were promised all of this new revolutionary stuff in DX11 and it failed to materialise.

Why?

Well, most of it was already there (see the image above, I make that nearly four years old.

So OK. The defender of all that is Direct X would now say "The lighting and shadows !". But, once again it was proven that (using fact) all of those lighting and shading techniques were achievable in DX10.

The proof? it was possible (with a bit of knowhow) to hack the config files in Dirt 2 and enable the lighting and shadows. At this point any Direct X 10 card on the market could then run all of the mentioned lighting and shadows.

Which leaves what? the tessellation. The amount of tessellation in Dirt 2 was absolutely and utterly pitiful.

Direct X 11 is a poor API. It was a poor API and it will continue to be a poor API. Like most of the stuff Microsoft create it needs a titanic amount of power to get it shifting. See also - emulators coded in Visual Basic rather than C++ or Delphi. You will need twice the power to get it shifting simply as that is how it is. It's bloatware.

And you can use facts all you want but they don't play into how these cards will change those facts.

Change them when? in five years when ID games release Tech 28 that actually uses Direct X 11? Kinda like how we finally have a real true DX10 game now coded from the ground up using DX10. How old is DX10 again?

Like I said, with the cards getting more powerful and it becoming cheaper to run a three screen set up or a single monitor with a higher res and more and more people will start doing it because of it becoming cheaper, that is not a fact but it is common sense.

And SLI/crossfire performance is improving in games, that is a fact.

Scaling in SLI/Crossfire is improving, that is a fact.

Your argument holds about as much water as a perforated teapot. Quite simply there are two enormous whacking great caveats that go against what you say.

1.Console gaming.

You see when a game company codes up a game they do it primarily for consoles. That will NEVER EVER EVER change, as the market is ten times larger than that of the PC gaming market. This means that anything added to make a PC game special needs to be added after the fact making it an expensive venture. Not only that, but things such as Securom and Starforce need to be procured, and that all costs money. And then what happens? well, a huge percentage of PC gamers will simply hack the game and steal it.

Thus, the PC is and never will be the platform of choice for game coders to make money on. That will never, ever change. If a game makes 10 million dollars on a PC it will make 100 million dollars on a console. Again, that won't change. People like the ease of use of consoles and their widespread easy to use nature. A PC is a pain at the best of times because once again you must rely on variables and software.

After all of this comes dual GPU / three screens and so on. All told they make up about 0.100000 % of gamers.

I / You / We are the minority.

AMD do not make their fortunes out of the PC gamer. They make the most of their funds from things like Llano (in cheap PCs) as well as Fusion. Nvidia's new bread and butter will be Tegra, as it is a mass selling mass marketed product and not a specialist product.

99% of computers sold are either low end Dell type Celeron based affairs.

This means that you are 1% of that 1% if you use multiple GPUs, and the same can be said for even more GPUs and screens. Companies will simply not bother to support those technologies because there is no money to be made by doing so.

So let's say your catch 22 argument about it did actually take hold and every single PC gamer had four GPUs and three screens (because they would need to).

What do you think we would get then? I'll tell you.

MOAR SECOND HAND CONSOLE GAMES !111oneoenoneeleven.

Why? because even if every PC gamer had four GPUs and three screens we are still the minority !

Your are arguing that the power draw isn't that much less but that in not the point the point is they do use less power and that power draw will continue to get lower and lower.

And as the power draw and NM become lower that means what? that we can push them and overclock them harder and higher. By the end of which we will save no electricity at all, because that's how it is. A 7970 uses 28nm technology. This means that you can now overclock them to 1.3ghz and beyond. Look around you at the reaction to that, every one is doing cartwheels !

However, sadly they are failing to realise that at those clock levels there are no power savings at all. Infact they will use more power than a GTX 580.

If the GPU makers said tomorrow "Look, we can provide you a GPU that uses about 10w at full load, but you can't overclock it and do stupid things with it to prove a point" you would lose interest in about ten seconds flat.

You are also saying it's not worth buying these cards and as I have said if you already have something that can run games or what ever then no it's not worth it but if you are upgrading or buying new yes it is worth it, FACT

No, they're not worth it. Not in any means, shape or form. Quite simply as the price they command for the performance they deliver just doesn't match up with cut price cards that you can get now. A 7970 costs XX% more than a 6970. It is also XX% faster. However, when you do the maths you come to find that it's actually very expensive for the performance it offers. No new GPU is ever worth the money unless it offers something truly new.

The last time we had something truly new (well, so we thought) was DX11. And, quelle surprise ! the 5000 series Radeons ran DX11 ! so we fell for the hype.

This time however? Well, considering there are multiple cards on the market that are all cheaper per FPS than the 7970 and in the real world can do everything it does? it's not worth buying for any reason at all other than waving your penis.

That's fact.

And as I said your 3D argument is weak "bawwww it doesn't work for me and I don't like it so it's turd" well it does work for me and many other people and we all love it.

Aha so because you think it's great that means you can shove your fingers in your ears and not look around you?

I currently have, for example, 279 Facebook friends. Two of them have gone out and bought 3D TVs. I would imagine that both of them feel as you do. Sadly that isn't quite how it works, because when a product is flawed (whether you like it or not) it can never be a success.

And again, that's just fact.

New tech constantly comes and you can keep sitting their moaning about the facts on why you shouldn't buy it but the fact is there are people that will be building their first PC who it will benefit, people who are upgrading from old PCs which it will benefit and there are people who just want the latest and fastest who wont benefit but it's what they do and it's what they enjoy.

Even if things are a minor improvement they are still an improvement and will continue to improve, just having a negative attitude and moaning is not the way to look at it. The way to look at it is how these things will change over time and what they bring to the table and what can be improved on even more.

Ahhh but you see people are finally becoming wise to this. And already we are beginning to see the results.

You see, there was once this company called Best Buy. They figured if they could mass market and brainwash people that sooner or later (like many corporations) they could sit down and tell people what it is they want.

Now of course what they mean by "what people want" actually equates to "What we want them to want". It's a common practice used by Apple and so on. Sadly it can only last for so long before people begin to realise they have been had.

http://www.forbes.com/sites/larrydownes/2012/01/02/why-best-buy-is-going-out-of-business-gradually/

Try reading that. It sounds awful familiar to the PC gaming scene right now. Instead of actually listening to what people want they instead keep giving them a load of old shit that is gimmicky and crap and designed to make them money.

Sadly, however, as purse strings tighten during this global recession people are beginning to say "Hey, what do I want with this completely pointless shit?"
 
Whatever you say, keep banging on, i'm sticking to my last post and you can keep pulling facts out left right and center and twisting things all you want but that is all you are doing.

You are just coming across as a conspiracy theorist who thinks the government is listening to all our phone calls and drugging our water to keep us under mind control and big evil corporations are trying to scam us out of our money.

The facts are all this new tech is worth it, it will bring new things to the table at a lower price and it does offer benefits to people buying new and to people who are upgrading.
 
Whatever you say, keep banging on, i'm sticking to my last post and you can keep pulling facts out left right and center and twisting things all you want but that is all you are doing.

You are just coming across as a conspiracy theorist who thinks the government is listening to all our phone calls and drugging our water to keep us under mind control and big evil corporations are trying to scam us out of our money.

The facts are all this new tech is worth it, it will bring new things to the table at a lower price and it does offer benefits to people buying new and to people who are upgrading.

Twisting things? How is stating the truth twisting things?

The bottom line is this.

You fall for marketing pap, I don't. It's as simple as that.

And no, I am not a conspiracy theorist either, here come the accusations again ! you seem to like doing that.

I just live in the real world, and I am fed up with being fed cold soup. Promises, pipe dreams. They never ever come to fruition.

And, Christmas 2011 was the final straw. Here comes even more crap based on crap relying on the future. The problem is that by the time any of this stuff comes into its own it will be AS USUAL woefully out of date.

As for this tech bringing anything new?

I tell you what, let's make this fair.

You tell me what technology has done for gaming over the past five years (and you can leave out all of the pointless gimmicks and tessellation) and I will listen. Tell me what there is to be excited about, and not just the hyperbole and crap of a company who want to take my money.

Starting. Now.
 
Back
Top