Dreaded ATI vs. Nvidia rant

Ari-M.

New member
First let me start by saying. I own BOTH ATI and Nvidia GPU's and have found both brands to be flawed in one way or another. I am not making this post as a fanboy of EITHER brand. They both serve purposes.

However I do want to point out one thing, that always set's my RAGE button off. When comparing ATI to Nvidia, reviewers and end users always say one thing. ATI = better price to performance ratio. This is usually accompanied by some gaming benchmarks. This is all fine and well, but here is something to consider.

For professional users. Such as content producers, and graphic artists. Nvidia is the only option. For example. The latest 2 Adobe CS releases have native support for CUDA GPU features and the newest offering of Adobe Premiere uses the MPE (meridian playback engine). The MPE ONLY supports GPU accelerated video and rendering through Nvidia CUDA architecture.

I find it curious that this is almost NEVER mentioned by users or reviews. Even if you are a casual CS user, this represents MASSIVE performance benefits. To the extent, that I am not sure how ATI can not be developing some sort of similar GPU development language.

Like I said, I am not a fanboy of either brand. My gaming rig has 5770's in it. They offer a better price to performance ratio and don't generate the ungodly heat that my gtx480's do in my production rig. Granted.

Why, however are these GPU's only considered/validated by gaming benchmarks, and gamers. I think it's grossly short-sighted...to not mention the benefits of CUDA in the professional environments.

I am not sure why mod rigs and customs are always seen as "gaming" systems. Why just earlier today I was watching a gentleman do a video "showcase" of his EVGA sr-2 rig. Quad SLI, dual 6 core xeons....koolance external rads....the whole lot.

He then went on to describe this as his gaming rig. In his showcase he even mentioned how his custom UPS and wiring couldn't handle the load of his 2 1500 watt PSU's....and so he had to have 2 dedicated 20 amp breakers installed for his system.

This seems insane to me. There would be ZERO use for a system like that as a "gaming" rig. It would however make an excellent development platform, or production rig. Especially with all of that CUDA available.

I hope this isn't off base or out of context, or even off-topic. I am just curious on everyones thought's about when a "gaming" rig is just plain overkill. Also why are less content producers involved in overclocking and high end component reviews. I can't be the only one who uses these high end systems to make a living....

I am very curious to hear your thoughts on the subject....why is the fact that CUDA is the ONLY supported GPU development language for professional apps. not a larger issue in the ATI vs. Nvidia debate?

thanks for make it through my long winded rant....

wink.gif
unsure.gif
 
First let me start by saying. I own BOTH ATI and Nvidia GPU's and have found both brands to be flawed in one way or another. I am not making this post as a fanboy of EITHER brand. They both serve purposes.

Indeed. Money talks and bullsh*t walks IMO. I don't care who makes the best card it's who makes the best card for MY money.

The argument used to be that ATI's drivers were pony (pony & trap, crap) and that Nvidia cards had far better drivers. I would agree with that too, right up until about a year ago when Crossfire just screamed off into the distance and Nvidia put out a driver that killed cards.

However I do want to point out one thing, that always set's my RAGE button off. When comparing ATI to Nvidia, reviewers and end users always say one thing. ATI = better price to performance ratio. This is usually accompanied by some gaming benchmarks. This is all fine and well, but here is something to consider.

For professional users. Such as content producers, and graphic artists. Nvidia is the only option. For example. The latest 2 Adobe CS releases have native support for CUDA GPU features and the newest offering of Adobe Premiere uses the MPE (meridian playback engine). The MPE ONLY supports GPU accelerated video and rendering through Nvidia CUDA architecture.

Whilst I agree with what you are saying you need to remember that graphics cards are made for gaming. There is a niche market for graphics cards that do more, but their primary usage is gaming. Nvidia's Tesla cards are basically nothing but a 480GTX with all of the cores they promised there and working. They are just as hot and cost what? thousands. That's OK if that is what you need but I would imagine that most of the other companies have been careful in that department. ATI made the FireGL or whatever it was called, 3Dlabs made the wildcats and so on. But over the years many of those smaller companies have fallen to the wayside (3Dlabs, Integraph, SGI etc) as the money all lies in gamers and convincing them to buy the latest and supposedly greatest cards.

No one will go through cards like a gamer does, especially one with an urge to display his manhood by what he has sitting in his PCIE slot. Manufacturers know this, and this is why they release a large series of cards in drip feed mode so that people will end up upgrading as many times as they can trick them to.

I find it curious that this is almost NEVER mentioned by users or reviews. Even if you are a casual CS user, this represents MASSIVE performance benefits. To the extent, that I am not sure how ATI can not be developing some sort of similar GPU development language.

Because gaming GPUs are not designed to be used for such things. For that you would need to buy a Tesla or what used to be known as a Quadro. Or a SGI workstation, or an Intergraph, or a FireGL based box and so on. Those are graphics workstations that are designed for design professionals. Nvidia actually stated that they were holding the real, true, unhindered 480GTX back for its Tesla series of professional cards. And there are magazines and places you can go to find out things like that. But yeah, Radeons and Geforce cards are simply not designed, pushed or marketted at graphics professionals.

Like I said, I am not a fanboy of either brand. My gaming rig has 5770's in it. They offer a better price to performance ratio and don't generate the ungodly heat that my gtx480's do in my production rig. Granted.

Well if you really wanted to see money as no object you would be using the Tesla cards. Fact is that 480s are probably more than good enough
smile.gif


Why, however are these GPU's only considered/validated by gaming benchmarks, and gamers. I think it's grossly short-sighted...to not mention the benefits of CUDA in the professional environments.

They're just not aimed at the professional market simply and purely because they are not considered to be professional cards mate. It really is as simple as that. They're being sold as gaming cards to gamers. If we opened the pages of Custom PC (or Maximum PC where you are) and saw a load of figures about how well it could crunch data or render a wireframe ETC we would likely yawn and be put off for life. Now yes, it's short sighted but then so are humans. We are an incredibly fickle bunch and aiming what we want right at us square in the face usually tends to work. I don't even know if ATI bother with the FireGL range any more tbh. The money is all in gaming. Why? because a gamer is never satisfied. It's a bloody illness. That's why they throw those benchmarks in your face, to sow a seed of doubt. You then run along like a good little puppy and bench your machine, only to find - "oh no, my machine can't put out the scores those new cards do !! POKEMON TIME ! GOTTA HAVE EM ALL !".

And then you go out and you buy faster cards that also last about a week before you see magazines slating them off and yours being shat on by the newer more expensive cards. That's how it works - on hardware sales. Not like a gaming console where there are no upgrades or benchmarks.

I am not sure why mod rigs and customs are always seen as "gaming" systems. Why just earlier today I was watching a gentleman do a video "showcase" of his EVGA sr-2 rig. Quad SLI, dual 6 core xenons....koolance external rads....the whole lot.

He then went on to describe this as his gaming rig. In his showcase he even mentioned how his custom UPS and wiring couldn't handle the load of his 2 1500 watt PSU's....and so he had to have 2 dedicated 20 amp breakers installed for his system.

So that he could put out benchmarks and increase the size of his penis. Seriously I am not joking. As I said, it's like an illness. The above described system probably never gets used for gaming but more for whacking out immense benchmarks. If he can afford it and enjoys his hobby? fair play. I mean I used to do SPL competitions. We used to describe them as 'our car stereo' but who can drive along with 4 4000w 18" subs on full blast? You can't, but it's addictive and it becomes an obsession. Good old capitalism at its best
smile.gif


This seems insane to me. There would be ZERO use for a system like that as a "gaming" rig. It would however make an excellent development platform, or production rig. Especially with all of that CUDA available.

I hope this isn't off base or out of context, or even off-topic. I am just curious on everyones thought's about when a "gaming" rig is just plain overkill. Also why are less content producers involved in overclocking and high end component reviews. I can't be the only one who uses these high end systems to make a living....

Yup you're indeed right. There would be absolutely bugger all use for it as a gaming rig, simply as what he has done would be woefully unsuitable for gaming. First you would shrink due to the heat coming out the back and then the drivers will probably not be up to gaming any way, being that it is so complicated. Some people are just never happy with what they have. And the manufacturers know this and this is why some (I'm looking at you, Nvidia) rebrand GPUs and put them out with a new name. And people rush straight out to buy them.
rolleyes.gif


I am very curious to hear your thoughts on the subject....why is the fact that CUDA is the ONLY supported GPU development language for professional apps. not a larger issue in the ATI vs. Nvidia debate?

thanks for make it through my long winded rant....

wink.gif
unsure.gif

Well as I said, it is all kinda black and white. Gaming cards are made for gamers, professional cards are made for professionals (and priced to suit). I mean hell, a few years back a simple mod could turn any ATI or Nvidia card into its professional guise and cost about 1/10th. Please consider that gamers would not even want a card that touted its prowess with professional graphic applications. They want buggy piles of rubbish like Crysis benchmarks for bragging rights. I mean hell, ATI are just about to launch an entire series of new cards just so that they can claim back the fastest single GPU trophy.

So by now you should be catching onto the mentality that is used and abused. Also... to open another can of worms.. Graphics cards are the most unreliable unstable thing in a PC. They don't come with ECC ram, are not designed to be ran 24 hours a day and fail more than any single other component in a PC. And that's because the manus basically take a gamble on people replacing them so quick they won't ever need the afforementioned (stable to the max, ECC and so on).

As I say, horses for courses. Gaming stuff is aimed at braggers and people who feel inadequate, and professional gear is aimed and professionals. That's why no gaming mags mention what ECC ram does or redundant PSUs, because we simply wouldn't care.
 
Indeed. Money talks and bullsh*t walks IMO. I don't care who makes the best card it's who makes the best card for MY money.

The argument used to be that ATI's drivers were pony (pony & trap, crap) and that Nvidia cards had far better drivers. I would agree with that too, right up until about a year ago when Crossfire just screamed off into the distance and Nvidia put out a driver that killed cards.

Whilst I agree with what you are saying you need to remember that graphics cards are made for gaming. There is a niche market for graphics cards that do more, but their primary usage is gaming. Nvidia's Tesla cards are basically nothing but a 480GTX with all of the cores they promised there and working. They are just as hot and cost what? thousands. That's OK if that is what you need but I would imagine that most of the other companies have been careful in that department. ATI made the FireGL or whatever it was called, 3Dlabs made the wildcats and so on. But over the years many of those smaller companies have fallen to the wayside (3Dlabs, Integraph, SGI etc) as the money all lies in gamers and convincing them to buy the latest and supposedly greatest cards.

No one will go through cards like a gamer does, especially one with an urge to display his manhood by what he has sitting in his PCIE slot. Manufacturers know this, and this is why they release a large series of cards in drip feed mode so that people will end up upgrading as many times as they can trick them to.

Because gaming GPUs are not designed to be used for such things. For that you would need to buy a Tesla or what used to be known as a Quadro. Or a SGI workstation, or an Intergraph, or a FireGL based box and so on. Those are graphics workstations that are designed for design professionals. Nvidia actually stated that they were holding the real, true, unhindered 480GTX back for its Tesla series of professional cards. And there are magazines and places you can go to find out things like that. But yeah, Radeons and Geforce cards are simply not designed, pushed or marketted at graphics professionals.

Well if you really wanted to see money as no object you would be using the Tesla cards. Fact is that 480s are probably more than good enough
smile.gif


They're just not aimed at the professional market simply and purely because they are not considered to be professional cards mate. It really is as simple as that. They're being sold as gaming cards to gamers. If we opened the pages of Custom PC (or Maximum PC where you are) and saw a load of figures about how well it could crunch data or render a wireframe ETC we would likely yawn and be put off for life. Now yes, it's short sighted but then so are humans. We are an incredibly fickle bunch and aiming what we want right at us square in the face usually tends to work. I don't even know if ATI bother with the FireGL range any more tbh. The money is all in gaming. Why? because a gamer is never satisfied. It's a bloody illness. That's why they throw those benchmarks in your face, to sow a seed of doubt. You then run along like a good little puppy and bench your machine, only to find - "oh no, my machine can't put out the scores those new cards do !! POKEMON TIME ! GOTTA HAVE EM ALL !".

And then you go out and you buy faster cards that also last about a week before you see magazines slating them off and yours being shat on by the newer more expensive cards. That's how it works - on hardware sales. Not like a gaming console where there are no upgrades or benchmarks.

So that he could put out benchmarks and increase the size of his penis. Seriously I am not joking. As I said, it's like an illness. The above described system probably never gets used for gaming but more for whacking out immense benchmarks. If he can afford it and enjoys his hobby? fair play. I mean I used to do SPL competitions. We used to describe them as 'our car stereo' but who can drive along with 4 4000w 18" subs on full blast? You can't, but it's addictive and it becomes an obsession. Good old capitalism at its best
smile.gif


Yup you're indeed right. There would be absolutely bugger all use for it as a gaming rig, simply as what he has done would be woefully unsuitable for gaming. First you would shrink due to the heat coming out the back and then the drivers will probably not be up to gaming any way, being that it is so complicated. Some people are just never happy with what they have. And the manufacturers know this and this is why some (I'm looking at you, Nvidia) rebrand GPUs and put them out with a new name. And people rush straight out to buy them.
rolleyes.gif


Well as I said, it is all kinda black and white. Gaming cards are made for gamers, professional cards are made for professionals (and priced to suit). I mean hell, a few years back a simple mod could turn any ATI or Nvidia card into its professional guise and cost about 1/10th. Please consider that gamers would not even want a card that touted its prowess with professional graphic applications. They want buggy piles of rubbish like Crysis benchmarks for bragging rights. I mean hell, ATI are just about to launch an entire series of new cards just so that they can claim back the fastest single GPU trophy.

So by now you should be catching onto the mentality that is used and abused. Also... to open another can of worms.. Graphics cards are the most unreliable unstable thing in a PC. They don't come with ECC ram, are not designed to be ran 24 hours a day and fail more than any single other component in a PC. And that's because the manus basically take a gamble on people replacing them so quick they won't ever need the afforementioned (stable to the max, ECC and so on).

As I say, horses for courses. Gaming stuff is aimed at braggers and people who feel inadequate, and professional gear is aimed and professionals. That's why no gaming mags mention what ECC ram does or redundant PSUs, because we simply wouldn't care.

Very interesting take on the subject. Here is some irony if I may add it.

I am a professional user....multimedia production. I used to buy SGI (which oddly did use nvidias early chips) and Apple and all kinds of other odd bits of "professional" junk. Then a while back i got wise to something. It's the same junk, just marketed in a black box....instead of a chrome/led monstrosity that looks like it beamed on to your desk from some future colony of druid phallus worshipers.

I had another major revelation with the release of the gtx480. I was honestly considering buying a massive Quadro/Tesla render farm. Just for grins I went and bought a gtx480. I then looked into the reason that it would not work as a CUDA/MPE engine for CS. Want to knwo what my researched uncovered?

Adobe writes a basic txt file (think of it as registry permissions) that basically disables ANY card not listed within the context of the file. For example the Quadros are supported and so were a few gtx cards. The gtx480 WAS NOT and CS would not recognize it as a CUDA resource. So I simply deleted the txt/registry file. Re-Ran the CUDA sniffer utility from the c:\ prompt...and blammo....the gtx 480 absolutely DEMOLISHES the tesla/quadro solution. Now to be fair it doesn't scale (no SLI). So when I am running multiple apps, I have just written script to allow each app. access to a dedicated gtx480. I also did the same with a 24gb brick of RAM. Rather than buy a custom machine that utilized RAM in a better way than win 7...I just allocated half the RAM to a RAM disk and moved all the paging files there.

Same deal with the 980x. "Gaming" cpu or not, it still crunches numbers the same as a xeon (ok ok smaller cache and some other interesting fsb things).

So long story short. i took "gaming" hardware and built a professional workstation out of it. It runs totally stable. It's got a healthy overclock on it. It absolutely craps on any other system I have had in the past or present. This includes SGI octane and o2 systems. Onyx servers, 8 core MacPro's. Custom fedora/unix boxes etc etc etc. In 15 years of doing this for a living I have never had a system that performed so well for so little coin.

Let's put it into perspective. I have what most "gamers" would consider to be an astronomical rig. By gaming standards I have an e-p33n the size of a bus. For a gaming rig it's STUPID money.

Herer is where it get's sweet though. My last SGI system was well over $30k. My last MacPro was well over $7k. I have less than $4k into this new production rig and it stomps the living sh!t out of those other systems.

Yes Yes, we have Moores law and all, and things get faster and cheaper etc etc etc....but I couldn't go out and buy a "professional" rig that could keep up with this thing. I have seen the stuff out there, and been courted by the companies trying to sell it. Including full on site demos and fancy dinners and what not.

So my point is this. The hardware, underneath all the marketing and fancy lights and fan shrouds....is EXACTLY the same.

I am really wondering then, how the general public can be so un-informed about this basic underlying principle. Are people really so wrapped up in their pissing contests, that they are blind to the facts? Or is it deception from reviewers and end-users alike, to corral more people into their camp? Like the ATI ot Nvidia fanboys. Because by god if you tell an ATI fanboy that Nvidia actually builds a more capable card then he just says things like, well yeah maybe more capable if you want to cook eggs or heat your house. Yet facts are ignored. The same for the Nvidia fanboy, you tell him that ATI builds a smarter cheaper card, he says "well yeah if you don't care about PhysX and want to be a tightwad".

Facts are just ignored left and right. There is no better or best. There are tools to get jobs done. If the job is gaming, then buy a pair of 5770's...and a 1055t on a biostar motherboard and go play ANY game out there. Then in a few years when you have a new generation of games, consider something else.

Here is an example of what i mean. Crysis is still used as THE benchmark for gaming. How old is Crysis again?
laugh.gif


Why is that the extent of this hardware though? This high end "gaming" stuff can really be used to do productive things. Not just to wank on some video games and synthetic benchmarks.

I think I already realize and agree with 100% of your feedback, but I am asking WHY is that the case??

Can end users really be so insecure in the size of their penis? Or do the reviews and "industry" lead sheepish people to think that this stuff has no use beyond playing video games. I mean c'mon seriously. It's freaking video games!!!!
laugh.gif
:lol:
laugh.gif


Maybe I should come out with a gold plated Monopoly board game that has 4 way SLi and scores over 60,000 in 3dmark

saddest part is that kids can't afford this stuff. so it's us "grown men" arguing and waring over who makes a better graphics card to wank the night off playing a game with. that is insane. I suppose pretty soon mother nature will wipe us pathetic bastards off the planet anyhow. Enjoy that e-p33n while it lasts I guess.
 
Mate, without getting out my tinfoil hat and running around going "dibble dibble dibble" before being rushed off in a straight jacket - it's all a con. All of it.

I'm the guy here who likes to experiment and really push for answers.

About a month ago I did an experiment with my 5770s and my Crosshair 2. When I put the pair of Radeons in there the machine would get to Windows and then reboot. No BSOD, no message, nothing. Now obviously the CH2 is an SLI enabled board.. Hmm. Well, I did some digging around and every one was saying that an SLI board will not run Crossfire because it has a special chip on. Hogwash. In the end GenL (the Physx hacker dood) and I discovered what was happening.

After many hours of poking we found out that the Striker II Extreme (790i SLI board by Asus) was able to run Crossfire with a bios made by HP (for that stunningly sexy Blackbird box they did with Voodoo DNA). So after some probing we found out that all HP had done was hack the bios.

Turns out that the bios on an SLI board (and I would imagine on a Crossfire board) looks for the ammount of cards in the machine. It then looks for the VEN_ID (or vendor name) and if it finds two rogue cards (ATI in an SLI or vice versa) it simply sends Windows a reset command when the drivers are loaded.

Now this could be easily bypassed by decompiling the bios, but the problem is there is now a CRC chip sitting right nextdoor to the bios which checks the protection calls (checksums) in the bios. If it doesn't match 100%? you have a nice bricked board on your hands.

That Adobe trick does not surprise me in the slightest. As I said before, all Tesla is is a 480 with all of the cores there (the maximum the chip can handle) and a few more stream processors ETC (examples, not precise). Then they go and charge around two grand for it.

The 8800 ultra was *nothing* but an 8800GTX with a simple capacitor volt mod and huge overclock. The 9800GTX was nothing but an 8800GTX with slightly more unlocked, the 250GTS was nothing but a 9800GTX - repeat ad nauseum.

As for the 980x? 100% bang on again. Intel have been doing that for years mate. Back when AMD were running riot with the Athlon XP and then the FX series Intel had no answer. No response, no reply. They were having their asses handed to them. So what to do? cut down a Xeon ever so slightly, whack it in a 478 package and bingo bongo ! The Extreme Edition was born, a fast desktop CPU... That costs a grand
laugh.gif


At that time I had managed to bag the full internals of an SGI 550 from Ebay (large Intel dual slot 1 Xeon board, PSU etc) and was running dual 800mhz P3 Xeons. Simply as it cost hardly anything ($300) and took a flying poo on any desktop CPU setup. As long as you had an AGP socket you were laughing.
 
Mate, without getting out my tinfoil hat and running around going "dibble dibble dibble" before being rushed off in a straight jacket - it's all a con. All of it.

I'm the guy here who likes to experiment and really push for answers.

About a month ago I did an experiment with my 5770s and my Crosshair 2. When I put the pair of Radeons in there the machine would get to Windows and then reboot. No BSOD, no message, nothing. Now obviously the CH2 is an SLI enabled board.. Hmm. Well, I did some digging around and every one was saying that an SLI board will not run Crossfire because it has a special chip on. Hogwash. In the end GenL (the Physx hacker dood) and I discovered what was happening.

After many hours of poking we found out that the Striker II Extreme (790i SLI board by Asus) was able to run Crossfire with a bios made by HP (for that stunningly sexy Blackbird box they did with Voodoo DNA). So after some probing we found out that all HP had done was hack the bios.

Turns out that the bios on an SLI board (and I would imagine on a Crossfire board) looks for the ammount of cards in the machine. It then looks for the VEN_ID (or vendor name) and if it finds two rogue cards (ATI in an SLI or vice versa) it simply sends Windows a reset command when the drivers are loaded.

Now this could be easily bypassed by decompiling the bios, but the problem is there is now a CRC chip sitting right nextdoor to the bios which checks the protection calls (checksums) in the bios. If it doesn't match 100%? you have a nice bricked board on your hands.

That Adobe trick does not surprise me in the slightest. As I said before, all Tesla is is a 480 with all of the cores there (the maximum the chip can handle) and a few more stream processors ETC (examples, not precise). Then they go and charge around two grand for it.

The 8800 ultra was *nothing* but an 8800GTX with a simple capacitor volt mod and huge overclock. The 9800GTX was nothing but an 8800GTX with slightly more unlocked, the 250GTS was nothing but a 9800GTX - repeat ad nauseum.

As for the 980x? 100% bang on again. Intel have been doing that for years mate. Back when AMD were running riot with the Athlon XP and then the FX series Intel had no answer. No response, no reply. They were having their asses handed to them. So what to do? cut down a Xeon ever so slightly, whack it in a 478 package and bingo bongo ! The Extreme Edition was born, a fast desktop CPU... That costs a grand
laugh.gif


At that time I had managed to bag the full internals of an SGI 550 from Ebay (large Intel dual slot 1 Xeon board, PSU etc) and was running dual 800mhz P3 Xeons. Simply as it cost hardly anything ($300) and took a flying poo on any desktop CPU setup. As long as you had an AGP socket you were laughing.

\

that is funny about the special chip thing. my mobo has nf200's on it (which are obviously an Nvidia product)....yet it can run both SLi or Crossfire....

the intentional "breaking" of products for the sake of a fancy sticker on a product box is absurd. it's extortion. pay a licensing fee for a box badge and we will "un-break" hardware that should already work

yeah it's a circus man. really surprised that consumers are willing to play the game.

it's funny my 10 year old SGi Octane is still the most potent web server I have ever Administered.....IRiX was a monster of an OS. VRML native OS. Freaking awesome. Every bit of the GUI is a vector graphic, VRML native in the web server. I can't understand how they didn't manage to completely dominate the ISP/colo markets

Oh yeah I guess I am forgetting the astronomical "as new" price tags

The saddest part about this whole scheme, is that it's all the same hardware. The only thing that really separates it is software implementations and the intentional "breaking" of versatility

Then again I guess everyone has to make a living somehow. It reminds me alot of the automobile industry. If you dont' have the latest revision of your make and model, you are a worthless pile of crap. Forget the fact that a vehicle is nothing more than a tool to get you from on place to another.

I think I am just going to skip all of this non-sense and get into the business of Penile implants.
 
You're quite correct in many of your statements. It's noticeable, disregarding cases where you don't need precision, that cheaper hardware can cut the mustard when it comes to professional work. As long as you're build your own kit, you can get away with this. Contracted suppliers tho will thrust the expensive hardware down your throat in the leasing business. Why the heck you need a high end quadro for graphic work - well cos it's the only graphic workstation we lease and it's $10k a year..

The expensive cards do have their place. Adobe work isn't part of it, although ofc they can do it. There is alot of high precision work that gaming cards just can't do due to the lack of hardware there-in. In broadcast this is more noticeable, but on a technical front you need that accuracy over just looks. The theory that you can take a gaming card and it'll do ALL the professional work is a fallacy, and vice versa. Although I'd say in a majoritive sense, you can get away with it.

Reviewing and information has been pretty bad for the last few years. It's almost as if the baton of the previously excellent standards we used to get just got copied without moving forward. Times have changed, and many things in addition to the existing testing should be taken into account. I will say tho that there are some places that have or did try to push the differences - thing being here that in the current climate you have to be popular. Some of the blatent inaccuracies can be quite laughable. Talking from a hardware testing basis, how some sites come to the results they get can be mind boggling. It almost appears sometimes that they do 'some' tests then read another review, then mush the feelings together. "AMD drivers are known to be crap" - well they used to, but you often find, like with drivers for other stuff, that it's more down to the user application than the actual drivers. "nVidia cards are blisteringly hot" - well your copy & paste works really well as a 480 can idle @ 40 degrees on air whilst being overclocked, 75 under full load. "X is faster than Y by 15 frames !" - erm, they both pass 100 fps in your test game, which usually isn't a new one - what's the point of buying a new card for your old game when your old card does over 100 fps itself. FOR NEW GAMES - exactly, so test the new games. Oh and look one will transcode your porn to your iphone 10x faster than the other and your i7 you bought for gaming.

Nah, you just gotta be popular, get them fans in, get the ad-clicks hopefully.

The manufacturers of cards don't give a damn about performance. As long as they compete with their proposed market by 5-10% they'll be happy. Long gone are the days of 25%++ increases cos, well, the software and creativity isn't increasing in line with it. A large number of games are still installing Dx9 on your Windows 7 and recommending a P4 as a base cpu. They don't use cores - they don't know how to. reportedly they do, but they will spread themselves over cores without going over 25% (for a quad). 4x25=100 and stop. Futuremark can do it, they should share the knowledge. Another reason is that they just don't have to. Sales will come no matter. As soon as a reportedly fastest card on Earth (from any camp) hits the shelf, it's sold out. "Ugh, why rebadge a card, I'm pissed off" - no your stoopid, like with any new purchase, you do your research. Yes the cards are updated, but in the sense that your going to see any difference by more than a few %, your not. The market is there tho and the manufacturers filled that slot, sold the cards. The new lines have a new ID, so to continue to make to older cards, cos after all they sell, you have to give them a base for comparison. "But I want Dx11" - don't we all. Next year maybe. There are some titles out right now and the reviewers have started taking note, but Crysis and CoD are still the basis for comparison. "Well they're popular!" - World of Warcraft is the most popular, you gonna test that ? "But, but look at the power consumption !" - mate, you've overclocked your cpu by how much ? When was the last time you underclocked your cpu to save energy ? When did you get so green ? The first thing you did as an enthusiast was to go into your bios and turn off all the power saving things in there that Intel designed for your cpu.

No, when you've been in the labs testing raw hardware and seeing what it can do without the shackles in place, you can get pretty complacent. Some things are shocking, others not. End results on the shelves are often a pale comparison to the pcb you seen with all kinds of stuff welded to it. Shut the hell up, test it, return it.

In a few months there'll be the HD6xxx season, it'll do the 4xx series by.. 10-20% maximum (probably 20% in a selective number of game titles). A number of months down the line you'll see the 5xx series (well, a continuation of the 4x5 series first) and it'll do the same, but with some other gimmick. One card will come up with 2x gpu and cost a fortune, then another. Further down the line (reportedly late, but it's not really, just a case to mock the one company for no real reason other than to justify the purchasing of cheaper cards) 5xx series will come out and do another 5-15%..... and on and on. Ally yourselves with whatever you will, it's all a big game and the manufacturers are laughing. The one making massive revenue, the other making double that revenue, and yet are considered to be failing ? hmm logical.

Xfire/SLI - never worth it imo, not even dual gpu. That's just my opinion tho. Just get a higher end single card than the one you have. You have the high end cards ? Well if you got cash to splash, knock yourself out.

You want something, check your cash situation and act accordingly. Do you want low - mid - topend performance ? You bought a 5970 for a 19" monitor ? If you're doing 100 fps, you really think you're gonna notice 110 ? Dx11 ? Next year I would say definitely, right now only if you have the cash, your existing Dx10 will be fine (or Dx10.1 which has extra things that no-one knows how to use properly). I wouldn't pay too much credence to many of the reviews out there, with some exceptions ofc
wink.gif
 
You're quite correct in many of your statements. It's noticeable, disregarding cases where you don't need precision, that cheaper hardware can cut the mustard when it comes to professional work. As long as you're build your own kit, you can get away with this. Contracted suppliers tho will thrust the expensive hardware down your throat in the leasing business. Why the heck you need a high end quadro for graphic work - well cos it's the only graphic workstation we lease and it's $10k a year..

The expensive cards do have their place. Adobe work isn't part of it, although ofc they can do it. There is alot of high precision work that gaming cards just can't do due to the lack of hardware there-in. In broadcast this is more noticeable, but on a technical front you need that accuracy over just looks. The theory that you can take a gaming card and it'll do ALL the professional work is a fallacy, and vice versa. Although I'd say in a majoritive sense, you can get away with it.

Reviewing and information has been pretty bad for the last few years. It's almost as if the baton of the previously excellent standards we used to get just got copied without moving forward. Times have changed, and many things in addition to the existing testing should be taken into account. I will say tho that there are some places that have or did try to push the differences - thing being here that in the current climate you have to be popular. Some of the blatent inaccuracies can be quite laughable. Talking from a hardware testing basis, how some sites come to the results they get can be mind boggling. It almost appears sometimes that they do 'some' tests then read another review, then mush the feelings together. "AMD drivers are known to be crap" - well they used to, but you often find, like with drivers for other stuff, that it's more down to the user application than the actual drivers. "nVidia cards are blisteringly hot" - well your copy & paste works really well as a 480 can idle @ 40 degrees on air whilst being overclocked, 75 under full load. "X is faster than Y by 15 frames !" - erm, they both pass 100 fps in your test game, which usually isn't a new one - what's the point of buying a new card for your old game when your old card does over 100 fps itself. FOR NEW GAMES - exactly, so test the new games. Oh and look one will transcode your porn to your iphone 10x faster than the other and your i7 you bought for gaming.

Nah, you just gotta be popular, get them fans in, get the ad-clicks hopefully.

The manufacturers of cards don't give a damn about performance. As long as they compete with their proposed market by 5-10% they'll be happy. Long gone are the days of 25%++ increases cos, well, the software and creativity isn't increasing in line with it. A large number of games are still installing Dx9 on your Windows 7 and recommending a P4 as a base cpu. They don't use cores - they don't know how to. reportedly they do, but they will spread themselves over cores without going over 25% (for a quad). 4x25=100 and stop. Futuremark can do it, they should share the knowledge. Another reason is that they just don't have to. Sales will come no matter. As soon as a reportedly fastest card on Earth (from any camp) hits the shelf, it's sold out. "Ugh, why rebadge a card, I'm pissed off" - no your stoopid, like with any new purchase, you do your research. Yes the cards are updated, but in the sense that your going to see any difference by more than a few %, your not. The market is there tho and the manufacturers filled that slot, sold the cards. The new lines have a new ID, so to continue to make to older cards, cos after all they sell, you have to give them a base for comparison. "But I want Dx11" - don't we all. Next year maybe. There are some titles out right now and the reviewers have started taking note, but Crysis and CoD are still the basis for comparison. "Well they're popular!" - World of Warcraft is the most popular, you gonna test that ? "But, but look at the power consumption !" - mate, you've overclocked your cpu by how much ? When was the last time you underclocked your cpu to save energy ? When did you get so green ? The first thing you did as an enthusiast was to go into your bios and turn off all the power saving things in there that Intel designed for your cpu.

No, when you've been in the labs testing raw hardware and seeing what it can do without the shackles in place, you can get pretty complacent. Some things are shocking, others not. End results on the shelves are often a pale comparison to the pcb you seen with all kinds of stuff welded to it. Shut the hell up, test it, return it.

In a few months there'll be the HD6xxx season, it'll do the 4xx series by.. 10-20% maximum (probably 20% in a selective number of game titles). A number of months down the line you'll see the 5xx series (well, a continuation of the 4x5 series first) and it'll do the same, but with some other gimmick. One card will come up with 2x gpu and cost a fortune, then another. Further down the line (reportedly late, but it's not really, just a case to mock the one company for no real reason other than to justify the purchasing of cheaper cards) 5xx series will come out and do another 5-15%..... and on and on. Ally yourselves with whatever you will, it's all a big game and the manufacturers are laughing. The one making massive revenue, the other making double that revenue, and yet are considered to be failing ? hmm logical.

Xfire/SLI - never worth it imo, not even dual gpu. That's just my opinion tho. Just get a higher end single card than the one you have. You have the high end cards ? Well if you got cash to splash, knock yourself out.

You want something, check your cash situation and act accordingly. Do you want low - mid - topend performance ? You bought a 5970 for a 19" monitor ? If you're doing 100 fps, you really think you're gonna notice 110 ? Dx11 ? Next year I would say definitely, right now only if you have the cash, your existing Dx10 will be fine (or Dx10.1 which has extra things that no-one knows how to use properly). I wouldn't pay too much credence to many of the reviews out there, with some exceptions ofc
wink.gif

Of course the gtx cards won't be adequate for SolidWorks or Avid (IcE) apps. However, that is not because of some difference in architecture. It's simply a case of driver implementation. Or should I say the intentional "breaking" of permissions.

As far as the precision and build quality of "professional workstations"....of course you are rite. I mean my 10 year old SGi's still work flawlessly. They are built like tanks. That does seem pointless though. I mean the throughput of the system will be useless long before anything breaks or needs replacing.

I for one would welcome the implementation of low voltage devices. I would gladly sacrifice some artificial numbers, for smarter hardware. Not just to be "eco friendly", or "save money on power". The main reason I would push for low power devices, is so that software developers can get back to writing streamilined code. Rather than writing redundant bloated code, on top of redundant bloated code.

More efficient hardware would force real breakthroughs in development. Just like there is obviously a theoretical limit to thrust or horsepower. Ok we have already figured out ways to make things faster, by simply consuming more resources....fine, that isn't very elegant. In fact that is a caveman mentality.

I think the Mghz. war has just been replace by the "core" war.....nothing has really changed....let's just keep adding more crap on top of crap....without examining the fundamental ways we handle the data.

Programs have gotten so complex and bloated, that if you were to ask a single development engineer, how his "program actually works"...he wouldn't be able to tell you. He would only be able to say "my responsibility is to compile the chunks that are sent to me". Ok then who sends the chunks to you? Go ask the guy that sends the chunks and he will tell you the same thing. I dunno man I just assemble these pre-cinfigured scripts.

Then you get to the heart of the underlying resources, and realize that it's a 25 year old programming language developed by Sun. It's laughable. People are standing on the shoulders of code that was written to be limited, because of limited hardware resources and limited funding at the time of development.

It's just an on going cycle. Pile arcane code on top of arcane code, until no-one actually knows what is going on. It seems that these rediculous 10%-20% increases in power are really only there to handle the extra bloat.

It's a decaying model. Pretty soon you won't be able to afford to keep up. The rate of turn over will be measured in weeks not months. Then it will get to a point where it doesn't even make sense to have retail outlets anymore. Because by the time you get to a store and get home with your packaged product, it will be totally un-usable by the current driver set's you can get online.

Some brave company needs to step up and just buck the current trend, and design a "system on a chip". Something with limited resources and limited power consumption. Then implement a sleek usable interface, and start developing their own software....and make their language open source.

Yeah I know that is what everyone would like to believe Apple does/did.....but that was never the case. I am sure a lot of people are thinking "well you just described an iPAD". Not really....the iPAD is just another closed-ended system. You can only write so many useless apps for a device like that, before people go ok....I need something bigger.

Like I said I think a brave newcomer needs to come to market, and provide something truly groundbreaking. If anything this should be a worldwide, public effort. The rate at which consumer electronics are destroying the world economies and environment is alarming. Mark my words, 4 years from now you won't be able to build ANY off the shelf computer that is worth a damn. The only thing that will keep people going to their local Fry's or e-tailer....are flashy boxes, and blinking lights.

Might as well go to Vegas and just stick your money on the card table.
 
Well, in terms of the architecture, a quadro card, for one example, essentially reads and writes to onboard memory differently to their geforce counter parts. In the same sense as your regular memory for your pc is no good in a server (or most servers) and vice versa. Your gtx cannot provide the same accuracy.

This worth the extra cash alone ? Well no, but the same thing goes for a hp server in comparison to a rigged desktop. Sure you can run server 2008 on both, but whilst you're warm and comfy with your music being shared on your home network, the hp has all kinds of onboard tech that provides safety to your business model. Worth the extra thousands ? Hell no, but in the same breath, these like quadro cards, are sold to businesses where the extra cash is slapped on "cos we can".

iPad, for me it's a question of iPhone versus my HTC. Both sexy, but in terms of who I'd offer them to. If you don't like restraints and want control of your platform ootb, you want a HTC (android or windows) - heck the windows one is bugged to hell ootb and needs upgrading firmware wize. For someone without, or little, technical sense, the iphone is ideal. Same here, both products have their markets and I'd not say one is 'better' than the other. iPad is outstanding for those who have the need for a no messing about tablet-esq thingy. If you're tech minded, you can either adapt it yourself or buy an offering from hp/asus/etc.. each again with their own market, each good for their purpose. People suited to the iPad/pod/phone are generally happy in their sexy sandbox.

Problem we've have for the last few decades is people coming out of universities with their IT associated credential and getting the top jobs in the industry. It's been said for years that they don't need to know this'n'that cos it's seen as unnecessary. Personally I feel there is a massive difference between those taught about computers and how to use them, than those who had the aptitude to do so before they went to university. If they lost all the references within micro-soft to win3.1, they'd probably be stuck. 20 years ago they probably wouldn't have been.

I don't see new companies breaking through in any of the tech industries. The moment anything of note raises it's head, it'd get swallowed by one of the bigger companies. This is a model throwing all the way back to why we use (mainly) 8086 compatible machines as opposed to those that were around at the time that, heaven forbid, were better models. It's cash, it's contracts, it's being able to sell things that don't work 100% and being able to patch the crap out of them (then release other patches to fix what the previous ones broke).

It's all down to cash at the end of the day. One thing is for sure tho, if the end of the planet depended on something being powerful enough (equiv of running Crysis at 2000 fps) in a small efficient pcb - atleast 4 companies could do it. We'd probably all die due to the badly written software based on it, but the hardware would be possible. As a business model for milking cash - it just doesn't make sense.

Still, with the end of the world on the lines, someone would try to make a buck off it.
 
Well, in terms of the architecture, a quadro card, for one example, essentially reads and writes to onboard memory differently to their geforce counter parts. In the same sense as your regular memory for your pc is no good in a server (or most servers) and vice versa. Your gtx cannot provide the same accuracy.

This worth the extra cash alone ? Well no, but the same thing goes for a hp server in comparison to a rigged desktop. Sure you can run server 2008 on both, but whilst you're warm and comfy with your music being shared on your home network, the hp has all kinds of onboard tech that provides safety to your business model. Worth the extra thousands ? Hell no, but in the same breath, these like quadro cards, are sold to businesses where the extra cash is slapped on "cos we can".

iPad, for me it's a question of iPhone versus my HTC. Both sexy, but in terms of who I'd offer them to. If you don't like restraints and want control of your platform ootb, you want a HTC (android or windows) - heck the windows one is bugged to hell ootb and needs upgrading firmware wize. For someone without, or little, technical sense, the iphone is ideal. Same here, both products have their markets and I'd not say one is 'better' than the other. iPad is outstanding for those who have the need for a no messing about tablet-esq thingy. If you're tech minded, you can either adapt it yourself or buy an offering from hp/asus/etc.. each again with their own market, each good for their purpose. People suited to the iPad/pod/phone are generally happy in their sexy sandbox.

Problem we've have for the last few decades is people coming out of universities with their IT associated credential and getting the top jobs in the industry. It's been said for years that they don't need to know this'n'that cos it's seen as unnecessary. Personally I feel there is a massive difference between those taught about computers and how to use them, than those who had the aptitude to do so before they went to university. If they lost all the references within micro-soft to win3.1, they'd probably be stuck. 20 years ago they probably wouldn't have been.

I don't see new companies breaking through in any of the tech industries. The moment anything of note raises it's head, it'd get swallowed by one of the bigger companies. This is a model throwing all the way back to why we use (mainly) 8086 compatible machines as opposed to those that were around at the time that, heaven forbid, were better models. It's cash, it's contracts, it's being able to sell things that don't work 100% and being able to patch the crap out of them (then release other patches to fix what the previous ones broke).

It's all down to cash at the end of the day. One thing is for sure tho, if the end of the planet depended on something being powerful enough (equiv of running Crysis at 2000 fps) in a small efficient pcb - atleast 4 companies could do it. We'd probably all die due to the badly written software based on it, but the hardware would be possible. As a business model for milking cash - it just doesn't make sense.

Still, with the end of the world on the lines, someone would try to make a buck off it.

Well the Quadro/Tesla/FirePRO offerings do handle data streams in a different way, as you said. This is not due to hardware (for the most part)...it mostly comes down to driver implementation. Also the willingness to throw engineers on customer specific problems. I have an older Quadro card for Solidworks/Avid support, and when I call tech support an engineer that speaks english answers. If i say hey your current driver doesn't work for xyz....then usually there is an updated river on the Quadro user site within a week. Sure that is a godsend. I can't argue that.

Maybe my point in all of this discussion is best mirrored by your statement. It's all about cash. As long as it's all about cash, we will continue to see bloat in code, and horribly wasteful hardware to accompany it.

I do however feel that this is a dying model. There are a few think tanks that are publicly funded...that are working on system on a chip architectures. I think this will be the saving grace of the IT industry.

Until that happens (I still think it will in a 4 year timeframe)...we are stuck playing the ATI vs. Nvidia war....or the "my PSU is bigger than your PSU" game.

I have honestly considered quitting my current line of work and going back to school for a degree in Low Voltage Electronics design. Not because I am interested in making a fortune. Hell money is just paper, but it would be nice to contribute something to help people.

The only trouble is, as you said, the market is flooded with booksmart "IT" guys. That actually have no concept of anything other then what they read in their Cisco manual. I wonder how a guy who has a deeper understanding than that actually lands a position in Semi Conductor research.

sad.gif
 
The physical difference to the pcbs is one of the things that come through requiring testing. The AMDP and Quadro have been in and out, as well as their cheaper alternatives. The reactions to surrounding noise or I'd loosely name 'emi' is one of the things to test for. A gaming card will not stand up to the same scrutiny - actually they don't request the same tests as they're already considered as failing them. It makes little difference to your game, but detailed models require that there are less errors made (it's never perfect all the time). But less is in the factor of millions per second.

The broadcasters I do work for use Avid machines, some pushing 10 years old, in all honest they're a pile of crap - but they do the job required. Loosely wrapped around a K7 mobo tech with a pos graphic card. Being dragged kicking and screaming out of Windows 2k - but hey, people in the UK get their news.

Low power and green stuff just aint likely to be part of the big picture. Sure it's got a heck of alot of uses and should really be what future systems are all about - right now our newest i7 based workstations and G5s still fit in the same size tower case as they did decades ago, and the market is trying to push more and more 1500w psus (that hardly anyone needs). Although on that basis, when they noticed not many people will buy these high power substations, the prices of lower power psus went up - ironic.

Things I like to see are the pcs the size of macminis. Acer and other manufacturers run lines of these with Ion, Atom - stuff really the average person can make use of very easily. Intel really flooded the market with the variety of cpus this last socket generation - again this was a cash thing. We reckoned it'd be cheaper at the low end, who we kidding. They just adopted the microsoft premium/ultimate/profession/home way of doing things.

The war on the gpu front is mostly in the media. It's just words, it doesn't mean anything. The odd power phrase makes people feel gooey about their purchases and how much better off they can feel spending their cash. They can recycle these soundbites and phrases.

It's commendable to think about studying tech for the good of others or the industry. In all honesty tho, everything will fall back on the cash thing. I'd definitely do it for yourself tho.
 
The physical difference to the pcbs is one of the things that come through requiring testing. The AMDP and Quadro have been in and out, as well as their cheaper alternatives. The reactions to surrounding noise or I'd loosely name 'emi' is one of the things to test for. A gaming card will not stand up to the same scrutiny - actually they don't request the same tests as they're already considered as failing them. It makes little difference to your game, but detailed models require that there are less errors made (it's never perfect all the time). But less is in the factor of millions per second.

The broadcasters I do work for use Avid machines, some pushing 10 years old, in all honest they're a pile of crap - but they do the job required. Loosely wrapped around a K7 mobo tech with a pos graphic card. Being dragged kicking and screaming out of Windows 2k - but hey, people in the UK get their news.

Low power and green stuff just aint likely to be part of the big picture. Sure it's got a heck of alot of uses and should really be what future systems are all about - right now our newest i7 based workstations and G5s still fit in the same size tower case as they did decades ago, and the market is trying to push more and more 1500w psus (that hardly anyone needs). Although on that basis, when they noticed not many people will buy these high power substations, the prices of lower power psus went up - ironic.

Things I like to see are the pcs the size of macminis. Acer and other manufacturers run lines of these with Ion, Atom - stuff really the average person can make use of very easily. Intel really flooded the market with the variety of cpus this last socket generation - again this was a cash thing. We reckoned it'd be cheaper at the low end, who we kidding. They just adopted the microsoft premium/ultimate/profession/home way of doing things.

The war on the gpu front is mostly in the media. It's just words, it doesn't mean anything. The odd power phrase makes people feel gooey about their purchases and how much better off they can feel spending their cash. They can recycle these soundbites and phrases.

It's commendable to think about studying tech for the good of others or the industry. In all honesty tho, everything will fall back on the cash thing. I'd definitely do it for yourself tho.

The testing we have done in our various studios, has shown that the better efi/emi/rfi stats are due to better transformer shielding. Usually via better mu-metal casings....and better windings on the doughnuts, or petter epoxy immersions. Same concept as a high current amplifier really. All about transformer damping and reducing ring oscilation.

We essentially ripped apart some Quadros, only to realize that they were just stacked PCB's. With essentially the same chips as their "gaming" counterparts. The one thing we did discover was much much better RAM layouts and higher quality bins on the chips. Of course we did also notice that Quadro offerings were put through better QA. I am not sure how much all of that is really worth though. At the end of the day, the lifespan in a high end production environment is really only 2 to 4 years now.

Yeah Old Avid rigs are massive piles of crap. I still have one running on NT. Anid also had a big issue with Apple, when Apple released FCP. There was a period where Avid basically boycotted Apple systems, and dropped all support. That is interestingly rite around the time everyone in post moved to FCP OOPS Avid!!!

For on Air stuff though, Avid is still king. Probably because no one wants the downtime of switching over to something better.
laugh.gif


I am not so much interested in "green" tech. I think green is just another marketing tool. What interestes me about LV IC's is that they force a system to live within strict limits. Systems that are forced to live in strict limits....by circumstance, wind up developing smarter, better written software packages. As I mentioned earlier...look at the elegance of assembly language for something like a CNC mill. It's vastly complex, yet fit's in a 500k packet

That is really what interests me about LV ic's....not the environmental aspects. I am not under some illusion that anything I could engineer would stop the burning of coal. That is just silly.

You know I have been at points in my career where I have had more cash than I know what to do with....and points where I have been so broke I had to move back in with parents. So money isn't anything that I am impressed or concerned with. It would be nice though to make some contributions to the education of society as a whole. The planet will evolve with or without us. Would be nice to figure out how to engineer some solutions that help that evolution. Significant or not.

BTW what do you do in broadcast? I am in Production/Engineering (new media, audio, video)

wink.gif
 
We found that the different build on the pro cards, despite any outside interference, were able to keep up with the accuracy, whilst gaming pcbs just cant and arent expected to. Generally we'll have them without any shielding, or atleast something that looks like traditional shielding, but doesn't actually do anything - kinda like some sound cards (something reviewers also don't test - which they should when they're paired with highend gaming gpus). Alot of the coverage is in the makeup of the pcb itself and the laying of the tracks.

Meh, I think budgets definitely come into it. I mean I can rant about some Avid stuff being used is a pile of crap, similarly the TGV gear, but in reality it's gear they're using that's been out of date for a heck of a long time. Whilst their support is questionable, their newer gear is pretty good and would be much better for the users - but hey suits deal with the contracts, just as suit let out a bunch of information that technically isn't correct, but such places as review sites latch onto it.

Spookily enough, I contract for a fairly well know Broadcasting Corporation that happens to be British (I say contract, I been there almost 5 years and will probably be there for the next 5). I don't like the term IT being near my name, and thankfully they renamed and jigged their department as "Technology" which means nothing to the user as they still call it IT when they want a computer thing and Engineering when it's audio/visual. IT for me encompasses a department educated in the ways of using a computer - as opposed to a Computer department that just plain simple know the ins-and-outs of computers.

What I do for them is mainly fix stuff and help along those who spent 4 years or so learning what a pc can do after you switched it on. I also carry out electrical testing and advise on that front.

In other capacities, I/we get sent "stuff" with documents and inherently do what it says and asks for, then return the kit. This is mostly for fun as you get your hands on some pretty mysterious gear, and doesn't take up much of my time, with my bbc work only taking up 35 hours a week, it's pretty easy to fit in.
 
We found that the different build on the pro cards, despite any outside interference, were able to keep up with the accuracy, whilst gaming pcbs just cant and arent expected to. Generally we'll have them without any shielding, or atleast something that looks like traditional shielding, but doesn't actually do anything - kinda like some sound cards (something reviewers also don't test - which they should when they're paired with highend gaming gpus). Alot of the coverage is in the makeup of the pcb itself and the laying of the tracks.

Meh, I think budgets definitely come into it. I mean I can rant about some Avid stuff being used is a pile of crap, similarly the TGV gear, but in reality it's gear they're using that's been out of date for a heck of a long time. Whilst their support is questionable, their newer gear is pretty good and would be much better for the users - but hey suits deal with the contracts, just as suit let out a bunch of information that technically isn't correct, but such places as review sites latch onto it.

Spookily enough, I contract for a fairly well know Broadcasting Corporation that happens to be British (I say contract, I been there almost 5 years and will probably be there for the next 5). I don't like the term IT being near my name, and thankfully they renamed and jigged their department as "Technology" which means nothing to the user as they still call it IT when they want a computer thing and Engineering when it's audio/visual. IT for me encompasses a department educated in the ways of using a computer - as opposed to a Computer department that just plain simple know the ins-and-outs of computers.

What I do for them is mainly fix stuff and help along those who spent 4 years or so learning what a pc can do after you switched it on. I also carry out electrical testing and advise on that front.

In other capacities, I/we get sent "stuff" with documents and inherently do what it says and asks for, then return the kit. This is mostly for fun as you get your hands on some pretty mysterious gear, and doesn't take up much of my time, with my bbc work only taking up 35 hours a week, it's pretty easy to fit in.

I have never done any testing on hard faults with the Quadros. I imagine it would be far superior to the GTX/GTS counterparts....simply because of the better transformer isolation. I suppose in a certain set of circumstances the gtx.gts series would be un-usable. However you do still run into the issue of cosmic-ray bit flip. No not being sarcastic. Even though the chances are virtually immeasurable....it would be troublesome to have a large render go down because of the lack of a check-sum. For apps like Photosho/Preimere though....which a lot of "gamers" use to edit their benchmark videos (lOLOLO) the benefits of CUDA and MPE for them would be HUGE. I still find it odd that those users don't understand, or flat out disregard the more capable Nvidia offerings.

I have noticed that the reference nvidia 480's have a VERY strong PCB design. Very good layering and excellent trace layout. You do bring up a very good point though. As a test I put a gtx480 alongside one of my Pro-Tools HD cards....just to see if there was any measurable noise added. Did some pretty comprehensive waveform analysis and found nothing below the typical 60db noise floor, However, here is where things got interesting. When comparing raw bitstream in the hexeditor....we did notice some very odd blocks. Which I will simply call errors, for lack of a better description. Thank god for oversampling and generally good buffering/checksum from the Digi hardware.

It's a tough balancing act to run a profitable operation. Especially in a saturated market. That was kind of my reasoning behind getting into LP IC design. It's a very small market as of yet. Lot's of potential for growth. Not to mention it would be very cool to be involved in the developmental aspects. I mean what better way to get your hands on cutting edge hardware. Developmental systems used to design developmental systems. hah. The supercomputer that designs the supercomputer. I have seen/owned/gotten to play with some pretty insane toys in A/V production, but I have a feeling that they pale in comparison to some of the stuff sitting in Semi Conductor research labs.

Like you said...either way it would be very cool to know it. Also the program I have been looking into to gives you full access to a prototyping lab. You should see some of the gear in there. A large format broadcast console looks like a toy compared to some of the 3d prototyping gear in these labs.

I have always liked to leave the figuring out how to make money part up to someone else. I am no good at that. I get money...I spend it. Pretty simple minded when it comes to that.
biggrin.gif
 
Back
Top