Human V2.0

Jaster

New member
With the imminent release of OCZ Neural Impulse Actuator this is a huge jump towards the "Technological Singularity". For people whom have never heard of this, this is simply the point at which computers equal the brain in terms of power and also the point in which we understand how the brain works in its entirety. Its a point where possibly our conciousness will be able to live on in machines, a point where all theology and morality may cease to exist, heavy stuff right.

Lets see if we all can come up with an Idea, or a theory, no matter how wacky, about how the future of computing is gonna end up, will it be our downfall, or our transcension to another plain of existence, are we going to enslave monkeys to run the power plants that our equipment needs because we all actually live in cyberspace...lets call this "OC3D Theory"...if its any good we might even be able to get someone to credit or publish it...they publish any old crap...at least lets give people an Ideology that people can laugh loud at...

For more information on the Singularity watch this

Human v2.0
 
The technological singularity is a hypothesised point in the future variously characterized by the technological creation of self-improving intelligence, unprecedentedly rapid technological progress, or some combination of the two

The idea of technology being self improving isn't helped or hindered by the OCZ Neural Impulse Actuator really. (or am I missing something?)

I guess it may be seen as technological progress.

From what I've read and seen, the interface is still quite basic. Especially looking at it in terms of evolution or Technological Singularity. It may be a step in the right direction. (the videos I've seen still show people using a mouse in conjunction with it)

The idea of Technological Singularity still seems unchanged and is still very much science fiction.

If anything the evolution of computers is slowing down. Look at clock speeds as an example, now moving onto using more cores rather than higher clocks?
 
name='equk' said:
If anything the evolution of computers is slowing down. Look at clock speeds as an example, now moving onto using more cores rather than higher clocks?

Clock cycles are quicker allowing you to drop the actual clock speed but keep the same performance..

A 1.8Ghz Core 2 Duo would beat a 3Ghz Pentium 4 with ease. ;)
 
The NIA is designed on technology brought about by the Monkey experiment in that documentary. It shows a direct link between our brains and a machine. Yes its a very basic approach but still works towards the essence of the TS. To say it is a work of science fiction is kinda nieve. As I think all the scientists in the world reaching towards this end will agree to. Also you are looking at CPU speed as a relation of power, not so, look at the cuda project (NVIDIA's open source coding) I think the best description of this was something as follows

"Give a PC CPU a book to read and it will assign alternate pages to each core reading in sequential order, give a current GPU a book to read it will rip the book up throw it in the air and read every piece simultaneously"

Read the peices on the Russian Hacking Software, current PC CPU technology is not even being touched upon in terms of development of hardware over software manipulation, we are only just touching on applications designed for mulitple cores. Abstract hardware layers for GPU's are where true "power" is harnessed, and I think most people on this forum will agree that its there cpu's bottlenecking there systems and not the graphics cards. Next you'll be saying moores law is twiddle as well :D...

Its an interesting development and ideology, and the date is still 2029 by estimates...that hasn't changed in 6 years (number is based on moores law though)...
 
name='Toxcity' said:
Clock cycles are quicker allowing you to drop the actual clock speed but keep the same performance..

A 1.8Ghz Core 2 Duo would beat a 3Ghz Pentium 4 with ease. ;)

True

But jumps in computer evolution seem to be be regressing more and more.

Con Kolivas said:
We're still plugging in faster CPUs, more RAM, bigger hard drives, faster graphics cards and sound cards just to service the operating system. Hardware driven innovation cannot be afforded by the market any more. There is no money in it. There will be no market for it. Computers are boring.

Computers of today may be 1,000 times faster than they were a decade ago, yet the things that matter are slower.

The standard argument people give me in response is 'but they do such more these days it isn't a fair comparison'. Well, they're 10 times slower despite being 1000 times faster, so they must be doing 10,000 times as many things. Clearly the 10,000 times more things they're doing are all in the wrong place.
 
lolz, saving coursework onto an atom and taking it to college, MISS MISS I LOST MY ATOM! :o

(Atom being a real atom not intel's atom) ;)
 
name='Jaster' said:
Its an interesting development and ideology

I agree with that, it is possible that in the future mankind will develop self-improving machines. It's quite amazing to think about it.

The OCZ thing proves that machines NEED input in order to work and doesn't make a computer anymore viable to being self-improving than plugging a keyboard or mouse into it.
 
At the end of the day computers themselves have been designing CPU's for nearly 2 decades, (do you honestly think your average VLSI genius can pinpoint 1 billion transistors and measure voltage drops across the cpu at any given point), however the software is usually created by man, and the software is always the weakest link, letting developers become lazier and lazier, until hardware can write its own software then this situation isnt going to improve. Why do we have known generations of games on one console, because they get better at developing for it over time. And face it the IBM compatible machine is a technology that originally was only designed to last for 5 years, something radical will need to be done to change the way we look at a computer and what true power and how to utilise it correctly, I really think CUDA and its developments may actually force the PC industry to change to a design that is less flawed with better development, probably harking somewhat back to the commodore Amiga era, where shareware and development was rife and exact with good software innovation. Most programming courses now teach you in the first lesson "your not here to re-invent the wheel", software engineering needs to be tackled in a big way.
 
name='Jaster' said:
I really think CUDA and its developments may actually force the PC industry to change to a design that is less flawed with better development, probably harking somewhat back to the commodore Amiga era, where shareware and development was rife and exact with good software innovation.

I agree, if software could keep up with hardware development, things would be completely different. As you said (& Con Kolivas in the article I quoted), look at what a Commodore & Amiga could do with such a low spec system.

name='Jaster' said:
Most programming courses now teach you in the first lesson "your not here to re-invent the wheel", software engineering needs to be tackled in a big way.

I had that :) haha

It would be good if software & software development could be accelerated hugely.

That's why I quoted Con Kolivas
 
the only way would be if machines could code themselves, as in the singularity, the logic of this can go round in circles for days, I think were all in agreement though that something radical on the software front needs to be done, even microsoft want developers to start multi core optimizing, lazy software engineers, too busy living it large and spending too much time in strip joints to do there work properly :D
 
name='PP Mguire' said:
I personaly dont like the idea of computers thinking for themselves. I-Robot comes to mind why.

and i bet thats what god said when he created man (or supreme being as I prefer to think of him, or geoff as he lets me call him ):P
 
well if thats just evolution and you dont like it it equates you to a person who was heckling people as they were burnt at the stake for heresy, when it was just science and knowledge...keep inmind the 3 laws...:D....seriously though things are changing and for the better...its better to embrace then to deny.....thats what the caveman who tried to stop the wheel realised when the wheel ran over him...:D
 
artificial being = artilect .... failsafes ....its whats needed...the AI may be near but the robotics are decades off....unless the AI designs them...then were up
cussing.gif
cussing.gif
cussing.gif
cussing.gif
creek without a paddle
 
Back
Top