So I just watched Her and it gave me a deep thought. Will man ever create an artificial consciousness or is it impossible. Anyone have any ideas?
Someone will, simply because they can. In fact, the more people who say they can't/shouldn't do it, the more likely it becomes. That's just human nature, seems to me.
I think according to moore's law, if you were to break down everything the human brain does into binary cpu tasks then by 2035 the average desktop computer will have the "power" of the human brain.
Obviously a binary CPU doesn't exactly emulate the functions of the human brain and would have to be programmed in order to do so.
For a while universities were experimenting with "neural networks" which were computers that did actually simulate how brains worked; (See Terminator 2: "My cpu is a neural net processor")however since then Binary computers have gotten so much more powerful that it's just easier to simulate the function of a neural network in the software.
Shit gets scary in the 2060-2070 range when the average consumer grade desktop computer CPU will have the computing power of every human brain on the planet combined. So if by then somebody has created true "living" software to go along with it we're pretty much fucked.
The problem with the "doomsday clock" is that it's based on 'nuclear stuff'.
The fact is not all physicists actually support the idea of a "nuclear winter" style doomsday. It's actually media sensationalism that promoted it into the mainstream dialogue (Due to a paper by Carl Sagan, who was already the beloved popculture TV scientist by then) but it is actually a broadly contested idea. And the earliest papers on the topic based their projections of levels of radioactive fallout on surface detonations (but tactical nuclear weapons have since been designed primarily for air detonations, which don't leave anywhere near as high a concentration of local radioactive fallout as a surface detonation would, it would actually be relatively dillutely spread throughout the atmosphere). So the primary consequence of a nuclear war on people outside of the "hot zones" might at worst be increased cancer rates and cancelled travel plans.
And when you just calculate the raw energy of the situation, we're pumping more energy into the atmosphere by burning fossil fuels than we would if we "dropped the bomb" (and "dropping the bomb" would significantly reduce the future use of fossil fuels to put it lightly) so other than killing lots of people things would actually balance out in the long run.
So the bottom line is, the doomsday clock is full of shit. There's far more effective ways we could potentially end our civilization that it doesn't take into account.
By the time we Create AI the reapers will come and kill us all
Awsome signature and avatar made by Kuragalolz
Even we know we are the cancer of this planet. Wouldn't take long for A.I. to figure that out and kill us all. Haven't you seen those movies?
I am personally working on the code that will make this a reality.
Give it 20 years and we will be working out the final kinks.
Call me Cassandra
Actually scientists have recently been working on a 100% conductive 2 dimensional tin based structure for potential computing applications (Stanene).
And they theorize the possibility of modifying the design in order to maintain the 100% conducitivity at temperatures as high as 100C.
http://www.extremetech.com/extreme/1...-silicon-chips
Really quite interesting stuff.
Last edited by Gheld; 2014-05-21 at 08:18 PM.
That's a good question and won the thread.
On topic, future of AI is understanding human brain. There are many fields researching it (Artificial neural networks, computational neuroscience, cognitive science and so on). We may have reasonably well AI in future but as good as our brain? If we do not wipe ourselves, eventually we will get thereat some point.
How about you have absolutely no way of knowing what the AI would value, what it would think, and what it would do. That's why the capability of the AI will be severely limited in most (if not all) cases. So you can make a packaging robot, and as much as it might learn about other things, it still only has tools effective at packaging things.
Call me Cassandra