Transhumanism should be treated how drugs should be treated: freely available as a product/service to those who want it. Some people tend to forget they have the right not to do or take something. Let's bring back personal responsibility!
Transhumanism should be treated how drugs should be treated: freely available as a product/service to those who want it. Some people tend to forget they have the right not to do or take something. Let's bring back personal responsibility!
What if he didn't have a choice?
- - - Updated - - -
Whenever I see this question pop-up. I always think about the first man that managed to bang the right two rocks together to make a spark and make fire. Another caveman stamps it out saying NO!!!! That's not ours it the work of the gods!"
Later millions of years in the future... Man is still sitting around doing whatever no fire, no tech advances... the sun is growing and dying. Some lizard is sitting on a branch and goes.. Darn Humans if only they'd played with fire they could have saved us all.
I agree but let's say that say, Russia, China and USA are equally far in creating AI.
Progress will be kept top secret. And no one will risk letting the other get it first.
That means hoping for countries to be careful with this won't work out. They will just try to get it out before the others.
And if they somehow decide to cooperate, other, less cooperative countries might catch up. And whoops, suddenly North Korea has invisible tanks and planes and nano supersoldiers who can move through walls.
This also applies to quantum computers.
Whoever gets it first will become the ultimate superpower.
It's a race to dominate the future and there will only be one winner.
I wonder when we will genetically engineer the first cat girl maid!
True path of science!
When the consequences of your choices are to either remain part of a relevant society, or to be cast away entirely as a trash-like being... You don't have much of a choice, do you?
"So, here is that thing that costs *way more than you can afford*, which will increase your mental and physical capabilities way beyond what normal humans can do. But sure, you're also free to not do it."
That doesn't sound good.
I dont think it is playing god. Just taking the next step. I mean not a whole lot different then the guy that first thought up how to split a broken bone to make it heal faster. Or the guy that figured out if you run a lot it gets easier to run. I mean all these things can be left to "god" to work out. Now we can put a chip into someones brain to make them see. The bar has moved. The basic idea is the same.
Bro, download my brain into a cybernetic human body ASAP. Let's get this shit started.
On AI,
I think it would be a grave mistake to create a human-like AI, not only would it be dangerous to have such a potentially powerful entity 'gifted' with human qualities like greed and hate but it could also be unethical (how would a human mind react to being put in a silicon-box?).
As long as we don't give the AI these human qualities, or the survival instinct (self preservation) itself I think we'll be safe. However, some would argue that an AI may reach a certain point where it begins to 'evolve' on it's own, and who's to say it wouldn't develop a survival instinct?
"In order to maintain a tolerant society, the society must be intolerant of intolerance." Paradox of tolerance
This is the sort of things i don't understand.
https://biohackinfo.com/news-weather...aguas-surgery/
It's also an unlikely scenario. Things that cost way more than people can afford don't become part of the fabric of society until the price comes down to a more reasonable number. You can see that with VR, there have been VR rigs of various types for as long as I've been alive, but they were bulky and expensive and impractical to the point where nobody really actually used them. It's only recently that the technology has reached a point where it's obtainable by the average person, and it's /still/ taking years to slowly work it's way into households.
And transhumanism does currently exist right now in various forms too. There are bionic feet and arms and rigs that translate brainwaves into electric signals to remotely control things. None of which are even close to the point where a sane person would willingly replace a working limb with them, but that's a question of refining technology and bringing down costs more than inventing something completely new at this point.
Technology doesn't impose a world upon the people, people accept technology into their world (or don't, sometimes). A split world between transhuman haves and base human have-nots only occurs if people /make/ it happen that way, and that has way more to do with the people than it does with the technology.
It honestly doesn't tend to be as many religious people as you would think so much as environmentalists. Say you genetically engineer a tree to grow in a more aesthetically pleasing way. Say it turns out that came with a fatal, until maturation unforeseen flaw. Saw that its not a sterile, one and done lifeform like many current day GMO cultured plants. Say that its seeds get out into the wild. Say they get into an ecosystem and totally upend it by effecting the local food chain, or are highly more effective to people will allergies to that kind of tree, or simply leads to natural cross breeding that kills off the natural version that also turned out to be used in the production of some medicine or something. Now try and ensure you get every seed/plant back and are 100% certain you can contain it. Its not as hard as trying to recapture every particulate of a manmade virus or something, but still. Its a very firm genie out of the bottle scenario. One that drives enviromentalists crazy and honestly is a genuine worry that should lead to very strict containment and development precautions.
Compared to that human augmentation is just going to be at worst another gun law situation if you can get a arm strong enough to put a guys head off or something.
Sure, they have the right not to do something. It's just there are game theory esque reasons that even the concept of equality of opportunity will break down if only some are able to access this kind of advancement, and a lot of our social fabric is based on the (imaginary) belief in that kind of fairness. I'd predict more reckless and risk-prone behavior in such a system.
- - - Updated - - -
I don't see the basic idea as the important part (it's like stating that climate change has always existed) - the part I'm wondering about is when that bar has moved too far, and when it reaches some kind of critical threshold.