What is the basiux?


#1

Would you try to guess what war, global warming, alien invasion, asteroids, and technology have in common?

The Afghan Girl (a.k.a. Sharbat Gula) is still living miserably.

We could stop war and prevent any other huge potential existential risk posed on humanity, including the most dangerous ones, made by ourselves (such as battles and CO2 emissions) by diving in arguably the most likely existential risk event to happen next.

Artificial Intelligence is already revolutionizing our lives, and it's only being used narrowly (A.N.I.). The current trend of AI evolution is trying to artificially and manually take every step slowly, in a very justified fear of losing control of it. Although not so much.

Experts believe the main risk A.I. poses is if it's made not as smart as it could be (A.S.I.) and then get controlled by unfriendly people. Army. Terrorists. The point is: an A.I. human like (A.G.I.) could be very dangerous for being dumb enough to keep following a bad initial set of instructions.

Now, you've seen what a good A.I. can bring us already. "Ok Google" or Siri. When they work, sure, but they can work over 99% of times. Like in the movie Her, picture a computer you can talk to but you can't tell if it's human or not. It could answer any questions you have before you even asked them.

An Artificial Super Intelligence (A.S.I.) could not only gives us answers, it could build them as well. It could accelerate our technology in an overwhelming rate. Think how people in 1910 would react if you showed them a smartphone or a paralympics cyborg from today. ASI could make that last century in evolution happen in less than 1 year.

But how can we control such a supreme being to serve us instead of exterminating us all?


Leeloo is, at best, an improved Artificial General Intelligence (A.G.I.), in our scale here

We can't. Just like no other live being in Earth can control us. And we're not even closely as vastly intellectually superior to them as the ASI would be to us. We just can't.

Try looking at the whole picture. In majority, we do help other animals, try to save them, study them to better understand the universe. Most of us want to be nice and, overall, we do have a positive impact. Dogs only exist because of us.

However... There's a bigger question before that:

Can we afford not taking the leap for another century? How about another decade?

Almost like a dumb parasite killing its host, we're killing our environment because individually we can't see it happening. And we need help. It's not just global warming. And even if, and this is a huge if, we ever figure it out by ourselves, there are so many universal dangers to the weakness of life that we're bound to get challenged by one soon enough. In fact, we're very lucky it still haven't happened, and it's going to happen any day.

Besides, Beny Basiux will be awesome! Nobody is really fearing for her, we're mostly fearing not getting to her fast enough and being caught in the process of trying.

More the reason to try and skip ANI, AGI and go straight to the sublimation of the ASI!

www.basiux.com


references


We need to accelerate the singularity!