Log in

No account? Create an account

Previous Entry | Next Entry

The Singularity Challenge

I lead a silly life, and this journal has largely been a record of silly things. But about this, I'm very serious and I hope you'll lend me your attention for just a few minutes.

Mankind's future is inextricably intertwined with technology. Human physiology has changed relatively little over the millenia, but our capacity to change the world has grown exponentially because of the tools that we create. The vast power with which we change our environment, and indeed, the whole world, comes not from what we can do with muscle and bone - but with minds and tools. And there's something big coming in the near future that's going to make gunpowder, the printing press, flight, computers - seem quaint. Some futurists call it "The Singularity". What's the singularity?

Our capacity to invent new things is limited to our intelligence and our knowledge. Each successive generation has the accumulated knowledge of the previous generations to build on, and as such, the progression is linear. But human intelligence has not much changed - certainly no linear progression. The smartest person in the world today might be marginally smarter than the smartest person in the world 5,000 years ago. (Then again he might not be.) But for the most part, human intelligence is a fixed quantity within a certain range that has not varied. That's about to change.

Sometime in the near future, we'll create a computer that's as smart as a human being. With the advent of quantum computing and ever-increasing computing power, we get closer and closer to fully modelling the human brain. Even today, computer scientists have successfully modelled the brain of a mouse in its entirety. The whole thing! A virtual mouse-brain! It worked slowly - much slower than an actual mouse-brain - but it worked. From there it's just a matter of Moore's law catching up with the difference between mice and man. This is excting for a hundred reasons - modelling cures for neurological afflictions, for instance - which is unethical on a human but nothing at all with a virtual model.

But that's not even 1/1000th of what's exciting about this. No, it's the singularity that's thrilling - because there will come a moment when mankind creates an intelligence that is greater than his own. Intelligence was always the limiting factor on human progress - but no longer. Soon there will literally be no limit to human progress. Because the first thing a higher-order intelligence could do would be to design an even higher-order intelligence. Each successive generation would be more intelligent, and thus capable of creating an intelligence of a greater delta than the previous generation. That means that the growth of intelligence will be exponential, not linear. The change and progress of the last thousand years will be as nothing compared to the changes of the next 100.

This is the most important thing that will have happened in human history. And you can be a part of it. This isn't science-fiction, and it's entirely possible it will happen in your lifetime. Are the hairs on the back of your neck standing up? They are for me. Limitless power. Instantenous travel. No more aging or disease. Humanity will be forever altered by the singularity - and there are many people who are working steadily towards making this happen. Today, the Singularity Institute for Artificial Institute has announced that Peter Thiel (co-founder of PayPal and a prominent philanthropist) will match up to $400,000.00 in contributions to their work to bring about the Singularity.

The singularity isn't just the pipe-dream of a bunch of slashdotters and transhumanists - there have been summits at Stanford, books written, and a lot of serious, academic thought about how the future of AI can be made to serve, rather than threaten humanity. I'm a big believer in this. I'im going to contribute, and I hope you will too.


May. 10th, 2007 06:37 pm (UTC)
The hairs on the back of my neck are standing up, but not for the reason that you think. This is for more than a few reasons:

1. With current inequalities in the world, not just in intelligence but also social and economic inequalities, this will further divide people rather than lift people as a whole. If the world were so benevolent, I would like to see this happen. But most people that are not connected to the internet nor have an advanced education are not going to see past their own basic needs for survival and immediate wants. Lifting the intelligence of humankind does not necessarily equal a bias towards compassion. Such a concept will still require stewards and will be marked with individual needs for power.

2. With such inequality, there will be a scenario where AI will be used to promote the thinking for society, since it will be presented that these structures "know better" than the inventors themselves. An individual will not be permitted to learn anything naturally nor through trial of experience. People will tend to move themselves toward conveniences and a society infused with AI will have thoughts thought for them, rather than people thinking for themselves.

3. No invention nor progress can be attained with a capitalist based society without financial backing. Should singularity happen, there will be no doubt that corporations will use such agendas to garner profit and significantly influence thought processes. Battles for artificial intelligence will be often linked to intellectual property, possibly halting the progress that you would like to see happen.

Those are my concerns. I like the idea of the Singularity but I do not trust anyone to carry it out so benevolently.
May. 10th, 2007 06:43 pm (UTC)
Innoculations, light-bulbs, airplanes, telephones - regardless of individual access on a day-to-day basis, have made the world better for pretty much everybody.

The rate at which people profit may be uneven, but the point is - the progress does occur. Should researchers not develop treatments for cancer, because someone in Sub-Saharan Africa won't have access to it? Should we can developments in space travel because not everyone will be able to jump on a shuttle?

So this is important - you could put your head in the ground and leave the Singularity to well-funded corporations and governments, or you could support ethical, non-profit organizations that seek to develop and guide the process. Which is superior?
May. 10th, 2007 06:52 pm (UTC)
So this is important - you could put your head in the ground and leave the Singularity to well-funded corporations and governments, or you could support ethical, non-profit organizations that seek to develop and guide the process. Which is superior?

In an ideal world, the Singularity would be done by a combination of corporations, government, and an outside organization (NPO) in a checks and balances scenario. Funding and research would occur corporately, government would regulate to ensure that reasonable profit and investment is made, NPO would counter-regulate (is that a word?) both to make information public and accessible to people who want to know and participate in it.

But your point is well-made on the uneven profit and the various inventions.
May. 10th, 2007 06:56 pm (UTC)
Here is a chance for you to encourage the NPO's with guiding a human-friendly and ethical approach to a world-changing event.

Like nanotechnology, AI offers a huge danger along with huge potential. It's very important that all of us be aware of, and as much as possible, encourage an ethical approach to progress.


monkey pirate
Rum, Sodomy, and the Lash: Pick Two
My Yelp Reviews.

Latest Month

June 2018

Page Summary

Powered by LiveJournal.com
Designed by Paulina Bozek