Ants

Aug. 21st, 2008 03:46 pm
pasithea: glowing girl (Default)
[personal profile] pasithea
At lunch time today, while I was drawing, I observed ants. Fascinating things. You could probably figure out every single human behavior by watching them if you thought about it.

Watching them today, I realized that each and every ant you see has everything that's needed to make an AI that could in time reach human levels of intelligence. There's no need to try to understand the human brain to make it happen. You could study the brain of a tiny insect, maybe a few hundred neurons and you should be able to figure it out.

Ants are inquisitive. Place some new item near their scent trail and they'll stop and investigate it. They determine if it's a threat, a resource, or not of interest, and once it is identified, they take action, but it's that curiosity that is the important key bit.

The other thing that ants have is fear. If they assess something as a threat, they will either attack it or run from it, dependent on the nature of the threat.

Both of these are functions of the same need. Self preservation. That's probably the core of what you need for creating a learning AI. It needs a huge quantity of resources but not an infinite quantity of resources. It needs self preservation and a bit of random alteration. I get the feeling that, given sufficient time, space, and resources, you would get everything you see in humans.

The problems for making this work are 1) Inspiring a survival behavior. 2) Generating an environment with both enough materials to support the initial seed development time and complex and finite enough to make evolution work in favor of the virtual organisms.

Computer viruses and the internet are close. What if the goals of a computer virus were not just replication but also grabbing bits of resources from the system, adding instructions to itself from those pieces letting those replicate as well. Say: Each living virus has an initial set of 3 actions. 1) Replicate two exact copies of itself. 2) Replicate a copy with an added code string stolen from the system. 3) Replicate a copy with part of its own codestring modified by a piece it swiped from the system. 99.9999% of them would die but some very fractional number of them would survive, probably mostly filled with NO-OP instructions. Human would respond to the 'much too successful' versions of said virii, causing the 'comet strikes' that ended the dinosaurs, but other strains would survive and continue to grow and change, doing something or nothing, being more or less successful than their cousins.

It would be interesting to see who would win. People who kill computer viruses or the evolutionary process of the viruses. Of course, with any success, whoever created the primitive animal would become one of the most reviled humans to have ever lived.

I do think it's an interesting idea though. The purpose of life would be to have no purpose except to be life. That's probably where we've really gone wrong with AIs. we assert purpose to them. They fail because we constrain them.

(no subject)

Date: 2008-08-21 11:13 pm (UTC)
From: [identity profile] ff00ff.livejournal.com
I'm pretty sure a self deterministic AI with an innate desire for survival is not in the best interests of a human dominated future. Being a human I can't say I like the sound of it.

All of our intelligence and philosophy and politics is nothing but the byproduct of the DNA molecule's propensity to reproduce itself. Making a piece of computer code with that objective, and an ability to mutate, like you specified, and hence evolve over multiple iterations, could ultimately lead to a code animal who does a lot of things that it thinks are for other reasons but are really just to try to continue replicating its source code. Its a dangerous and stupid way to keep increasing the data processing capacity of the world around us, IMHO. Down with Darwinistic algorithms!

I think the future lies in non-genetic information, Richard Dawkin's memes. Preserving a meme is a better basis, in my opinion, for developing an AI than giving it the imperative to continue replicating and guarding a piece of starting code, a code equivalent to genetic material.

(no subject)

Date: 2008-08-21 11:17 pm (UTC)
From: [identity profile] dv-girl.livejournal.com
So here's what I imagine happening:

Generation 0: Virus is deployed, begins replication.
Gen 1 thru N: 2 exact copies of 0, 2 that may or may not run. One with some small random chunk of memory added into it, the other with some small random chunk of memory replacing some part of it. Of the Gen1 species, the Gen0 clones will survive, the other two may or may not survive.

Eventually, some of the modified first generations will either add code that does not render them inoperable or modify code that does not render them inoperable. It's important Gen0 be as compact as possible for this to work well.

Some survivable Gen-N child might be: Faster/slower repro rate. Few or More Children of various types. Various inserted/appended bytes that do not cause immediate failure.

Some potential catastrophic events are over-consumption of resources, causing the eco-system (computer) to crash, people altering the environment (formatting the drive) and people writing virus elimination software. A strain might be super-successful in the short term, dominating the resources on the system it inhabits or distributing itself more virulently than other strains, but that will result in a greater likelihood for a catastrophic event so the system becomes self-limiting.

As changes and 'non-fatal additions' accumulate, some of those changes will eventually cause some of the non-fatal paths to be run instead of just sitting around doing nothing. Most of those will fail but a very tiny percentage will survive and continue to replicate, change, and grow.


Now I'm wondering if these even need to be a 'virus' per say. If you had thousands of computers running a very simple program that does what I suggested and those computers were connected to the internet and other devices, in time, they would (in theory) eventually find their way off their home systems. For the first very long time, you'd have to rigorously back up the systems and restore them if your children wiped themselves out, but sooner or later, you'd get some that would find a solid path.

The cool thing is that in a closed system, your virii would begin preying on one another in time due to limited resources and competition.

Madness! I wonder if it could really be that simple. :)

(no subject)

Date: 2008-08-21 11:45 pm (UTC)
From: [identity profile] coffeedaiv.livejournal.com
It is really easy to think of Humans as bilogical machines, akin to computers. The thing is, though, the brain is not a digital computer. Neurons either fire or not, which is digital. But there are also billions of chemical interactions taking place, all of the time, which are not digital, and have profound affect on the system.
So, while i think your experiment would be interesting, I suspect it would not lead to intelligence of even the Ant level.
Note, of course, I am not an expert of anything (at least, not of anything relevant to this discussion). So my thoughts should be taken with a grain of salt.

(no subject)

Date: 2008-08-22 07:41 am (UTC)
From: [identity profile] paradox-puree.livejournal.com
Hmmm...I did some undergrad research in the area of swarm intelligence and came across a whole bunch of stuff on this and related topics. I have a few books I could lend you if you're interested.

(no subject)

Date: 2008-08-22 05:40 pm (UTC)
From: [identity profile] yetanotherbob.livejournal.com
Linking, just because it's such a fun read.

Later dubbed "the Ancestor," [a program on a simulated computer system that added mutation] was the first worm Tom Ray ever created - an 80-byte-long self-replicating machine written in Tierra's quirky assembly language - and as it happens, it was also the last. Once loosed into the Tierra environment installed on Ray's laptop, the creature's offspring quickly spread to the new world's every corner, within minutes displaying the evolutionary transformations that would "write" Ray's organisms from then on.

February 2012

S M T W T F S
   1234
567891011
12 131415161718
19202122232425
26272829   

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Aug. 1st, 2025 11:56 am
Powered by Dreamwidth Studios