Ants

Aug. 21st, 2008 03:46 pm
pasithea: glowing girl (Default)
[personal profile] pasithea
At lunch time today, while I was drawing, I observed ants. Fascinating things. You could probably figure out every single human behavior by watching them if you thought about it.

Watching them today, I realized that each and every ant you see has everything that's needed to make an AI that could in time reach human levels of intelligence. There's no need to try to understand the human brain to make it happen. You could study the brain of a tiny insect, maybe a few hundred neurons and you should be able to figure it out.

Ants are inquisitive. Place some new item near their scent trail and they'll stop and investigate it. They determine if it's a threat, a resource, or not of interest, and once it is identified, they take action, but it's that curiosity that is the important key bit.

The other thing that ants have is fear. If they assess something as a threat, they will either attack it or run from it, dependent on the nature of the threat.

Both of these are functions of the same need. Self preservation. That's probably the core of what you need for creating a learning AI. It needs a huge quantity of resources but not an infinite quantity of resources. It needs self preservation and a bit of random alteration. I get the feeling that, given sufficient time, space, and resources, you would get everything you see in humans.

The problems for making this work are 1) Inspiring a survival behavior. 2) Generating an environment with both enough materials to support the initial seed development time and complex and finite enough to make evolution work in favor of the virtual organisms.

Computer viruses and the internet are close. What if the goals of a computer virus were not just replication but also grabbing bits of resources from the system, adding instructions to itself from those pieces letting those replicate as well. Say: Each living virus has an initial set of 3 actions. 1) Replicate two exact copies of itself. 2) Replicate a copy with an added code string stolen from the system. 3) Replicate a copy with part of its own codestring modified by a piece it swiped from the system. 99.9999% of them would die but some very fractional number of them would survive, probably mostly filled with NO-OP instructions. Human would respond to the 'much too successful' versions of said virii, causing the 'comet strikes' that ended the dinosaurs, but other strains would survive and continue to grow and change, doing something or nothing, being more or less successful than their cousins.

It would be interesting to see who would win. People who kill computer viruses or the evolutionary process of the viruses. Of course, with any success, whoever created the primitive animal would become one of the most reviled humans to have ever lived.

I do think it's an interesting idea though. The purpose of life would be to have no purpose except to be life. That's probably where we've really gone wrong with AIs. we assert purpose to them. They fail because we constrain them.

(no subject)

Date: 2008-08-21 11:33 pm (UTC)
From: [identity profile] ff00ff.livejournal.com
It very probably could be that simple, but I'd far prefer to do it on an entirely isolated network, rather than on the internet that controls stuff that humans need D:

(no subject)

Date: 2008-08-21 11:42 pm (UTC)
From: [identity profile] dv-girl.livejournal.com
Well. It would really need a dynamic system to force evolution. That's part of the appeal of the internet BUT, it's worth mentioning that a program like this is probably considered something like 'crime against humanity'. It exceeds terrorism. Anyone who successfully wrote such a thing would probably be lynched.


But on the bright side, we'd at long last have definitive proof that man destroyed God. :)

(no subject)

Date: 2008-08-22 12:01 am (UTC)
From: [identity profile] dv-girl.livejournal.com
Anyhow. You have only yourself to blame for this idea. You're the one who suggested that all evolution would necessarily be stupid and violent at some point in its path. I just came up with an idea for how to create an evolutionary function that was stupid and violent. :)

(no subject)

Date: 2008-08-22 12:36 am (UTC)
From: [identity profile] ff00ff.livejournal.com
Evolution is stupid and violent, that's why I'd prefer to have AIs designed around aesthetic principals, and given versatility through heuristic programing, rather than though something as reckless as evolution. We already did the hard part of evolving artificial intelligence, why repeat the process in a computer? Now that we exist, we should put an end to evolution by natural selection by making AIs that are more like angels than animals.

February 2012

S M T W T F S
   1234
567891011
12 131415161718
19202122232425
26272829   

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jul. 19th, 2025 07:49 am
Powered by Dreamwidth Studios