Aug. 21st, 2008

pasithea: glowing girl (Default)
If you're trying to locate me next week, try somewhere in the desert.

I don't have any friends that invited me to camp with them so I'll be somewhere in the suburbs, probably situated between a couple of screaming children and just down wind of the world's most flatulent man.

You can find my camp by looking for a pile of failed structures and experiments. Just look for the people who look the most dull and the least decorated. That will probably be us.

Ants

Aug. 21st, 2008 03:46 pm
pasithea: glowing girl (Default)
At lunch time today, while I was drawing, I observed ants. Fascinating things. You could probably figure out every single human behavior by watching them if you thought about it.

Watching them today, I realized that each and every ant you see has everything that's needed to make an AI that could in time reach human levels of intelligence. There's no need to try to understand the human brain to make it happen. You could study the brain of a tiny insect, maybe a few hundred neurons and you should be able to figure it out.

Ants are inquisitive. Place some new item near their scent trail and they'll stop and investigate it. They determine if it's a threat, a resource, or not of interest, and once it is identified, they take action, but it's that curiosity that is the important key bit.

The other thing that ants have is fear. If they assess something as a threat, they will either attack it or run from it, dependent on the nature of the threat.

Both of these are functions of the same need. Self preservation. That's probably the core of what you need for creating a learning AI. It needs a huge quantity of resources but not an infinite quantity of resources. It needs self preservation and a bit of random alteration. I get the feeling that, given sufficient time, space, and resources, you would get everything you see in humans.

The problems for making this work are 1) Inspiring a survival behavior. 2) Generating an environment with both enough materials to support the initial seed development time and complex and finite enough to make evolution work in favor of the virtual organisms.

Computer viruses and the internet are close. What if the goals of a computer virus were not just replication but also grabbing bits of resources from the system, adding instructions to itself from those pieces letting those replicate as well. Say: Each living virus has an initial set of 3 actions. 1) Replicate two exact copies of itself. 2) Replicate a copy with an added code string stolen from the system. 3) Replicate a copy with part of its own codestring modified by a piece it swiped from the system. 99.9999% of them would die but some very fractional number of them would survive, probably mostly filled with NO-OP instructions. Human would respond to the 'much too successful' versions of said virii, causing the 'comet strikes' that ended the dinosaurs, but other strains would survive and continue to grow and change, doing something or nothing, being more or less successful than their cousins.

It would be interesting to see who would win. People who kill computer viruses or the evolutionary process of the viruses. Of course, with any success, whoever created the primitive animal would become one of the most reviled humans to have ever lived.

I do think it's an interesting idea though. The purpose of life would be to have no purpose except to be life. That's probably where we've really gone wrong with AIs. we assert purpose to them. They fail because we constrain them.

February 2012

S M T W T F S
   1234
567891011
12 131415161718
19202122232425
26272829   

Most Popular Tags

Page Summary

Style Credit

Expand Cut Tags

No cut tags
Page generated Jan. 5th, 2026 12:33 pm
Powered by Dreamwidth Studios