Sensory Organelles in AI and Player

This question comes up specifically in the context of me trying to prototype a learning AI, but also is broader than that. For the AI you need to give it a series of senses, but what are those. Also, how do we balance this set of senses with the ones the player has inherently in the game (Sight, Sound, etc) Do we wish to add additional sets of organelles to provide additional senses? How do we get the player to add redundant ones like photoreceptors required by the AI versions of their cell?

It would be interesting, but maybe controversial to limit some of the player’s inherent senses at the start of the game? You can’t see, but you can feel vibrations. You build a chemoreceptor and can see some chemical gradients. You build an early eye and get a narrow view cone.

Might have to be an optional setting?

Definitely needs to be optional.

Btw have you looked at the current AI code? I think it would be very beneficial for the discussion if someone provided a summary of what it does (it does now use the chemoreceptor if the species has it).

Based on comments in the code.

If nothing is engulfing me right now, see if there’s something that might want to hunt me

If this microbe is out of ATP, pick an amount of time to rest

Follow received commands if we have them

If there are no threats, look for a chunk to eat

If there are no chunks, look for living prey to hunt

Otherwise just wander around and look for compounds or if this organism is sessile sit there

I will note that the word look just means to magically get the closest thing within a max range.

I also see nothing about the chemoreceptor in MicrobeAI.cs. Well looking more nothing in anything related to the microbe. Are you sure about that?

I was 90% sure that someone worked on it. So maybe it is in a branch? Does anyone remember? From a quick look all I could find was a PR making auto-evo algorithm take chemoreceptor into account:

I’d be really, really wary of limiting the player’s visual or auditory capacities. Those would be pretty inconvenient steps to take for the sake of realism in an application based on seeing and hearing things surrounding the player, and would turn off a lot of people. It would just seem exhausting to deal with.

Only thing I remember seeing akin to this which I thought to be a cool thing to potentially see is one of Uniow’s concepts, where the player starts with a limited color palette which gradually saturates and expands as they evolve more advanced eyes. A very simple multicellular animal would still have the ability to see important colors like green and brown and the such, but it adds room for a sense of progression in the environment.

2 Likes

Yep, that’s the issue. It seems more interesting as a setting or a mod than a key feature. The real question is balancing the AI and Player in a realistic way.

1 Like

True. To be fair, microbial intelligence/behavior generally follows the run/tumble method, where they basically just move around in squiggles for as long as they detect a specific compound and then randomly change directions and repeat the same mechanism. So a lot of this conversation will apply more in the multicellular/aware stages. Perhaps the question we should be asking in the microbial stage is how to encourage auto-evo to evolve different forms of behavior, such as focused/sessile/brave? Maxonovien is already doing similar work and it appears to have made atleast the appearance of microbes with different levels of sensory capacity.

I also remember seeing someone attempting to have chemoreceptors effect AI in-game, although I can’t find it right now for some reason.

2 Likes

Another issue is the huge variety of microbes. Like some stentors appear to make basic decisions Making Decisions Without a Brain - YouTube. One of the many ideas behind my prototype is a basic genetic neural network. The really simple ones I’m talking about, do a really great job of simulating basic cause and effect responses while evolving with the organism. Might require some limits in size to keep it basic and realistic, but I think that has promise. But what sensory information do I feed into it? Just the chemical gradients? The current AI gets a really unrealistic amount of info, but I guess that’s needed in a game with a player.

As a side note with only a gradient sense and maybe some touch senses, it likely will do what you describe (run and tumble) It will just provide more variety in the forms of run and tumble to suit each cells needs.

1 Like

Chemical gradients and perhaps something like an awareness of other cells the cell can engulf/poison/attack and an awareness of other cells which can engulf/poison/attack them? This would be particularly useful for organisms like stentors or pathogenic bacteria or small prey items.

But how would we pull that off realistically? If we had a color/photoreceptor maybe it could learn the colors of its enemies. Would have a selective pressure to evolve camo as well as mimicking other cells?

I think we need someone from the theory team here.

For the sake of clarity, chemoreceptors come into play here: Thrive/MicrobeAI.cs at master ¡ Revolutionary-Games/Thrive ¡ GitHub

It only effects searching for compounds, and I guess this is a good sign that there should have been a comment there.

1 Like

Ah, I read through once and then Ctrl-F Chemo, and nothing came up. That does need a comment :slight_smile:

1 Like

I also tried to searched for that, but then also tried to look where the AI would check if a cell can use chemoreception, which it doesn’t seem to do. I think it would make the AI appear to become better overtime if that special smelling was skipped unless a cell has a chemoreceptor.

We discussed that in the PR Ai uses chemoreceptors by adQuid ¡ Pull Request #3093 ¡ Revolutionary-Games/Thrive ¡ GitHub

Ah yeah, totally forgot about this. All of this could have been prevented with just a few more comments in the AI class… and some people who say they are programmers are against comments when they’d save so much total time in the world if they were done well.

Nice, we got that figured out. I do think we have drifted quite far from the original question though.