Procedural Animation - Central Discussion

So, this seems like something really silly to be talking about so early but, it’s extremely important we start developing solid algorithms and designs for how to approach this problem since it’s one that has already bogged down Thrive once (along with auto-evo) and it’s not a bridge that will become any easier to cross with time (like auto-evo). To help illustrate, I have collected postings from Ancient Times Thrive:













Animation is something that is very expensive for the computer and difficult to do even assisted by a program that makes things easier (like Blender 3D). There are many steps you must take before you can actually animate a 3D model. Here’s a bit of the workflow:

-You have to retopologize your mesh for deformation (this means curves must follow a certain flow, and triangles must be a certain way or else you get crazy glitches and clipping)

-You have to build a rig for your mesh

-You have to assign vertex groups to their corresponding armatures (parts of the rig)

-You have to weight paint the model (this is very tedious and controls how much vertices deform to an extent)

-You have to animate the model (potentially the easiest step)

We have to figure out a way to do the above steps that doesn’t involve being Blender, and we have to figure out a way to do it procedurally for many species on an entire planet. (or at least for various body plans, there’s a lot of different sharks but they all swim like a shark)

There’s also various gameplay implications. How do we reasonably control an animation system this advanced? More importantly how do we make this fluid enough where a normal person stands a chance against a computer that’s able to manipulate things at a very precise and fast level in the game world? How do we map the results from this system to the keyboard and GUI?

We need to start collecting our thoughts into actual structured designs and algorithms that could be made reasonably performative, not just word salads of concepts. I will ponder this more, as it’s something I really want to work on in the future, especially with 3D cells on the horizon, and multicell past that.

Feel free to discuss.

5 Likes

Nunz and I had a convo in general about how to actually implement complex dynamic actions in our system without the game becoming basically QWOP. (the following is basiclaly copied and pasted from a discord convo) The idea is we have things called “verbs” these are atomized actions that can run by themselves or at the same time as other verbs, that do a certain action, for example eg when you bite the bite verb is ran when it is running the creature clamps the mouth down and attaches the thing you are biting to the mouth. Then you move and you dont have to code the dragging, since moving while the thing is clamped to you is dragging.
You didnt have to code the dragging in that case, it is just a consequence of the “bite” and “move” verbs running at the same time interacting with the physics system
the verbs are like their own little loops running alongside each other.
The bite key activates the bite verb, and you can bite and move backward at the same time and both verbs being active causes you to drag, because the bite verb attaches the thing you ar ebitting to your mouth and you are moving. You dont have to code a seperate dragging animation or command or anything like that. You just program these verbs. And let them rn at the same time to create more complex actions.
This avoids an issue with creating, for example, several hard coded attack types, like “strike”. The issue with hardcoded commands is that i cannot be an aye aye who posititions their claw to stab at bugs in a tree or a creature that stabs things forward with its claws without us making all of these actions their own keys, so we have to simplify it and atmoize it down to the verb system, which would allow a player to choose exactly what a strike means for them in the behavior editor by simply assigning a couple verbs to a key, strike becomes just a key the player can customize that activates both the “move appendage up” and “move appendage to the side verbs.” which probably both just apply forces to the appendage (which ideally is like a ragdoll or skeletally animated creature).
Then the player can say “okay, x key swipes my claw left and down” they have defiend what a strike means for their creature by mapping verbs we can add presets, that the player can choose in the behavior editor, but the player can also just choose verbs and add them to a list that runs when a button is pressed, and as long as we actually coded them as verbs, it should automatically do what the player expects
of course other things.
Heres another example, instead of having a “spit key” the player goes into the behavior editor, chooses an appendage, and a key, and assigns the “eject agent” verb to that key, that verb dynamically knows what agent is on that appendage and shoots it out that is what that verb does. Now the player has added their own spit key, but if they want they can make the appendage “move up”+“eject agaent”, by just adding the move up verb to that map so now the player has a scorpion tail, basically
but we didnt hard code that, its just the two verbs running at the same time.

I think this is ideal because it allows for the widest range of interactions, without bloating our code base with hard coded commands like we currently do for shooting agents.

The verbs would interact with our procedural animation system, whatever that ends up being, (anything from just ragdolls moved around by physics to proper procedural animations. Like dynamically animated creatures with 12 legs. We dont have to atomize walking into verbs, but we could do that. The verbs would be mostly to allow the widest range of player behaviors. Eg, the ayaye that sticks its huge claws into a tree to retrieve bugs. Or teh predator dragging its prey up a tree, all without hard coding the more complex commands, just coding atomized verbs that can run alongside each other.

2 Likes

Further, in the behavior editor we can assign AI hints which is an enum that tells the AI what an action does in the simplest way, the player would for example add the Attack ai-hint to their action so ai members of their species know when to run the command.
The ai would just loop through their commands and run them when in the right states based on the ai hint assigned to the command. The AI species the player doesn’t control would only use verb lists we create for them, (unless we wan’t to create some kind of crazy ai that can generate that)

2 Likes

Starting out with only some rough thoughts
-Proper topology is not as important as its thought to be, marching cubes should work fine but the problem i see is polygon distribution. There has to be a way to add more verticies to smaller and more detailed places and less to smooth surfaces. Either that or parts will have to be separate meshes.
-Rig stuff will be tricky but its better to keep it simple and finish it than to overthink it. Might think about how to do it in next posts. IK of course
-Vertex groups and weight painting are the same thing and automatic weightpainting exists. Shouldn’t be too tedious. Of course it’s not as good as manual weightpainting but it has to be done that way pretty much.
-Animating the model is doable.
In conclusion we shouldn’t reach for perfection. We don’t have the technology for great simulations (yet) but I believe it will look fine regardless. Also untrustedlife’s idea is cool

1 Like

Vertex Groups and Weight painting are related but not the same

1 Like

I really liked this video from a while ago. I thought the approach of having the movement be always smooth and nice and then getting the animations to be reasonable around that was cool.

4 Likes

Posting the Spore Animation White Paper for any interested parties. This is not me suggesting we copy this but, it’s interesting.
https://drive.google.com/file/d/1C6Y7r0Z7LLTHN2JOiHGvw2SxkzhNXbLX/view?usp=drivesdk

2 Likes

I’ve been personally thinking about how we can handle the controls side of all of this. It’s definitely a complex issue that we must carefully consider.

In the cell stage the player’s organism currently follows the cursor which allows for strafing movement which is great, but when the user interacts with UI elements this can be disruptive to what they may be doing since the creature will track the cursor. This means that if we keep this movement system in the later stages then UI-based actions would be out of the question since the player will presumably need to keep their organism oriented with their target.
Honestly I don’t think we should go with UI-based controls for Thrive unless we grant players the ability to “lock on” to targets as otherwise orienting themselves would be an issue even with being able to unlock their orientation from the cursor.
With that in mind; I’ll throw out a couple ideas on how we could approach the controls.

  1. Default Universal Actions:
    Replace cell-stage hotkeys for important actions with default and basic universal actions. (For example replacing engulf with a simple bite.) This would ensure that the controls and actions remain familiar and recognizable to the player and would make customized actions entirely optional.

  2. Assignable Keys:
    Either assign custom player actions to the numpad/bar by default, and/or allow them to assign their custom actions to keys from the editor for easy access. As long as we don’t intend to make use of the numpad for any other purposes, it’s a great place to deposit a large amount of actions that fall under a category. In this case; any custom actions the player makes (Like swinging both arms in circles like a maniac I don’t know) would by default be placed in the number bar until the player reassigns it.

I admittedly don’t have any other ideas at the moment, I’ll keep thinking about it but this is all I have at the moment.

In general, I think any in depth animation and action customization should be regarded as an optional and advanced feature. The addition of basic universal actions like Bite, Bash, Grab, Spit, etc would ensure that players will be able to defend themselves and interact with their environment at a basic level regardless of their understanding of the animation feature. Meanwhile, custom actions would ensure that players can still give their species that unique behavioral flare that would differentiate them from other species as long as they want to.

2 Likes

AYy, i meant assigning the verb based actions to a key, not necessarily a ui button. The player can say x does the “move-up” and “eject-agent” verbs (our scorpion tale), and g does the “bite” verb in the behavior editor, and as i said, we could also premake some verb lists (list of verbs that run together when the key is pressed) that the player can assign these to keys and to appendages in the behavior editor and we can auto assign them too so the player doesn’t need to if we want. But we need to keep it all verb based, no more hardcoded “e to shoot oxy toxy” style actions so its dynamic enough that the player can intuitively figure out how to drag things (“bite”+“move”) and we can implement those things without bloating our code base, even right now with the basic oxy toxy stuff it has caused bloat in our code base and needs to be generalized.

And also to be clear, im using the word “bite” and so on but each verb is actually an action that can run on its own, on a loop, and each action should simply be a child of say, an “Action” base class where we define a function in a delegate and run that delegate, buttons would either run the actions one after another or at the same time. Using a variation of the action pattern that encapsulates basic actions that can run together to create more complex behavior .Command pattern - Wikipedia

But in the code we could create a list of strings (the valid verbs) and loop through these lists of strings and run the associated actions to create our premade actions. And in the behavior editor the player can create their own lists of verbs and that is what we assign to the appendage and button and we loop through them and run just like i explained.

1 Like

I find that the command pattern is one of the most effective behavioral design patterns, it is one of the most effective ways to add dynamic behavior to a game, i use it in pretty much every single one of my games. This “Verb” system is simply a variation on the command pattern.

Here is a c# code example of an action base class and a moveUpAction from one of my projects

public delegate void ActionDelegateForAction(Creature owner, Creature interactor, Engine engine, MapChunk chunk);
    
[Serializable]
    public class Action
    {
        public List<Delegate> actionList;
        public float energyCost = 50.0f;
        public Action()
        {
            actionList = new List<Delegate>();
            ActionDelegateForAction d = new ActionDelegateForAction(doGenericAction);
            actionList.Add(d);
        }

        private void doGenericAction(Creature owner, Creature interactor, Engine engine, MapChunk chunk)
        {
            Console.WriteLine("called generic action");
        }

        public void doAction(Creature owner, Creature interactor, Engine engine, MapChunk chunk)
        {
            actionList[0].DynamicInvoke(owner, interactor, engine, chunk);
            owner.addEnergy(-energyCost);
        }
    }



    [Serializable]
    public class MoveUpAction : Action
    {
        public MoveUpAction()
        {
            actionList = new List<Delegate>();
            ActionDelegateForAction d = new ActionDelegateForAction(doGenericAction);
            actionList.Add(d);
        }

        private void doGenericAction(Creature owner, Creature interactor, Engine engine, MapChunk chunk)
        {
            if (!chunk.tileMap[owner.getX(), owner.getY() - 1, owner.getZ()].getBlocks())
            {
                owner.setY(owner.getY() - 1);
                owner.setLastDirection(Direction.North);
                owner.setMovementState(movementState.walk);
                if (owner.getIsPlayer())
                {
                    engine.PlayBlockedSoundEffect("data/sound/sfx/footstepLibrary.wav", true, 50);
                }
            }
            else
            {
                Creature creature = engine.getCreatureByPosition(owner.getX(), owner.getY() - 1, owner.getZ());
                if (creature != null && creature.getInteractable() != null)
                {
                    owner.setLastDirection(Direction.North);
                    creature.getInteractable().doAction("action1", creature, owner, engine, chunk);
                    owner.setMovementState(movementState.walk);
                }
            }
        }
    }

And here’s another more complex example, a differnet action delegate class
public delegate void ActionDelegate(Creature owner, Creature interactor, Engine engine, MapChunk chunk);

[Serializable]
public class Interactable
{
    public Dictionary<string, string> phraseList;

    [NonSerialized]
    public Dictionary<string, Delegate> actionList;

    public Interactable()
    {
        initializeInteractable();
    }

    public virtual void initializeInteractable()
    {
        actionList = new Dictionary<string, Delegate>();
        phraseList = new Dictionary<string, string>();

        ActionDelegate d = new ActionDelegate(doGenericAction);
        actionList.Add("actionOne", d);
        phraseList.Add("actionOne", "Do Generic Action");
    }


    private void doGenericAction(Creature owner, Creature interactor, Engine engine, MapChunk chunk)
    {
        Console.WriteLine("called generic interactable");
    }

    public void doAction(string action, Creature owner, Creature interactor, Engine engine, MapChunk chunk)
    {
        if (actionList == null)
        {
            initializeInteractable();
        }

        if (actionList.ContainsKey(action))
        {
            actionList[action].DynamicInvoke(owner, interactor, engine, chunk);
        }
    }

    public virtual bool getHasAction(string action)
    {
        return actionList.ContainsKey(action);
    }

    public virtual string getActionPhrase(string action)
    {
        if (phraseList.ContainsKey(action))
        {
            return phraseList[action];
        }
        return null;
    }

    public virtual void setAction(string action, string description, ActionDelegate functionAction)
    {
        if (actionList.ContainsKey(action))
        {
            actionList[action] = functionAction;
        }
        else
        {
            actionList.Add(action, functionAction);
        }

        if (phraseList.ContainsKey(action))
        {
            phraseList[action] = description;
        }
        else
        {
            phraseList.Add(action, description);
        }
    }

    public virtual void removeAction(string action)
    {
        if (actionList.ContainsKey(action))
        {
            actionList.Remove(action);
        }

        if (phraseList.ContainsKey(action))
        {
            phraseList.Remove(action);
        }
    }

    public virtual void setActionPhrase(string phrase, string action)
    {
        if (phraseList.ContainsKey(action))
        {
            phraseList[action] = phrase;
        }
    }
}

[Serializable]
public class InteractibleDoor : Interactable
{
    public InteractibleDoor()
    {
        initializeInteractable();
    }

    public override void initializeInteractable()
    {
        actionList = new Dictionary<string, Delegate>();
        phraseList = new Dictionary<string, string>();

        ActionDelegate d = new ActionDelegate(openDoor);
        setAction("action1", "Open Door", d);
    }


    private void openDoor(Creature owner, Creature interactor, Engine engine, MapChunk chunk)
    {
        //alter sound effect based on distance to player
        engine.playSpatialSoundEffect("data/sound/sfx/door.wav", true, owner.getX(), owner.getY(), owner.getZ(), 100, 1);
        owner.setTileNum('|');
        chunk.setNotBlocked(owner.getX(), owner.getY(), owner.getZ(), false);
        chunk.setNotBlocksLight(owner.getX(), owner.getY(), owner.getZ(), true);
        removeAction("action1");
        ActionDelegate d = new ActionDelegate(closeDoor);
        setAction("action1", "Close Door", d);
    }

    private void closeDoor(Creature owner, Creature interactor, Engine engine, MapChunk chunk)
    {
        owner.setTileNum((char)20);
        chunk.setBlocked(owner.getX(), owner.getY(), owner.getZ(), false);
        chunk.setBlocksLight(owner.getX(), owner.getY(), owner.getZ(), true);
        removeAction("action1");
    }
}

And here is code utilizing the first action pattern class i showed

hitting move up and move left at the same time lets me move diagnolly. We need to use some kind of action system, that atomizes, for example, the bite and move actions, and we can loop through the list of Actions (you use the base class) and run their doAction method in c# as long as doAction is a virtual method, you dont even need to know the type when you run doAction on the base class it will do the child classes variation of it. You use this create a list of actions based on a list of strings and loop through and run doAction, to create complex actions, that the player can define.

here is some example possible code:
The key is, I can also do:
Action[] theActionList = {new EjectAgent(“ocytoxyNT”), new MoveUpAction())
foreach Action a in theActionList {
a.doAction(player,player,this.mapChunk)
}

2 Likes

This is pretty much what I was getting at. The hotkeys used in the cellstage would be replaced with action analogues that the player would be able to modify using your system if they so chose, as well as additionally making more completely custom actions. Didn’t mean to imply that the actions would be entirely hardcoded and independent from your proposed system.

My concern is that requiring the player to make their own actions would steepen the learning curve quite a bit, so I was simply trying to propose a potential solution for that by granting players a basic “toolset” that could reasonably cover their standard needs from the get go.

3 Likes

Awesome, sorry for all the edits. I was trying to add a good breadth of examples. :slight_smile: You can actually see in the first example it accesses the interactable and runs the delegate on the second example, its like action-ception and that means i can close and open the doors without knowing anything about the doors state or even that it is a door, it could just as easily be me kicking down a bookshelf looking for a book or whatever when i walk into it,. Because it doesn’t care what it is, just that there is some action pegged to “action1” in the thing i am walking into. All because of the wonder of action systems, and it is really easy to get the ai using these actions because they are calling the same exact actions i am which means the ai can also open doors without any extra code at all. because it is running the same action the player is. Which basically eliminates redundancy and bloat as long as you stick to the pattern.

3 Likes