Sensor-driven behaviour tree for AI combat, navigation and tasks in Unity using C#

Purpose

The aim of the project is to create a dynamic AI system that creates realistic and engaging behaviour for a player to be involved in or enjoy. The project aims to create a dynamic behaviour combat system for NPCs within a 3D environment for Unity 3D. It is planned to just be for personal use and later expansion, with the option for upload to the ‘Unity Asset Store’.

Research

As part of the project, research was required to look into the advantages and disadvantages of various design methodologies and technologies.

To meet the project requirements, exploration of the differences and benefits of hierarchical state machines, decision trees or a behaviour tree. From this research, it became clear to implement a behaviour tree, with a hierarchical state machine (layered) for the agent states.

After performing some market research it appeared that – although there was a range of more complex packages on the Unity Asset Store – the project USP will be a broader template that can be used for many types of games. This would include developing the project into FPS games, third or first-person games, RPGs or even RTS games. Ideally, the code would be extendable to suit the needs of other project requirements, implementing the AI, behaviour-driven systems.

Progression

Currently, there is an agent that can navigate a NavMesh, baked onto a surface of a scene. The agent has sensors that can detect whether the observed/heard agent is a friend, a foe or an unimportant entity in the environment. With further work on creating a grouping system, that collates groups of agents into a group (a better word for this is required), which have a Blackboard component. This blackboard contains the details of the group members, how many can be in the group, a destination target and the states of the group. Currently, the project uses a hierarchical state where they have an Alertness state – which is their overall aggression or behaviour – which are: relaxed, cautious, alert and engaged. This state will generally alter the movement speed and the direction of travel. Beyond that, the project also has various behavioural states such as: seek, travel, patrol, pursue, charge, group, guard and many more. These behaviours will dictate the actions performed by the agent and group.

The project aims to present a realistic and dynamic NPC system, that is unique on each playthrough (using non-deterministic composition tasks) by crafting a task system that prioritises jobs but may sometimes slightly randomise a task – in order to give a different feel each time. For example, the agent may approach an obstacle completely differently, perhaps choosing to go back and around a strong enemy agent, rather than engaging. Or potentially deciding to look for nearby allies then return to the enemy.

Desired features + stretch goals:

  • Dynamic waypoint navigation
    • Ability to create a patrol circle around the agents, anywhere on the NavMesh
  • Prioritised task system
    • Assignment of tasks to groups, then breaking those down and distributing to specific agents
    • Including: attacking a specific enemy (limited number at once), investigating a noise or grouping
  • Global grouping system
    • Limited agent amount in the group
    • Create a new group when required
    • Delete empty group
    • Blackboard, to reduce the necessity of expensive calculation on each frame
    • Role-based slot assignment in groups
  • Sensor-Based environment perception and information gathering
    • Communication bubble, vision and potentially hearing
  • Behaviour Tree influenced actions
  • Agent stats and features
    • Health, courage, speed, attack speed, strength, leadership etc
  • Multiple triggers
    • Visual/communication (enemy or ally)
    • Destination arrival
    • Agent death (enemy or ally)
    • Agent movement (enemy or ally, causing sound)