AI

From OniGalore
Revision as of 22:29, 16 September 2011 by Loser (talk | contribs) (Filling in cross-links to sections)
Jump to navigation Jump to search

Foreword

Motivational video: http://www.youtube.com/watch?v=8Okn7u_-oVs

An action game like Oni needs artificial intelligence driven characters ("bots") and some sort of interaction between these characters and between the character and world. The goal is to make AI look at least a bit like human being with its behavior resembling real life humans. That means giving the A.I. abilities such as moving in the game world believeably or see, hear and react on events in the game world. Different games deal with these probles in various ways, but in this article, we will take a look at Oni and its A.I.

This page serves two purposes - to give an overview of Oni A.I. and to help modders with A.I. tweaking.

Pathfinding and movement - "I think I will consult a map for a while..."

File:Pathfinding grid.JPG
An example of a pathfinding grid.

Oni A.I. has two ways how to make A.I. move in a game world - pathfinding-based and ???vector-based??? (needs proper term).

Pathfinding is used when A.I. driven character needs to traverse in level from one place to another. Examples of pathfinding-based A.I. movement are:

  • patrol paths - A.I. driven character moves in a pre-designed fashion, see PATR
  • alarm running - A.I. character is requested to go to given console and use it, see section Alarm behavior
  • pursuit of target - A.I. character loses direct sight of the enemy, see section Pursuit of enemy
  • running to weapons in order to pick them up, see LINK TO COMBAT BEHAVIOR
Pathfinding utilizes an A* (A-star) algorithm to design route for A.I. within the game world.
Human players see walls and obstacles placed in a way between current location and goal location, thus they can adapt their behavior thanks to the best computer so far known to a man - human brain. The A.I. on the other side has to be told where it can go and where it cannot (wall, pit, obstacle etc). Since the A* is a chart search algorithm, there is a need to have some sort of chart, conveniently mapping the environment for A.I. purposes. And that is pathfinding grid, which can be seen ingame by activating ai2_showgrids = 1 (see picture on the right).
If A.I. has to travel in the level and pathfinding is used, then a "mark" is set at the final destination and the "chart" mentioned above is consulted to get the shortest available route. After this process, the A.I. is continually fed with "traversal nodes" - temporary marks set in AKVAs A.I. is currently running through. Artificial Intelligence keeps running directly towards these marks, so when A.I. needs to turn around the corner, a number of these nodes are generated in order to make A.I. turn the corner.
Pathfinding grids are tied to AKVAs and are loaded into memory only for the time when they serve some purpose (character is inside the AKVA volume). More detailed info about various types of pathfinding grid tiles and their effects on pathfinding can be found at OBD talk:AKVA/0x24.
There used to be a short modding initiative to alter pathfinding grids manually, but it deceased due to extreme amount of labor required to do so. Currently, OniSplit can import levels and generate pathfinding grids.


Modding hints: if A.I. has to turn sharp corners, pathfinding still picks the shortest route possible. That usually results in A.I. having problems with turning corners at full running/sprinting speed - A.I. stops at corner, turns, runs past the corner, turns again to finish it and continues on its way. That is caused by low Rotation speed assigned to the character in his ONCC file.
In ONCC xml file there is <RotationSpeed> under <AIConstants>. Default is 1, but 1.5 works way better without causing any visible negative influence on the game. Rotation speed 2.0 is recommended to be the maximal because the pathfinding in the game is designed for speed 1.0 and with such a high rotation rate there raises a paradox as new problems in pathfinding may appear instead of issues being erased (mainly A.I. colliding with obstacles).


???Vector??? movement is used when

  • A.I. driven character enters close combat state and goes into melee
  • A.I. driven character is dodging gunfire, be it firingspreads or projectiles
  • A.I. driven character is requested to run away from enemy by special setting in CMBT profile
In this mode A.I. is not given exact destination where to move. Instead it is given a "desire" to go to some direction. This mode still uses pathfinding grid, however in a bit different way than pathfinding does so glitches occur, mainly unexpected collisions and ignored tiles (A.I. runs over the edge and falls down to its death).


Unexpected Collision movement happens when A.I. runs on its own towards some destination and suddenly hits an obstacle. Ideally that should never happen as pathfinding grid should prevent it, but still...Anyway, when collision happens, the game decides what to do according to actually selected wy how to move a character:

  • Pathfinding - the game makes A.I. do a few steps back, rotate a bit and lets the A.I. continue its journey. There is limit of how many unexpected collisions can A.I. undergo before it gives up its destination as unreachable.
  • ???Vector??? - according to the angle between A.I.'s facing and obstacle the game decides either to let the character slide along the obstacle or to stop the character completely until direction of the A.I.'s vector of movement changes.


Congitive possiblities of A.I. characters ... "COME TO YOUR SENSES!"

File:VisionFields.JPG
An example of Blue Striker's central and peripheral vision.
File:SoundSpheres.JPG
Three examples of sound spheres.

When talking about life-like A.I., senses have to be taken into an account. Sight, hearing, taste, smell, touch. Unfortunately Oni emulates only sight and hearing.

Sight is important. Without it the only option for A.I. to recognize the enemy is to be attacked by the enemy first. Unlike majority of action games, Oni A.I. characters have two visions:

  • Central - enemy seen by central vision = enemy recognized and attacked.
  • Peripheral - enemy seen by peripheral vision = alert risen to low and A.I. goes into pursuit mode with given enemy, see LINK TO PURSUIT


Modding hints - parameters of vision fields are stored in ONCC, under <VisionConstants>. Central vision field distance has to be greater than peripheral vision field distance, otherwise peripheral vision detection will be somehow broken (it will not detect). Since peripheral vision makes A.I. pursue the enemy but not attack it, it can be set quite large in order to surprise lousy stealth players. But be warned - nobody likes cheating A.I. ^_^.


Hearing is crucial in Oni. Majority of A.I. character interaction is done via sound system. Oni recognizes these types of sound (in brackets visualization when ai2_showsounds is set to 1):

  • Unimportant (blue) - is ignored to some extent if alert level of A.I. is lull, but after some period of time A.I. will register it.
  • Interesting (green) - causes alert rise to low level.
  • Danger (yellow) - causes alert rise to medium level.
  • Melee (orange) - causes alert rise to medium level but A.I. reacts a bit differently than in case of Danger sound.
  • Gunfire (red) - causes alert rise to high level.
For more info about A.I. alert levels, see section Reaction on stimuli. These sound types are used by impact effects ONIE and danger sound type can be set in particle as a "danger radius".


Modding hints: Sound system and ONIE are mighty tools if modder knows how to use them. Even doors can be attached a sound sphere of any one of those listed above, so a modder can create doors which draw attention (sound type interesting) of nearby A.I. characters if these doors are manipulated.
Another example is a workaround for A.I. to get alerted by dead bodies -
Set dead particle in ONCC to be one special custom made - this particle will for given time keep emiting custom impact effect. That custom impact effect will have no sound or effect attached, but will be set to be hearable by A.I. as a gunfire within a large radius (200 units).


Reactions on stimuli a.k.a. "What was that?!"

In real life each human being is unique with its reactions. But even then reaction differ - when calm or when nervous or when suspicious, a man does not always react the same way. And what about Oni?

In Oni, reaction on stimulus is based on type of stimulus (see previous section) and on level of AI alert level. Oni A.I. characters have these alert levels:

  • Lull
  • Low
  • Medium
  • High
  • Combat
Level of alert can be set via BSL as ai2_setalert ai_name/chr_index desired level of alert. Levels of alert play the role of "behavior modifier". Artificial Intelligence reacts on seen or heard allies/enemies according to its own level of alert. Level of alert can be increased by:
  • hearing a sound which causes rise of alert level (for detailed info read previous section)
  • special rise of alert level from Lull to Low is by enemy being seen by peripheral vision or by enemy causing too many unimportant sounds
  • special rise of alert level to Combat is by being hit by the enemy or by alarm being triggered (see OBD:BINA/OBJC/CONS) or by script functions ai2_tripalarm, ai2_makeaware or ai2_attack (see BSL:Functions).
Alert level also affects movement mode of A.I. driven characters. There are six movement modes - creep, walk, walk_noaim, run, run_noaim, by_alert_level. Those "_noaim" movement modes are forcing A.I. character to not aim with weapon in case the A.I. is armed (so A.I. character walks or runs with the gun in hand but it does not aim with it). Movement mode can be forced via bsl command ai2_setmovementmode. Alert level affects movement mode only if movement mode is set as by_alert_level. In that case:
  • Lull and Low alerts use walk_noaim
  • Medium, High and Combat use run
If patrol path (PATR) is assigned to a character, it can override movement mode for patrol path purposes and for example force Lull alert character to run with aimed weapon. However, when A.I. is disturbed and starts pursuit of ally/enemy, movement mode is then chosen by coresponding alert level (given the fact the movement mode is set as by_alert_level).
Level of alert can decrease via BSL ai2_setalert or with time , exact location of timers unknown (maybe hardcoded).


Next component of A.I. reaction logic is its awareness of ally/enemy. Oni recognizes between ally (friendlythreat) and enemy (hostilethreat). Oni A.I. characters always know whether the disturbance was caused by ally or enemy. That means if player is playing as standard Konoko, shoots and some Striker gets alerted, the Striker knows which character (from CHAR) shot, which team does the character belong to and according to it the Striker reacts (but more about it later). Oni defines four levels of awareness:

firendlythreat
  • Definite - A.I. is 100% sure where ally stands and ignores him. Is achieved by seeing the ally with central vision field or by being hit by his gun by accident or by being told of its presence by BSL commands ai2_makeaware, ai2_tripalarm or ai2_attack.
  • Strong - A.I. has strong awareness of ally presence, but cannot pinpoint exact location of the ally character itself. Is caused by all sound types, which are described in previous section. Also if ally is off central vision field and <FriendlyThreatDefiniteTimer> runs out, A.I. decreases level of awareness from definite to strong.
  • Weak - A.I. has weak awareness of the character's presence. It is caused either by seeing ally with peripheral vision or by <FriendlyThreatStrongTimer> running out.
  • Forgotten - A.I. has ecountered some ally, but over time forgot about its presence. <FriendlyThreatWeakTimer> ran out.
hostilethreat
  • Definite - A.I. is 100% sure where enemy stands and will go attack her/him. Is achieved by seeing the enemy with central vision field or by being hit by her/him (or her/his gun) or by being told of its presence by BSL commands ai2_makeaware, ai2_tripalarm or ai2_attack.
  • Strong - A.I. has strong awareness of enemy presence, but cannot pinpoint exact location of the enemy character itself. Is caused by all sound types, which are described in previous section. If enemy manages to get is off central vision field and <HostileThreatDefiniteTimer> runs out, A.I. decreases level of awareness from definite to strong. If A.I. is allowed to investigate, corresponding pursuit behavior is excuted (more about this later).
  • Weak - A.I. has weak awareness of enemy's presence. It is caused either by seeing enemy with peripheral vision or by <HostileThreatStrongTimer> running out. If A.I. is allowed to investigate, corresponding pursuit behavior is excuted (more about this later)
  • Forgotten - A.I. has ecountered some enemy, but over time forgot about its presence. <EnemyThreatWeakTimer> ran out.
When A.I. character is freshly spawned, it does not have any contact with other characters (Tabula Rasa). Through sounds and vision, A.I. learns about presence of other characters and reacts on them according to team affiliation, alert level and awareness. Even when A.I. character "forgets" about ally/enemy, it does not completely abandon their existence. For example startle behavior when A.I. character sees an enemy is played only once vs this particular enemy. Even when A.I. character forgets, it won't play startle animation next time it sees this one enemy, but goes directly attack him.
Friendly warning: awareness "Forgotten" IS NOT equal to ai2_forget command. "Ai2_forget" clears A.I. character's memory back to Tabula Rasa status.


Pursuit of enemy alias "I will find you..."

Everything that was needed was explained in section above so let's take a look at pursuit behavior. Pursuit behavior in Oni emulates the situation when a man knows "somebody is here", but does not see anybody.

Oni deals with this problem by using pursuit behaviors. In Oni there are following types of pursuit behavior:

  • None - simply nothing
  • Forget - A.I. goes into forgotten awareness with this enemy
  • GoTo - A.I. moves to the source of distrubance and executes timer-based 600 frames Look behavior (600 frames = 10 seconds)
  • Wait - A.I. simpy stands and waits
  • Look - A.I. rotates on spot and looks all around
  • Move - unimplemented, probably was meant to be similar to Look, but with character randomly moving around
  • Hunt - unimplemented, probably was meant to make A.I. purposefully look around for the enemy
  • Glance - A.I. does not rotate whole body, only rotates its head

Plus there are three types of lost behavior (when A.I. has definite contact with enemy but enemy manages to get away)

  • ReturnTo Job - A.I. returns to its job
  • KeepLooking - A.I. does not return to its job but keeps using the last used pursuit behavior
  • FindAlarm - A.I. tries to switch to alarm behavior (see section Alarm behavior), if it does not suceed, A.I. returns to its job.


In Oni, a character which is spawned in the level is a character profile from CHAR file. This CHAR profile contains links to:

  • ONCC file - character class, eg. Konoko or Comguy_1
  • CMBT file - combat profile for this character
  • MELE file - melee profile for this character
  • PATR file - OPTIONAL, patrol path
  • NEUT file - OPTIONAL, neutral behavior (for example civilians giving player some powerup)
Pursuit behavior is executed only within the Pursuit range, which is a parameter <PursuitDistance> in CMBT profile attached to the A.I. character. If anything suspicious happens outside of this range, A.I. only looks in the direction of disturbance for a short time, then ignores it and returns to job.
Pursuit behavior greatly depends on level of alert. There are four parameters regarding alert levels in CHAR profile:
  • Initial - A.I. is spawned with this level of alert
  • Minimal - minimal alert level this A.I. can have
  • JobStart - Alert level of the A.I. when it starts some job (most typically a patrol)
  • Investigate - From this alert level and all greater alert levels A.I. will tend to pursue any suspicious sound it hears or any hostile peripheral contact it makes.


Also in CHAR there are five fields regarding pursuit behavior:
  • Lull/Low alert level strong awareness behavior - in xml labeled as <StrongUnseen>
  • Lull/Low alert level weak awareness behavior - in xml labeled as <WeakUnseen>
  • Medium/High/Combat alert level strong awareness behavior - in xml labeled as <StrongSeen>
  • Medium/High/Combat alert level weak awareness behavior - in xml labeled as <WeakSeen>
  • Lost - behavior when A.I. had definite contact (confirmed enemy) but lost it, in xml labeled as <Lost>


If inside pursue range A.I. character registers any sound or peripheral vision contact while its alert level is below alert level set in Investigate parameter, then this A.I. will only look in a direction of disturbance for a short time, then ignores it and returns to job. However, if alert level of the A.I. is high enough, A.I. character goes into pursue mode. That means A.I. will get strong awareness of the enemy (only weak in case of peripheral contact), goes to the source of disturbance (center of sound sphere in case of sound or the spot where enemy stood in case of peripheral eye contact) and will perform pursuit mode Look for 10 seconds. After this initial Look mode expires, A.I. starts with pursuit behavior prescribed in its CHAR profile.
Length of each pursuit behavior execution depends on HostileThreat timers (ONCC file) as described in section above.



Modding hints: Pursuit mess and weird glitches

Watch this YouTube video.
Pursuit mode is quite buggy, so when altering this behavior, there are several important things modder has to have on mind.
  • Pursuit distance parameter in CMBT is important in deciding whether the character should be more a pursuer or more a guard.
  • Within the pursuit distance, peripheral vision pursuit runs its own league. When A.I. character sees enemy with its peripheral vision, this A.I. always goes into weak awareness mode and either just glaces at direction of this enemy (alert level was below investigate level) or moves to the spot and performs weak investigation pursuit behavior (alert level was on investigate level on higher).
  • Remember there are TWO sets of pursuit behavior:
    • strong and weak pursuit for lull/low alert levels. In xml these are called <StrongUnseen> and <WeakUnseen>, however those names simply reflect how many changes were made to this part fo the game.
    • strong and weak pursuit for medium/high/combat levels, in xml called <StrongSeen> and <WeakSeen>.
  • From all available pursuit behaviors GoTo, Look, Glance, Forget and Wait can be effectively used in CHAR pursuit behavior parameters. Hunt and Move are not finished code-wise.
  • Remember that whole pursuit behavior is glitched and parameters from CHAR are more often ignored than actually used. This glitch is somehow connected with sight. When sight of A.I. characters is turned off via ai2_blind=1, whole pursuit mechanics work as they should. However, the moment A.I. character is allowed to see, then when pursuit should be performed it gets:
    • bugged and results in pursuit values from CHAR being ignored if pursuit mode was called while the caracter was idle or moving in a patrol path. Our A.I. character performs basic GoTo + timer-based Look for the whole time of pursuit, for both strong and weak awareness. Looks so-so but definitely is a glitch.
    • bugged and results in one GoTo + timer based Look being called, but timer for Look does not decrement, so A.I. simply stands and stares. This happens when pursuit mode is called while A.I. stand at job location in patrol path. And well... it looks really bad.
    • in roughly 10% the pursuit performs correctly. It is quite random, but looks like A.I. has to be somehow disturbed while going for the source of previous distrubance. And then Oni writes into console "pursuit mode Hunt not yet implemented" ^_^.
  • The best way how to deal with pursuit sight issue is to either ignore it (so char get often glitched) or use BSL script to create a small self-looping function where ai2_blind is set to 1 for at least one second (60 frames), then set to 0 again. It can cause funny moments of A.I. completely ignoring enemy right in front of it, but it silently fixes these pursuit problems.


Alarm behavior - "She's everywhere!"

CONTROL CONSOLE.png
ALARM CONSOLE.png
DATA CONSOLE.png
Control Alarm Data

In order to make A.I. look more humane, it is a good idea to grant it ability to interact with the world the same way player does. In Oni that means for example to give A.I. option to use consoles.

BSL scripting provides a command to make A.I. go and use a console - ai2_doalarm ai_name/index number of console. This way A.I. can be told to use any console. But there is a method to make A.I. character use a console completely on it own.
In order to utilize those mechanics, there is a need for a console which has ALARM CONSOLE flag set (see OBD:BINA/OBJC/CONS). In XML this flag has a string IsAlarm. Such a console then can be used by A.I. characters without scripting.
Next, there are Alarm parameters in CMBT profiles which define A.I. character's alarm behavior:
  • Search distance - a perimeter around the character where engine checks for any console with ALARM CONSOLE flag. In xml it is named <SearchDistance>.
  • Ignore distance - a perimeter around A.I. character where this A.I. character (which is currently executing alarm behavior) acknowledges enemies. Enemies outside of this perimeter are ignored by the A.I. character. In xml it is named <EnemyIgnoreDistance>.
  • Attack distance - a perimeter around A.I. character where this A.I. character (which is currently executing alarm behavior) temporarily stops its run for the console and attacks enemies if they are within this range and A.I. character sees them with central vision field. In xml it is named <EnemyAttackDistance>.
  • Damage threshold - in xml named <DamageThreshold>. Time interval for which A.I. character keeps awareness of enemy who attacked it. If this enemy crosses Attack distance perimeter, A.I. temporarily stops its run for alarm and immediately attacks this enemy (does not have to see him with central vision field).
  • Fight timer - in xml called <FightTimer>, duh. Numer of frames for which A.I. character should fight with the enemy before it attempts to resume its run for alarm.


A couple of notes regarding target console:
  • Console must be in "activated" mode. If console is in deactivated or used modes, alarm behavior will not be executed.
  • Console must be directly accessible. Oni A.I. cannot use consoles in order to open path to get to the point of interest (target console in this case). Target console must be directly accessible. Still, it can be directly accessible across the whole level ^_^.


Logic for alarm running is set, now how to trip it? There are three ways how to make character run and use a console without BSL. the console must have ALARM CONSOLE flag set and ust lie within Search distance perimeter:
  • in CMBT: If no gun behavior set to "Run for Alarm". In xml it is RunForAlarm string inside <NoGunBehavior>. By setting this parameter, A.I. character will attempt to run and use a console when it does not have a loaded weapon or spare clips in inventory (then it reload and continues shooting). If there is no alarm console nearby, A.I. will switch to Melee.
  • in CMBT: Behaviors (Long, Medium, Short, MediumRetreat, LongRetreat) can be set to be "Run for Alarm". DO NOT confuse with "Run for alarm If no gun" behavior! This behavior is never used in retail Oni and there is quite a good reason why. If there is no useable console within Alarm search distance radius, then A.I. character simply stands and stares even when enemy is visible with central vision field. On top of that even when console is present, this method of invoking alarm behavior then fights with desire to go to the enemy so in result A.I. is acting silly.
  • In CHAR: Lost behavior "FindAlarm" - When A.I. character makes definite contact with enemy and then enemy manages to escape, A.I. executes "Lost" behavior (see "Pursuit of enemy" section). This behavior is typically set as ReturnToJob, but FindAlarm works well with no known issues. If alarm console is not found within search distance, A.I. returns to its job.
As already mentioned, Alarm behavior can be tripped by BSL command ai2_doalarm. In that case alarm behavior is executed but this time console is set by command instead of being looked for via Search distance. Also, by ai2_doalarm A.I. can be made to use any console, not only those with ALARM CONSOLE flag.


Modding hints: ability of A.I. to use consoles is an excellent tool for increasing a challenge.

  • Overall, alarm running can interfere with close combat and cause A.I. glitches. To get rid of this, set melee override at close or medium range (see CMBT).
  • In BSL a modder can easily distinguish if console was used by player or A.I. thanks to chr_is_player(ai_name) function. All what is required is a BSL function being triggered when console is used. Here is example where console triggers a function called console1_used:
func void console1_used(string ai_name)
{
 if(chr_is_player(ai_name))
 {
  *Some code in case player used the console*
 }
 else
 {
  *Some code in case A.I. used the console*
 }
}
  • Other way of using alarm mechanics is to create a feeling of cooperation - player has to achieve something and in order to do so, an A.I. driven sidekick who goes after needed console and uses it is required.
  • In extreme case, A.I. character can be even made to though the level on it own, moving from one console to another. That can be used to create chasing levels - A.I. character has to use a certain number of consoles while player is required to stop this A.I. character from doing so. Thanks to Alarm behavior the task of tripping alarm consoles can be fully completed by A.I., no scripting needed.


Combat behavior : "Hokey religions and ancient weapons..."