We may soon enter a world where NPCs do not simply execute lines of code, but form, or at least appear to form, complex relationships with other NPCs even when we are away.
So far we have followed a fairly rudimentary NPC AI system. Enemies and followers react to the environment in real time and make decisions based on your playstyle, etc., but a recent paper (opens in new tab) (PDF) by researchers at Stanford University and Google Research describes an impressive "generative agent for architecture" describes.
The title of the paper is "Generative Agents": it is entitled "Generative Agents: Interactive Simulacra of Human Behavior. This paper, entitled "Generative Agents: Interactive Simulacra of Human Behavior," describes a clever way to make NPCs behave in a believable and spontaneous way.
To test the capabilities of the generative framework they designed, the researchers created a small open world similar to "The Sims." After adding personality to the NPC seed data, they dropped the little agents into the world and watched them interact in surprisingly complex ways. with each other in surprisingly complex ways.
The "intention" given to one NPC was to plan a Valentine's Day party, which she did. She ran around inviting NPC friends, and the day before the party she and her best friend decorated the venue. The friend's backstory included a secret crush on another character, and she invited his girlfriend to the party.
"The social behaviors of spreading rumors, decorating, inviting each other, arriving at the party, and interacting at the party were initiated by the agent architecture," the paper states.
Generative architecture allows NPCs not only to "perceive" their surroundings, but also to construct a believable understanding of their world through "a comprehensive record of agents' experiences, called a memory stream. By reflecting on retrieved memories and planning their actions accordingly, they can form complex relationships and coordinate large groups.
Underlying this system is a large-scale language model (think ChatGPT) that generates actions. As stated in this paper, "large language models encode the wide range of human behaviors represented in their training data" and can therefore be used to narrate believable actions and conversations. The researchers do not say that this is literally a sentient agent. In a sense, the characters exist to narrate reality and are prone to certain errors, such as adding footnotes.
The paper points out that one of the main ethical concerns with such "believable proxies of human behavior" is the possibility of people falling in love with NPCs. Or, as they call it: "forming parasitic social relationships with generative agents, even when such relationships are not appropriate."
"Despite being aware that the generating agents are computational entities, users may anthropomorphize them or attach human-like emotions to them," the paper warns.
The researchers suggest ways to address this problem include ensuring that NPCs "explicitly disclose their nature as computational entities" and that developers align their values so that their designs do not "behave in ways that are inappropriate given the context, such as responding to love confessions ."
Humans will pack-bond literally anything, whether it is an AI that clearly discloses that it is not human or an inanimate object with a collection of grooves like a face. (Believe it or not, this is already a problem.)
I say bring in human-AI relations, but that's a story for another day.
If this keeps up, "emergent social dynamics" for game NPCs is likely in the not-too-distant future. One day you will re-enter the village you looted and find mothers in mourning clothes holding funerals for their loved ones. The chickens will remember everything you did when you yelled "FUS" at them.
While you wait, you can watch a replay of the simulation (open in new tab) demonstration that accompanies the paper.
.
Comments