AI NPC Experimentation

Not since the transition into 3D graphics have I been so excited for a new technology.

I have now completely intertwined my development with various AI assistants, namely ChatGPT and Leonardo AI, but when i started talking to my NPC characters I became enthralled. This is much more than a gimmick. When done right, this will be industry shattering.

Recording voice with a webcam microphone, converting to a text string, then sending this to ChatGPT for a response which gets converted to an AI voice, in near real time. This is what I’ve been worked on all through April this year, and I think I’ve cracked it.

The trick is to train the AI.

How the AI responds is determined by the prompts you give it. Obviously. But there is so much that can happen before and after the response. Namely:

  1. Events that occur in the world can trigger the AI NPC

  2. Memories can be implanted into the AI NPC

  3. Personality can be shaped for each AI NPC

  4. The AI NPC can be a character you can visually see, or a voice, like a narrator

Along with these experiements, the biggest breakthrough I had was on my project “Entanglement Theory” where the player investigates a mysterious space station with a group of AI NPCs. In this simulation, I was able to add group AI conversations and dynamics using a series of pre and post processing to the player’s input and the NPC’s output.

And still, I feel I’m only scratching the surface.

Once players get over the “AI is taking jobs away from devs and therefore I’m boycotting anny game that uses AI” thing, this kind of AI NPC interaction will become common and even standard amung all games.

I was skeptical of VR, even as one of the first pioneers to work as a contractor at Oculus, but this is something else entirely. This will take the medium much, much further.

Next
Next

Looking at Rogue-Lite Design