//

So I’ve been spending a lot of time designing an abstracted ‘body’ system for my robits in Null Operator and have stumbled upon a difficulty that I really should have seen coming. Namely, that the ‘worlds’ as we assemble them in ready-made engines like Unity and Unreal are so exceptionally player-centric, so bolted down by the assumptions of how one might want to utilize systems in these engines, and every npc in our games is worse off for it.

What do I mean by this?

Let’s talk about a few senses for a moment. When building my brain/body model, one of the things I’ve been trying to emphasize is the sort of automatic physiological reactions that occur with an agent in relation to environmental stimuli. Changes in light level, sudden sounds, composition of atmosphere. In my side-prototype this was all abstracted. There was no connection to an actual ‘Unity scene’. Problems arose however when I started to try to migrate my work over to my Null Operator agents, which are situated in an actual 3-dimensional Unity scene. It turns out getting even coarse data structures up and running for satisfying data to feed into this system is proving to be a nightmare.

Sound

How does one go about making an agent ‘hear’ things?

Well if you’re using Unity, it’s quite an expensive and pain in the ass proposition from what I can tell so far. For starters, we can’t use the seemingly-useful-by-its-name audio listener. We only get one of those, and it’s for the player (sensibly, as it’s for actually sending data to the sound system). But what if we don’t need that. What if we just want to get say… the volume of a single sound effect as it’s playing. Not a coarse event that an audio source has started playing, but the actual volume of that clip, maybe broken down into some frequency ranges.

We can use GetSpectrumData (weirdly not shown in the audiosource doc page, but it still exists here, fucking 4 years later still no good documentation on its usage). This looks promising! Now from my own experience, this call is a touch pricey, but done at a low tick speed, intelligently on demand, we could theoretically do create some sort of sound-in-the-environment manager to track relevant sound volumes for agents to be aware of.

Now we just need to apply the rolloff curve from the audio source and compare our posi.. wait.

Fucking really.

None of the audiosource rolloff curve data is available via code?

Thanks Unity.

Now this isn’t insurmountable, but if we want our sound response to match, what say.. the player hears, we’re going to have to roll our own custom data type, and more than likely hand-match the curves for every damn sound effect. A colossal waste of time.

Smell

This one might sound a touch odd, as we as players can’t exactly smell our game environments, but I think it’s worth talking about. Why? Because a good percentage of typical game agents, be they animals or monsters, have significantly better sense of smell than humans do. What would we need to model this?

Air is one of those things that we’re still quite far away from being able to simulate in a game-space at anything even approaching realist levels, as it would necessitate a volumetric data-set. What saddens me in this case is that in Unity we have an almost perfect data type that would be stunningly useful for coarse, inhomogeneous volumetric data sets, but it’s been hard-coded into a single use-case. Which one you might ask?

Light Probes!

If you haven’t played with them, light probes in Unity are a FANTASTIC data structure for representing baked lighting for dynamic objects. Each probe contains a spherical harmonic, which basically boils down to a 3×9 array of integers that somehow get magically turned into a rough representation of illumination from all directions. I don’t even pretend to understand the math.

These probes are arrayed in an editor-time generated set of tetrahedra that allows for getting the interpolated values of roughly the nearest/most relevant 4 probes for a given point in space. They’re a bit of a pain to setup, but damn are they effective, and I have years of practice using them.

The thing is, this type of structure (the probes, the connection graph, the caching of position to the last cell, etc.) would be TREMENDOUSLY useful for all sorts of volumetric data, both static and manipulable. But, like so many wonderful systems in Unity..

We have no access to it.

I toyed with the idea of just hijacking the probe constants for other purposes (and writing a baker for them), but while Unity exposes the actual probe array’s data, it exposes nothing about the connectome of graph, so there’s no way to actually say… pass data along probes, or crawl the graph in any meaningful way. Plus, giving up light probes is giving up one of the most powerful tools in the Unity lighting arsenal..

A Point, Somewhere

I wanted to write up this little piece to address two main points that I would love to see discussed more.

Firstly, I would really love it if there was someone at Unity whose responsibility was to comb over the API, especially the parts that have been around for half a decade, with an eye for exposing things to code (even if only in a read-only capacity) that reeeally should have been from the get-go. It’s immensely frustrating to have a data structure that can be set just fine in the editor, and no interface to it once in play mode, even if that interface has to be to an optimized, baked down, alternate abstraction of that data. Something is better than nothing.

Secondly, I think it’s about time we started peeling like… fucking 5 percent of our energies at least away from the graphics arms race, and started trying to build some better base data structures for our agents to utilize. I’ve heard no end of talk about how ‘dumb’ agents feel in all of our rpgs. Maybe it’s time we admitted that a state-machine, a navmesh, and a bit of raycasting isn’t going to give use the sort of nuanced behavior that a truly interesting agent would exhibit.

I spoke last week about how a mind model needs some sort of body model to be truly compelling for an agent. So too does it need some sort of sensory spectra to feed it. A huge chunk of what’s needed is already sitting there in these engines.

It just needs to be made a little more open.

Please?

4 Responses to “Situated Bodies”

  1. Kirk

    This is really interesting to me – a body implies a world. But as it turns out, there is no world, only the reflection of the world as the player sees it. The problem isn’t just that the NPCs are for you, and not for themselves, but they can’t be for themselves because they don’t have equal access to the same world as you.

    Maybe this is why stealth games are so compelling. The world, in some shallow way, acts as mediator between you and the NPCs. When they hear something, it means that something actually happened that had meaning to something other than you.

    Reply
    • anton

      I think this is also the reason that the only ‘new’ game that I’ve put a substantial amount of time into recently is NeoScavenger. The world speaks loudly in that game’s system, to the point where weather, temperature meld with the game’s hunger/thirst/pain model in a way the affects player and agent equally. It’s the first game where I’ve tracked another scavenger for multiple in-game days after fighting them, and watched as they weakened, dropped their heavy sack and broken shoes, then finally came upon them having expired (likely from hunger and blood loss). That these exact same events have happened to me as player (being followed by armed scavengers, wounded and unable to find food) have made it such a compelling space to explore.

      Reply
  2. Ruber Eaglenest

    I seems you should support an engine like GODOT, completely open source.

    Reply
    • anton

      The Godot projects does seem interesting. Though this would require a long (and likely to get some folks quite mad at me) post, in general I don’t distinguish between the problems of interfacing with corp. software and most open source projects. They’re both fiefdoms at the end of the day, and as someone lacking the exceptionally low-level coding ability to modify engine code, regardless of context, I am left in the position of asker/beggar regardless.

      Reply

Leave a Reply

Your email address will not be published. Required fields are marked *