Skip to content

Mod for VRChat that allows expression parameter driving from touch zones on avatars.

Notifications You must be signed in to change notification settings

snerble/Snerble.VRC.TouchControls

Repository files navigation

Snerble.VRC.TouchControls

Mod for VRChat that allows expression parameter driving from touch zones on avatars.

I made this because one day I thought to myself how avatars can have stuff like triggers on themselves that interact with the parameter system. This means all the limitations of the parameter system apply, so there's essentially no harm in adding this. I then spent like 2 days prototyping this mod and then spent another week finetuning it.

I don't really like working with this system since it's so awkward to add the probes and sensors to avatars. The only way to get this data into the game (without sneaking scripts past the avatar validation system) is to put all the data in the object names. This already complicates things since readability goes to shit. I made a bunch of scripts for unity that adds a sensor and probe scripts that help to configure everything, but they must be removed before you upload an avatar, so that sucks. I also didn't bother with making it such that adding a script to an existing sensor object initializes the script again. Instead it just wipes it and sets it to default values.

Future plans

None. With the recent announcement of Avatar Dynamics, this system will be essentially obsolete. I mean when I read about "buttons on avatars" I just yelled "holy fucking shit" because this was exactly the thing I wanted to achieve with this mod. Originally I was going to make it so this system interacts with colliders, but that is exactly how Avatar Dynamics are going to work, so yeah this project is dead in the water.

Not that I am complaining though! I'm glad this stuff is being implemented by the VRChat devs. I once heard that no form of scripting would be added to avatars, but what frustrated me was how avatars had 0 interactivity. This is one of the reasons why the Dynamic Penetration System is quite popular; there's a demand for avatar interactivity. However I am not sure if Avatar Dynamics are going to have 'sensors' quite like mine. I suspect that Avatar Dynamics are going to be more trigger based instead of parameter based. This pretty much limits it to only toggle effects. This mod however does support float parameters which are driven based on how much the sensors and probes overlap. If you were to make this cross-avatar, then one could imagine this somewhat replacing the Dynamic Penetration System. That being said, I personally haven't used this mod for anything other than boolean toggles.

Whatever the case may be, I simply hope that the VRChat developers will add more features to avatars. Although there's a lot you can do with SDK3, there are many things that are unnescessarily complicated simply because the only way to add any behaviour is through animations. The unity animation system is dreadfully painful to work with (my biggest gripe is how sub-state machines are purely visual, however if they behaved like regular animation states on the outside, but state machines on the inside, then sooooo many things could be simplified, namely a huge reduction of conditions (and I hate setting up conditions with how the unity UI works)). Working with sounds is something that absolutely has to be simplified. Stuff like playing once-off sounds is essential for avatars with firearms. I tried working around this limitation but I couldn't get a consistent result. The avatar SDK is in desperate need of more features if the devs are never going to add a form of scripting.