Avatar 3.0

Tutorial written by .necro

Overview

Welcome to our Avatars 3.0 Tutorial.

You can easily jump around from section to section using the little menu on the right side of the page.

We hope you find this page helpful, we spent a lot of time putting it together.

If anyone has any feedback, or questions, feel free to reach out to us though our discord server.

Link at the top of the page or invite code dpuxmxr

We’re always open to constructive criticism!

Setting up the view position

The view position can now be manually adjusted by the transform itself like any other game object now rather than slowly adjusting values again and again like you’re used to. I personally try to get the view position right between the eyes right beneath the skin.

Visemes

You’ll start with setting the Mode option to whichever applies to your model, in mine, (like most cases) this will be Viseme Blend Shapes. Then you just bring whatever mesh has the blendshapes you want to use for lip sync, which would be our Body mesh.

Eye Tracking

The new eye tracking works a bit different from the way we’re used to with CATS.
You still want to create the eye bones using CATS, however you do not set up the eye movements or blinking inside of CATS anymore. Eye movements and blinking are now made in Unity and we can edit them in-engine with these sliders.

The reason for still creating the bones with CATS is to avoid potential eye orientation issues with pre-made models (like MMD and TDA).

Calm/Excited decides how often your eyes will move around and blink, turning this up to the max makes your eyes move around constantly while constantly blinking. Turning this down all the way makes your eyes not move around as much and blinking happening much less often.

Shy/Confident decides how often each eye movement is looking at somebody, turning this up to the max will cause you to almost always be staring at people and your eyes wandering just a little around them. Turning this down all the way makes you look away from people more often rather than looking at them.

Unless this is just a specific issue with the few avatars I’ve tested with, these changes don’t seem to appear for you in a mirror, but all other people can see these changes.

Once we have those settings decided, we’ll define what our eye bones are and then we’ll set up the rotation states of the eyes. These are used for defining how far your eyes move and what directions they’ll look in.

Finally we’ll be setting up blinking in the Eyelids section. This has a rather noteworthy change to it. Blinking now uses only one blendshape for blinking both eyes rather than the two it required before. (one for each eye)

There’s now two more slots for blendshapes other than Blink but we won’t be using them for this model. They are generally used for the positioning of your eyelids when looking up or down. They aren’t required so we’ll just keep them as -none-.

Animations

Now that we have everything else set up, we can start setting up Animations.

Animations are in the “Playable Layers” section on the avatar descriptor, these are all the different animators that can serve different purposes. 

Whenever you click any of these buttons, it replaces that default animator with whatever animator you decide to throw into it.

The animators activate animation clips that you’re used to making, but based on certain conditions that you get to define now.

Gestures are now created in a pretty different way now.

Before we would make a single animation clip. This animation clip would essentially replace the entire gesture you decided to put it on. In a lot of cases this got annoying because if you wanted to just have a gesture to close your eyes and smile, you needed to make the animation for that, as well as recreating the animation of your hand making the pose that you were overriding. Otherwise your hand would just make an awkward claw pose when you wanted to close your eyes and smile.

Now you can actually tell an animator to activate an animation upon you making a gesture (or other things), rather than only being able to just replace what the entire gesture actually did as a whole. (in other words, the animation bound to the hand does NOT replace the hand’s pose.)

Another reason for this change is that now when creating animations for your avatar, there is actually a priority system in place now.  As well as certain animations placed under specific layers can now be seen in mirrors, unlike before.

So those are kind of the reasons for all of these different “playable layers”.
Now you make animation clips and put them under the appropriate layer, and while we won’t be using all the different layers for this tutorial, I’ll quickly go over their purposes and what they do for future reference.

Animation Layers

Base is used for animations like walking, running, jumping, crouching, and crawling. Generally you wouldn’t change this unless you’re remaking all of the above animations.

Additive is used for animations that are like add-ons to animations in the Base layer, so for example, a breathing animation that slightly shifts your chest up and down, or maybe a slight constant head bobbing animation.
Additives are specifically only for humanoid rig bones, so things like doing an ear twitch animation for some fox ears on an avatar is not what this layer is for.

Gestures are fairly self explanatory. This is where you would put your animations for making hand gestures, but you can also animate other limbs. When animating under this layer, we would create an Avatar Mask object in the Project tab that defines what parts of the body we want to animate, and then use that for that animation. So while our animation is playing, none of the other body parts will stop animating. Essentially anything being animated under this layer will keep all other animations that are in the Additive layer and Base layer playing. You would also use this layer for doing non-humanoid bone animations like making your ears animated. We’ll cover this more later in the tutorial.

Action is for animations that want to completely override everything else. Think of these like how Emotes used to work, where you press the button and all your current animations stop while it does it’s thing.

FX is where a lot of the cooler animation stuff will reside at. 

This is where you would animate things like your blendshapes, activating gameobjects, changing materials, particle systems, all of that stuff. You do not animate your hands or anything like that in this layer.

All movement of bones are done on any of the other layers EXCEPT for this one.

The last 3 layers are completely undocumented as of writing this, so I’m unsure of their purpose. I have not personally messed around with them yet.

Expressions

You may have noticed this new addition to the descriptor as well and have been confused about what they are for.

These are going to be required for the next few steps, so we’ll start by creating these objects in our Project tab and I’ll explain afterwards.

The ExpressionsMenu object is used for making your very own custom menu system for your avatar that can be used for activating/deactivating different animations you choose.

You can create your own sub menus inside of the main expressions menu, and create even more buttons inside of those menus.

The ExpressionParameters object is used for defining values that you can use in animators that you can change in the ExpressionsMenu. You can then animate things based on what their value is currently at.

So for example, we can create an int (short for integer), then make a button that changes that int to be 5. Then make an animation that turns on whenever the int is at 5.

Animating Blendshapes

First thing’s first, duplicate your model just like the old days.

Turn off your old avatar just to make sure you don’t accidentally animate the wrong one.

Make sure you have your model selected and open the animation tab, create a new animation clip, and name it whatever you want.

Now when we are making this animation, we no longer need to make the hand gesture as well, we only need this animation to animate our blendshapes.
Set up your blendshapes in the animation, and make sure you do not forget that the animation should only have 2 frames.

Once you’re done with the animation, go back to the project folder and we’ll create an animator controller.

You don’t need to worry about the name for this, although to keep things consistent and easier to remember, you should just name it “FX” or “[your avatar name here] FX”.

Once created, open it up by double clicking on it.

This will be a pretty new process since animators were pretty rarely used on avatars for most people, so I’ll explain a bit as we go.

In case you’ve never opened an animator controller before, you can move the view around by holding your middle mouse button, and zoom in/out by scrolling.

You should notice these 3 nodes:

Think of these like checkpoints in a linear map.

You can create more nodes here that do different things, and you can connect each of them with these little lines called “transitions”.

Entry is where the animator will start at once the avatar is loaded.

We can connect the Entry to another state or animation and by default that would mean as soon as the animator starts up it will go into that state or animation that you connected it to.
Right now, it’s not connected to anything so the animator will do nothing by default when it starts.

Exit is obviously where the animator will stop, but using it in VRChat will just have it restart at Entry; effectively not really doing anything noticeable to you in MOST cases.

Any State means that anything you connect it to can be activated at any point in time in this map, so you could be halfway through the map and still have something happen without doing some strange round-about circle to get back to that part.

Now that we’ve covered that a bit, we should add some parameters for whatever you may want to do, here is a list of the built in parameters that all do their own things.

 

Each one of these are parameters that you can activate animations off of, so for example:

AFK is turned True whenever you take your headset off your head by checking the proximity sensor for your VR headset. This means you can have a literal AFK animation that triggers upon taking your headset off. No button pressing involved. For our specific use though, we will just be adding GestureLeft and GestureRight.

These parameter’s values are changed depending on what gesture you make, and they are separated individually now. This means we can have specific animations activate based on if you do Victory on the right hand, and do a completely different animation for Victory on the left hand.

Again, remember like our custom parameter we created, this MUST be written EXACTLY the same, have the same value type and is case-sensitive.

Now that both of those parameters are set up in the animator, we should create a layer to work off of. Go back to the Layers tab in the top left corner.

Think of layers as like, layers in picture editing/drawing, except for animation instead.
These are just here to separate things and make it more neat, orderly, and easier to work with.

I’ll be making one named Blendshapes, you can name yours whatever you want.

Once you create your layer you’ll have to go into it’s settings and change the Weight from 0 to 1.

The Weight setting is kind of like an opacity for the layer, so animations in this layer can actually affect things less depending on the weight of the layer.

Now that we have all this setup done for the animator, now we can actually set up the animation itself. This is where a lot of the creativity can happen.

We’ll start by right clicking in the grid anywhere, and hitting Create State, then Empty.

What we just created is basically what our animator is going to do right when it starts, which right now is nothing. This is here because the first thing you create will automatically be connected to the entry node, which would cause whatever you just added to instantly be played by the animator. So we add an Empty State here so that it connects to it instead, as kind of a default do nothing at first. You can rename this state, but I don’t bother.

So let’s make it do something now with our animation we made previously.

Drag your animation clip onto the grid and right click on New State then click Make Transition, then click on the animation you just drug in.

Now if you click on the transition line you just made, you’ll notice you can change some settings for it inside the inspector now. This is where we can change things like, how fast does it go into the animation, and more importantly, when it will go into the animation.

Add a new condition for the transition by clicking the + button at the bottom right.

You’ll notice by default it’ll put in the first parameter you entered in on the animator, and it will set the animation to activate if that parameter is greater than 0.

We want to change this obviously, but how do we activate the animation based off of our current gesture?

GestureRight and GestureLeft use these for their values:

0 means you’re not activating ANY gesture, and 2 means you’re using “HandOpen”.

So for this tutorial, we will change the condition to check if “GestureRight”, is equal to 2.

We should also make a transition going back into New State otherwise, this animation would work perfectly, but once we turned it on, it could never turn off until you reloaded the avatar.

So now we make another transition going back to New State from our animation clip, and add a condition to it, then change it to check if GestureRight is not equal to 2 so that whenever we aren’t making HandOpen it will return back to normal.

Also, if you want a more instantaneous gesture, I recommend changing the Exit Time of the transitions to be around 0.3

This isn’t required, but to me it feels a bit weird with the small delay in-game using the default exit time. You can also simply turn off the exit time entirely for a completely instant gesture.

Great, so now we have a working animation clip that closes our eyes on HandOpen.

But how do we toggle gameobjects like particle systems or meshes now?

 

I’ll show you how to do that as well in the next section.

Button Setup

Before we jump into making Animation Toggles, let’s go over creating buttons in the Expressions Menu.

The reason to do this in the Expressions Menu and not a Gesture, is so we don’t toggle our object on and off every time we do a gesture, and it’s done though a button on our avatar’s menu.

Start by going into your ExpressionsParameters object that you created earlier in this tutorial, and set parameter 4 to “AudioTrigger”

You can name this whatever you want, but the name is important for use later.

You can change the type of variable it is on the side, in most cases you’ll just want Int.
Float just allows you to have decimal values rather than it always being a whole number, this is useful in some cases but for most everyday VRC animations you won’t need a parameter to be float/ have decimals.

Now we should go into the ExpressionsMenu object, and hit add control.

You should then see this:

This is basically the settings for the button that we’ve just created.
The name is what the button’s going to show up as in-game.

You’ll also notice you can change the button’s icon, you can put any image you want in here and it will show up in-game.

Parameter is what parameter the button will change, and once you select a parameter their Value will show up.

Value is what we want to change the parameter’s value to, by default, all of our custom parameters we make will be equal to 0 until we have a button change their value.

The type is what kind of button it will be, there are a lot of options here that are very useful for activating different kinds of animations. Each button type changes the value in a different way, and can be used differently.

I’ll be showing some in-game footage alongside the debug menu to show you exactly how these buttons work, because they seem pretty obvious at first glance but how they actually modify the parameters isn’t really as obvious.

You’ll notice on the left side I have a bunch of parameters set, these are just here to show you how the different buttons actually modify their values in real-time.

The default is “Button” which just means, when you click on it, it sets the parameter’s value to whatever you define for 1 second and then resets it to 0 after that 1 second has passed. I have the parameter “Button” mapped to this button.

This effectively stops you from spamming the button quickly since it turns on for one second, then turns off.

We could use this for our animation, but it wouldn’t allow us to spam it off and on quickly if we wanted to.

Toggle is fairly obvious, it is what we’re going to be using for this animation. This turns the parameter’s value to what we define, and doesn’t change it back to 0 until we hit it again. Allowing us to turn it off and on as fast as we want. I have the parameter “ToggleButton” mapped to this button.

For the sake of showing off the capabilities of the new system though, I will explain the other buttons as well.

Sub Menu is basically what the name implies. It’s a button that leads to another menu, and you just stick a new ExpressionsMenu object into it, and bam. Now you’ve got a menu inside of a menu. This also has the interesting feature of changing a parameter’s value while you have that particular menu open, which could be interesting as you could do something like making a flame surround you when you open a menu for activating an animation that casts a fireball or something. I have the parameter “MenuOpened” mapped to this button.

Two Axis Puppet is basically an interesting type of button that lets you drag a cursor between 2 directions, so for using this in animations you would use a float parameter and throw it in either Parameter Horizontal or Parameter Vertical.

For this example, I’m using 2 float parameters mapped to it named “2AxisVertical” and “2AxisHorizontal” so the value can slowly rise/lower.

This also can activate a parameter by simply opening the button itself, you’ll see this as I have “MenuOpened” mapped to it as well.

Basically when your cursor is at the middle, the value would be set at 0, and the more you go one direction the higher the value will be until you hit the edge of the button and it will be 1. Going backwards it will go lower and lower until -1.

You can use an Int here, but I don’t think in most cases you would want that as the Int can’t be a decimal so it wouldn’t slowly rise to 1 or slowly lower to -1, it will just hit a point and suddenly be one or the other.

Four Axis Puppet is very similar to the Two Axis Puppet but this one doesn’t take things to negative values, although instead lets you change 4 different parameters at the same time. So farther you go down, that increases a parameter, the farther you go left, that increases a different parameter, etc.

I have 4 different float parameters mapped to this named “4AxisUp”, “4AxisDown”, “4AxisLeft”, and “4AxisRight”.

This also can activate a parameter by opening the button, you’ll see this as I have “MenuOpened” mapped to it as well.

And finally we have the Radial Puppet. This one is pretty cool, it’s a literal circle that you can rotate kind of like a volume knob for a speaker. You can use this for all sorts of things, like literally being a volume slider for an Audio Source, or being an opacity slider for certain meshes, or even being a color picker for a material. This works best with float parameters as well, since it’s a gradual increase. I have a float parameter mapped to this named “Radial”.

It also has the capability of changing a parameter upon being opened, which I have mapped to “MenuOpened”.

Animation Toggle

So first thing’s first, we need to make two animations.

We should make an animation clip for having the gameobject active, and another being inactive. In this case, I’m going to use an AudioSource, but this obviously applies to any gameobject.

Now that we have our 2 animations and our toggle button we created in the last section of the tutorial, we can get started.

Change the Parameter of our button to our custom parameter we made earlier. In my case this would be “AudioTrigger”.

By default the button will change the value to 1 when we click on it, so we won’t change anything.

Now we go back into our FX animator, and create a new layer for this animation to keep things tidy and easy to use. Since this is a separate animation for turning a gameobject on and off, we should keep it separate from the blendshapes so things don’t get super cluttered later on when we add more animations.

I’ll be naming the layer “Audio”, but again, you can name it whatever you want.
Be sure to set the weight of this layer to 1 like before as well, otherwise the animations on it won’t work.

We should also add in our custom parameter we made earlier, and again when you add this in it must be the EXACT same name and value type as you defined in the ExpressionsParameters object. In my case, it’s “AudioTrigger” and it’s an Int.

Now throw in the animation clip for the gameobject being turned OFF first, then throw in the one for turning it ON. The reason we do this in that exact order is because we want the OFF animation to be the default state, so by default the gameobject is turned off, and then we transition to it turning on when we hit the button.

Now make a transition from the OFF animation to the ON animation, and add a condition to that transition for our custom parameter we made earlier (in my case: AudioTrigger) to Equals, 1.

Now make a transition from the ON animation to the OFF animation, and do the same again but set it to NotEquals, 1.

The final step will now be simply inserting this animator we’ve been working with, into the avatar descriptor. Back on the Playable Layers section of the descriptor, click on Default Non-Transform, and drag our FX animator into it.

Good! Now we have a toggle button that turns a gameobject on and off, and we know how to use our avatar’s blendshapes.

“BUT NECRO” you yell at your screen
“HOW DO I MAKE MY HANDS DO THE THINGS”

Worry not fellow degenerate,  We’ll give your favorite Waifu some hand gestures in the next section below!

Hand Gestures

If you skipped through to this point: blendshapes will not animate under this animator even if done in the same way as previously covered. You MUST animate your blendshapes in the FX animator we covered before, then bind them to a hand. This is covered in the “Blendshapes” section.

We’ll start by making our animation for just the hands, I recommend using MuscleAnimationEditor for this.

Once we have our hand animation set up, we should set up an animator for the animation.
Luckily for us, VRChat actually left some sample animators in the SDK that we can just work off of.
If you look in VRCSDK/Examples3/Animation/Controllers, you’ll see a bunch of animators here. The animators you’re looking for are vrc_AvatarV3HandsLayer or vrc_AvatarV3HandsLayer2.

“vrc_AvatarV3HandsLayer” is for male animations

“vrc_AvatarV3HandsLayer2” is for female animations

Double click on the one you want and see if it’s the right one. It should look like this:

So once you’ve found the correct animator, we should duplicate this animator in our Project tab. You can name the duplicate whatever you want, but I recommend naming it Gestures or *youravatarnamehere*Gestures so that you don’t get it confused with the other animators.

 

Once you have the animator duplicated and renamed, open up the duplicate and click on the Left Hand layer. This is the default set-up for how our gestures work on AV3.0, it’s pretty basic.
So now all we need to do is select what gesture we want to override in the animator, in my case I’m going to replace RockNRoll.

 

Once you have that selected in the animator, in the top right you’ll see “Motion”, we can just drag our gesture animation clip into that slot and now it’s been effectively replaced!

 

Keep in mind that since we are in the Left Hand layer, this animation will only work on the left hand. If you want to do this on both hands or just the right hand, you just need to click on the right hand layer and do the same thing we just did in that layer instead.


Now we just go drag our Gestures animator into the Gestures slot for our avatar descriptor and it’s done!

 

Congratulations, now you can upload the model!