HOME | DD

#mmd
Published: 2017-08-08 03:51:34 +0000 UTC; Views: 820; Favourites: 9; Downloads: 12
Redirect to original
Description
Tda face left, Fatal Frame Miu right. (It turns out that a lot of my model downloads are all fucked up from me editing them and saving over accidentally...)Tda's not unique. Evey anime face MMD model looks like shit from this angle.
Zombie Eyes: Part of this is the deep-set eyes designed to track the viewer. If the viewer is below, the eyes appear to be tracking what they cannot see. An auto-blink morph would be handy here. I don't know if it would be enough.
Nose: I have no idea if this nose style is necessary. Anime is not known for big noses. Hadn't noticed this before, will have to compare to a few other models.
Necks: We all know necks are bad places. The beginners think it's all screwed up textures on head swaps. Typical anime exaggerates wide faces and slender necks, and this is the angle where it shows poorly. Even if you're careful about your normals, this is a steep shift. Smooth the verts until it works., Deformation is always going to be a problem and SDEF won't work here. Try a shape-correcting morph instead. NM that many editing techniques will leave the normal-illiterate with a sharp edge here that needs fixing (not shown).
Related content
Comments: 7
Taemojitsu [2019-06-02 09:47:44 +0000 UTC]
It should be possible to write a custom shader for eyes that use the camera-auto-tracking "osoroshii ko" eye setting to rotate the eyes toward the camera (after moving them back to where they'd be without this facial morph). You'd start by getting the default or current position through the head's angle, though the relationship might be different for different models. A little complicated, but I think it would work. If a shader like this was widely used, it could affect model development positively.
The default eyes aren't supposed to track the viewer; I think Tda-style's oval eyes just make it harder to tell whether the model is looking at the camera. But the option of bringing the irises into the center of the eye, which definitely creates the zombie-eye look, probably makes model creators too lazy to make the iris very close the eyelid. If it's broken for some cases, it's easier to assume it'll be broken for all cases.
I think I've also read that you can mimic the eye's natural appearance by using inward-pointing cone-shaped irises or something, so light from above causes brightness on the bottom iris; this isn't being used in MMD but it would also look bad at steep (?) angles.
👍: 0 ⏩: 1
vasilnatalie In reply to Taemojitsu [2019-06-08 00:25:12 +0000 UTC]
Oh, I know eyes don't track by default. Tda's eye trick does though. Or at least, it's responsible for part of this problem. The eyes appear to track down from this view, because they're deep-set, but that appearance gives it the zombie eyes.
I'm not familiar with "osoroshii ko." What is that?
I've actually made a vertex shader that does eye tracking. A long time ago, I could probably make it a lot better these days-- think I was using lerp to interpolate between quaternions, didn't know what spherical lerp was for. I made a few bone markers to get eye's current axes and then transformed them based on that toward the camera or any other arbitrary point. Basically, you want two eye-end bones, possible two eye-top bones if you want to limit roll; don't really care about the head, unless you want to implement angle limits, which would be a good idea even though I didn't. Vertex shader. Irritating to adapt to different pixel shaders, kept having to rewrite anytime I wanted to use a new effect. That's the main issue with this, you need to adapt the vertex shader to every single effect. (And mark the eye material appropriately, so that the shader can recognize what needs to track and what doesn't.) But it really did add a lot to a model to have it track-- even while just moving the camera to pose from a different angle, it was a little bizarre, but very cool. (Also did some IK based tracking, which wasn't as good, but easier to implement. Unfortunately, MMD's IK algorithm bugs out sometimes, and the whole minimization of YXZ euler angles leads to poor angle limits, and it's a pain to make a proxy for the camera to give the eye IK something to OP to.)
In terms of getting good looking irises, the way most MMD models are made, I'm not sure that cone irises would matter-- it's all very abstract and cartoony anyways. But a real iris is a little bit like a gem-- semitranslucent, specular from a complicated underlying structure. Specular can improve irises a lot, to the point that you don't need fancy textures. Consider
, where I'm actually rendering an interior mesh to a different render target to generate specular. Something like that would be ideal for irises.
👍: 0 ⏩: 1
Taemojitsu In reply to vasilnatalie [2019-07-12 18:51:25 +0000 UTC]
osoroshii ko, 'おそろしい 子!' or '恐ろしい 子!', is what Tda calls the eye-tracking expression. "Scary child"
I don't know too much about the advanced math. When you say, "using lerp to interpolate between quaternions, didn't know what spherical lerp was for", I know what lerp is because I looked it up, and then thought it was the coolest thing so I made sure to use it. I'm supposed to know what quaternions are because they were a topic when I was in Academic Decathlon in high school, but I don't; and I don't know what spherical lerp is either. I do know that linear interpolation is often not appropriate, but that's about it.
If you mean, 'convert position into an angle and use linear interpolation on the angle', then I think I understand that but it would take me about a day to implement that short description.
The reason you would use head angle is so that you don't have to modify models to use the shader with them. I'm using GNU Linux with Wine, so it was enough trouble to get MME working and I'm not interested in trying to use PMX Editor in case it doesn't work and I try to get it to work out of stubbornness (or even if it does work, it's another skill set to learn). As the saying goes, when you only have a hammer everything looks like a nail, and right now effect files are my hammer.
I don't think you should need to adapt the eye shader to different effects. Just use the same eye shader for all models; maybe with a few different versions or a control object to match the brightness of other shaders. Only customize the eye shader if there's an obvious mismatch.
I decided I want to always have a proxy for the camera if I do any original camera motions. It makes it easier to do a stereoscopic video; it lets you have persistent cameras, and probably eliminates a 1-frame camera duplication at 60 fps; and it makes it much easier to do minor camera shaking and random movements. (Although physics-based shaking would be a little easier than copying and pasting motion frames for an entire video.) If more people did this, it would offset that disadvantage of IK-based eye tracking, but the other disadvantages would remain.
Don't know if understanding Listing's law would help in writing a vertex shader for eyes.
"rendering an interior mesh to a different render target to generate specular"
I've thought a bit about ways to condense some lighting calculations in MMD with an intermediate texture, the way other applications do, but with the restriction that you can't render to texture cubes in MMD so you have to have some other mapping like a sphere; but I don't understand what you mean here or what the advantage would be.
👍: 0 ⏩: 1
vasilnatalie In reply to Taemojitsu [2019-07-24 19:15:32 +0000 UTC]
> I'm supposed to know what quaternions are because they were a topic when I was in Academic Decathlon in high school, but I don't; and I don't know what spherical lerp is either.
They're worth knowing eventually. Quaternions are an alternate way of describing rotations that don't cause interpolation artifacts the way that Euler angles (x/y/z angles) do. Slerp is spherical lerp, which is used to interpolate between two different quaternions.
You could use head bone, I suppose-- you lose convergence, and you have to make some (reasonable) assumptions about the default orientation of the model.
Re different shaders, think about Raycast. How many different render targets does it use? If you hide the eyes from Raycast, then they don't show up in reflections, don't shadow anything. If you want to do eyetracking in Raycast, you have to edit the vertex shader every single place that it shows up, and hope that there aren't any weird things it does that aren't in the vertex shader (like parallax).
👍: 0 ⏩: 1
Taemojitsu In reply to vasilnatalie [2019-11-25 00:47:41 +0000 UTC]
👍: 0 ⏩: 0
vasilnatalie In reply to TayAyase [2017-08-09 14:22:23 +0000 UTC]
My devious plan is to avoid that angle
👍: 0 ⏩: 0