![](../images/banner 800 200.jpg)
![](../images/w00small.png)
![](../images/w1small.png)
![](../images/w2small.png)
![](../images/w3small.png)
![](../images/w4small.png)
![](../images/w5small.png)
![](../images/w6small.png)
![](../images/w7small.png)
![](../images/w8small.png)
![](../images/w9small.png)
![](../images/w10small.png)
![](../images/w11small.png)
![](../images/w12small.png)
![](../images/w13small.png)
![](../images/w14small.png)
![](../images/w15small.png)
![](../images/w16small.png)
![](../images/w17small.png)
![](../images/w18small.png)
![](../images/w19small.png)
![](../images/w20small.png)
Propose a final project.
My propose on final project is to make a digital puppet tool. A controller where I can control rigged model or skeleton in animation software like motionbuilder, maya or blender. The idea is to make a connection between blendshape/rotation/translate value and movement of the puppet. The focus will be on facial movement like mouth and head rotation.
puppet .
head: sensor that measure distance and will control mouth and another one control rotate head
![](../images/digitalpubbet.png)
extra;
3-4 sensor slighter = expression
![](../images/blendMotionbuilder.png)