With less than a week the firm "pencils down" date, I'm feeling a little disappointed in where my project is. I'm at the point where mesh morphs are operational (albeit buggy), with what I consider is a solid and simple API.
mesh = soy.models.Mesh() mesh.size = 6 mesh[0] = face //a soy.atoms.Face object // repeat for mesh[1] through mesh[5] clone = mesh.clone() //clone is a Mesh object that can be rendered in its own right, if it is bound to a body //change the face and vertex data for clone[0] through clone[5] target = soy.models.Target(mesh) morph = mesh.morph(clone,0.5) //mesh.morph(variantMesh,delta) spins off a soy.atoms.Morph object target.morphs.add(morph) //now you bind target to a soy.bodies.Body, and when its render() method is called, it will apply all its morphs at their given deltas
Rendering has not been done yet for Mesh, and this process will be complicated on the back-end once we perform optimization. Basically we have to maintain the public vertex ordering while shifting vertices around on the backend so that OpenGL can render faces that have the same material consecutively (having to switch between materials needlessly is costly). This is already done for getters and setters for Mesh, but not honored by clone(), soy.atoms.Morph, or by the Target constructor.
Odds are this process won't even be kind of complete by pencils down, but I would expect something more fully functional by PyCon.