Avatar-to-local space constraint
This would essentially hook into (or, more likely replace) the original constraints system that SL already has with a more complete implementation of constraints. The existing constraints system was only ever half-implemented, leaving us with a constraints system which can barely be used for anything.
Examples: Holding an oversize object (e.g. a skateboard, a large box, etc.), coughing (hand-to-mouth), rubbing stomach (Canned animations do not work consistently for this, mostly resulting in clipping)
Avatar-to-world space constraint
In order to create the feeling of being in a dynamic world, avatars need to be able to interact with their surroundings. Think about modern video games - pretty much all games allow for interaction with world elements (Pulling levers, Steering at the helm of a ship, driving, etc.), while the player can see that their avatars are physically making these things happen - this is key. Players need to be able to see their avatars manipulating the world around them.
Examples: A car steering wheel, a bicycle pedals and handlebars, a physical item colliding with a world-space item (like a tennis racket that can hit objects back over the net) or more general avatar-to-world interactions (e.g. an avatar pulling lever or playing chess/checkers).
We don't want to stay in a local space forever. We need to be able to mix spaces to compete with other worlds.
Examples: Handshakes and holding hands, hugs, high fives, picking up other characters, dancing, combat (e.g. boxing, fistfighting), brushing someone's hair, headpats, handing an avatar an object, passing a drink to a friend, etc.
Object-to-hand (VR-like object manipulation)
This is the kind of manipulation that benefits users of VR, as such, the Puppetry project marks a major milestone on the road to having VR support within Second Life. In many respects, this is similar to the aforementioned Avatar-to-world constraints, however placing the avatar in control of the movement, as opposed to the object.
Examples: Taking an apple off a shelf, throwing a ball, etc.
Constraints in Second Life are very limited due to poor implementation and lack of access to their API.
IK/FK switching in relation to world space locations is a much wanted feature and is definitely part of puppetry's mandate.
We, the users, need to be able to track the actual bone locations of other avatars and objects in 3D space otherwise we cannot do meaningful avatar to avatar or avatar to object interactions.
Although Second Life already contains a constraint system, the implementation of the existing system was never completed, and has been left in a half-finished state for more than a decade. In that time, the technology powering constraints in competing software has gotten a lot more advanced than what the existing system is capable of, even if the implementation was finished. This, combined with the relative inaccessibility of the existing system (Only available to users who are able to manually modify SL’s .anim format) has meant that the system has gone completely unused, leading to SL being full of animations which aren’t compatible with most avatars (at least, those that deviate from whichever reference shape the animator had used), leading to clipping being extremely common.
The above paragraph isn’t the entire story however. Under the hood, Second Life uses constraints in several of it’s internal animations. One such example is the default idle animation (standing), where the avatar will sometimes raise their hand to their hip - and this works on near-enough any shape, as the arm is using a constraint to avoid clipping into the hip (Local-to-local constraint). Another example is the animation used when an avatar is editing an object, wherein the avatar’s left arm will track the object being manipulated (Local-to-world constraint).