If you are totally sold on that route you will either have to accept the limitations it has or try and use other outside solutions (like iPisoft). I know I've read this on the forum at least a few times (but Peter can chime in to confirm). In any case, there won't be any more development of the Kinect plugin. While it's *possible* RL will answer you I'm guessing not. But is there a way to smooth those glitches? Thanks in advance for the read. The outcome is very glitch filled with small jerks and twitches in the arms, wrists, ect. What is the path moving forward with the Iclone plugin? Will there be options to add more sensors? Capture more movement like fingers for example? The downfall of the plugin is that it only captures one view.
Ipi mocap fingers software#
This being said, The steps to get captures in that software then loaded into Iclone takes a bit of work and crossing platforms such as blender. It would capture fingers and even face expressions. And you could set up several motion capture sensors to get a higher accuracy tracking.
In the time of doing the research I had found that ipi also had the ability to use xbox 360 as well as xbox one sensors. Applying motion data right to a character that has already been loaded with both voice script, and facial animation is fantastic. After a few days of research and testing and asking questions, I have it up and running. I have a couple of questions if anyone has an answer? I recently picked up the xbox one Kinect plugin.