My First attempt at two handed control of a 3D application. It actually feels pretty natural. Need some more gestures set up for brush size, strength, zoom, pan etc.
It’s better suited to kiosks and short use situations. Mudbox was just for fun, and I certainly couldn’t get the level of precision with this method as a mouse or tablet. Just fer kicks!
Though I will say that traditional sculptors do that all the time. Great arm workout! I think once the Kinect or a sequel to it have the resolution and intelligence to track fingers individually, this’ll get really exciting!…. Said me in 2010… before the Leap Motion was available.
Using the iPad as a trackable virtual motion capture camera. Currently working on getting frame-rate and interactivity speed up to par. Also working on getting a real-time OGL 3D viewer on iPad which accesses the data stream from Blade.
By doing this, the iPad does the heavy work and any number of viewers can use the virtual camera system with just an iPad or iPhone.
The system now works without a mocap stage or any external tracking. Everything is performed on the iPad, including absolute position and orientation.
Using 2 monitors, Maya’s real-time, high-quality stereo display capabilities, and two front-surface mirrors at 45 degree angles, I’ve built a super high resolution stereoscopic viewer. A really easy way to get an idea of what a 4K 3D effect might look like on a budget. (Of course a single monitor and some 3D glasses would work too).
I installed this in the viewing window of our mocap stage. One can then look into this display through semi-silvered mirrors mounted in front of the glass and get a real-time 3D view of the CG characters overlayed on the actual actors on the stage.