2 capacitive sensing foil strips (head and around back and sides)
1 capacitive sensing wire ironed into front fur with connector
1 piezo element to right of mouth
1 microphone to left of mouth
1 or 2 LED lit monochrome LCD panels for eyes?
4 AA batteries
1 IR emitter/detector pair above eyes?
1 momentary switch in tail
1 geared motor driving
Since I was 5 I had always wanted to be a Disney animator. Life led me elsewhere career-wise, and I’m very happy for it, but this project came along as a freelance gig and I HAD to jump on it!
All of the modeling, texturing, lighting, animation, physics, rendering, and compositing was done by me. It was a one month job, and there were plenty of things I’d like to change, but the client (Disney) was happy with it and it had to be live by the deadline. So there it is.
This is an ongoing project of mine, which I intend to 3D print in the near future. It’s a segmented digital display, but driven mechanically. Each square segment has one tooth placed in a unique position, which allows the coded teeth of the circular gears to push it outward.
By pushing them out in the correct combinations (the part I’ve designed here), time numbers are constructed as physical blocks protruding from the surface of the display cylinder.
I run at night, while the wee babies are dreaming of a more fit daddy, and the woodland creatures are doing their best to scare me from the roadside (and they do).
I have my iPhone with me for tunes, but I felt that I didn’t look stupid enough, so I came up with a chest-mounted ground projection rig. It both illuminates the path (and armadillos) directly in front of me and displays information, such as mileage, time, maps, etc.
A custom app I wrote constantly samples the gyro and accelerometer and counter-adjusts (position and rotation) the projected image. This results in less sway and bounce in the projected image on the ground while running. I intend to integrate some sort of pattern recognition into the motion detection so that I can more accurately predict and adjust the image based on the average motion over time.
I like it. It could be more comfortable to wear, but that’s for version 0.2.
I recently had the need to power 12 camera flashes and didn’t want to burn through batteries. Since there was no DC power input jack, and I didn’t want to void any warranties, I designed and printed up some powered battery compartment “slugs”.
The final version has a DC power jack integrated into the print for quicker connections.
This is a test scan of my daughter, captured with the six camera face scanning prototype rig I built. Using multiple cameras firing simultaneously is the only way to capture a child in motion I believe (at least in 3D). Even using time of flight or structured light cameras, there will be motion blur or expression changes before the capture can be completed accurately. Photogrammetry also provides a higher resolution (14MP in this case) than the other methods. I’m not keen on blasting infrared laser light close range at my kid either.
I can see 3D scanning booths becoming an important addition to public spaces in the near future, as people want to get their likeness into games, 3D IDs, animations and as printable souvenirs. The depth from stereo pairs tech is there and consistent today. Building such camera rigs in mobile units can be as cheap as this prototype (under $1000), or up to $10-$20k, depending on how many cameras and the quality.
I had a thought recently… “What if light bulbs could provide us with information, and not just light?” And so, I sketched up the “InfoLight”.
The idea is basically a pico projector in a bulb form factor. I simplify it to an LED, an LCD and a lens here, but really, that’s a pico projector. The bulb would be connected to Wifi or smart phone via Bluetooth LE 4.0 and display simple information and data on demand or ambiently all the time.
I want one. So I made a very crude version for weather, shown in the above video. I have little need for weather info, since I have a smart phone and windows in my house, but the idea still intrigues me.
Here are the guts. An Arduino, a Bluetooth LE module, an LED and a weather icon mask cut from cardboard. An iPhone app rotates the mask to project weather based on the forecast.
Script I wrote with GUI that lets users draw over 3D models in Maya’s viewport for feedback and critique. Scene with notes can be opened by artists and feedback can be viewed or saved out to jpgs. Useful for fast turnaround on models, textures, animation, etc.
More to come in future revisions. It’s a work in progress.