Having an Xbox 360 Kinect and a MakerBot is a perfect combination for scanning and printing 3D models. Typically scanning an object is a complex process (although we have seen some DIY 3D scanners in the past). With this combination, however, the process can be simplified without loosing too much detail.
De "Ultimate Battlefield 3 Simulator" is gemaakt van hi-def video projectoren op een 360-graden scherm, motion tracking, een omnidirectionele loopband - en 12 paintball guns.
Voor meer informatie over de medewerkers die hebben bijgedragen aan dit simulator gebeuren:
MSE Weibull [Omni-Directional Treadmill and Motion Tracking]
Igloo Vision [Projection and 360-degree Dome]
aps events and media Ltd [Visual Manipulation]
Extra Dimensional Technologies [Ambient Lighting]
Running in the Halls [Kinect Hack]
Robo Challenge [Paintball Marker Control System]
Yoyotech [Gaming PC]
The latest version gives developers access to a new set of data, which includes raw sensor data, and also some new API’s to query data and results from the Kinect’s natural user interface.
Because it is still in beta stage, the SDK APIs may change along the way, but this is something that developers are well aware of.Finally, some stability have been addressed – especially crashes that happen when the PC goes into sleep mode (a classic!). With each iteration of the Kinect for Windows SDK, we’re getting a little bit closer to getting some Kinect goodness on our PCs. Which applications would you like to see first?
You can find out more information about the latest version of Kinect for Windows SDK over at Microsoft.
The Kinect is basically asking to be hacked into a Minority Report style interface. We’ve seem some similar Kinect hacks in the past, but this MIT project used the device to track individual fingers instead of the user’s entire skeletal structure.
Hopefully this Kinect hack is a glimpse into the future of interactive gesture based interface. Although not exactly like the 3D multiple screen interface in Minority Report, this is a step in the right direction. The detection process distinguishes hands and fingers at 30 frames per second to allow for real time operation. Get the code to play with it yourself.