Like any trade, once you learn the ropes of 3D modeling and do it enough, it becomes second nature. Certain college programs even dare to think they can teach it to you in just a semester or two. If you’re one of those people that finds all that clicking really rather tedious, or you simply want to get in on the ground floor of 3D modeling with a new kind of control scheme, Purdue University has created a new gesture tool just for you.
To fulfill your Minority Report reference for the day, the new tool, called Shape-It-Up, employs the use of a Kinect and some specialized algorithms in order to sense gestures that you make in the air with your hands. The tool then interprets these gestures and applies them to a 3D modeling program, which results in a 3D model you created via your best Tom Cruise impersonation, rather than the tried-and-true method of using a mouse. The intent of Shape-It-Up, aside from delivering a new method of 3D-modeling input, is to make it seem like you’re actually molding the model with your bare hands, similar to molding clay. The Purdue team envisions the new tool being used in any venue that 3D modeling would be used, such as in the worlds of video game development, engineering design, or architecture.
In the demonstration above, it’s clear that it’s quite easy to create and manipulate the 3D model, though the tool doesn’t yet look sophisticated enough to, for example, painlessly create the intricate details of a main character’s face in a modern-gen, triple-A game. However, it does look like it can easily handle some less important items, such as environmental details.
The system works through a set of predefined gestures, but they’re more complex and intuitive than the phrase “predefined gestures” makes it seem. It’s not like the Wii’s remote (before WiiMotion Plus), where you just kind of waggle in a certain way, and the game interprets the motion no differently than a button press, and then activates an in-game action based on that input that doesn’t mimic the waggle. In effect, Shape-It-Up first recognizes a certain hand gesture — similar to how the Wii recognizes a certain kind of waggle — but then that triggers a more fluid gesture-recognition that allows the user to more intricately manipulated a 3D model. In a simple sense, imagine it like a mouse’s click and drag; you click to initiate the action, then can fluidly manipulate the drag however you wish.
Since all it really takes to create a 3D-printed object (once you have the 3D printer and printing material, that is) is a 3D model of that object, Shape-It-Up also works as a fun way to create 3D printed items. Simply create your wacky 3D model using Shape-It-Up, feed the model data into whatever 3D printing program you prefer, then let the printer have at it. You can even nonchalantly brag about how you made that weird mushroom figurine or flower vase with your bare hands, perhaps subtly flexing while you do so.
Currently, there doesn’t seem to be word on when or if Shape-It-Up will hit the market, but considering the team envisions the system being used by game designers and architects, it’s safe to assume that we’ll be hearing about a public release at some point.
Read More: http://www.extremetech.com/