3D-printed Items That Recognize How A User Is Interacting With Them

By: April Carson

Scientists have developed a new method to 3D print mechanisms that detect the force applied. These structures are made out of one piece material, so they can be rapidly prototyped and used by designers for interactive input devices like joysticks or hand controllers without having an entire object printed first-hand in order test its responsiveness before going into production.

The process of creating these interactive devices is made easier with the addition of custom software. The researchers integrated electrodes into structures made from metamaterials, which are materials divided into a grid on repeating cells; they also created an editing program that helps users build this new type of virtual reality headset by providing guidance through building blocks and tutorials while allowing for individualized layouts in order to achieve desired effects or perspectives as if you were really there!

"We've created a metamaterial that can sense how many degrees someone has turned the door handle. Our work enables you to customize your mechanism for whatever needs it may have."

"What's really unique about our design," says co-lead author Jun Gong, "is we're able not only detect rotation but also magnitude - meaning whether or not they rotated by just one degree all the way up to 180!"

Gong, a first-year graduate student in EECS at MIT created an algorithm to predict migraine with more than 90% accuracy. His paper was written alongside fellow lead authors Olivia Seow and Cedric Honnet who are also from the Media Lab's Computer Science Department.

Other coauthors include Jack Forman of Harvard University as well as associate professor Stefanie Mueller from CSAIL for their contributions on this research project next month that will be presented during Association for Computing Machinery Symposium called "User Interface Software & Technology".

The project’s most exciting feature is the capability to integrate sensing into material object structure. This will enable new intelligent environments where our objects can sense interactions with them, such as a chair or couch made from smart material that detects when someone sits on it and then uses it for particular functions like turning on lights or TVs in an home--or even collects data about their body while they're sitting down so researchers later analyze what happens during those times!

When the user applies pressure to a metamaterial object, some of the flexible, interior cells strain or compress. The researchers took advantage of this by developing “conductive shear cells,” which are flexible cells with two conductive wall and two nonconductive wall. The conductive walls serve as electrodes.

When a user applies pressure to the metamaterial mechanism — pushing a joystick handle or pressing buttons on a controller, for example — the conductive shear cells stretch or compress, and the distance and overlapping area between opposing electrodes varies. Those variations may be detected using capacitive sensing and used to compute force magnitude, direction, and rotation.

The researchers created a metamaterial stick with four conductive shear cells placed around the base of the handle in each direction (up, down, left, and right). The distance and area between opposing conductive walls changes as the user moves the joystick handle, allowing sensitivity to be obtained for both direction and strength of each applied force. Those values are then processed by the embedded microprocessor and used to update the movement of a graphical object on an attached display, such as moving up for more force, down for less, or left or right based on direction.

The researchers would like to strengthen MetaSense's algorithms in the future to allow for more complex simulations. They also dream of developing mechanisms with hundreds or thousands of conductive shear cells. Embedding hundreds, if not thousands, of conductive shear cells within a big mechanism may provide high-resolution real-time visualizations of how a person is interacting with all sides of the device. "Eventually, we want to build devices that can see in six dimensions: up/down, back/forth, left/right, and twist," says Guo.



38 views0 comments