Or, with the flick of the wrist, you'll "throw" photos and videos from your iPod touch onto your desktop Mac's display.
Or you'll gently blow a file to a close friend.
This user-interface wackiness – and more – is described in "Intuitive, gesture-based communications with physics metaphors", a patent application filed in January of last year and published this Thursday.
The lead inventor listed in the filing, Brett Bilbrey, is the senior manager of Apple's Technology Advancement group – and as he says in his LinkedIn profile after giving his title: "And as you can imagine, that is about all I can say about that."
Indeed. The only way we mere mortals can glimpse what may be going on behind Cupertino's brushed-aluminum curtain is through patent applications such as this one – and, by the way, you can join in the fun at the United States Patent and Trademark Office's search page each and every Thursday, when applications are published.
This week's look-see turned up one of the more offbeat UI ideas than we've seen in some time. The basic idea is that a device's "one or more onboard motion sensors" could detect a gesture such as tipping or flipping, and then cause onscreen items to animate in ways governed by a "physics metaphor" that would eject those items from one device and land them on another device with which the first device shares a network connection.

In addition to being activated by motion sensors – think accelerometers or gyroscopes – flicking and tossing could also be induced by gestures on a trackpad or touch-sensitive display. One might, for example, shove photos one by one off your iPad and onto a friend's.
Our particular favorite method of moving on-screen objects is described as follows: "A user can initiate transmission of a selected file by generally aligning the device with the target device and then blowing air across the display of the device. One or more microphones on the device can detect the sound of moving air and the direction of airflow. The direction of airflow can be used to infer the intent of the user to identify a target device."

Altenatively, an object might simply dissolve "like a tablet in water", then rematerialize on the target device. On that device, objects could be made to appear as if they are floating above other on-screen elements until their transfer was accepted.
Sounds could accompany the transfers, as well. Examples given in the filing include "the sound of liquid pouring, a tablet fizzing, gas through a valve, a sci-fi teleporter, or other sound that audibly represent the transfer of a material from one point to another."
Gestures could also be used to initiate actions other than data transfer. One example provided is for a user to lift a device "skyward in a gesture that symbolizes uplifting a torch" in order to initiate a network connection.

No comments:
Post a Comment