In an apparent bid to extend the use of its multi-touch technology to devices other than iPhone, Apple in a new patent filing discusses the concept of a software-based multi-touch gesture dictionary that would let users to assign custom multi-touch finger movements to computer actions."Many expect that the use of multi-finger, touch-sensitive user interfaces will become widely adopted for interacting with computers and other electronic devices, allowing computer input to become even more straightforward and intuitive," Apple wrote in the January 3, 2007 filing.
"Users of these multi-touch interfaces may make use of hand and finger gestures to interact with their computers in ways that a conventional mouse and keyboard cannot easily achieve. A multi-touch gesture can be as simple as using one or two fingers to trace out a particular trajectory or pattern, or as intricate as using all the fingers of both hands in a complex sequence of movements reminiscent of American Sign Language."
The Cupertino-based electronics maker further explains that each motion of the hands and fingers, whether complex or not, would convey a specific meaning or action that is acted upon by the computer or electronic device at the behest of the user:
"The number of multi-touch gestures can be quite large because of the wide range of possible motions by fingers and hands. It is conceivable that an entirely new gesture language might evolve that would allow users to convey complex meaning and commands to computers and electronic devices by moving their hands and fingers in particular patterns."
To manage the new language, Apple's patent proposal calls for a "dictionary of multi-touch gestures" that is interactively presented to a user of a computer system having a multi-touch user interface. In one embodiment, the company said the dictionary may take the form of a dedicated computer application that identifies a chord (e.g., a combination of fingers, thumbs, and/or other hand parts) presented to the multi-touch interface by the user and displays a dictionary entry for the identified chord.
A dictionary entry may include, for example, visual depictions of one or more motions that may be associated with the chord and meanings of the gestures including the identified chords and the various motions.
"The visual depictions may take the form of motion icons having a graphical depiction of the motion and a textual description of the meaning of the gesture," Apple wrote. "The visual depictions may also take the form of animations of the one or more motions. The application could also identify one or more motions of the chord by the user and provide visual and/or audible feedback to the user indicating the gesture formed and its meaning."
In another embodiment, Apple said a dictionary application can run in the background while other applications on the computer systems are used. In this scenario, if a user presents a chord associated with a gesture without a motion completing the gesture, the dictionary application can present a dictionary entry for the presented chords.
Apple explains: "As in other embodiments, the dictionary entry may include visual depictions of one or more motions and meanings of the gestures comprising the identified chord and the various motions. Also as in other embodiments, the visual depictions may take the form of motion icons or animations of the motions. A user guided by the dictionary entry may perform a motion completing a gesture, and the system may execute a meaning of the gesture and may also provide visual and/or audible feedback indicating the meaning of the gesture. "
In yet another approach to its dictionary concept, Apple discusses the concept of an interactive computer application that allows a user to assign meanings to multi-touch gestures is provided. The computer application may display a dictionary entry (like those described above, for example) and accept inputs from the user to assign a meaning to one or more of the gestures in the dictionary entry.
Apple further explained that the application could be used to assign meanings to gestures that do not have default meanings selected by a system designer or may be used to change the meanings of gestures that do have default meanings assigned by a system designer. The application, the company added, may also include program logic to selectively present only those motions that may be more easily performed in a form different from those motions that may be more difficult to perform. Alternatively, the more difficult motions may not be displayed at all.
Finally, Apple also explains that the gesture dictionary applications could be triggered by events other than presentation of a chord:
"These events may include hand parts hovering over a multi-touch surface, audible events (for example, voice commands), activation of one or more buttons on a device, or applying a force and/or touch to a force and/or touch sensitive portion of a device. These events may correspond to chords and invoke a dictionary entry corresponding to such a chord. Alternatively or additionally, these events may correspond to other groupings of gestures not based on chords, such as custom dictionary entries. In yet another variation, the event triggering a gesture dictionary application may not correspond to a gesture grouping at all. In these cases, a dictionary index may be invoked, allowing a user to select from a plurality of dictionary entries."
According to the filing -- credited to Apple employees John Elias, Wayne Westerman, and Myra Haggerty -- the multi-touch gesture dictionary concept could be deployed on a notebook computer, tablet computer, handheld computer, personal digital assistant, media player, mobile telephone, or other consumer electronics devices of similar nature.