News Column

Patent Issued for Collaborative Gesture-Based Input Language

June 26, 2014



By a News Reporter-Staff News Editor at Computer Weekly News -- From Alexandria, Virginia, VerticalNews journalists report that a patent by the inventors Ouyang, Yu (Cambridge, MA); Li, Yang (Palo Alto, CA), filed on September 20, 2011, was published online on June 10, 2014.

The patent's assignee for patent number 8751972 is Google Inc. (Mountain View, CA).

News editors obtained the following quote from the background information supplied by the inventors: "Some known computing devices may include various hardware and software elements that may enable performance of a variety of tasks in response to user input. Conventionally, a user may interact with or otherwise activate the hardware and software elements through one or more interfaces (such as a keyboard, a mouse, a touch screen, etc.) by selecting one or more pre-configured graphical and/or hardware buttons (such as an icon, a switch, etc.).

"For example, a user may use an Internet-enabled mobile device that browses the World Wide Web. The user may manually enter a Uniform Resource Locator (URL) for a desired webpage using a virtual keyboard or similar input device of the mobile device. The process of entering the URL may be complicated by the size or configuration of the input device. More specifically, the utility of virtual keyboards and other input devices designed for use with a mobile device (e.g., a smartphone) is often limited by the relatively small physical dimensions of the mobile device itself, negatively impacting the user experience.

"One conventional solution is to implement gesture-based shortcuts that may be used to control a mobile device. A gesture (such as a pattern traced by a fingertip on a touch screen or other presence-sensitive device) may be detected by a mobile device. The detected gesture may be identified by the device and matched to one or more predefined shortcuts corresponding to one or more actions and/or operations performed by the device. In some instances, a computing device may graphically present to the user a list of the shortcuts associated with the detected gesture for selection by the user. Based at least in part on a received user selection, the computing device may then perform the one or more actions/operations corresponding to the selected shortcut (e.g., open a web browser application and connect to a specific location, execute a program, etc.).

"In some instances, however, a user may not be willing to spend much effort to associate the gestures and shortcuts necessary to define a custom, user-specific gesture-based language. Further, the more gestures and shortcuts exist, the less likely it may be that the user will be able to recall the various custom-defined gestures and associated shortcuts. In other words, user-defined gesture-based languages may not scale. As the user-defined gesture-based language develops, the number of gesture and shortcut associations may reach the thousands, hundreds of thousands, or more such that the user may not be able to recall the various gestures and associated shortcuts of the gesture-based language. When a user is unable to recall the various gesture and associated shortcuts, the user may not use or otherwise abandon the gesture-based language and, instead, rely upon the keyboard for manual input."

As a supplement to the background information on this patent, VerticalNews correspondents also obtained the inventors' summary information for this patent: "In one example of the disclosure, a method includes receiving, by a computing device, data representative of a gesture detected by a presence-sensitive screen of the computing device, identifying, by the computing device, a shortcut associated with the gesture, and providing for display, by the computing device, data representative of the shortcut. The shortcut corresponds to an action to be performed by the computing device. Identifying the shortcut comprises accessing at least a portion of an aggregated group of gesture-shortcut associations determined based at least in part upon prior user input from at least one other user.

"In another example of the disclosure, a method includes receiving, by a server and from a plurality of computing devices, data representative of a group of gestures detected by the plurality of computing devices, and receiving, by the server, data representative of one or more shortcuts associated with the group of gestures from the plurality of computing devices, wherein each of the shortcuts corresponds to an action performed by at least one of the plurality of computing devices. The method further includes aggregating, by the server, the data representative of the group of gestures and the data representative of the associated shortcuts received from the plurality of computing devices based at least in part on detected similarities between at least one of 1) the group of gestures and 2) the associated shortcuts, and defining, by the server, a gesture-shortcut language based at least in part on the aggregated data, wherein the gesture-shortcut language includes at least a portion of the aggregated data representative of the group of gestures and associated shortcuts.

"In another example of the disclosure, a computer-readable storage medium is encoded with instructions that, when executed, cause one or more processors of a computing device to perform operations, the operations including receiving data representative of a gesture detected by a presence-sensitive screen of the computing device. The operations further include identifying a shortcut associated with the gesture, wherein the shortcut corresponds to an action to be performed by the computing device, and wherein identifying the shortcut comprises accessing at least a portion of an aggregated group of gesture-shortcut associations determined based at least in part upon prior user input from at least one other user, and providing for display data representative of the shortcut.

"In another example of the disclosure, a computer-readable storage medium is encoded with instructions that, when executed, cause one or more processors of a computing device to perform operations, the operations including receiving data representative of group of gestures detected by a plurality of computing devices, and receiving data representative of one or more shortcuts associated with the group of gestures, wherein each of the shortcuts corresponds to an action performed by at least one of the plurality of computing devices. The operations further include aggregating the data representative of the group of gestures and the data representative of the associated shortcuts received from the plurality of computing devices based at least in part on detected similarities between at least one of 1) the group of gestures and 2) the associated shortcuts, and defining a gesture-shortcut language based at least in part on the aggregated data, wherein the gesture-shortcut language includes at least a portion of the aggregated data representative of the group of gestures and associated shortcut.

"In another example of the disclosure, a device comprises at least one processor, a network interface, and a language development module. The network interface is configured to receive data representative of gestures detected by a plurality of computing devices and receive data representative of one or more shortcuts associated with the gestures, wherein each of the shortcuts corresponds to an action performed by at least one of the plurality of computing devices. The language development module operable by the at least one processor to aggregate the data representative of the group of gestures and the data representative of the associated shortcuts received from the plurality of computing devices based at least in part on detected similarities between at least one of 1) the group of gestures and 2) the associated shortcuts, and define a gesture-shortcut language based at least in part on the aggregated data, wherein the gesture-shortcut language includes at least a portion of the aggregated data representative of the group of gestures and associated shortcuts.

"The details of one or more embodiments are set forth in the accompanying drawings and the description below. Other features, objects, and advantages will be apparent from the description and drawings, and from the claims."

For additional information on this patent, see: Ouyang, Yu; Li, Yang. Collaborative Gesture-Based Input Language. U.S. Patent Number 8751972, filed September 20, 2011, and published online on June 10, 2014. Patent URL: http://patft.uspto.gov/netacgi/nph-Parser?Sect1=PTO1&Sect2=HITOFF&d=PALL&p=1&u=%2Fnetahtml%2FPTO%2Fsrchnum.htm&r=1&f=G&l=50&s1=8751972.PN.&OS=PN/8751972RS=PN/8751972

Keywords for this news article include: Software, Google Inc..

Our reports deliver fact-based news of research and discoveries from around the world. Copyright 2014, NewsRx LLC


For more stories covering the world of technology, please see HispanicBusiness' Tech Channel



Source: Computer Weekly News


Story Tools






HispanicBusiness.com Facebook Linkedin Twitter RSS Feed Email Alerts & Newsletters