Apple exploring motion-adaptive iPhone with video chat

By Sam Oliver

Patent requests from Apple continue to provide clues as to where the company may take its iPhone interface in the coming years, such as a new filing which depicts a version of the handset with a front-facing video camera and a software interface capable of adjusting itself for more precise interaction when the user carrying the phone is in motion.

While the former discovery hints towards the inevitable adoption of video conference capabilities by the iPhone in the coming years, the adaptive software interface concept could materialize much sooner, improving a user's accuracy in making touch selections by increasing the size of user interface elements on the touch-screen when it's determined that the user is operating the device while jogging or participation in some other kind of motion-based activity.

For example, Apple notes that some users may use their iPod touch of iPhone while out for a run or hustling between business meetings. At these times, it may be desirable to simultaneously place a call by making a selection from the contact list or change to a different music track by making a selection in an album list. However, these simple tasks can be slightly more challenging on the go than when stationary because they require the user to divert their attention from their primary activity to make an accurate selection on what could be a sweat-slicked or jittering dislay.

To solve this problem, the Cupertino-based company proposes an updated version of iPhone software that can detect when the device is in motion and then compare the detected degree of motion to one or more predetermined "signatures of motion." The iPhone software could then adjust itself by enlarging selection areas on the screen to a degree suitable for the current motion of the device and user.

"For example, if the user wishes to view the contact information for 'John Adams,' the user touches the display over the area of the row for the contact 'John Adams,'" Apple says. "While the device is moving, the motion of the device can be detected. The device can change the size of the rows of the contacts in the contact list application to give the user a larger target area for each contact. For example, the height of a row can be increased. This gives the user a larger touch area with which to select a contact. In some implementations, the height of the toolbar can be increased as well."

Similarly, changes to the size of the elements on the iPhone's home screen can be made after similar detections, in most cases increasing the size of application icons based on the degree of detected motion. This same concept could also carry over to the on-screen keyboard, the company adds, by which the size of each key could in some way be enlarged for more accurate text input while on the go.

The 16-page filing made back in November of 2007 also suggests that interface elements -- such as an array of home screen icons -- could shift their position on the screen based on predictions of where the user may touch the screen, though the need for such adjustments isn't entirely clear from Apple's description.

"The shift moves the target touch areas of the display objects to a different position. In some implementations, the new position is a predetermined distance from the original position," the company says. "In some other implementations, the new position is determined by the device based on a prediction of where the user will touch the touch-sensitive display if the user wanted to select the user interface element while the device is in motion."

The filing is credited to Apple employee John Louch. For those interested, the front-facing video camera depicted in Apple's illustrations is the element labeled "180" near the proximity and ambient light sensors.