Apple's interest in allowing users to record better video on their iPhone was revealed this week in a newly published patent application discovered by AppleInsider. Entitled "Accelerometer/Gyro-Facilitated Video Stabilization," it describes how an iPhone might use motion sensing data to compensate for any jittering in a recorded video.
The filing notes that software-based video stabilization already exists and can improve the perceptual quality of a video sequence, but not without consequences. For example, current stabilization techniques can use up "considerable resources," which can be particularly detrimental on a portable, battery-powered device like an iPhone.
In addition, while advanced algorithms can help offset any shakiness in a video, sometimes they can generate incorrect estimates that don't actually improve video quality at all.
But now, consumer devices like the iPhone include gyroscopes and accelerometers, providing motion data for software on the device. While this data can be helpful, even it isn't a perfect solution for video stabilization due to "noise" in the data, Apple said.
"Motion detection devices can provide metadata that indicates motion effects of a camera during video capture, however, even though the motion detectors provide data relating to global motion of the camera, the level of shakiness between frames is often comparable to the noise level of the motion detector data," the filing reads. "Such high level of the noise in data prohibits (direct use) of accelerometer data in video stabilization."
Apple's solution would be to selectively control the motion stabilization feature, only adjusting and improving the video captured when the system determines it is necessary. This determination would be made by comparing motion sensor data to a pre-set threshold.
"Based on the determination, motion stabilization may be suspended on select portions of a captured video sequence," the application states.
The described system could go frame by frame in a captured video, comparing the file with synced motion detection data. By doing this, an iPhone could determine if and when image normalization processes are required, only applying the effect when necessary.
The proposed invention, made public this week, was first filed in April of 2010. It is credited to Yuxin Liu, Xiaojin Shi, James Oliver Normile, and Hsi-jung Wu.