The Californian Apple could be preparing a response to the version of Kinect that Microsoft is preparing for Windows Phone mobiles. Still without a name, those from Cupertino would have patented a three-dimensional interface that would allow managing the contents and functions of the iPhone, iPad and iPod Touch using proximity systems, gestural commands and the now classic accelerometer -or position detector-.
The interface that we have glimpsed through the Patently Apple site shows a kind of cube that defines a cabin open by a wall, which would give the user visibility of its contents. In this virtual space, the elements could be placed, represented by objects that we could manipulate from a distance.
As the patent would describe, the size of the iPhone panel prevents the use of fingers to manage the content as it is conceived in this multidimensional space, so the use of a sensor that interprets movements would be considered the most appropriate solution for work with this system.
At the moment it is difficult to predict when we would see this resource management medium in action on Apple devices, although what is sensed with some clarity is that not all mobile terminals on the block will be compatible. It can be deduced that the operation of this interface will require a series of resources that the firm's single-core devices - iPhone 3GS, iPhone 4, iPad - would not have enough resources to make this system work, given the specifications that it would presumably require.
It is even in doubt whether the iPhone 4S and iPad 2 would be able to do so with their dual-core A5 processors. What we could glimpse is the possibility that future devices –iPhone 5 or iPad 3 - will be equipped with the appropriate graphic equipment to make the interface that we have known through these patents work correctly. And it is that as it has been rumored, the new Apple terminals would come equipped with new GPUs, as well as with quad-core processors capable of supplying enough power that a system like this would require.