Apple will improve the control of the Vision Pro thanks to this patented function

Californians' new spatial computing and mixed reality viewer, Apple Vision Pro, will be closer than ever. However, although they are about to see the light of day publicly in the coming year 2024, this is not an obstacle to further developing advances and filing patents for operational improvements. Do you want to know all the details? In this article we will introduce them to you.

It was the specialized medium Apple clearly Who duplicated documents that were registered in the United States Patent and Trademark Office. The most important of them, which we are here to explain to you, is called «Activate actionable items with hand gestures».

Apple Vision Pro interactivity is based on taking elements of physical reality and being able to combine them with components of the digital environment. Using our hands, we will be able to perform gestures that the glasses will detect through sensors, in addition to the ability to place digital interface elements on physical surfaces, for complete control. Now, with this invention, they want to go one step further.

Detect objects and activate them

The basic process of this patent consists of “receiving one or more images of the physical environment from the image sensor”, and from there, these relate to various actions that can be developed if we execute the correct command, which in this case are manual gestures.

In the Pictures shown The patent shows a physical environment in which different types of distributed objects exist. While wearing Apple Vision Pro glasses, virtual switches and points are installed on these elements, which we can choose ourselves to perform different tasks.

See also  Paid WhatsApp Premium will arrive soon, but not as expected

So it's about Patented software developedIn the case of a mixed reality headset, it is VisionOS. A diagram of Vision Pro is also shown, along with all the sensor locations that will be able to capture elements of our tangible reality, as well as the operational field of view that will be available to enable direct interaction. .

Apple Vision Pro space video

One example shown, for practical use, is the phone menu from which we can select the contact we want, to be able to make a direct call. This is also an indication of the depth of interaction between different Apple devices. Your Mac, iPad, and Apple Watch can all make phone calls when paired with an iPhone on the same wireless network.

At present, we only know this data from the patent. There is still some time before we can see these developments come to fruition. Therefore, one of the key dates for the Mixed Reality Viewer is February 2024, and from there we can learn in more detail how these interactions will be implemented.

Leave a Reply

Your email address will not be published. Required fields are marked *