Apple has filed two patents describing an advanced augmented reality solution which will provide a real-time view of the surroundings overlaid with rich location data. This augmented view will be provided by using the camera of the iPhone, onboard sensors and communications suite. The patents are titled “Registration between actual mobile device position and environmental model,” and “Federated mobile device positioning”. With these patents, augmented reality navigation will come to iPhone.
Apple Augmented Reality Navigation
The system uses GPS, sensor data, Wi-Fi signal strength and other data which finds out the location of the users. A three dimensional model of the nearby area is downloaded by the app. The model consists of image data of closest buildings, wireframes and the points of interest. However, to create a perfect representation in the real world might be difficult with the use of sensors alone.
Apple has proposed that a virtual frame be overlaid on top of a live video which is provided by the iPhone’s camera. Users can bring the 3D asset and the live feed in a line by controlling it onscreen using different gestures like tap-and-drag, pinch-to-zoom, etc. This will provide them with an accuracy which won’t be available otherwise through sensors alone.
Users can also give audible commands like “move left” and “move right” to match with the images. Once a point or points are aligned properly, the wireframe can be “locked in”. This will calibrate the augmented view.
In another picture, users can place their hands in the live view area and grab parts of the virtual image and interact directly with the wire model. They can reposition the parts by using a special set of gestures. For this option, object recognition technology is required to find out the place and the way in which a user’s hand is working together with the surrounding which is in front of the iPhone’s camera.
Apart from user input, the device can compensate for yaw and roll, pitch and other movements so that the position in space as related to the live world view, can be estimated. The augmented reality program can provide useful data of the location to the users through onscreen overlays, once the lock is made and calibration of the images is done.
Addition of the ‘X-ray vision’ feature is also being expected in the iPhone along with this augmented reality navigation. However, it is not yet known whether iOS 8 will have this feature.
What do you think?
Will iPhone’s augmented reality navigation be helpful to users? What other AR apps do you like? Share your insights in the comments below.
Image source – www.designboom.com