As per reports, iPhone users will soon be able to share augmented reality with the help of tools that will be launched next week by Apple. The personal data that will be sent to the servers will be limited. Over the last few years, augmented reality market has grown immensely and big names in the industry are trying to make a mark in this market, thereby increasing competition. Apple’s rival, Google is also into developing augmented reality tools which will bring in more software developers to their respective platforms.
How Will the Augmented Reality System Work in Phones
With this system, users will be able to share data and use their own devices to see the same virtual object in the same area. As per developers, the system may be a concern in terms of privacy as people will be able to scan their personal places and homes without any difficulty.
Reports say that for this privacy concern, Apple has developed the two-player system to work from one phone to another, partially. Their approach is different than that of Google, where players have to scan their space and send to and store it in the cloud.
The exact details of how the AR system will work is still not available. Whether the system will support more than two players is not clear yet. However, technical limitations will definitely be there in future with a phone-to-phone approach. If a player who began the game stops taking part, it will be difficult to handle with three or more players.
Competition between Google and Apple
Both Apple and Google have made augmented reality a prime focus. Last year, Apple had launched its first set of tools with which software developers could create AR apps. With these tools, augmented reality was easily available on many phones without any changes.
After this Google had to drop its plan of creating an AR process that needed special sensors for AR to work on phones. Instead, they started building tools for this technology which could be used in general phones.
At Google’s developer conference, the company created tools called Cloud Anchors which needed the first player to scan his environment and upload raw mapping data to servers. The data is then converted to represent the player’s space roughly. The other players then scan their respective environments and the data is sent to the servers. Their phones are then matched up and they see the same virtual object in the same physical environment.
Apple, however, does not store any raw mapping scans of the space of a player, in the cloud. As per Google, the scans they store in the cloud will be discarded after seven days.
We have to wait and watch what happens when Apple finally launches phone-to-phone augmented reality sharing.
Article Source: https://bit.ly/2JtCABS
Image Source: https://www.iphonefaq.org/archives/976512