Multiplayer AR
Creating Multiplayer AR Experiences with MultiSet VPS SDK for Unity
The MultiSet Visual Positioning System (VPS) SDK for Unity empowers developers to build immersive, multiplayer Augmented Reality experiences. A core component of this is the ability for all users who localize within the same physical map to share a unified coordinate system. This synchronization allows for the seamless interaction of users and virtual objects in a shared AR space. Once this shared coordinate system is established, developers can leverage existing Unity networking solutions, such as Netcode for GameObjects or third-party assets like Photon, to broadcast and stream player coordinates across a local or global network, depending on the application's requirements.
Establishing a Shared Coordinate System
The MultiSet SDK achieves a shared coordinate system through its "Map Space" GameObject. When a user's device successfully localizes within a pre-scanned map, the MultiSet SDK transforms the MapSpace GameObject to align perfectly with the physical environment. This means that for every user localized in the same map, their individual MapSpace will have the exact same position and orientation in the real world.
This MapSpace then acts as the anchor for all AR content. By placing all shared virtual objects and player representations as children of the MapSpace GameObject, developers can ensure that these elements appear in the same real-world location for all users.
Player Coordinate Transformation
While each player's AR camera will have its own unique world coordinates within their respective Unity scenes, these can be transformed into a local coordinate system relative to the shared MapSpace. This local position is what should be synchronized across the network. By doing so, even if players start their AR experience from different points within the map, their relative positions within the shared AR space will be consistent.
The key to this transformation is to calculate the local position of the player's AR camera with respect to the MapSpace. This can be achieved using Unity's Transform.InverseTransformPoint method. This method takes a world position (in this case, the AR camera's position) and converts it into a local position relative to the specified transform (the MapSpace).
Vector3 cameraRelative = mapSpace.transform.InverseTransformPoint(Camera.main.transform.position);This cameraRelative vector can then be broadcast over the network to other players. On the receiving end, each client can use this local position to place a representation of the remote player within their own MapSpace, ensuring that all players see each other in the correct relative positions.
Sample Script: PlayerPositionManager
The following C# script demonstrates how to obtain the current AR camera's position relative to the MapSpace and provides a basic structure for managing player positions in a multiplayer AR scene.
Instructions:
Create a new C# script named PlayerPositionManager.
Attach this script to a GameObject in your scene, for example, a "GameManager" or the player's root object.
In the Unity Inspector for this script, drag your scene's main AR Camera to the arCamera field and the MapSpace GameObject to the mapSpace field.
Last updated