Multiplayer Sample
A turnkey sample for building a shared-AR experience on top of the MultiSet SDK. Two or more participants localize against the same MultiSet Map (or MapSet) and see each other's live pose in a shared coordinate space. When a teammate walks behind a real-world wall, the humanoid avatar automatically swaps to a skeleton silhouette so you always know where they are.
The sample supports two deployment shapes:
Mobile ↔ Mobile — the Unity app installed on two mobile devices (iOS or Android, in any combination) on the same Wi-Fi network.
iOS host ↔ Meta Ray-Ban client — the Unity app on iOS as host, the
MultisetWearableXcode app (from wearable-vps-samples) as a wearable client that streams video from Meta Ray-Ban glasses. (Wearable peer discovery uses Apple MultipeerConnectivity, so the Unity host must be iOS for this flow.)
Scene
Assets/MultiSet/Scenes/MultiplayerSample/MultiPlayerSample.unity
Key components in the scene:
SingleFrameLocalizationManager
Localizes the device against a MultiSet Map / MapSet.
MultiplayerManager
Sends and receives pose updates over the MultipeerConnectivity bridge and spawns remote-player visuals.
MultisetMultipeerBridge
Thin C# wrapper around the native iOS MultipeerConnectivity plugin. Must live on a GameObject named exactly MultisetMultipeerReceiver.
NetworkUI
UI for Start Host / Start Client, name entry, and host IP.
LocalizationSuccessDataHandler
Notifies the MultiplayerManager when the device has successfully localized.
MapMeshColliderSetup
Adds MeshColliders to the loaded map mesh and hides it from the camera so it can be used for line-of-sight raycasts (drives the skeleton-through-walls effect).
See the SingleFrameLocalizationManager API reference for details on the localization component used here.
One-time Setup
Before running the sample for the first time:
Credentials — open
MultiSetConfig(inAssets/MultiSet/…) and set yourclientIdandclientSecretfrom the MultiSet dashboard.Map — on the
SingleFrameLocalizationManagercomponent in theMultiPlayerSamplescene, enter either:a
mapCode(single-map localization), ora
mapsetCode(multi-map localization).
The same code must be used on every device that joins the session.
Layer — add a new user layer named exactly
CollisionMesh(Edit → Project Settings → Tags and Layers). TheMapMeshColliderSetupscript tags the loaded map mesh with this layer so the remote avatar can switch to its skeleton representation when occluded by the real world.Build & install — build the scene for your target platform (iOS or Android) and install the app on each participating device.
All devices must be on the same Wi-Fi network so they can reach each other over LAN. The client enters the host's IP address to connect — no pairing code is required.
Flow 1 — Two Mobile Devices
Use this flow when both participants run the Unity app. The two devices can be any mix of iOS and Android — the transport is Unity Netcode over UTP, so the platforms interoperate.
Launch the app on both devices and open the MultiplayerSample scene.
On each device, enter a player name.
On Device A (host): tap Start Host.
On Device B (client): enter Device A's IP address in the input field, then tap Start Client. Once connected, the status text will read "Connected to host".
Point each device at the mapped space and localize (the
SingleFrameLocalizationManagerhandles this — trigger a localization from the scene's UI). Both devices must localize against the same map before pose sharing begins.Once both devices are localized, each device renders the other player's avatar at their real-world position. If the other player walks behind a physical wall that is part of the map mesh, their avatar automatically swaps to a skeleton silhouette, so they remain visible through occlusion.
Finding the host's IP:
iOS — Settings → Wi-Fi → (i) next to the network → IP Address.
Android — Settings → About phone → Status (or Settings → Network & internet → Wi-Fi → (network) → View more).
Flow 2 — Meta Ray-Ban Glasses (Wearable Client)
Use this flow when one participant is wearing Meta Ray-Ban glasses and the other is holding an iOS device running the Unity app.
Roles are fixed in this configuration:
The Unity app always acts as Host, and must run on iOS (wearable-peer discovery uses Apple MultipeerConnectivity, which is unavailable on Android).
The
MultisetWearableXcode app (iOS) always acts as Client. It streams video from the paired Meta Ray-Ban glasses, localizes on the phone, and forwards the glasses' pose to the Unity host.
Prerequisites
Clone and build the companion iOS app:
Build and install
MultisetWearableon an iPhone that is paired with Meta Ray-Ban glasses.In the
MultisetWearableapp settings, enter the samemapCode/mapsetCodeand the sameclientId/clientSecretused in the Unity scene.
Steps
Unity device — open the
MultiplayerSamplescene, enter a host name, and tap Start Host.Wearable device — launch
MultisetWearable, pair your Meta Ray-Ban glasses, and from the landing page open Multiplayer Demo.Tap Join Session. The wearable app browses the local Wi-Fi for the Unity host and connects automatically (no IP entry required — MultipeerConnectivity handles discovery).
On the wearable device, tap Start Streaming & Localize. The glasses begin streaming video to the phone, and the phone localizes that stream against the MultiSet map.
Once localized, the glasses' pose is forwarded to the Unity host at ~20 Hz. The Unity device now renders the Meta Ray-Ban user's avatar at their real-world position, with the same occlusion / skeleton behavior as Flow 1.
As the glasses wearer moves through the space, the wearable app re-localizes every ~1 s to keep the pose fresh, and the Unity host continuously updates the avatar.
Related
Multiplayer AR — concepts behind the shared coordinate system (MapSpace) that this sample builds on.
Unity sample —
Assets/MultiSet/Scenes/MultiplayerSample/Meta Ray-Ban companion app — wearable-vps-samples, iOS target
MultisetWearable.
Last updated