π Stumbled here on accident? Start with the first part!
Welcome to the second part to our WebXR journey. We previously setup our development environment and went through our base template.
In this part we dive into the plane detection feature.
βΉοΈ Remember - you can always run the code associated with this article and follow along using
npm start --part=2
Β
What is plane detection?
π Plane detection in WebXR identifies and maps flat surfaces in the user's environment for augmented reality applications. This allows virtual objects to be realistically placed on floors, walls, or tables. It enhances AR experiences by ensuring seamless integration of virtual content with the physical world.
Prerequisites
βοΈ For this part we assume that you went through the Assisted Space Setup on the Meta Quest 3. The assistant can be found under Settings->Physical Space->Space Setup
Assisted Space Setup
The Assisted Space Setup feature on the Meta Quest 3 enhances the virtual reality experience by allowing the headset to interact with your physical environment.
3D Environmental Scanning: When activated, the device performs a quick 3D scan of your surroundings.
Object Recognition: It identifies and creates representations of surfaces and objects, like walls, tables, and furniture.
Spatial Interaction: This allows virtual content to collide with or hide behind these real-world objects, making the XR experience more immersive.
Automatic Activation: The feature runs automatically when you launch apps that use mixed reality features, but it can also be accessed manually from the settings.
The Assisted Space Setup is especially useful in mixed reality applications where interaction with the physical environment is key to the experience.
Registering the plane detection feature
Feature Management
The Plane Detection feature is enabled via the WebXRFeaturesManager
. This is done by calling enableFeature(WebXRFeatureName.PLANE_DETECTION, "latest")
.
addFeaturesToSession() {
if (this._xr === null) {
return;
}
this._fm = this._xr.baseExperience.featuresManager;
try {
this._xrPlanes = this._fm.enableFeature(WebXRFeatureName.PLANE_DETECTION, "latest") as WebXRPlaneDetector;
} catch (error) {
console.log(error);
}
}
Performing the Plane Detection
Plane detection uses the WebXR API to recognise real-world surfaces. By accessing the camera data and sensors of the device, it can identify different types of planes, such as horizontal and vertical surfaces.
Events such as onPlaneAddedObservable
, onPlaneUpdatedObservable
, and onPlaneRemovedObservable
are used to respond to changes in the detected plane landscape. These events control the creation, updating, and removal of the meshes that represent the physical planes.
this._xrPlanes.onPlaneAddedObservable.add((plane: IWebXRPlaneWithMesh) => {
mat = new StandardMaterial("mat", this._scene);
mat.alpha = 0.25;
mat.diffuseColor = Color3.Random();
this.initPolygon(plane, mat);
});
onPlaneAddedObservable
: When a new plane is detected, this observable adds an event listener. A new StandardMaterial
is created with some level of transparency (alpha = 0.25
) and a random diffuse color. The initPolygon
function is then called to create a mesh for this plane.
this._xrPlanes.onPlaneUpdatedObservable.add((plane: IWebXRPlaneWithMesh) => {
if (this._planes[plane.id].material) {
mat = this._planes[plane.id].material as StandardMaterial;
this._planes[plane.id].dispose(false, false);
}
const some = plane.polygonDefinition.some(p => !p);
if (some) {
return;
}
this.initPolygon(plane, mat!);
});
onPlaneUpdatedObservable
: This listens for updates to existing planes. If the plane's mesh already has a material, it retrieves and reuses this material; otherwise, it disposes of the current mesh and calls initPolygon
to recreate it. This ensures the mesh is always up-to-date with the latest plane data.
this._xrPlanes.onPlaneRemovedObservable.add((plane: IWebXRPlaneWithMesh) => {
if (plane && this._planes[plane.id]) {
this._planes[plane.id].dispose()
}
})
onPlaneRemovedObservable
: It listens for when a plane is no longer detected and disposes of the corresponding mesh to free up resources.
if (this._xr !== null) {
this._xr.baseExperience.sessionManager.onXRSessionInit.add(() => {
this._planes.forEach((plane: Mesh) => plane.dispose());
while (this._planes.pop());
});
}
The code checks if the _xr
object (representing the XR experience) is not null and adds an event listener for the XR session's initialization. This listener disposes of all plane meshes, effectively resetting the plane representations when a new XR session starts.
The complete code
createPlaneMeshesFromXrPlane(): void {
interface IWebXRPlaneWithMesh extends IWebXRPlane {
mesh?: Mesh;
}
let mat: Nullable<StandardMaterial>;
if (this._xrPlanes === null) {
return;
}
this._xrPlanes.onPlaneAddedObservable.add((plane: IWebXRPlaneWithMesh) => {
this._debug && console.log("plane added", plane);
mat = new StandardMaterial("mat", this._scene);
mat.alpha = 0.25;
mat.diffuseColor = Color3.Random();
this.initPolygon(plane, mat);
});
this._xrPlanes.onPlaneUpdatedObservable.add((plane: IWebXRPlaneWithMesh) => {
if (this._planes[plane.id].material) {
mat = this._planes[plane.id].material as StandardMaterial;
this._planes[plane.id].dispose(false, false);
}
const some = plane.polygonDefinition.some(p => !p);
if (some) {
return;
}
this.initPolygon(plane, mat!);
});
this._xrPlanes.onPlaneRemovedObservable.add((plane: IWebXRPlaneWithMesh) => {
if (plane && this._planes[plane.id]) {
this._planes[plane.id].dispose()
}
})
if (this._xr !== null) {
this._xr.baseExperience.sessionManager.onXRSessionInit.add(() => {
this._planes.forEach((plane: Mesh) => plane.dispose());
while (this._planes.pop());
});
}}
}
Mesh Visualisation
For each detected plane, a mesh is created that provides a visual representation of the plane in the virtual world. These meshes are equipped with materials created using StandardMaterial
and are coloured with Color3.Random()
for visual distinction.
initPolygon(plane: IWebXRPlane, material?: StandardMaterial): Mesh {}
Initializing the function by providing a plane
and a material
.
plane.polygonDefinition.push(plane.polygonDefinition[0]);
Adds the first point of the polygon definition to the end, making it a closed polygon.
const polygonTriangulation = new PolygonMeshBuilder(plane.xrPlane.orientation, plane.polygonDefinition.map((p) => new Vector2(p.x, p.z)), this._scene);
Create a new PolygonMeshBuilder
object, using the plane.polygonDefinition
to define the shape of the polygon. The map function is used to convert each point in plane.polygonDefinition
to a Vector2
object, using the x
and z
properties of each point.
const polygon = polygonTriangulation.build(false, 0.01);
Build the polygon mesh using the PolygonMeshBuilder
object. The false
argument means that the mesh is not updatable. The 0.01
argument is the depth of the mesh.
polygon.createNormals(false);
Creating the normals for the polygon
. Normals are vectors perpendicular to the surface of the mesh, used for lighting calculations. The false
argument means that the normals are not updated.
if (material) {
polygon.material = material;
}
Assign a material
to the polygon, if one is provided. A material defines the appearance of the mesh.
polygon.rotationQuaternion = new Quaternion();
Initialize the rotation of the polygon using a quaternion. Quaternions are a way to represent rotations in 3D space.
polygon.checkCollisions = true;
polygon.receiveShadows = true;
Enable collisions and shadows for the polygon
.
plane.transformationMatrix.decompose(polygon.scaling, polygon.rotationQuaternion, polygon.position);
Decompose the transformation matrix of the plane into scaling, rotation, and position components, and apply them to the polygon.
this._planes[plane.id] = (polygon);
Adds the polygon
to the planes
array, using the plane id
as the key.
return polygon;
Returns the polygon
The complete code
initPolygon(plane: IWebXRPlane, mat?: StandardMaterial): Mesh {
plane.polygonDefinition.push(plane.polygonDefinition[0]);
const polygonTriangulation = new PolygonMeshBuilder(plane.xrPlane.orientation, plane.polygonDefinition.map((p) => new Vector2(p.x, p.z)), this._scene);
const polygon = polygonTriangulation.build(false, 0.01);
polygon.createNormals(false);
if (mat) {
polygon.material = mat;
}
polygon.rotationQuaternion = new Quaternion();
polygon.checkCollisions = true;
polygon.receiveShadows = true;
plane.transformationMatrix.decompose(polygon.scaling, polygon.rotationQuaternion, polygon.position);
this._planes[plane.id] = (polygon);
return polygon;
}
Adding Plane Detection to the scene
async createScene(): Promise<Scene> {
...
this.createPlaneMeshesFromXrPlane();
return this._scene;
}
Finally we have to add our createPlaneMeshesFromXrPlane
function to the scene.
Conclusion
This article details the implementation of plane detection in WebXR, specifically focusing on its integration within the Meta Quest 3's Assisted Space Setup. It describes how the WebXR API is utilized to identify real-world surfaces, allowing for the placement of virtual objects in an augmented reality environment. Key features include the creation, updating, and removal of meshes representing detected planes, enhancing the realism and interactivity of the mixed reality experience.
In the third part of this series weβre focusing on Meshes & Materials. Stay tuned.
Top comments (0)