WebXR AR occlusion, lighting, interaction and physics with three.js
onBeforeCompile
callback.gh repo clone tentone/enva-xr
depth-estimation
, hit-test
, lighting
).npm install
and start the code running npm run start
ARRenderer
is responsible for most of the work required to setup the AR scene.ARRenderer
receives a configuration object that indicates wich WebXR features should be enabled.three.js
materials the AugmentedMaterial.transform()
method should be used to transform regular materials into AR materials.LightProbe
object replicates the envornment lighting and position main light source position and direction. Internaly contains a three.js LightProbe and DirectionalLight with shadow casting enabled by default.const renderer = new ARRenderer({
depthSensing: true,
depthTexture: true,
lightProbe: true
});
let material: any = new MeshPhysicalMaterial({color: (Math.random() * 0xFFFFFF)});
material = AugmentedMaterial.transform(material);
let box = new Mesh(new BoxGeometry(), material);
box.receiveShadow = true;
box.castShadow = true;
renderer.scene.add(box);
const probe = new LightProbe();
renderer.scene.add(probe);
const floor = new FloorPlane();
renderer.scene.add(floor);
renderer.onFrame = function(time: number, renderer: ARRenderer) {
box.rotation.x += 0.01;
};
renderer.start();
AugmentedMaterial.transform()
method.MeshPhysicalMaterial
material should be used alonside PBR assets.VoxelEnvironment
provides a probabilistic voxel based model that maps the environment from depth data that is updated every frame.FloorPlane
or Planes
objects.