Skip to content

expo/expo-three-ar

Repository files navigation

This package is no longer supported as Expo Go's built-in AR support is no longer supported. This package may be reused in the future if new native package is created.

expo-three-ar

Tools for using three.js to build native AR experiences with Expo. This library isiOSonly.

This library is a side-project and should not be considered production ready

Installation

yarn add three expo-three-ar

Usage

Import the library into your JavaScript file:

import*asThreeARfrom'expo-three-ar';

Enabling AR

  • expo-gl:callAR.startAsync(gl)afterGLView.onContextCreatehas been called.
  • expo-graphics:you need to add theisArEnabled={true}prop

API

new BackgroundTexture(renderer: WebGLRenderingContext)

extends aTHREE.Texturethat reflects the live video feed of the AR session. Usually this is set as the .backgroundproperty of a THREE.Sceneto render the video feed behind the scene's objects.

// viewport width/height & zNear/zFar
scene.background=newBackgroundTexture(renderer);

new Camera(width: number, height: number, zNear: number, zFar: number)

extends aTHREE.PerspectiveCamera that automatically updates its view and projection matrices to reflect the AR session camera.width, heightspecify the dimensions of the target viewport to render to andnear, farspecify the near and far clipping distances respectively. TheTHREE.PerspectiveCamerareturned has itsupdateMatrixWorld andupdateProjectionMatrixmethods overriden to update to the AR session's state automatically. THREE.PerspectiveCamerathat updates it's transform based on the device's orientation.

// viewport width/height & zNear/zFar
constcamera=newCamera(width,height,0.01,1000);

new Light()

THREE.PointLightthat will update it's color and intensity based on ARKit's assumption of the room lighting.

renderer.physicallyCorrectLights=true;
renderer.toneMapping=THREE.ReinhardToneMapping;

constarPointLight=newLight();
arPointLight.position.y=2;
scene.add(arPointLight);

// You should also add a Directional for shadows
constshadowLight=newTHREE.DirectionalLight();
scene.add(shadowLight);
// If you would like to move the light (you would) then you will need to add the lights `target` to the scene.
// The shadowLight.position adjusts one side of the light vector, and the target.position represents the other.
scene.add(shadowLight.target);

...
// Call this every frame:
arPointLight.update()

new MagneticObject()

ATHREE.Meshthat sticks to surfaces. Use this as a parent to models that you want to attach to surfaces.

constmagneticObject=newMagneticObject();
magneticObject.maintainScale=false;// This will scale the mesh up/down to preserve it's size regardless of distance.
magneticObject.maintainRotation=true;// When true the mesh will orient itself to face the camera.

// screenCenter is a normalized value = { 0.5, 0.5 }
constscreenCenter=newTHREE.Vector2(0.5,0.5);
...

// Call this every frame to update the position.
magneticObject.update(camera,screenCenter);

new ShadowFloor()

A transparent plane that extendsTHREE.Meshand receives shadows from other meshes. This is used to render shadows on real world surfaces.

renderer.gammaInput=true;
renderer.gammaOutput=true;
renderer.shadowMap.enabled=true;
constshadowFloor=newShadowFloor({
width:1,
height:1,
opacity:0.6,
});// The opacity of the shadow

new Points()

A utility object that renders all the raw feature points.

constpoints=newPoints();
// Then call this each frame...
points.update();

new Planes()

A utility object that renders all the ARPlaneAnchors

constplanes=newPlanes();
// Then call this each frame...
planes.update();

AR Functions

Three.js calculation utilites for working in ARKit. Most of these functions are used for calculating the surfaces. You should see ifMagneticObject()has what you need before digging into these. You can also check out this example provided by Apple

hitTestWithFeatures(camera: THREE.Camera, point: THREE.Vector2, coneOpeningAngleInDegrees: number, minDistance: number, maxDistance: number, rawFeaturePoints: Array)

Props

  • camera:THREE.Camera
  • point:THREE.Vector2
  • coneOpeningAngleInDegrees:number
  • minDistance:number
  • maxDistance:number
  • rawFeaturePoints:Array<any>

hitTestWithPoint(camera: THREE.Camera, point: THREE.Vector2)

Props

  • camera:THREE.Camera
  • point:THREE.Vector2

unprojectPoint(camera: THREE.Camera, point: THREE.Vector2)

Props

  • camera:THREE.Camera
  • point:THREE.Vector2

hitTestRayFromScreenPos(camera: THREE.Camera, point: THREE.Vector2)

Props

  • camera:THREE.Camera
  • point:THREE.Vector2

hitTestFromOrigin(origin: THREE.Vector3, direction: THREE.Vector3, rawFeaturePoints:?Array)

Props

  • origin:THREE.Vector3
  • direction:THREE.Vector3
  • rawFeaturePoints:?Array<any>

hitTestWithInfiniteHorizontalPlane(camera: THREE.Camera, point: Point, pointOnPlane: THREE.Vector3)

Props

  • camera:THREE.Camera
  • point:THREE.Vector2
  • pointOnPlane:THREE.Vector3

rayIntersectionWithHorizontalPlane(rayOrigin: THREE.Vector3, direction: THREE.Vector3, planeY: number)

Props

  • rayOrigin:THREE.Vector3
  • direction:THREE.Vector3
  • planeY:number

convertTransformArray(transform: Array): THREE.Matrix4

Props

  • transform:number[]

positionFromTransform(transform: THREE.Matrix4): THREE.Vector3

Props

  • transform:THREE.Matrix4

worldPositionFromScreenPosition(camera: THREE.Camera, position: THREE.Vector2, objectPos: THREE.Vector3, infinitePlane = false, dragOnInfinitePlanesEnabled = false, rawFeaturePoints = null): { worldPosition: THREE.Vector3, planeAnchor: ARPlaneAnchor, hitAPlane: boolean }

Props

  • camera:THREE.Camera
  • position:THREE.Vector2
  • objectPos:THREE.Vector3
  • infinitePlane:boolean = false
  • dragOnInfinitePlanesEnabled:boolean = false
  • rawFeaturePoints:any = null

positionFromAnchor(anchor: ARAnchor): THREE.Vector3

Props

  • anchor:{ worldTransform: Matrix4 }

improviseHitTest(point, camera: THREE.Camera):?THREE.Vector3

Props

  • point:THREE.Vector2
  • camera:THREE.Camera