Part II: Detecting Planes 🛫
The Basics
In the previous part, we went through the basics of ARKit, and some of its most basic classes that help with building AR experiences. In this part, we’ll look at more exciting stuff like detecting flat surfaces in your scene. Around you. Or anywhere really.
Let’s get right into it.
The Exciting Stuff: Plane Detection
Let’s quickly go over the basics of building an AR experience:
- Create an ARWorldTrackingConfiguration object
- Create an ARSession
- Bind the ARSession to a ARSCNView
- Run the ARSession with the configuration object
ARKit has the ability to detect planar surfaces, not necessarily horizontal plane surfaces, but it is what we are able to detect at this point. Support for vertical surfaces, I believe, will be added in the future. Oh, the glorious future.
Code
To enable plane detection, we simply set a property indicating the type of plane the session will detect, when the configuration object is created:
1let configuration = ARWorldTrackingConfiguration()2configuration.planeDetection = .horizontal
The planeDetection
property on ARWorldTrackingConfiguration
is an enum with one value ARPlaneDetectionHorizontal
at the time of writing.
And by default, plane detection is turned off.
Now you expect to see horizontal surfaces around you in the session. But, you don’t. Let’s see why.
Show them planes
If plane detection is enabled on the configuration object, and the session is run, ARKit analyses feature points and detects horizontal planar surfaces in the scene, adds ARPlaneAnchor
objects to the session.
Let’s look into what the logic actually is like.
An ARSession
is able to track ARAnchor
’s position and orientation in 3D space. When plane detection is enabled, ARPlaneAnchor
(which is a subclass of ARAnchor) objects get added to the session whenever a flat surface is detected.
This information is available to the session, but this is not visible to the user unless we make it. The ARPlaneAnchor
objects contain some useful properties like alignment
, center
, and extent
which can be used to add a visual plane to the scene.
Every time an ARPlaneAnchor is added, the session notifies your ARSessionDelegate
, ARSCNViewDelegate
or ARSKViewDelegate
. Let’s do this with SceneKit, since we’re familiar already with ARSCNView.
Most often, the View Controller that is handling your ARSCNView
, would be your ARSCNViewDelegate
, and you can implement methods on it that will be called by the session when it needs information.
As for our case, at any point during the session, whenever an ARAnchor
is added to the session, the session adds an empty SCNNode
at the anchor position and notifies the delegate with this method:
1func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor)
We simply check if the anchor added is an ARPlaneAnchor
, and use the anchor’s properties to add a visual plane onto the scene by adding it to the empty node.
Code
There are two parts to showing detected plane surfaces:
- Check if the anchor that was added was a
ARPlaneAnchor
- and
addPlane()
if the anchor added is anARPlaneAnchor
1func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {2 guard let anchor = anchor as? ARPlaneAnchor else { return }3 // Here `anchor` is an ARPlaneAnchor4 addPlane(for: node, at: anchor)5 }
For a detected ARPlaneAnchor
, we add a plane which is an SCNNode
:
1func addPlane(for node: SCNNode, at anchor: ARPlaneAnchor) {2 // Create a new node3 let planeNode = SCNNode()45 let w = CGFloat(anchor.extent.x)6 let h = 0.017 let l = CGFloat(anchor.extent.z)89 // Box Geometry with a minimum height10 let geometry = SCNBox(width: w, height: h, length: l, chamferRadius: 0.0)1112 // Translucent white plane13 geometry.firstMaterial?.diffuse.contents = UIColor.white.withAlphaComponent(0.5)1415 // Set Position16 // Use the `center` property to find the bounds of the anchor to place the node17 planeNode.position = SCNVector3(18 anchor.center.x,19 anchor.center.y,20 anchor.center.z21 )2223 // Keep a reference to plane you're adding, so you can update it later24 planes[anchor] = planeNode2526 // Add PlaneNode to your node27 node.addChildNode(planeNode)28}
With that, you should have planes being detected and displayed in the scene around you.
I prefer to abstract the plane logic to its own class. You can take a look at my Plane implementation here.
That’s not all folks. We still have one thing left:
When planes are detected in the scene, ARKit constantly analyses the feature points for better results, and thereby the same plane that was detected could have a better result, by result I mean the boundaries and the position could be more accurate. To handle this and update the UI accordingly, we need a reference to the planes that we add to our session:
1var planes = [ARPlaneAnchor: SCNNode]()
I use a dictionary to keep a reference of the planes when they’re added to the session, so I can refer back to the plane that needs updating using its anchor.
When boundaries for existing planes are updated as you get better input from the camera, the following delegate method is called:
1func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor)
Code
- Updating an existing plane node
1func updatePlane(for anchor: ARPlaneAnchor) {2 // Pull the plane that needs to get updated3 let plane = self.planes[anchor]45 // Update its geometry6 if let geometry = plane.geometry as? SCNBox {7 geometry.width = CGFloat(anchor.extent.x)8 geometry.length = CGFloat(anchor.extent.y)9 geometry.height = 0.0110 }1112 // Update its position13 plane.position = SCNVector3(14 anchor.center.x,15 anchor.center.y,16 anchor.center.z17 )18}
So, now that we have a function that updates the plane’s geometry and position for an ARPlaneAnchor
, it can be called when the delegate is notified of an update:
1func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {2 // Verify the updated anchor is an ARPlaneAnchor3 guard let anchor = anchor as? ARPlaneAnchor else { return }45 // Update the plane6 updatePlane(for: anchor)7 }
And with that you should have an understanding of how planes are detected and updated in an ARSession.
Moving on
You should pretty much be able to write an AR app that detects flat surfaces around you. But what good is it if you can’t put stuff in 3D you say? We’ll see how to do exactly that in the next part. Feel free to leave any feedback you have.
Index of the series of ARKit posts:
- Part I: Introducing ARKit
- Part II: Detecting Planes
- Part III: Adding 3D content to your scene
- Part IV: Lighting with SceneKit
That’s all folks.