The Reference Book of RealityKit

MLBoy
17 min readOct 18, 2021

--

Reality Kit for making AR apps on iOS.
In this article, you can learn how to use RealityKit from basic to advanced.
Please use it as an introduction and an index.

(If you have never created an app, read “Create your first app”.)

Contents

Displaying AR screen with RealityKit

Displaying AR contents

Machining the surface of an object

Position, Orientation, Scale

Placing contents with anchors

Demo

Animations

Lighting

Physics And Collisions

Real-world understanding with Lidar senser

Audio

Tap and Gestures

Custom Component

Share AR with multiple devices

aCreate a scene with Reality Composer and read it from the code

Register and receive events

Relationship between ARKit and RealityKit and Points to note

RealityKit-Sampler

Hand Interaction

Displaying AR screen with RealityKit

By placing ARView, you can display the content using RealityKit.
Implementation

UIKit

SwiftUI

* Don’t forget to add Camera Usage Description in Info.Plist.

Subclassing ARView or ARViewController for SwiftUI

When actually using ARView with SwiftUI, it is convenient to subclass ARView and display it with UIViewRepresentable, or display a ViewController with ARView with UIViewControllerRepresentable.

UIViewControllerRepresentable

How to display UIViewController as a SwiftUI View.

Displaying AR contents

Display objects in RealityKit, such as boxes and spheres.

Box

RealityKit box mesh

Implementation

Plane

RealityKit plane mesh

Implementation

Sphere

RealityKit sphere mesh

Implementation

Text

RealityKit text mesh

USDZ model

You can show complex shapes in 3D.

Dinosaurs USDZ model (converted from .obj).

Implementation

You can find some USDZ samples in the apple sample page.

Asynchronous loading

Loading a large number of high quality models blocks the app.
To avoid that, you can use asynchronous loading.

Machining the surface of an object

Adding the color, texture, and pattern of the surface of the object.

Simple Material

Materials that are affected by real light, such as reflections.

RealityKit simple material. The color is blue and metallic.

Implementation

Unlit Material

A material that is not affected by physics rendering. It doesn’t get dark even in a dark place.

RealityKit unlit material

Implementation

Occlusion Material

A material that transmits camera images through AR objects.

Implementation

Place a box with occlusion material in front of the robot

Image Texture

You can paste the image on the surface of the object.

Paste the dolphin image on the box mesh

Implementation

Imports an image as a texture resource and attaches it to an unlit material.

However, if you load it by image name from the Xcode asset folder (Assets.xcassets) as above, it may somehow be darker than the original image.
In that case, loading it by URL will improve it.
Place the image in the bundle’s regular folder hierarchy instead of the asset folder and load it by URL.

UIImages and remote URLs cannot be read directly by TextureResource, so temporarily save them locally and then load them by URL.

The aspect of the image is distorted depending on the size of the mesh.
For example, to determine the size based on the width of the box according to the aspect of the image, you can do the following:
In the case of a box, it is not possible to keep the aspect of all faces. You need to give up on either the side or the top and bottom.
The code below gives up on the top and bottom.

It’s trivia. When you try to paste a texture from code on a mesh created with Reality Composer, the orientation of the image is reversed or only a part of the image is pasted.

Video Material

You can paste the video on the surface of the object.

Paste the rat video on the box mesh

Implementation

Initialize AVPlayer and attach it to VideoMaterial.

To get the mesh size based on the width according to the aspect of the video, you can use the following methods.

Set material for USDZ model

You can paste the RealityKit material on the surface of the USDZ model.

Paste the blue simple material on the surface of the USDZ

Implementation

Set video material for USDZ model

Paste the flame video material on the surface of USDZ

Implementation

Entity position, orientation, scale

You can change the position, orientation, and scale of the object in code.

Position

Set the position in local coordinates (the origin of the parent entity is the origin).

Implementation

Set the position in the world coordinates (the camera position at the time of starting the application is the origin).

Implementation

Orientation

Implementation

Scale

Implementation

SIMD3

It is often used for coordinates.
It can be initialized with simd_make_float3 () or an array.

Implementation

Placing contents with anchors

You can fix the AR to a real feature point to make it look real.
The object remains an anchor as the camera moves. As the anchor moves, the object will follow.
With the RealityKit anchor entity, the model entity will appear as soon as the anchor is found.

Horizontal Plane Anchor

A horizontal surface such as a desk or floor can be used as an anchor.
This gives you the feeling that you are placing an AR object.

Detect the desk with a horizontal anchor and place the box

Implementation

If you want to use plane anchors, run an AR session in ARWorldTrackingConfiguration.

Vertical Plane Anchor

Vertical plane such as walls, doors and display surfaces can be anchored.

Place the box on the display with vertical anchors

Implementation

Image Anchor

Images can be anchored, such as posters and magazine covers.

Recognize the image of the girl on the display and place the box

In the demo image, the girl image displayed on the computer is used as the anchor.

Implementation

  1. Create an AR resource folder. With “Assets.xcassets” ->“+” button at the bottom left-> “AR and Textures”-> “AR Resource Group”, then drag and drop the image you want to use as an anchor.
  2. Click the anchor image and register the width and height of the anchor image in the right pane.

3. Create an anchor with the AR resource folder name and image name.

You can also create the following effects from the anchor image.

Video that pops out of the computer screen

Set the first frame of the video as an image anchor and paste the video into a box entity that is the same size as the anchor image and play it.

Implementation

  1. Preparation of video box.

2. Play when ARSession find an anchor.

Object Anchor

You can use a pre-scanned object as an anchor.

Place the box with the plant pot as an anchor

Implementation

  1. Scan the object you want to anchor with the scan function of ARKit.

2. You can create an .arobject file by running the Apple sample app.

Scan the object you want to make it as anchor

2. Register the generated .arobject file in the resource group of AR and Textures (the procedure is the same as the image anchor).

Drag and drop .arobject file into AR and Texture folder.

3. Create an object anchor with the registered resource group name and .arobject name.

Try wrapping the object in a translucent sphere.

Wrap the object in a translucent sphere.

Face Anchor

You can detect a person’s face and use it as an anchor.

Place a skeleton on the face anchor

Implementation

To use the face-targeted anchor entity, run an ARView session with ARFaceTrackingConfiguration.

By default, the face geometry is occlusion and the anchor follows the face.

Body Anchor

It can detect the human body and make it an anchor.

Place the airplane model 1m above the body anchor

Implementation

To use the body anchor, run an ARView session in ARBodyTrackingConfiguration.

The body anchor follows the body.

AR Anchor

You can create an anchor entity from an AR anchor in ARKit.
You can also access the properties of the AR anchor, such as the position of the head and hands relative to the body anchor, and the facial movements of the face anchor.
Anchor entities created from AR anchors follow updates on the position of AR anchors.

Animations

You can animate moving, rotating, and scaling.
You can also play the animation built into the USDZ.

Move

Move to the side animation

Implementation

Rotation

150 degree rotation animation

Implementation

Enlargement / reduction

3x magnified animation

Implementation

From RealityKit2, the .move method seems to replace the .moveCharacter method.

Play the animation embedded in the USDZ

Play baked animation in airplane USDZ file. It’s a GIF, so it’s clunky, but it actually plays smoothly.

The animation embedded in the USDZ file can be recalled and played.

Implementation

Use Entity.load () to load the USDZ with the animation, add the entity to the scene, and then play the animation.

Lighting

RealityKit reflects the brightness of the environment by default, but you can also add three types of light entities (light components).

Point Light

It emits even light in all directions.
It’s like lighting a house.

Place the point light slightly in front of the box.

Implementation

Directional Light

It emits uniform light in a certain direction.

Place the directional light slightly in front of the box.

Implementation

Spotlight

A light that illuminates a cone.
The one that is common in stage lighting.

Place the spotlight slightly in front of the box.

Implementation

Physics And Collisions

RealityKit can express physical actions such as bouncing when entities collide with each other or receiving gravity.

Interaction between a plate and a sphere with physical action

Implementation

Add the entities physical bodies and collision shapes.

PhysicsBodyMode

.dynamic:

This type can give power to other dynamic bodies by moving. It also move with the force it received.

.kinematic:

This type can give power to other dynamic bodies by moving. It does not move due to the force it receives.

.static:

This type doesn’t move. It gives power to other dynamic bodies only when a collision occurs. It does not move due to the force it receives.

An entity with a .dynamic body will fall under the force of gravity if it is not supported the it’s weight by another entity.

Collision detection

You can detect collisions between entities with CollisionComponent.

Play video material triggered by collision between sphere and board.

Implementation

Listen to CollisionEvents with Combine.

If you want to detect a collision of a particular entity…

* If the entity has PhysicsBodyComponent, the collision will not be detected unless one of the colliding entities has a .dynamic type body. If the entity does not have a PhysicsBodyComponent, the collision can be detected with the CollisionComponent alone.

By the way, when you end the ARView session and transition to another View, if you do not cancel Cancellable, the pointer will not be released and it will continue to be detected and the memory usage will increase, so explicitly cancel it. To do.

Real-world understanding with Lidar senser.

With Lidar-equipped devices, the actual physical shape can be taken in detail, so AR can be made to look more realistic with the following effects.

Occlusion

The AR object is hidden behind the real object.

Receives Lighting

AR object casts a shadow on the real floor

Physics

AR objects physically interact with real objects.

Collision

Audio

Audio playback

Implementation

If you load the sound source with AudioFileResource and pass it to the prepareAudio method of the entity, AudioPlaybackController will be returned, so play it with AudioPlaybackController.

You can load the sound source from the URL as well by setting AudioFileResource.load (contentOf: URL …).

AudioFileResource.InputMode

.nonSpatial : Sounds the same regardless of position.

.spatial : Hearing changes depending on the relationship between the distance and direction of the device and the entity.

.ambient : Hearing changes depending on the relationship between the direction of the device and the entity

Tap and Gestures

Entity Hit Test

ARView can detect entities that are an extension of the user’s tap.

Apply force when tapping an entity.

Implementation

  1. Detect the user’s tap to ARView with UITapGestureRecognizer and get the entity on the extension line.

2. The entity detected by the hit test needs a physical shape.

Detect intersection with a plane by Ray Cast

You can detect the intersection with a plane that is an extension of where you tap the display.

Move, Rotate, Enlargement / Reduction The Entity With Gestures

By installing the gesture reconizer (subclass of UIGestureRecognizer) for the entity built in ARView in ARView for each entity, you can perform the following gesture operations on the entity.

Move with Pan (drag) gesture (X * Z plane)
Rotate with a two-finger circle gesture (Y-axis),
Enlarge / reduce with pinch gesture

Box to move / rotate / enlarge / reduce with gesture.

Note if you attach .dynamic type PhysicsBodyComponent to an entity, move and rotate gestures will not work (scale only works).

With only the above installation procedure, movement, rotation, enlargement / reduction will be reflected in the model entity, but like other GestureRecognizer, you can add a method with @objc func to get the movement distance and scale amount of the entity. ..

Custom Component

You can create a struct that conforms to the component protocol and give the Entity logic.

Implementation

Share AR with multiple devices

When you look at the AR screen on multiple devices, you can share the AR world by having entities in the same location or taking the same actions.

Reflect the operation of iPod Touch on iPhone.

Collaborative Session

You can instantly share anchor positions, entity component states, physical states, and more across multiple devices.

There are many networks available for sharing services, but here we use Apple’s MultiPeer Connectivity framework.

Implementation

1. Add Local Network (Bonjour services) to the app

Add Local Network Usage Descriprtion and Bonjour services to Info.plist.

Write your service name as a String in Bonjour services in Info.plist.
This service name will be the identifier for the app’s interaction.
The Bonjour service name must be up to 15 ASCII lowercase letters, numbers, and hyphens.
Let’s say “my-mc-service”.

2. Connect with other peers

3. Start Collaborative Session

The anchors and entities are now shared by multiple devices.

Tips for enabling Collaborative Sessions as soon as possible and sharing positions accurately

• Bring the devices as close together as possible and point them at similar angles so that they can see the same landscape.

• Move the device to recognize the surroundings and set ARFrame.WorldMappingStatus to .mapped

• Keep close the local distance between the origin of the anchor and the child entity (it makes less deviation)

• Create anchors for each entity as often as possible

• However, if you want to keep the relative distance between entities as accurate as possible, attach it to one anchor.

• Receive the update of the AR anchor position in the delegate method and update the position of the anchor entity

Entity Change Permission

Only the owner of an entity can reflect the changes in the entity to other devices.

The owner of an entity means the device that made the entity.
By transferring ownership, entity changes can be reflected throughout even from devices that are not the original owner.

To take over the ownership, the devices that are not the original owner send the ownership request.

Implementation

Entity owners can set whether to allow ownership when requested.

You can specify not to share an entity during a sharing session.

This entity is now only visible on owner’s device.

Anchors for other devices

The ARParticipantAnchor can get the location of other devices and the ID of the AR session that is unique to that device.

Attach a text entity to ParticipantAchor on iPod Touch.

Implementation

You can identify the display name of the anchor device by exchanging the ID of the AR session in MultipeerConnectivity and associating it with the peerID that has the display name as a property.

Create a scene with Reality Composer and read it from the code

You can create an AR scene with the graphic user interface and incorporate it into your app.
You can add objects, anchors, animations, sounds, etc. to your scene and import them as .rcproject files in your xcode project.

Play the scene in Reality Composer
Play scenes created with Reality Composer with ARView

Create a scene with Reality Composer

To create a Reality Composer project, right click on Xcode → Open Developer Tool.
Alternatively, you can create it from Xcode’s New → File.

Also, if you open the .rcproject file in Xcode and click “Open in Reality Composer”, the file will be opened in Reality Composer, and the edited contents in Reality Composer will be immediately reflected in the Xcode project.

You can select the anchor where you want to place the content.
(One anchor can be selected for each scene)

Anchors to choose from in Reality Composer

You can add preset models.
USDZ models can also be added by dragging and dropping.

Preset shapes that can be selected with Reality Composer

You can set the position, size, angle, surface color, texture, physics, and collision characteristics of the model.

Reality Composer Object Settings

You can set the behavior of objects such as animation in sequence.

You can set the following behaviors:

Emphasis (comical animation such as somersault)
Display (can be set such as fade-in)
Hide (can be set such as fade out)
Move / Rotate / Enlarge / Reduce (Absolute / Relative)
Apply force (calculates physical action)
Orbit (around other objects)
Change the scene
Play sound (preset sound can be downloaded)
Play environmental sounds
Play music
stand-by
USDZ animation playback
Notification in Xcode

You can set the following start triggers for each behavior sequence:

Tap
Scene start
When the camera approaches
Object collision
Notification from code

Reality Composer behavior settings
Play and check the behavior sequence.

Read the Reality Composer project file in Xcode

The scene will be played below.

Implementation

Play a scene created with Reality Composer with ARView.

Access entities in the Reality Composer scene from the code

Access with the entity name set in Reality Composer.

Implementation

Trigger behavior from code

Select Notification from the Reality Composer behavior settings.
Access the behavior by name from the code.

Implementation

Register and receive events

You can execute code when the event occurs by subscribing to the specific event in ARView.Scene in advance.
For example, to receive that the anchor is pinned to the scene:

Implementation

You can receive the following events in RealityKit.

Scene Event

SceneEvents.Update:

An event triggered once per frame interval that you can use to execute custom logic for each frame.

SceneEvents.AnchoredStateChanged:

An event triggered when the anchored state of an anchoring entity changes.

AnimationEvents.PlaybackCompleted:

The event raised when an animation reaches the end of its duration.

AnimationEvents.PlaybackLooped:

The event raised when an animation loops.

AnimationEvents.PlaybackTerminated:

The event raised when an event has been terminated, regardless of whether it ran to completion.

AudioEvents.PlaybackCompleted:

Audio playback completed.

CollisionEvents.Began:

An event raised when two objects collide.

CollisionEvents.Updated:

An event raised on every frame when two objects are in contact.

CollisionEvents.Ended:

An event raised when two objects, previously in contact, separate.

A picture book that pops up when the image anchor is activated

You can use Combine to receive events in ARView.

Implementation

Event detection of type Cancellable type keeps a reference to memory until it is explicitly canceled, which puts pressure on memory and must be canceled after use.

Relationship between ARKit and RealityKit and Points to note

RealityKit is built on ARKit and can be used by RealityKit alone, but if you want to use various Tracking Configurations or ARSessionDelegate, you need to explicitly import ARKit to configure and execute the session.

At that time, there are some things to be aware of.

Texture light reflection

Using ARKit’s WorldTrackingConfiguration with the default configuration dims the light reflection of RealityKit materials.

Reality Kit alone
Run a session with WorldTrackingConfigration

To prevent this, set environmentTexturing in ARWorldTrackingConfiguration to .automatic.

Plane detection settings

If you explicitly use WorldTrackingConfiguration in ARKit and use the anchor entity in the target plane in RealityKit, you must set the planeDetection in WorldTrackingConfiguration before you can place the anchor entity in the plane. This setting is not required when using RealityKit alone, but it must be set when using ARKit as well.

RealityKit sample code collection has been released

RealityKit sample code collection “RealityKit-Sampler” has been released as open source.
You can get the source code on GitHub. You can build with Xcode.

This is a collection of sample code that incorporates the functions of RealityKit in an easy-to-understand manner.

RealityKit-Sampler

Contents of RealityKit-Sampler

Put the box

The simplest way to use the ModelEntity and AnchorEntity.

Things you can learn with Put the box:

ARView in SwiftUI, Scene, Entity, Anchor, MeshResource, Material.

Gigant Robots

Use USDZ models and animations.

Things you can learn with Big Robots:

USDZ, Animation

Big Monitor

How to select a video from your album and paste it as a texture.

Things you can learn with Big Monitor:

VideoMaterial, SceneEvent

Building blocks

How to place objects of different shapes and colors.

Things you can learn with Building blocks:

Ray Cast, Hit Test, Handle Gestures, Physics, Collision, TextureResource

Speech Balloon

Things you can learn with Speech Balloon:

Face Anchor, ARSessionDelegate, Deal with RealityComposer

Special Move

Interact body and AR object.

Things you can learn with Special Move:

Body Anchor

Face Cropper

Detect a face then crop.

Things you can learn with Face Cropper:

Image Anchor

AR Hockey

Things you can learn with AR Hockey:

Collaborative Session

Multi Device AR Game.

Hand Interaction

AR with Vision Framework.

Things you can learn with Hand Interaction:

AddForce, Use with Vision

Thank you for reading🐣

****

Request for work:

rockyshikoku@gmail.com

--

--