Developing for ARKit 1.5 update using Unity ARKit Plugin

Zerone羽發表於2018-03-28

Developing for ARKit 1.5 update using Unity ARKit Plugin

, 二月 16, 2018

Unity’s ARKit plugin now supports the new augmented reality (AR) features Apple announced in their Spring 2018 Update.  The launch of ARKit with iOS 11 put AR into the hands of hundreds of millions of iPhone and iPad users. With the introduction of ARKit 1.5, Apple aims to provide developers with the tools to power more immersive AR experiences that better integrate with the world.

With Unity’s updated ARKit plugin, developers can now take full advantage of the new ARKit 1.5 tools to create a new generation of AR apps. In addition to horizontal surfaces, you can now recognize vertical objects on vertical surfaces and map irregularly shaped surfaces with more accuracy. Find and recognize the position of real world 2D images and turn them into AR experiences such as bringing artwork to life. Other new features include relocalization, video formats, auto-focus and the ability to reset the world origin.

Download the updated plugin from Bitbucket and read below to start developing with the ARKit Spring 2018 update using Unity.

Setup

Requirements

Steps

  1. Start Unity Editor 2017.1 or above
  2. Open “unity-arkit-plugin” project that you downloaded from Bitbucket
  3. Follow instructions below to load up example scenes or create your own scenes
  4. Build your scene in Unity to create an XCode project
  5. Open the created XCode project and build and run it on your ARKit supported device.

Vertical Planes

One of the most popular new features is vertical planes. When the original ARKit was released, only horizontal planes were detected, and many developers clamored for vertical planes detection. With this update, Apple have answered this call and provided vertical planes detection. In Unity, you can see this as an extra entry in the UnityARPlaneDetection enum. We also added another entry in case you wanted to detect both vertical and horizontal planes at the same time. We also added the “vertical” entry in the ARPlaneAnchorAlignment enum so that you can identify the orientation of each plane anchor detected.

Instead of a new example to show this feature, our original “Assets/UnityARKitPlugin /Examples/UnityARKitScene /UnityARKitScene.scene” can be used along with a slight modification of UnityCameraManager.cs to allow you to detect both horizontal and vertical planes with your configuration.

In the scene hierarchy, select the ARCameraManager GameObject and select Horizontal And Vertical in the Inspector window:Plane detection can now detect vertical planes as well.

Here is an example of both kinds of planes being found on a device:Voila! Now you can put up virtual doors and windows on your walls as well as put up some artwork to decorate your walls, among other things.

Plane boundaries and detail mesh

With this update, instead of just plane centers and rectangle extents, ARKit now provides a more detailed visualization of the boundary of the plane and also a detailed mesh of the shape of the flat surface detected.

The implementation of this is via an ARPlaneGeometry item that is returned as part of an ARPlaneAnchor. Further examination of this item will show a structure similar to ARFaceGeometry which you will remember from ARKit Face Tracking for iPhone X.

You will be given an array of vectors which represent the boundary points, which can be plugged into a LineRenderer to give a representation of the boundary of the flat surface that has been detected. You will also be given an array of vertices, texture coordinates at those vertices, and a list of indices for the triangles of the detail mesh that represents the shape of the surface.

To see how to use this information, have a look at Assets/UnityARKitPlugin/Examples /ARKit1.5/UnityARPlaneMesh/UnityARPlaneMesh.scene. This scene is setup just like our original UnityARKitScene above, but the GeneratePlanes GameObject has a reference to ARKitPlaneGeometry prefab instead of the debugPlanePrefab.

Have a look at the ARKitPlaneGeometry prefab in the Inspector:This new prefab has both a LineRenderer and a MeshRenderer, and takes the information about the ARPlaneGeometry boundary and shape from the ARPlaneAnchor and plugs them into those renderers. See how this is actually done in ARKitPlaneMeshRender.cs:

Now build the above scene on to your device and look for oddly shaped flat surfaces to see if they actually show their shape:

Image Anchors

This is arguably the most important and most complex new feature: it allows you to detect specific images (or markers) in the scene and create an anchor at that spot that describes the position, size and orientation of the marker. Read this section of the ARKit documentation to understand this feature better.

Then open up Assets/UnityARKitPlugin/Examples/ARKit1.5/ UnityARImageAnchor/ UnityARImageAnchor.scene to follow along.

In Unity, we need to setup the reference images, image sets and allow our configurations to reference an arbitrary image set for detection. To achieve this, we use two new types of assets: ARReferenceImage and ARReferenceImagesSet. These assets can be created by using the Assets/Create/UnityARKitPlugin menu:You can create an ARReferenceImage asset for each image you want your configuration to detect, and fill in the asset with a reference to your actual image asset in the Image Texture field, the physical size of the image being detected in the scene, and the name you want to reference this image with. Here we examine an example asset in our scene folder:Create one of these for each image you want to detect in the scene. Then make a collection of them by creating an ARReferenceImagesSet with references to each of the ARReferenceImage assets you need the collection to contain. You should also put in a resource group name that we will use to refer to this images set in the AR configuration:Increase size and add more reference images if needed.

Now in the ARCameraManager GameObject in your scene, put in a reference to the ARReferenceImagesSet that you want to detect in that scene:ARImagesSet_UnityLogo is put in Detection Images field.

This configuration will now look for all the ARReferenceImages that are included in the ARReferenceImagesSet that you have referenced. When any of the images in the set are detected, you will get events that add, update and remove the ARImageAnchor associated with that image. We use a script called GenerateImageAnchor to put up the appropriate prefab instance when the particular ARReferenceImage is detected:See the GenerateImageAnchor.cs code to see how we leverage the ARImageAnchor events and the ARReferenceImage referenced in each resulting anchor:

Now build this scene to XCode. Examine the XCode project and you will notice that each ARReferenceImageSet appears as an AR Resource Group named with the name you used in the corresponding asset:

To see how this works with this scene, print out a copy of the reference images png included in the scene folder, making sure the size is as specified in physical size of the marker. Then run this scene on the device, and go over the place where you put the printout of the marker. You should see an instance of the Axes prefab GameObject appear on your marker:

Relocalization

Previously, when you got a phone call, or otherwise made your ARKit app go into the background, you would lose your world tracking info and everything would be out of position.  With this update, ARKit now allows you to keep your world tracking information after an interruption, as long as you have not moved a lot from your previous position.  With this new feature comes a tracking state reason to designate the period of time it takes for ARKit to relocalize after an interruption:  ARTrackingStateReason enum now has an entry for ARTrackingStateReasonRelocalizing.

This feature is optional: you can make your ARKit app work as before this feature existed by setting the value of the property UnityARSessionNativeInterface .ARSessionShouldAttemptRelocalization to false.

Open up Assets/UnityARKitPlugin/Examples/ARKit1.5/UnityARKitRelocalize/ UnityARKitRelocalize.sceneto see how this is done.  This scene allows you to toggle between the two modes, and see the tracking state and tracking state reason as ARKit relocalizes. See the code in RelocalizationControl.csto see how this is done.

Autofocus

For ARKit, the camera was originally set at infinite focus. With this update, you can choose between infinite focus or autofocus on your ARKit camera.  By default, the camera is set to autofocus if you have the latest update of ARKit on your device.  To change that we have introduced a new boolean enableAutoFocus that you can set on your configuration before starting ARKit with it.

Open up Assets/UnityARKitPlugin/Examples/UnityARKitScene/UnityARKitScene.scene and examine the ARCameraManager GameObject which now has an “Enable Auto Focus” checkbox, which is passed down to the ARKitWorldTrackingSession Configuration (see UnityARCameraManager.cs):Build and run this scene with the checkbox enabled and disabled to see the difference between these two modes.

Video Formats

Another request from developers was to increase the resolution of the video that is displayed on an ARKit app.  This update provides higher resolution video for the AR app, but allows you to select other video formats depending on the actual device that is being used.

To enumerate the supported video formats for a device, you can call the static method UnityARVideoFormat.SupportedVideoFormats(), which will return a list of UnityARVideoFormats each of which include width, height and frames per second.  Each UnityARVideoFormat also contains an IntPtr that you can use in the videoFormat field of your ARKitWorldTrackingSessionConfiguration to initialize your session with that video format.  By default, the highest possible resolution is used on the device.

Open up Assets/UnityARKitPlugin/Examples/ARKit1.5/UnityARVideoFormats/ UnityARVideoFormats.scene to see how this is used.  Build and run on device to see a list of buttons corresponding to the video formats available on the device.  Select one of the buttons to reset the session with the new configuration that contains the selected video format:

Set World Origin

This update also introduces functionality for your AR session to set the world coordinate system to a new position and orientation.  As you may know, when an AR session is started, the device’s original position and orientation is used as the origin for the world coordinate system for the ARKit world tracking.  In some cases, you will want to reset the world coordinate system for ARKit world tracking based on some real world reference point.  You can use UnityARSessionNativeInterface.GetARSessionNativeInterface().SetWorldOrigin() and give it a Unity transform which will be used as the origin for the newly set world coordinate system.

Please open Assets/UnityARKitPlugin/Examples/ARKit1.5/UnityARSetWorldOrigin/ UnityARSetWorldOrigin.scene and build it to try it out.  Pressing the button to “Set World Origin” will reset the world coordinate system using the devices current world transform.  You should notice that all the existing anchors will get updated to their new positions relative to the world origin, but any virtual objects that have not been anchored will be at their original positions relative to the devices original world coordinate system (they will appear to have moved).

Hit Test Result Types

To support the notion of vertical planes and the detail geometry of the planes, there are now two more entries in the ARHitTestResultType enum.  They are ARHitTestResultTypeEstimatedVerticalPlane and ARHitTestResultTypeExisting PlaneUsingGeometry.  See ARHitTestResultType.cs for descriptions of these.

Go Forth and Develop

These are some interesting new features of ARKit that have been released with this update. For more information on these features, please consult Apple’s ARKit documentation. Please leverage the simplicity of using Unity ARKit Plugin to improve your ARKit apps with these new features.  Looking forward to seeing your updates!  Remember, this update is still in Beta, so you will have to wait for the release of this update before you can publish your updated app to the AppStore.  Queries should be directed to our forums.

相關文章