Developing for ARKit 1.5 update using Unity ARKit Plugin
Unity’s ARKit plugin now supports the new augmented reality (AR) features Apple announced in their Spring 2018 Update. The launch of ARKit with iOS 11 put AR into the hands of hundreds of millions of iPhone and iPad users. With the introduction of ARKit 1.5, Apple aims to provide developers with the tools to power more immersive AR experiences that better integrate with the world.
With Unity’s updated ARKit plugin, developers can now take full advantage of the new ARKit 1.5 tools to create a new generation of AR apps. In addition to horizontal surfaces, you can now recognize vertical objects on vertical surfaces and map irregularly shaped surfaces with more accuracy. Find and recognize the position of real world 2D images and turn them into AR experiences such as bringing artwork to life. Other new features include relocalization, video formats, auto-focus and the ability to reset the world origin.
Download the updated plugin from Bitbucket and read below to start developing with the ARKit Spring 2018 update using Unity.
Setup
Requirements
- iOS device that supports ARKit and that has the latest iOS 11.3 beta image from the Apple developer site installed.
- Mac with macOS 10.13 (High Sierra) or above.
- Unity version 2017.1 or above.
- Latest XCode 9.3 beta from the Apple Developer website (requires macOS 10.13)
- Latest version of the “spring2018_update” branch of the Unity ARKit Plugin Bitbucket repo.
Steps
- Start Unity Editor 2017.1 or above
- Open “unity-arkit-plugin” project that you downloaded from Bitbucket
- Follow instructions below to load up example scenes or create your own scenes
- Build your scene in Unity to create an XCode project
- Open the created XCode project and build and run it on your ARKit supported device.
Vertical Planes
One of the most popular new features is vertical planes. When the original ARKit was released, only horizontal planes were detected, and many developers clamored for vertical planes detection. With this update, Apple have answered this call and provided vertical planes detection. In Unity, you can see this as an extra entry in the UnityARPlaneDetection enum. We also added another entry in case you wanted to detect both vertical and horizontal planes at the same time. We also added the “vertical” entry in the ARPlaneAnchorAlignment enum so that you can identify the orientation of each plane anchor detected.
Instead of a new example to show this feature, our original “Assets/UnityARKitPlugin /Examples/UnityARKitScene /UnityARKitScene.scene” can be used along with a slight modification of UnityCameraManager.cs to allow you to detect both horizontal and vertical planes with your configuration.
In the scene hierarchy, select the ARCameraManager GameObject and select Horizontal And Vertical in the Inspector window:Plane detection can now detect vertical planes as well.
Here is an example of both kinds of planes being found on a device:Voila! Now you can put up virtual doors and windows on your walls as well as put up some artwork to decorate your walls, among other things.
Plane boundaries and detail mesh
With this update, instead of just plane centers and rectangle extents, ARKit now provides a more detailed visualization of the boundary of the plane and also a detailed mesh of the shape of the flat surface detected.
The implementation of this is via an ARPlaneGeometry item that is returned as part of an ARPlaneAnchor. Further examination of this item will show a structure similar to ARFaceGeometry which you will remember from ARKit Face Tracking for iPhone X.
You will be given an array of vectors which represent the boundary points, which can be plugged into a LineRenderer to give a representation of the boundary of the flat surface that has been detected. You will also be given an array of vertices, texture coordinates at those vertices, and a list of indices for the triangles of the detail mesh that represents the shape of the surface.
To see how to use this information, have a look at Assets/UnityARKitPlugin/Examples /ARKit1.5/UnityARPlaneMesh/UnityARPlaneMesh.scene. This scene is setup just like our original UnityARKitScene above, but the GeneratePlanes GameObject has a reference to ARKitPlaneGeometry prefab instead of the debugPlanePrefab.
Have a look at the ARKitPlaneGeometry prefab in the Inspector:This new prefab has both a LineRenderer and a MeshRenderer, and takes the information about the ARPlaneGeometry boundary and shape from the ARPlaneAnchor and plugs them into those renderers. See how this is actually done in ARKitPlaneMeshRender.cs:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 | public void UpdateMesh(ARPlaneAnchor arPlaneAnchor) { planeMesh.vertices = arPlaneAnchor.planeGeometry.vertices; planeMesh.uv = arPlaneAnchor.planeGeometry.textureCoordinates; planeMesh.triangles = arPlaneAnchor.planeGeometry.triangleIndices; lineRenderer.positionCount = arPlaneAnchor.planeGeometry.boundaryVertexCount; lineRenderer.SetPositions (arPlaneAnchor.planeGeometry.boundaryVertices); // Assign the mesh object and update it. planeMesh.RecalculateBounds(); planeMesh.RecalculateNormals(); } |
Now build the above scene on to your device and look for oddly shaped flat surfaces to see if they actually show their shape:
Image Anchors
This is arguably the most important and most complex new feature: it allows you to detect specific images (or markers) in the scene and create an anchor at that spot that describes the position, size and orientation of the marker. Read this section of the ARKit documentation to understand this feature better.
Then open up Assets/UnityARKitPlugin/Examples/ARKit1.5/ UnityARImageAnchor/ UnityARImageAnchor.scene to follow along.
In Unity, we need to setup the reference images, image sets and allow our configurations to reference an arbitrary image set for detection. To achieve this, we use two new types of assets: ARReferenceImage and ARReferenceImagesSet. These assets can be created by using the Assets/Create/UnityARKitPlugin menu:You can create an ARReferenceImage asset for each image you want your configuration to detect, and fill in the asset with a reference to your actual image asset in the Image Texture field, the physical size of the image being detected in the scene, and the name you want to reference this image with. Here we examine an example asset in our scene folder:Create one of these for each image you want to detect in the scene. Then make a collection of them by creating an ARReferenceImagesSet with references to each of the ARReferenceImage assets you need the collection to contain. You should also put in a resource group name that we will use to refer to this images set in the AR configuration:Increase size and add more reference images if needed.
Now in the ARCameraManager GameObject in your scene, put in a reference to the ARReferenceImagesSet that you want to detect in that scene:ARImagesSet_UnityLogo is put in Detection Images field.
This configuration will now look for all the ARReferenceImages that are included in the ARReferenceImagesSet that you have referenced. When any of the images in the set are detected, you will get events that add, update and remove the ARImageAnchor associated with that image. We use a script called GenerateImageAnchor to put up the appropriate prefab instance when the particular ARReferenceImage is detected:See the GenerateImageAnchor.cs code to see how we leverage the ARImageAnchor events and the ARReferenceImage referenced in each resulting anchor:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 | void Start () { UnityARSessionNativeInterface.ARImageAnchorAddedEvent += AddImageAnchor; UnityARSessionNativeInterface.ARImageAnchorUpdatedEvent += UpadteImageAnchor; UnityARSessionNativeInterface.ARImageAnchorRemovedEvent += RemoveImageAnchor; } void AddImageAnchor(ARImageAnchor arImageAnchor) { Debug.Log ("image anchor added"); if (arImageAnchor.referenceImageName == referenceImage.imageName) { Vector3 position = UnityARMatrixOps.GetPosition (arImageAnchor.transform); Quaternion rotation = UnityARMatrixOps.GetRotation (arImageAnchor.transform); imageAnchorGO = Instantiate<GameObject> (prefabToGenerate, position, rotation); } } |
Now build this scene to XCode. Examine the XCode project and you will notice that each ARReferenceImageSet appears as an AR Resource Group named with the name you used in the corresponding asset:
To see how this works with this scene, print out a copy of the reference images png included in the scene folder, making sure the size is as specified in physical size of the marker. Then run this scene on the device, and go over the place where you put the printout of the marker. You should see an instance of the Axes prefab GameObject appear on your marker:
Relocalization
Previously, when you got a phone call, or otherwise made your ARKit app go into the background, you would lose your world tracking info and everything would be out of position. With this update, ARKit now allows you to keep your world tracking information after an interruption, as long as you have not moved a lot from your previous position. With this new feature comes a tracking state reason to designate the period of time it takes for ARKit to relocalize after an interruption: ARTrackingStateReason enum now has an entry for ARTrackingStateReasonRelocalizing.
This feature is optional: you can make your ARKit app work as before this feature existed by setting the value of the property UnityARSessionNativeInterface .ARSessionShouldAttemptRelocalization to false.
Open up Assets/UnityARKitPlugin/Examples/ARKit1.5/UnityARKitRelocalize/ UnityARKitRelocalize.sceneto see how this is done. This scene allows you to toggle between the two modes, and see the tracking state and tracking state reason as ARKit relocalizes. See the code in RelocalizationControl.csto see how this is done.
Autofocus
For ARKit, the camera was originally set at infinite focus. With this update, you can choose between infinite focus or autofocus on your ARKit camera. By default, the camera is set to autofocus if you have the latest update of ARKit on your device. To change that we have introduced a new boolean enableAutoFocus that you can set on your configuration before starting ARKit with it.
Open up Assets/UnityARKitPlugin/Examples/UnityARKitScene/UnityARKitScene.scene and examine the ARCameraManager GameObject which now has an “Enable Auto Focus” checkbox, which is passed down to the ARKitWorldTrackingSession Configuration (see UnityARCameraManager.cs):Build and run this scene with the checkbox enabled and disabled to see the difference between these two modes.
Video Formats
Another request from developers was to increase the resolution of the video that is displayed on an ARKit app. This update provides higher resolution video for the AR app, but allows you to select other video formats depending on the actual device that is being used.
To enumerate the supported video formats for a device, you can call the static method UnityARVideoFormat.SupportedVideoFormats(), which will return a list of UnityARVideoFormats each of which include width, height and frames per second. Each UnityARVideoFormat also contains an IntPtr that you can use in the videoFormat field of your ARKitWorldTrackingSessionConfiguration to initialize your session with that video format. By default, the highest possible resolution is used on the device.
Open up Assets/UnityARKitPlugin/Examples/ARKit1.5/UnityARVideoFormats/ UnityARVideoFormats.scene to see how this is used. Build and run on device to see a list of buttons corresponding to the video formats available on the device. Select one of the buttons to reset the session with the new configuration that contains the selected video format:
Set World Origin
This update also introduces functionality for your AR session to set the world coordinate system to a new position and orientation. As you may know, when an AR session is started, the device’s original position and orientation is used as the origin for the world coordinate system for the ARKit world tracking. In some cases, you will want to reset the world coordinate system for ARKit world tracking based on some real world reference point. You can use UnityARSessionNativeInterface.GetARSessionNativeInterface().SetWorldOrigin() and give it a Unity transform which will be used as the origin for the newly set world coordinate system.
Please open Assets/UnityARKitPlugin/Examples/ARKit1.5/UnityARSetWorldOrigin/ UnityARSetWorldOrigin.scene and build it to try it out. Pressing the button to “Set World Origin” will reset the world coordinate system using the devices current world transform. You should notice that all the existing anchors will get updated to their new positions relative to the world origin, but any virtual objects that have not been anchored will be at their original positions relative to the devices original world coordinate system (they will appear to have moved).
Hit Test Result Types
To support the notion of vertical planes and the detail geometry of the planes, there are now two more entries in the ARHitTestResultType enum. They are ARHitTestResultTypeEstimatedVerticalPlane and ARHitTestResultTypeExisting PlaneUsingGeometry. See ARHitTestResultType.cs for descriptions of these.
Go Forth and Develop
These are some interesting new features of ARKit that have been released with this update. For more information on these features, please consult Apple’s ARKit documentation. Please leverage the simplicity of using Unity ARKit Plugin to improve your ARKit apps with these new features. Looking forward to seeing your updates! Remember, this update is still in Beta, so you will have to wait for the release of this update before you can publish your updated app to the AppStore. Queries should be directed to our forums.
相關文章
- [ARKit]7-ARKit1.5的圖片識別功能
- Unity3D ARKit 參考文章Unity3D
- 初識ARKit
- [ARKit]12-[譯]在ARKit中建立一個時空門App:新增物體APP
- [ARKit]11-[譯]在ARKit中建立一個時空門App:準備開始APP
- [ARKit]13-[譯]在ARKit中建立一個時空門App:材質和光照APP
- iOS計算機視覺—ARKitiOS計算機視覺
- 直擊蘋果 ARKit 技術蘋果
- WWDC 2018: ARKit 2 的新功能
- ARKit中控制.dae動畫的播放動畫
- ARKit 如何給SCNNode貼Gif圖片CNN
- iOS ARKit錄製視訊(AVAssetWriter & 有聲音)iOS
- [ARKit]9-3D/AR 中的 simd 型別3D型別
- 15-《ARKit by Tutorials》讀書筆記2:時空門筆記
- 16-《ARKit by Tutorials》讀書筆記3:互動操作筆記
- 18-《ARKit by Tutorials》讀書筆記5:特殊物理效果筆記
- ARKit2.0究竟給我們帶來了什麼
- 14-《ARKit by Tutorials》讀書筆記1:開始入門筆記
- 17-《ARKit by Tutorials》讀書筆記4:面部識別AR筆記
- ARKit+Swift 版本的機器學習演算法 k-NNSwift機器學習演算法
- [ARKit]10-3D模型怎麼製作,哪裡尋找,如何使用?3D模型
- [ARKit]6-3D與AR/VR應用Debug與優化淺談3DVR優化
- Unity Low-level Native Plugin InterfaceUnityPlugin
- ARKit:擴增實境技術在美團到餐業務的實踐
- AR實踐:基於ARKit實現電影中的全息視訊會議
- 基於ARKit的iOS無限屏實現,還原錘子釋出會效果iOS
- [ARKit]8-對裸眼3D效果一次不太成功的嘗試3D
- 基於ARkit和SceneKit檢測相機位置和設定2個物體碰撞的事件事件
- 游泳池上發射火箭、臥室內登陸月球,ARkit成了開發者的狂歡工具
- 解決maven update project 後專案jdk變成1.5的問題MavenProjectJDK
- Useful Techniques in Developing On-demand Service ApplicationdevAPP
- Choosing a driver model for developing a USB client driverdevclient
- SpringBoot文件之Developing的閱讀筆記Spring Bootdev筆記
- Unity2D專案-平臺、解謎、戰鬥! 1.5 Player框架、技能管理元件Unity框架元件
- MySQL 索引優化 Using where, Using filesortMySql索引優化
- 1.5
- JQuery Plugin 2 - Passing Options into Your PluginjQueryPlugin
- MySQL explain結果Extra中"Using Index"與"Using where; Using index"區別MySqlAIIndex