想了一段時間該如何著手去寫ARKit的東西(並不是沒學一直瞎想呢?),還是覺著吧,先把ARKit 捋一遍。
//先揭露一下ARKit 中的類
#import <ARKit/ARError.h>
#import <ARKit/ARSession.h>
#import <ARKit/ARConfiguration.h>
#import <ARKit/ARFrame.h>
#import <ARKit/ARCamera.h>
#import <ARKit/ARHitTestResult.h>
#import <ARKit/ARLightEstimate.h>
#import <ARKit/ARPointCloud.h>
#import <ARKit/ARAnchor.h>
#import <ARKit/ARPlaneAnchor.h>
#import <ARKit/ARFaceAnchor.h>
#import <ARKit/ARFaceGeometry.h>
#import <ARKit/ARSCNView.h>
#import <ARKit/ARSKView.h>
複製程式碼
ARKit 簡單 類 說明
ARError
顧名思義,一些error的說明
typedef NS_ERROR_ENUM(ARErrorDomain, ARErrorCode) {
/** Unsupported configuration. 不支援的配置*/
ARErrorCodeUnsupportedConfiguration = 100,
/** A sensor required to run the session is not available. 執行所需的感測器是不可用的*/
ARErrorCodeSensorUnavailable = 101,
/** A sensor failed to provide the required input.感測器無法提供輸入 */
ARErrorCodeSensorFailed = 102,
/** App does not have permission to use the camera. The user may change this in settings. 沒有使用攝像頭的許可權。在設定中更改。*/
ARErrorCodeCameraUnauthorized = 103,
/** World tracking has encountered a fatal error.發現一個致命的錯誤 */
ARErrorCodeWorldTrackingFailed = 200,
};
複製程式碼
####Camera and Scene Details
ARFrame
ARFrame SDK 視訊影象和位置跟蹤資訊作為AR會話的一部分被捕獲
此類中 *屬性* 都是readonly
//---Accessing Captured Video Frames---
@property(nonatomic, readonly) CVPixelBufferRef capturedImage;
@property (nonatomic, readonly) NSTimeInterval timestamp;
@property (nonatomic, strong, readonly, nullable) AVDepthData *capturedDepthData;
@property (nonatomic, readonly) NSTimeInterval capturedDepthDataTimestamp;
//---Examining Scene Parameter---
複製程式碼
#####ARCamera ARCamera
-
複製程式碼
#####ARLightEstimate ARLightEstimate
-
複製程式碼
#####ARDirectionalLightEstimate ARDirectionalLightEstimate
-
複製程式碼
END