如何利用 AVFoundation 設計一個通用穩定的音視訊框架?

欣東?發表於1970-01-01

前言

承接上篇的《AV Foundation開發祕籍——實踐掌握iOS &
OS X應用的視聽處理技術 閱讀指南》
今天這篇給大家講解下如何利用AVFoundation設計一套通用穩定的音視訊框架。

核心

AVCaptureSession開啟捕獲任務,配置AVCaptureDeviceInput定製捕獲任務的輸入源(多種攝像頭),通過AVFoundation內各種Data output輸出資料(後設資料、視訊幀、音訊幀),AVAssetWriter開啟寫任務,將音視訊資料歸檔為媒體檔案。

實現功能

視訊流預覽、錄影歸檔、捕獲相片、切換攝像頭、人臉檢測、幀率配置、相機詳細配置

框架原始碼

github.com/caixindong/…

具體設計

核心模組 XDCaptureService &
XDVideoWritter

XDCaptureService是對外API的總入口,也是框架的核心類,主要做音視訊輸入輸出配置工作和排程工作。XDVideoWritter是音視訊寫模組,主要提供寫資料和歸檔資料的基礎操作,不對外暴露。對外API設計:

@class XDCaptureService;
@protocol XDCaptureServiceDelegate <
NSObject>
@optional//service生命週期- (void)captureServiceDidStartService:(XDCaptureService *)service;
- (void)captureService:(XDCaptureService *)service serviceDidFailWithError:(NSError *)error;
- (void)captureServiceDidStopService:(XDCaptureService *)service;
- (void)captureService:(XDCaptureService *)service getPreviewLayer:(AVCaptureVideoPreviewLayer *)previewLayer;
- (void)captureService:(XDCaptureService *)service outputSampleBuffer:(CMSampleBufferRef)sampleBuffer;
//錄影相關- (void)captureServiceRecorderDidStart:(XDCaptureService *)service ;
- (void)captureService:(XDCaptureService *)service recorderDidFailWithError:(NSError *)error;
- (void)captureServiceRecorderDidStop:(XDCaptureService *)service;
//照片捕獲- (void)captureService:(XDCaptureService *)service capturePhoto:(UIImage *)photo;
//人臉檢測- (void)captureService:(XDCaptureService *)service outputFaceDetectData:(NSArray <
AVMetadataFaceObject*>
*) faces;
//景深資料- (void)captureService:(XDCaptureService *)service captureTrueDepth:(AVDepthData *)depthData API_AVAILABLE(ios(11.0));
@end@protocol XDCaptureServicePreViewSource <
NSObject>
- (AVCaptureVideoPreviewLayer *)preViewLayerSource;
@end@interface XDCaptureService : NSObject//是否錄製音訊,預設是NO@property (nonatomic, assign) BOOL shouldRecordAudio;
//iOS原生人臉檢測,預設是NO@property (nonatomic, assign) BOOL openNativeFaceDetect;
//攝像頭的方向,預設是AVCaptureDevicePositionFront(前置)@property (nonatomic, assign) AVCaptureDevicePosition devicePosition;
//判斷是否支援景深模式,當前只支援7p、8p、X的後置攝像頭及X的前後攝像頭,系統要求是iOS 11以上@property (nonatomic, assign, readonly) BOOL depthSupported;
//是否開啟景深模式,預設是NO@property (nonatomic, assign) BOOL openDepth;
//只有以下指定的sessionPreset才有depth資料:AVCaptureSessionPresetPhoto、AVCaptureSessionPreset1280x720、AVCaptureSessionPreset640x480@property (nonatomic, assign) AVCaptureSessionPreset sessionPreset;
//幀率,預設是30@property (nonatomic, assign) int frameRate;
//錄影的臨時儲存地址,建議每次錄完視訊做下重定向@property (nonatomic, strong, readonly) NSURL *recordURL;
//如果設定preViewSource則內部不生成AVCaptureVideoPreviewLayer@property (nonatomic, assign) id<
XDCaptureServicePreViewSource>
preViewSource;
@property (nonatomic, assign) id<
XDCaptureServiceDelegate>
delegate;
@property (nonatomic, assign, readonly) BOOL isRunning;
//視訊編碼設定(影響錄製的視訊的編碼和大小)@property (nonatomic, strong) NSDictionary *videoSetting;
///相機專業設定,除非特定需求,一般不設定//感光度(iOS8以上)@property (nonatomic, assign, readonly) CGFloat deviceISO;
@property (nonatomic, assign, readonly) CGFloat deviceMinISO;
@property (nonatomic, assign, readonly) CGFloat deviceMaxISO;
//鏡頭光圈大小@property (nonatomic, assign, readonly) CGFloat deviceAperture;
//曝光@property (nonatomic, assign, readonly) BOOL supportsTapToExpose;
@property (nonatomic, assign) AVCaptureExposureMode exposureMode;
@property (nonatomic, assign) CGPoint exposurePoint;
@property (nonatomic, assign, readonly) CMTime deviceExposureDuration;
//聚焦@property (nonatomic, assign, readonly) BOOL supportsTapToFocus;
@property (nonatomic, assign) AVCaptureFocusMode focusMode;
@property (nonatomic, assign) CGPoint focusPoint;
//白平衡@property (nonatomic, assign) AVCaptureWhiteBalanceMode whiteBalanceMode;
//手電筒@property (nonatomic, assign, readonly) BOOL hasTorch;
@property (nonatomic, assign) AVCaptureTorchMode torchMode;
//閃光燈@property (nonatomic, assign, readonly) BOOL hasFlash;
@property (nonatomic, assign) AVCaptureFlashMode flashMode;
//相機許可權判斷+ (BOOL)videoGranted;
//麥克風許可權判斷+ (BOOL)audioGranted;
//切換攝像機- (void)switchCamera;
//啟動- (void)startRunning;
//關閉- (void)stopRunning;
//開始錄影- (void)startRecording;
//取消錄影- (void)cancleRecording;
//停止錄影- (void)stopRecording;
//拍照- (void)capturePhoto;
@end複製程式碼

CDG佇列分流

因為在主執行緒啟動音視訊捕獲及音視訊讀寫會阻塞主執行緒,所以我們需要將這些任務派發到子執行緒中執行。我們選擇GCD佇列幫我們做這個視訊。我們框架總共配置3個佇列,分別是sessionQueue、writtingQueue、outputQueue,這些佇列都是序列佇列,因為音視訊相關操作都是有順序(時序)要求,保證當前佇列只有一個操作的執行(配置、寫資料、讀資料)。sessionQueue主要負責音視訊任務啟動的排程,writtingQueue主要負責寫資料的排程,保證資料幀能夠準確歸檔到檔案,outputQueue主要負責資料幀的輸出。

@property (nonatomic, strong) dispatch_queue_t sessionQueue;
@property (nonatomic, strong) dispatch_queue_t writtingQueue;
@property (nonatomic, strong) dispatch_queue_t outputQueue;
_sessionQueue = dispatch_queue_create("com.caixindong.captureservice.session", DISPATCH_QUEUE_SERIAL);
_writtingQueue = dispatch_queue_create("com.caixindong.captureservice.writting", DISPATCH_QUEUE_SERIAL);
_outputQueue = dispatch_queue_create("com.caixindong.captureservice.output", DISPATCH_QUEUE_SERIAL);
複製程式碼

音視訊捕獲

初始化捕獲任務

sessionPreset指定了輸出的視訊幀的畫素,例如640*480

@property (nonatomic, strong) AVCaptureSession *captureSession;
_captureSession = [[AVCaptureSession alloc] init];
_captureSession.sessionPreset = _sessionPreset;
複製程式碼

配置捕獲的輸入

獲取輸入源裝置,通過_cameraWithPosition可以獲取攝像頭的抽象表示,因為紅外攝像頭、雙攝像頭只能從較新API中獲取,所以方法裡已經做了相容處理。並用輸入源裝置配置捕獲輸入AVCaptureDeviceInput

@property (nonatomic, strong) AVCaptureDeviceInput *videoInput;
- (BOOL)_setupVideoInputOutput:(NSError **) error {
self.currentDevice = [self _cameraWithPosition:_devicePosition];
self.videoInput = [[AVCaptureDeviceInput alloc] initWithDevice:_currentDevice error:error];
if (_videoInput) {
if ([_captureSession canAddInput:_videoInput]) {
[_captureSession addInput:_videoInput];

} else {
*error = [NSError errorWithDomain:@"com.caixindong.captureservice.video" code:-2200 userInfo:@{NSLocalizedDescriptionKey:@"add video input fail"
}];
return NO;

}
} else {
*error = [NSError errorWithDomain:@"com.caixindong.captureservice.video" code:-2201 userInfo:@{NSLocalizedDescriptionKey:@"video input is nil"
}];
return NO;

} //穩定幀率 CMTime frameDuration = CMTimeMake(1, _frameRate);
if ([_currentDevice lockForConfiguration:error]) {
_currentDevice.activeVideoMaxFrameDuration = frameDuration;
_currentDevice.activeVideoMinFrameDuration = frameDuration;
[_currentDevice unlockForConfiguration];

} else {
*error = [NSError errorWithDomain:@"com.caixindong.captureservice.video" code:-2203 userInfo:@{NSLocalizedDescriptionKey:@"device lock fail(input)"
}];
return NO;

}……Other code
}- (AVCaptureDevice *)_cameraWithPosition:(AVCaptureDevicePosition)position {
if (@available(iOS 10.0, *)) {
//AVCaptureDeviceTypeBuiltInWideAngleCamera預設廣角攝像頭,AVCaptureDeviceTypeBuiltInTelephotoCamera長焦攝像頭,AVCaptureDeviceTypeBuiltInDualCamera後置雙攝像頭,AVCaptureDeviceTypeBuiltInTrueDepthCamera紅外前置攝像頭 NSMutableArray *mulArr = [NSMutableArray arrayWithObjects:AVCaptureDeviceTypeBuiltInWideAngleCamera,AVCaptureDeviceTypeBuiltInTelephotoCamera,nil];
if (@available(iOS 10.2, *)) {
[mulArr addObject:AVCaptureDeviceTypeBuiltInDualCamera];

} if (@available(iOS 11.1, *)) {
[mulArr addObject:AVCaptureDeviceTypeBuiltInTrueDepthCamera];

} AVCaptureDeviceDiscoverySession *discoverySession = [AVCaptureDeviceDiscoverySession discoverySessionWithDeviceTypes:[mulArr copy] mediaType:AVMediaTypeVideo position:position];
return discoverySession.devices.firstObject;

} else {
NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
for (AVCaptureDevice *device in videoDevices) {
if (device.position == position) {
return device;

}
}
} return nil;

}複製程式碼

配置捕獲輸出

根據不同的功能需求,我們可以往捕獲任務裡新增不同的輸出,例如捕獲基礎的視訊幀資料,我們新增AVCaptureVideoDataOutput,捕獲音訊資料,我們新增AVCaptureAudioDataOutput,捕獲人臉資料,我們新增AVCaptureMetadataOutput。因為音訊的輸出和視訊輸出的設定方式大同小異,所以這裡只列出視訊輸出的關鍵程式碼,這裡有幾個關鍵的設計:
1、因為相機感測器問題,輸出的視訊流的方向會有90度偏轉,所以我們需要通過獲取與輸出連線的videoConnection進行偏轉配置;2、視訊幀(或者音訊幀)都是以CMSampleBufferRef格式輸出,視訊幀可能經過多個業務處理,例如寫檔案或者拋到上層業務處理,所以處理資料前都對視訊幀資料進行retatin操作,保證各個業務線處理的視訊幀是獨立的,具體可以看_processVideoData;
3、為了及時清理臨時變數(對視訊幀處理的各種操作可能需要較多記憶體空間),所以將外拋的幀處理用autorelease pool包裹起來,防止出現記憶體高峰;

@property (nonatomic, strong) AVCaptureVideoDataOutput *videoOutput;
- (BOOL)_setupVideoInputOutput:(NSError **) error {……Other codeself.videoOutput = [[AVCaptureVideoDataOutput alloc] init];
_videoOutput.videoSettings = @{(id)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_32BGRA)
};
//對遲到的幀做丟幀處理 _videoOutput.alwaysDiscardsLateVideoFrames = YES;
dispatch_queue_t videoQueue = dispatch_queue_create("com.caixindong.captureservice.video", DISPATCH_QUEUE_SERIAL);
//設定資料輸出的delegate [_videoOutput setSampleBufferDelegate:self queue:videoQueue];
if ([_captureSession canAddOutput:_videoOutput]) {
[_captureSession addOutput:_videoOutput];

} else {
*error = [NSError errorWithDomain:@"com.caixindong.captureservice.video" code:-2204 userInfo:@{NSLocalizedDescriptionKey:@"device lock fail(output)"
}];
return NO;

} self.videoConnection = [self.videoOutput connectionWithMediaType:AVMediaTypeVideo];
//錄製視訊會有90度偏轉,是因為相機感測器問題,所以在這裡設定輸出的視訊流的方向 if (_videoConnection.isVideoOrientationSupported) {
_videoConnection.videoOrientation = AVCaptureVideoOrientationPortrait;

} return YES;

}#pragma mark - AVCaptureVideoDataOutputSampleBufferDelegate &
&
AVCaptureAudioDataOutputSampleBufferDelegate
- (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
//可以捕獲到不同的執行緒 if (connection == _videoConnection) {
@synchronized(self) {
[self _processVideoData:sampleBuffer];

};

} else if (connection == _audioConnection) {
@synchronized(self) {
[self _processAudioData:sampleBuffer];

};

}
}#pragma mark - process Data- (void)_processVideoData:(CMSampleBufferRef)sampleBuffer {
//CFRetain的目的是為了每條業務線(寫視訊、拋幀)的sampleBuffer都是獨立的 if (_videoWriter &
&
_videoWriter.isWriting) {
CFRetain(sampleBuffer);
dispatch_async(_writtingQueue, ^{
[_videoWriter appendSampleBuffer:sampleBuffer];
CFRelease(sampleBuffer);

});

} CFRetain(sampleBuffer);
//及時清理臨時變數,防止出現記憶體高峰 dispatch_async(_outputQueue, ^{
@autoreleasepool{
if (self.delegate &
&
[self.delegate respondsToSelector:@selector(captureService:outputSampleBuffer:)]) {
[self.delegate captureService:self outputSampleBuffer:sampleBuffer];

}
} CFRelease(sampleBuffer);

});

}複製程式碼

配置圖片資料輸出

配置圖片資料輸出的目的是為了實現相片捕獲功能,通過setOutputSettings,我們可以配置我們輸出的圖片格式。

@property (nonatomic, strong) AVCaptureStillImageOutput *imageOutput;
- (BOOL)_setupImageOutput:(NSError **)error {
self.imageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSetting = @{AVVideoCodecKey: AVVideoCodecJPEG
};
[_imageOutput setOutputSettings:outputSetting];
if ([_captureSession canAddOutput:_imageOutput]) {
[_captureSession addOutput:_imageOutput];
return YES;

} else {
*error = [NSError errorWithDomain:@"com.caixindong.captureservice.image" code:-2205 userInfo:@{NSLocalizedDescriptionKey:@"device lock fail(output)"
}];
return NO;

}
}//拍照功能實現- (void)capturePhoto {
AVCaptureConnection *connection = [_imageOutput connectionWithMediaType:AVMediaTypeVideo];
if (connection.isVideoOrientationSupported) {
connection.videoOrientation = AVCaptureVideoOrientationPortrait;

} __weak typeof(self) weakSelf = self;
[_imageOutput captureStillImageAsynchronouslyFromConnection:connection completionHandler:^(CMSampleBufferRef _Nullable imageDataSampleBuffer, NSError * _Nullable error) {
__strong typeof(weakSelf) strongSelf = weakSelf;
if (imageDataSampleBuffer != NULL) {
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [UIImage imageWithData:imageData];
if (strongSelf.delegate &
&
[strongSelf.delegate respondsToSelector:@selector(captureService:capturePhoto:)]) {
[strongSelf.delegate captureService:strongSelf capturePhoto:image];

}
}
}];

}複製程式碼

配置人臉資料輸出

關鍵是配置人臉後設資料輸出AVCaptureMetadataOutput,並指定metadataObjectTypes為AVMetadataObjectTypeFace,捕獲的人臉資料包含當前視訊幀中所有的人臉,可以從資料中提取人臉的範圍、位置、偏轉角,但這個有個注意點,就是原始的人臉資料的座標是相機座標系,我們需要轉化為螢幕座標,這樣才方便我們的業務處理,具體可以看人臉資料輸出那一塊。

@property (nonatomic, strong) AVCaptureMetadataOutput *metadataOutput;
-(void)captureOutput:(AVCaptureOutput *)output didOutputMetadataObjects:(NSArray<
__kindof AVMetadataObject *>
*)metadataObjects fromConnection:(AVCaptureConnection *)connection {
NSMutableArray *transformedFaces = [NSMutableArray array];
for (AVMetadataObject *face in metadataObjects) {
@autoreleasepool{
AVMetadataFaceObject *transformedFace = (AVMetadataFaceObject*)[self.previewLayer transformedMetadataObjectForMetadataObject:face];
if (transformedFace) {
[transformedFaces addObject:transformedFace];

}
};

} @autoreleasepool{
if (self.delegate &
&
[self.delegate respondsToSelector:@selector(captureService:outputFaceDetectData:)]) {
[self.delegate captureService:self outputFaceDetectData:[transformedFaces copy]];

}
};

}複製程式碼

配置預覽源

這裡有兩種方式,一種是外部已經通過實現預覽資料來源方法配置了資料來源,另外一種是內部自己生成AVCaptureVideoPreviewLayer配置為預覽源。

   if (self.preViewSource &
&
[self.preViewSource respondsToSelector:@selector(preViewLayerSource)]) {
self.previewLayer = [self.preViewSource preViewLayerSource];
[_previewLayer setSession:_captureSession];
[_previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];

} else {
self.previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_captureSession];
//充滿整個螢幕 [_previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
if (self.delegate &
&
[self.delegate respondsToSelector:@selector(captureService:getPreviewLayer:)]) {
[self.delegate captureService:self getPreviewLayer:_previewLayer];

}
}複製程式碼

處理音視訊前後臺狀態變化

iOS的音視訊前後臺機制較複雜,有各種生命週期變化,為了保證我們框架在正確狀態下做正確的事,我們將資料幀的讀和寫的狀態處理進行解耦,各位維護自己的通知狀態變化處理。外層業務無需監聽AVFoundation的通知手動處理視訊流的狀態變化。讀模組音視訊通知配置:

//CaptureService和VideoWritter各自維護自己的生命週期,捕獲視訊流的狀態與寫入視訊流的狀態解耦分離,音視訊狀態變遷由captureservice內部管理,外層業務無需手動處理視訊流變化    [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(_captureSessionNotification:) name:nil object:self.captureSession];
//為了適配低於iOS 9的版本,在iOS 9以前,當session start 還沒完成就退到後臺,回到前臺會捕獲AVCaptureSessionRuntimeErrorNotification,這時需要手動重新啟動session,iOS 9以後系統對此做了優化,系統退到後臺後會將session start快取起來,回到前臺會自動呼叫快取的session start,無需手動呼叫 [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(_enterForegroundNotification:) name:UIApplicationWillEnterForegroundNotification object:nil];
#pragma mark - CaptureSession Notification- (void)_captureSessionNotification:(NSNotification *)notification {
NSLog(@"_captureSessionNotification:%@",notification.name);
if ([notification.name isEqualToString:AVCaptureSessionDidStartRunningNotification]) {
if (!_firstStartRunning) {
NSLog(@"session start running");
_firstStartRunning = YES;
if (self.delegate &
&
[self.delegate respondsToSelector:@selector(captureServiceDidStartService:)]) {
[self.delegate captureServiceDidStartService:self];

}
} else {
NSLog(@"session resunme running");

}
} else if ([notification.name isEqualToString:AVCaptureSessionDidStopRunningNotification]) {
if (!_isRunning) {
NSLog(@"session stop running");
if (self.delegate &
&
[self.delegate respondsToSelector:@selector(captureServiceDidStopService:)]) {
[self.delegate captureServiceDidStopService:self];

}
} else {
NSLog(@"interupte session stop running");

}
} else if ([notification.name isEqualToString:AVCaptureSessionWasInterruptedNotification]) {
NSLog(@"session was interupted, userInfo: %@",notification.userInfo);

} else if ([notification.name isEqualToString:AVCaptureSessionInterruptionEndedNotification]) {
NSLog(@"session interupted end");

} else if ([notification.name isEqualToString:AVCaptureSessionRuntimeErrorNotification]) {
NSError *error = notification.userInfo[AVCaptureSessionErrorKey];
if (error.code == AVErrorDeviceIsNotAvailableInBackground) {
NSLog(@"session runtime error : AVErrorDeviceIsNotAvailableInBackground");
_startSessionOnEnteringForeground = YES;

} else {
if (self.delegate &
&
[self.delegate respondsToSelector:@selector(captureService:serviceDidFailWithError:)]) {
[self.delegate captureService:self serviceDidFailWithError:error];

}
}
} else {
NSLog(@"handel other notification : %@",notification.name);

}
}#pragma mark - UIApplicationWillEnterForegroundNotification- (void)_enterForegroundNotification:(NSNotification *)notification {
if (_startSessionOnEnteringForeground == YES) {
NSLog(@"為了適配低於iOS 9的版本,在iOS 9以前,當session start 還沒完成就退到後臺,回到前臺會捕獲AVCaptureSessionRuntimeErrorNotification,這時需要手動重新啟動session,iOS 9以後系統對此做了優化,系統退到後臺後會將session start快取起來,回到前臺會自動呼叫快取的session start,無需手動呼叫");
_startSessionOnEnteringForeground = NO;
[self startRunning];

}
}複製程式碼

寫模組音視訊通知配置:

//寫模組註冊通知,只負責寫相關的狀態處理    [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(_assetWritterInterruptedNotification:) name:AVCaptureSessionWasInterruptedNotification object:nil];
- (void)_assetWritterInterruptedNotification:(NSNotification *)notification {
NSLog(@"assetWritterInterruptedNotification");
[self cancleWriting];

}複製程式碼

啟動捕獲 &
關閉捕獲

非同步啟動,防止阻塞主執行緒。序列佇列中執行啟動和關閉,保證不會出現啟動到一半就關閉這種異常case。

- (void)startRunning { 
dispatch_async(_sessionQueue, ^{
NSError *error = nil;
BOOL result = [self _setupSession:&
error];
if (result) {
_isRunning = YES;
[_captureSession startRunning];

}else{
if (self.delegate &
&
[self.delegate respondsToSelector:@selector(captureService:serviceDidFailWithError:)]) {
[self.delegate captureService:self serviceDidFailWithError:error];

}
}
});

}- (void)stopRunning {
dispatch_async(_sessionQueue, ^{
_isRunning = NO;
NSError *error = nil;
[self _clearVideoFile:&
error];
if (error) {
if (self.delegate &
&
[self.delegate respondsToSelector:@selector(captureService:serviceDidFailWithError:)]) {
[self.delegate captureService:self serviceDidFailWithError:error];

}
} [_captureSession stopRunning];

});

}複製程式碼

切換攝像頭

切換不僅僅是切換device,同時還要將舊的捕獲input移除,新增新的device input。切換攝像頭時,videoConnection會變化,所以需要重新獲取。

- (void)switchCamera { 
if (_openDepth) {
return;

} NSError *error;
AVCaptureDevice *videoDevice = [self _inactiveCamera];
AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&
error];
if (videoInput) {
[_captureSession beginConfiguration];
[_captureSession removeInput:self.videoInput];
if ([self.captureSession canAddInput:videoInput]) {
[self.captureSession addInput:videoInput];
self.videoInput = videoInput;
//切換攝像頭videoConnection會變化,所以需要重新獲取 self.videoConnection = [self.videoOutput connectionWithMediaType:AVMediaTypeVideo];
if (_videoConnection.isVideoOrientationSupported) {
_videoConnection.videoOrientation = AVCaptureVideoOrientationPortrait;

}
} else {
[self.captureSession addInput:self.videoInput];

} [self.captureSession commitConfiguration];

} _devicePosition = _devicePosition == AVCaptureDevicePositionFront?AVCaptureDevicePositionBack:AVCaptureDevicePositionFront;

}- (AVCaptureDevice *)_inactiveCamera {
AVCaptureDevice *device = nil;
if (_devicePosition == AVCaptureDevicePositionBack) {
device = [self _cameraWithPosition:AVCaptureDevicePositionFront];

} else {
device = [self _cameraWithPosition:AVCaptureDevicePositionBack];

} return device;

}複製程式碼

錄影功能

通過videoSetting配置錄製完的視訊的編碼格式,框架預設的編碼格式是H.264,H.264是一種高效的視訊編碼格式,之後再出篇文章講下這種編碼格式,現階段你只需要知道這是一種常用的編碼格式,想了解更多編碼格式,可以看下AVVideoCodecType裡面的內容。在XDCaptureService的startRecording方法中,我們初始化我們的寫模組XDVideoWritter,XDVideoWritter根據videoSetting配置對應的編碼格式。

- (void)startRecording { 
dispatch_async(_writtingQueue, ^{
@synchronized(self) {
NSString *videoFilePath = [_videoDir stringByAppendingPathComponent:[NSString stringWithFormat:@"Record-%llu.mp4",mach_absolute_time()]];
_recordURL = [[NSURL alloc] initFileURLWithPath:videoFilePath];
if (_recordURL) {
_videoWriter = [[XDVideoWritter alloc] initWithURL:_recordURL VideoSettings:_videoSetting audioSetting:_audioSetting];
_videoWriter.delegate = self;
[_videoWriter startWriting];
if (self.delegate &
&
[self.delegate respondsToSelector:@selector(captureServiceRecorderDidStart:)]) {
[self.delegate captureServiceRecorderDidStart:self];

}
} else {
NSLog(@"No record URL");

}
}
});

}//XDVideoWritter.m- (void)startWriting {
if (_assetWriter) {
_assetWriter = nil;

} NSError *error = nil;
NSString *fileType = AVFileTypeMPEG4;
_assetWriter = [[AVAssetWriter alloc] initWithURL:_outputURL fileType:fileType error:&
error];
if (!_assetWriter || error) {
if (self.delegate &
&
[self.delegate respondsToSelector:@selector(videoWritter:didFailWithError:)]){
[self.delegate videoWritter:self didFailWithError:error];

}
} if (_videoSetting) {
_videoInput = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeVideo outputSettings:_videoSetting];
_videoInput.expectsMediaDataInRealTime = YES;
if ([_assetWriter canAddInput:_videoInput]) {
[_assetWriter addInput:_videoInput];

} else {
NSError *error = [NSError errorWithDomain:@"com.caixindong.captureservice.writter" code:-2210 userInfo:@{NSLocalizedDescriptionKey:@"VideoWritter unable to add video input"
}];
if (self.delegate &
&
[self.delegate respondsToSelector:@selector(videoWritter:didFailWithError:)]) {
[self.delegate videoWritter:self didFailWithError:error];

} return;

}
} else {
NSLog(@"warning: no video setting");

} if (_audioSetting) {
_audioInput = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeAudio outputSettings:_audioSetting];
_audioInput.expectsMediaDataInRealTime = YES;
if ([_assetWriter canAddInput:_audioInput]) {
[_assetWriter addInput:_audioInput];

} else {
NSError *error = [NSError errorWithDomain:@"com.caixindong.captureservice.writter" code:-2211 userInfo:@{NSLocalizedDescriptionKey:@"VideoWritter unable to add audio input"
}];
if (self.delegate &
&
[self.delegate respondsToSelector:@selector(videoWritter:didFailWithError:)]) {
[self.delegate videoWritter:self didFailWithError:error];

} return;

}
} else {
NSLog(@"warning: no audio setting");

} if ([_assetWriter startWriting]) {
self.isWriting = YES;

} else {
NSError *error = [NSError errorWithDomain:@"com.xindong.captureservice.writter" code:-2212 userInfo:@{NSLocalizedDescriptionKey: [NSString stringWithFormat: @"VideoWritter startWriting fail error: %@",_assetWriter.error.localizedDescription]
}];
if (self.delegate &
&
[self.delegate respondsToSelector:@selector(videoWritter:didFailWithError:)]) {
[self.delegate videoWritter:self didFailWithError:error];

}
} [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(_assetWritterInterruptedNotification:) name:AVCaptureSessionWasInterruptedNotification object:nil];

}複製程式碼

錄影的原理就是在拋資料的同時呼叫XDVideoWritter的appendSampleBuffer方法將資料寫入一個臨時檔案中,當呼叫stopRecording,也就是呼叫到XDVideoWritter的stopWriting方法停止寫資料,將臨時檔案歸檔為MP4檔案。

- (void)stopRecording { 
dispatch_async(_writtingQueue, ^{
@synchronized(self) {
if (_videoWriter) {
[_videoWriter stopWriting];

}
}
});

}//XDVideoWritter.m- (void)appendSampleBuffer:(CMSampleBufferRef)sampleBuffer {
CMFormatDescriptionRef formatDesc = CMSampleBufferGetFormatDescription(sampleBuffer);
CMMediaType mediaType = CMFormatDescriptionGetMediaType(formatDesc);
if (mediaType == kCMMediaType_Video) {
CMTime timestamp = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
if (self.firstSample) {
[_assetWriter startSessionAtSourceTime:timestamp];
self.firstSample = NO;

} if (_videoInput.readyForMoreMediaData) {
if (![_videoInput appendSampleBuffer:sampleBuffer]) {
NSError *error = [NSError errorWithDomain:@"com.caixindong.captureservice.writter" code:-2213 userInfo:@{NSLocalizedDescriptionKey:[NSString stringWithFormat: @"VideoWritter appending video sample buffer fail error:%@",_assetWriter.error.localizedDescription]
}];
if (self.delegate &
&
[self.delegate respondsToSelector:@selector(videoWritter:didFailWithError:)]) {
[self.delegate videoWritter:self didFailWithError:error];

}
}
}
} else if (!self.firstSample &
&
mediaType == kCMMediaType_Audio) {
if (_audioInput.readyForMoreMediaData) {
if (![_audioInput appendSampleBuffer:sampleBuffer]) {
NSError *error = [NSError errorWithDomain:@"com.caixindong.captureservice.writter" code:-2214 userInfo:@{NSLocalizedDescriptionKey:[NSString stringWithFormat:@"VideoWritter appending audio sample buffer fail error: %@",_assetWriter.error]
}];
if (self.delegate &
&
[self.delegate respondsToSelector:@selector(videoWritter:didFailWithError:)]) {
[self.delegate videoWritter:self didFailWithError:error];

}
}
}
}
}- (void)stopWriting {
if (_assetWriter.status == AVAssetWriterStatusWriting) {
self.isWriting = NO;
[_assetWriter finishWritingWithCompletionHandler:^{
if (_assetWriter.status == AVAssetWriterStatusCompleted) {
if (self.delegate &
&
[self.delegate respondsToSelector:@selector(videoWritter:completeWriting:)]) {
[self.delegate videoWritter:self completeWriting:nil];

}
} else {
if (self.delegate &
&
[self.delegate respondsToSelector:@selector(videoWritter:completeWriting:)]) {
[self.delegate videoWritter:self completeWriting:_assetWriter.error];

}
}
}];

} else {
NSLog(@"warning : stop writing with unsuitable state : %ld",_assetWriter.status);

} [[NSNotificationCenter defaultCenter] removeObserver:self name:AVCaptureSessionWasInterruptedNotification object:nil];

}複製程式碼

感謝

感覺每一位給XDCaptureService issue 和使用反饋的同學,開源完善靠大家!

來源:https://juejin.im/post/5c46d6bff265da613d7c5e69#comment

相關文章