需求:使用AVFoundation中的AVCaptureSession實現設定相機的解析度,幀率(包括高幀率), 切換前後置攝像頭,對焦,螢幕旋轉,調節曝光度...
閱讀前提:
- 原理請參考另一篇文章:iOS視訊流採集概述(AVCaptureSession)
- 基於AVFoundation框架
GitHub地址(附程式碼) : iOS視訊採集實戰(AVCaptureSession)
簡書地址 : iOS視訊採集實戰(AVCaptureSession)
部落格地址 : iOS視訊採集實戰(AVCaptureSession)
掘金地址 : iOS視訊採集實戰(AVCaptureSession)
1. 設定解析度與幀率
1.1. 低幀率模式(fps <= 30)
在要求幀率小於等於30幀的情況下,相機設定解析度與幀率的方法是單獨的,即設定幀率是幀率的方法,設定解析度是解析度的方法,兩者沒有繫結.
-
設定解析度
使用此方法可以設定相機解析度,可以設定的型別可以直接跳轉進API文件處自行選擇,目前支援最大的是3840*2160,如果不要求相機幀率大於30幀,此方法可以適用於你.
- (void)setCameraResolutionByPresetWithHeight:(int)height session:(AVCaptureSession *)session {
/*
Note: the method only support your frame rate <= 30 because we must use `activeFormat` when frame rate > 30, the `activeFormat` and `sessionPreset` are exclusive
*/
AVCaptureSessionPreset preset = [self getSessionPresetByResolutionHeight:height];
if ([session.sessionPreset isEqualToString:preset]) {
NSLog(@"Needn't to set camera resolution repeatly !");
return;
}
if (![session canSetSessionPreset:preset]) {
NSLog(@"Can't set the sessionPreset !");
return;
}
[session beginConfiguration];
session.sessionPreset = preset;
[session commitConfiguration];
}
複製程式碼
-
設定幀率
使用此方法可以設定相機幀率,僅支援幀率小於等於30幀.
- (void)setCameraForLFRWithFrameRate:(int)frameRate {
// Only for frame rate <= 30
AVCaptureDevice *captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
[captureDevice lockForConfiguration:NULL];
[captureDevice setActiveVideoMinFrameDuration:CMTimeMake(1, frameRate)];
[captureDevice setActiveVideoMaxFrameDuration:CMTimeMake(1, frameRate)];
[captureDevice unlockForConfiguration];
}
複製程式碼
1.2. 高幀率模式(fps > 30)
如果需要對某一解析度支援高幀率的設定,如50幀,60幀,120幀...,原先setActiveVideoMinFrameDuration
與setActiveVideoMaxFrameDuration
是無法做到的,Apple規定我們需要使用新的方法設定幀率setActiveVideoMinFrameDuration
與setActiveVideoMaxFrameDuration
,並且該方法必須配合新的設定解析度activeFormat
的方法一起使用.
新的設定解析度的方法activeFormat
與sessionPreset
是互斥的,如果使用了一個, 另一個會失效,建議直接使用高幀率的設定方法,廢棄低幀率下設定方法,避免產生相容問題。
Apple在更新方法後將原先分離的解析度與幀率的設定方法合二為一,原先是單獨設定相機解析度與幀率,而現在則需要一起設定,即每個解析度有其對應支援的幀率範圍,每個幀率也有其支援的解析度,需要我們遍歷來查詢,所以原先統一的單獨的設定解析度與幀率的方法在高幀率模式下相當於棄用,可以根據專案需求選擇,如果確定專案不會支援高幀率(fps>30),可以使用以前的方法,簡單且有效.
注意: 使用
activeFormat
方法後,之前使用sessionPreset
方法設定的解析度將自動變為AVCaptureSessionPresetInputPriority
,所以如果專案之前有用canSetSessionPreset
比較的if語句也都將失效,建議如果專案必須支援高幀率則徹底啟用sessionPreset
方法.
+ (BOOL)setCameraFrameRateAndResolutionWithFrameRate:(int)frameRate andResolutionHeight:(CGFloat)resolutionHeight bySession:(AVCaptureSession *)session position:(AVCaptureDevicePosition)position videoFormat:(OSType)videoFormat {
AVCaptureDevice *captureDevice = [self getCaptureDevicePosition:position];
BOOL isSuccess = NO;
for(AVCaptureDeviceFormat *vFormat in [captureDevice formats]) {
CMFormatDescriptionRef description = vFormat.formatDescription;
float maxRate = ((AVFrameRateRange*) [vFormat.videoSupportedFrameRateRanges objectAtIndex:0]).maxFrameRate;
if (maxRate >= frameRate && CMFormatDescriptionGetMediaSubType(description) == videoFormat) {
if ([captureDevice lockForConfiguration:NULL] == YES) {
// 對比鏡頭支援的解析度和當前設定的解析度
CMVideoDimensions dims = CMVideoFormatDescriptionGetDimensions(description);
if (dims.height == resolutionHeight && dims.width == [self getResolutionWidthByHeight:resolutionHeight]) {
[session beginConfiguration];
if ([captureDevice lockForConfiguration:NULL]){
captureDevice.activeFormat = vFormat;
[captureDevice setActiveVideoMinFrameDuration:CMTimeMake(1, frameRate)];
[captureDevice setActiveVideoMaxFrameDuration:CMTimeMake(1, frameRate)];
[captureDevice unlockForConfiguration];
}
[session commitConfiguration];
return YES;
}
}else {
NSLog(@"%s: lock failed!",__func__);
}
}
}
NSLog(@"Set camera frame is success : %d, frame rate is %lu, resolution height = %f",isSuccess,(unsigned long)frameRate,resolutionHeight);
return NO;
}
+ (AVCaptureDevice *)getCaptureDevicePosition:(AVCaptureDevicePosition)position {
NSArray *devices = nil;
if (@available(iOS 10.0, *)) {
AVCaptureDeviceDiscoverySession *deviceDiscoverySession = [AVCaptureDeviceDiscoverySession discoverySessionWithDeviceTypes:@[AVCaptureDeviceTypeBuiltInWideAngleCamera] mediaType:AVMediaTypeVideo position:position];
devices = deviceDiscoverySession.devices;
} else {
#pragma clang diagnostic push
#pragma clang diagnostic ignored "-Wdeprecated-declarations"
devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
#pragma clang diagnostic pop
}
for (AVCaptureDevice *device in devices) {
if (position == device.position) {
return device;
}
}
return NULL;
}
複製程式碼
2. 前後置攝像頭切換
切換前後置攝像頭,看似簡單,實際應用中會產生很多問題,因為同一部裝置前後置攝像頭支援的解析度幀率的值是不同的,所以如果從支援切向不支援就會產生問題,具體案例如下
比如iPhoneX, 後置攝像頭最大支援(4K,60fps),前置攝像頭最大支援(2K,30fps),當使用(4K,60fps)後置攝像頭切到前置攝像頭如果不做處理則無法切換,程式錯亂.
注意
下面程式碼中我們這行程式碼session.sessionPreset = AVCaptureSessionPresetLow;
,因為從後置切到前置我們需要重新計算當前輸入裝置支援最大的解析度與幀率,而輸入裝置如果不先新增上去我們無法計算,所以在這裡先隨便設定一個可接受的解析度以使我們可以把輸入裝置新增,之後在求出當前裝置最大支援的解析度與幀率後再重新設定解析度與幀率.
- (void)setCameraPosition:(AVCaptureDevicePosition)position session:(AVCaptureSession *)session input:(AVCaptureDeviceInput *)input videoFormat:(OSType)videoFormat resolutionHeight:(CGFloat)resolutionHeight frameRate:(int)frameRate {
if (input) {
[session beginConfiguration];
[session removeInput:input];
AVCaptureDevice *device = [self.class getCaptureDevicePosition:position];
NSError *error = nil;
AVCaptureDeviceInput *newInput = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
if (error != noErr) {
NSLog(@"%s: error:%@",__func__, error.localizedDescription);
return;
}
// 比如: 後置是4K, 前置最多支援2K,此時切換需要降級, 而如果不先把Input新增到session中,我們無法計算當前攝像頭支援的最大解析度
session.sessionPreset = AVCaptureSessionPresetLow;
if ([session canAddInput:newInput]) {
self.input = newInput;
[session addInput:newInput];
}else {
NSLog(@"%s: add input failed.",__func__);
return;
}
int maxResolutionHeight = [self getMaxSupportResolutionByPreset];
if (resolutionHeight > maxResolutionHeight) {
resolutionHeight = maxResolutionHeight;
self.cameraModel.resolutionHeight = resolutionHeight;
NSLog(@"%s: Current support max resolution height = %d", __func__, maxResolutionHeight);
}
int maxFrameRate = [self getMaxFrameRateByCurrentResolution];
if (frameRate > maxFrameRate) {
frameRate = maxFrameRate;
self.cameraModel.frameRate = frameRate;
NSLog(@"%s: Current support max frame rate = %d",__func__, maxFrameRate);
}
BOOL isSuccess = [self.class setCameraFrameRateAndResolutionWithFrameRate:frameRate
andResolutionHeight:resolutionHeight
bySession:session
position:position
videoFormat:videoFormat];
if (!isSuccess) {
NSLog(@"%s: Set resolution and frame rate failed.",__func__);
}
[session commitConfiguration];
}
}
複製程式碼
3.螢幕視訊方向切換
我們在這裡首先要區分下螢幕方向與視訊方向的概念,一個是用來表示裝置方向(UIDeviceOrientation),一個是用來表示視訊方向(AVCaptureVideoOrientation). 我們使用的AVCaptureSession,如果要支援螢幕旋轉,需要在螢幕旋轉的同時將我們的視訊畫面也進行旋轉.
螢幕方向的旋轉可以通過通知UIDeviceOrientationDidChangeNotification
接收,這裡不做過多說明.
- (void)adjustVideoOrientationByScreenOrientation:(UIDeviceOrientation)orientation previewFrame:(CGRect)previewFrame previewLayer:(AVCaptureVideoPreviewLayer *)previewLayer videoOutput:(AVCaptureVideoDataOutput *)videoOutput {
[previewLayer setFrame:previewFrame];
switch (orientation) {
case UIInterfaceOrientationPortrait:
[self adjustAVOutputDataOrientation:AVCaptureVideoOrientationPortrait
videoOutput:videoOutput];
break;
case UIInterfaceOrientationPortraitUpsideDown:
[self adjustAVOutputDataOrientation:AVCaptureVideoOrientationPortraitUpsideDown
videoOutput:videoOutput];
break;
case UIInterfaceOrientationLandscapeLeft:
[[previewLayer connection] setVideoOrientation:AVCaptureVideoOrientationLandscapeLeft];
[self adjustAVOutputDataOrientation:AVCaptureVideoOrientationLandscapeLeft
videoOutput:videoOutput];
break;
case UIInterfaceOrientationLandscapeRight:
[[previewLayer connection] setVideoOrientation:AVCaptureVideoOrientationLandscapeRight];
[self adjustAVOutputDataOrientation:AVCaptureVideoOrientationLandscapeRight
videoOutput:videoOutput];
break;
default:
break;
}
}
-(void)adjustAVOutputDataOrientation:(AVCaptureVideoOrientation)orientation videoOutput:(AVCaptureVideoDataOutput *)videoOutput {
for(AVCaptureConnection *connection in videoOutput.connections) {
for(AVCaptureInputPort *port in [connection inputPorts]) {
if([[port mediaType] isEqual:AVMediaTypeVideo]) {
if([connection isVideoOrientationSupported]) {
[connection setVideoOrientation:orientation];
}
}
}
}
}
複製程式碼
4.對焦調節
關於對焦,我們需要特別說明手動設定對焦點進行對焦,因為對焦方法僅接受以左上角為(0,0),右下角為(1,1)的座標系,所以我們需要對UIView的座標系進行轉換,但是轉換需要分為多種情況,如下
- 視訊是否以映象模式輸出: 如前置攝像頭可能會開啟映象模式(x,y座標是反的)
- 螢幕方向是以Home在右還是在左: 在右的話是以左上角為原點,在左的話則是以右下角為原點.
- 視訊渲染方式: 是保持解析度比例,還是填充模式,因為手機型號不同,所以可能是填充黑邊,可能超出螢幕,需要重新計算對焦點.
如果我們是直接使用AVCaptureSession的AVCaptureVideoPreviewLayer做渲染,我們可以使用captureDevicePointOfInterestForPoint
方法自動計算,此結果會考慮上面所有情況.但如果我們是自己對螢幕做渲染,則需要自己計算對焦點,上面的情況都需要考慮. 下面提供自動與手動計算兩種方法.
- (void)autoFocusAtPoint:(CGPoint)point {
AVCaptureDevice *device = self.input.device;
if ([device isFocusPointOfInterestSupported] && [device isFocusModeSupported:AVCaptureFocusModeAutoFocus]) {
NSError *error;
if ([device lockForConfiguration:&error]) {
[device setExposurePointOfInterest:point];
[device setExposureMode:AVCaptureExposureModeContinuousAutoExposure];
[device setFocusPointOfInterest:point];
[device setFocusMode:AVCaptureFocusModeAutoFocus];
[device unlockForConfiguration];
}
}
}
複製程式碼
4.1. 自動計算對焦點
- (CGPoint)convertToPointOfInterestFromViewCoordinates:(CGPoint)viewCoordinates captureVideoPreviewLayer:(AVCaptureVideoPreviewLayer *)captureVideoPreviewLayer {
CGPoint pointOfInterest = CGPointMake(.5f, .5f);
CGSize frameSize = [captureVideoPreviewLayer frame].size;
if ([captureVideoPreviewLayer.connection isVideoMirrored]) {
viewCoordinates.x = frameSize.width - viewCoordinates.x;
}
// Convert UIKit coordinate to Focus Point(0.0~1.1)
pointOfInterest = [captureVideoPreviewLayer captureDevicePointOfInterestForPoint:viewCoordinates];
// NSLog(@"Focus - Auto test: %@",NSStringFromCGPoint(pointOfInterest));
return pointOfInterest;
}
複製程式碼
4.2. 手動計算對焦點
- 如果手機螢幕尺寸與解析度比例完全吻合,則直接將座標系轉為(0,0)到(1,1)即可
- 如果螢幕尺寸比例與解析度比例不同,需要進一步分析視訊渲染方式來計算,如果是保持解析度,則肯定會留下黑邊,我們在計算對焦點時需要減去黑邊長度,如果是以解析度比例填充螢幕則會犧牲一部分畫素,我們在計算對焦點時同樣需要加上犧牲的畫素.
- (CGPoint)manualConvertFocusPoint:(CGPoint)point frameSize:(CGSize)frameSize captureVideoPreviewLayer:(AVCaptureVideoPreviewLayer *)captureVideoPreviewLayer position:(AVCaptureDevicePosition)position videoDataOutput:(AVCaptureVideoDataOutput *)videoDataOutput input:(AVCaptureDeviceInput *)input {
CGPoint pointOfInterest = CGPointMake(.5f, .5f);
if ([[videoDataOutput connectionWithMediaType:AVMediaTypeVideo] isVideoMirrored]) {
point.x = frameSize.width - point.x;
}
for (AVCaptureInputPort *port in [input ports]) {
if ([port mediaType] == AVMediaTypeVideo) {
CGRect cleanAperture = CMVideoFormatDescriptionGetCleanAperture([port formatDescription], YES);
CGSize resolutionSize = cleanAperture.size;
CGFloat resolutionRatio = resolutionSize.width / resolutionSize.height;
CGFloat screenSizeRatio = frameSize.width / frameSize.height;
CGFloat xc = .5f;
CGFloat yc = .5f;
if (resolutionRatio == screenSizeRatio) {
xc = point.x / frameSize.width;
yc = point.y / frameSize.height;
}else if (resolutionRatio > screenSizeRatio) {
if ([[captureVideoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResizeAspectFill]) {
CGFloat needScreenWidth = resolutionRatio * frameSize.height;
CGFloat cropWidth = (needScreenWidth - frameSize.width) / 2;
xc = (cropWidth + point.x) / needScreenWidth;
yc = point.y / frameSize.height;
}else if ([[captureVideoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResizeAspect]){
CGFloat needScreenHeight = frameSize.width * (1/resolutionRatio);
CGFloat blackBarLength = (frameSize.height - needScreenHeight) / 2;
xc = point.x / frameSize.width;
yc = (point.y - blackBarLength) / needScreenHeight;
}else if ([[captureVideoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResize]) {
xc = point.x / frameSize.width;
yc = point.y / frameSize.height;
}
}else {
if ([[captureVideoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResizeAspectFill]) {
CGFloat needScreenHeight = (1/resolutionRatio) * frameSize.width;
CGFloat cropHeight = (needScreenHeight - frameSize.height) / 2;
xc = point.x / frameSize.width;
yc = (cropHeight + point.y) / needScreenHeight;
}else if ([[captureVideoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResizeAspect]){
CGFloat needScreenWidth = frameSize.height * resolutionRatio;
CGFloat blackBarLength = (frameSize.width - needScreenWidth) / 2;
xc = (point.x - blackBarLength) / needScreenWidth;
yc = point.y / frameSize.height;
}else if ([[captureVideoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResize]) {
xc = point.x / frameSize.width;
yc = point.y / frameSize.height;
}
}
pointOfInterest = CGPointMake(xc, yc);
}
}
if (position == AVCaptureDevicePositionBack) {
if (captureVideoPreviewLayer.connection.videoOrientation == AVCaptureVideoOrientationLandscapeLeft) {
pointOfInterest = CGPointMake(1-pointOfInterest.x, 1-pointOfInterest.y);
}
}else {
pointOfInterest = CGPointMake(pointOfInterest.x, 1-pointOfInterest.y);
}
//NSLog(@"Focus - manu test: %@",NSStringFromCGPoint(pointOfInterest));
return pointOfInterest;
}
複製程式碼
5.曝光調節
如果我們是以UISlider作為調節控制元件,最簡單的做法可以將其範圍設定的與曝光度值的範圍相同,即(-8~8),這樣無需轉換值,直接傳入即可,如果是手勢或是其他控制元件可根據需求自行調整.較為簡單,不再敘述.
- (void)setExposureWithNewValue:(CGFloat)newExposureValue device:(AVCaptureDevice *)device {
NSError *error;
if ([device lockForConfiguration:&error]) {
[device setExposureTargetBias:newExposureValue completionHandler:nil];
[device unlockForConfiguration];
}
}
複製程式碼
6.手電筒模式
- AVCaptureTorchModeAuto: 自動
- AVCaptureTorchModeOn: 開啟
- AVCaptureTorchModeOff: 關閉
- (void)setTorchState:(BOOL)isOpen device:(AVCaptureDevice *)device {
if ([device hasTorch]) {
NSError *error;
[device lockForConfiguration:&error];
device.torchMode = isOpen ? AVCaptureTorchModeOn : AVCaptureTorchModeOff;
[device unlockForConfiguration];
}else {
NSLog(@"The device not support torch!");
}
}
複製程式碼
7.視訊穩定性調節
注意: 部分機型,部分解析度使用此屬性渲染可能會出現問題 (iphone xs, 自己渲染)
-(void)adjustVideoStabilizationWithOutput:(AVCaptureVideoDataOutput *)output {
NSArray *devices = nil;
if (@available(iOS 10.0, *)) {
AVCaptureDeviceDiscoverySession *deviceDiscoverySession = [AVCaptureDeviceDiscoverySession discoverySessionWithDeviceTypes:@[AVCaptureDeviceTypeBuiltInWideAngleCamera] mediaType:AVMediaTypeVideo position:self.cameraModel.position];
devices = deviceDiscoverySession.devices;
} else {
#pragma clang diagnostic push
#pragma clang diagnostic ignored "-Wdeprecated-declarations"
devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
#pragma clang diagnostic pop
}
for(AVCaptureDevice *device in devices){
if([device hasMediaType:AVMediaTypeVideo]){
if([device.activeFormat isVideoStabilizationModeSupported:AVCaptureVideoStabilizationModeAuto]) {
for(AVCaptureConnection *connection in output.connections) {
for(AVCaptureInputPort *port in [connection inputPorts]) {
if([[port mediaType] isEqual:AVMediaTypeVideo]) {
if(connection.supportsVideoStabilization) {
connection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeStandard;
NSLog(@"activeVideoStabilizationMode = %ld",(long)connection.activeVideoStabilizationMode);
}else {
NSLog(@"connection don't support video stabilization");
}
}
}
}
}else{
NSLog(@"device don't support video stablization");
}
}
}
}
複製程式碼
8.白平衡調節
- temperature: 通過華氏溫度調節 (-150-~250)
- tint: 通過色調調節 (-150-~150)
注意在使用setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains
方法時必須比較當前的AVCaptureWhiteBalanceGains
值是否在有效範圍.
-(AVCaptureWhiteBalanceGains)clampGains:(AVCaptureWhiteBalanceGains)gains toMinVal:(CGFloat)minVal andMaxVal:(CGFloat)maxVal {
AVCaptureWhiteBalanceGains tmpGains = gains;
tmpGains.blueGain = MAX(MIN(tmpGains.blueGain , maxVal), minVal);
tmpGains.redGain = MAX(MIN(tmpGains.redGain , maxVal), minVal);
tmpGains.greenGain = MAX(MIN(tmpGains.greenGain, maxVal), minVal);
return tmpGains;
}
-(void)setWhiteBlanceValueByTemperature:(CGFloat)temperature device:(AVCaptureDevice *)device {
if ([device isWhiteBalanceModeSupported:AVCaptureWhiteBalanceModeLocked]) {
[device lockForConfiguration:nil];
AVCaptureWhiteBalanceGains currentGains = device.deviceWhiteBalanceGains;
CGFloat currentTint = [device temperatureAndTintValuesForDeviceWhiteBalanceGains:currentGains].tint;
AVCaptureWhiteBalanceTemperatureAndTintValues tempAndTintValues = {
.temperature = temperature,
.tint = currentTint,
};
AVCaptureWhiteBalanceGains deviceGains = [device deviceWhiteBalanceGainsForTemperatureAndTintValues:tempAndTintValues];
CGFloat maxWhiteBalanceGain = device.maxWhiteBalanceGain;
deviceGains = [self clampGains:deviceGains toMinVal:1 andMaxVal:maxWhiteBalanceGain];
[device setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains:deviceGains completionHandler:nil];
[device unlockForConfiguration];
}
}
-(void)setWhiteBlanceValueByTint:(CGFloat)tint device:(AVCaptureDevice *)device {
if ([device isWhiteBalanceModeSupported:AVCaptureWhiteBalanceModeLocked]) {
[device lockForConfiguration:nil];
CGFloat maxWhiteBalaceGain = device.maxWhiteBalanceGain;
AVCaptureWhiteBalanceGains currentGains = device.deviceWhiteBalanceGains;
currentGains = [self clampGains:currentGains toMinVal:1 andMaxVal:maxWhiteBalaceGain];
CGFloat currentTemperature = [device temperatureAndTintValuesForDeviceWhiteBalanceGains:currentGains].temperature;
AVCaptureWhiteBalanceTemperatureAndTintValues tempAndTintValues = {
.temperature = currentTemperature,
.tint = tint,
};
AVCaptureWhiteBalanceGains deviceGains = [device deviceWhiteBalanceGainsForTemperatureAndTintValues:tempAndTintValues];
deviceGains = [self clampGains:deviceGains toMinVal:1 andMaxVal:maxWhiteBalaceGain];
[device setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains:deviceGains completionHandler:nil];
[device unlockForConfiguration];
}
}
複製程式碼
9.螢幕填充方式
- AVLayerVideoGravityResizeAspect: 保持解析度比例,如果螢幕解析度與視訊解析度不一致會留下黑邊.
- AVLayerVideoGravityResizeAspectFill: 保持解析度比例去填充螢幕,即以較小的邊來準填充螢幕,會犧牲掉一些畫素,因為超出螢幕.
- AVLayerVideoGravityResize:以拉伸的方式來填充螢幕,不會犧牲畫素,但是畫面會被拉伸.
- (void)setVideoGravity:(AVLayerVideoGravity)videoGravity previewLayer:(AVCaptureVideoPreviewLayer *)previewLayer session:(AVCaptureSession *)session {
[session beginConfiguration];
[previewLayer setVideoGravity:videoGravity];
[session commitConfiguration];
}
複製程式碼