iOS開發基礎107-iOS直播

Mr.陳發表於2024-07-16

在 iOS 平臺上,直播技術已經很成熟,有許多強大的第三方框架可以幫助開發者輕鬆實現直播功能。當前主流的直播第三方框架包括但不限於:

  1. LFLiveKit:一款開源的直播推流 SDK。
  2. PLMediaStreamingKit:由雲天存提供的一站式音影片解決方案。
  3. AliyunPlayer:阿里雲提供的音影片播放解決方案。
  4. Agora SDK:聲網提供的大規模實時影片通訊解決方案。

以下將詳細介紹 LFLiveKit 和 PLMediaStreamingKit 的使用,並給出相應的示例程式碼。

一、LFLiveKit

1. LFLiveKit 安裝

要使用 LFLiveKit 首先需要透過 CocoaPods 新增到你的專案。

在你的 Podfile 檔案中新增如下內容:

pod 'LFLiveKit'

然後執行 pod install

2. 配置和使用

Import LFLiveKit 的標頭檔案:

#import <LFLiveKit/LFLiveKit.h>
建立直播會話

LFLiveKit 提供了多種配置選項。首先,你需要建立一個 LFLiveSession,這是 LFLiveKit 的核心類,負責管理音影片捕獲和推流處理。

- (LFLiveSession*)liveSession {
    if (!_liveSession) {
        // 自定義音訊和影片配置
        LFLiveAudioConfiguration *audioConfiguration = [LFLiveAudioConfiguration defaultConfiguration];
        LFLiveVideoConfiguration *videoConfiguration = [LFLiveVideoConfiguration defaultConfiguration];
        
        _liveSession = [[LFLiveSession alloc] initWithAudioConfiguration:audioConfiguration videoConfiguration:videoConfiguration];
        _liveSession.delegate = self;
        _liveSession.preView = self.view; // 設定預覽檢視
    }
    return _liveSession;
}
請求許可權

在 iOS 開發中,需要請求相機和麥克風許可權。以下是請求許可權程式碼:

- (void)requestAccessForVideo {
    AVAuthorizationStatus status = [AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeVideo];
    if (status == AVAuthorizationStatusNotDetermined) {
        [AVCaptureDevice requestAccessForMediaType:AVMediaTypeVideo completionHandler:^(BOOL granted) {
            if (granted) {
                dispatch_async(dispatch_get_main_queue(), ^{
                    [self.liveSession setRunning:YES];
                });
            }
        }];
    } else if (status == AVAuthorizationStatusAuthorized) {
        [self.liveSession setRunning:YES];
    }
}

- (void)requestAccessForAudio {
    AVAuthorizationStatus status = [AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeAudio];
    if (status == AVAuthorizationStatusNotDetermined) {
        [AVCaptureDevice requestAccessForMediaType:AVMediaTypeAudio completionHandler:^(BOOL granted) {
            if (granted) {
                dispatch_async(dispatch_get_main_queue(), ^{
                    [self.liveSession setRunning:YES];
                });
            }
        }];
    } else if (status == AVAuthorizationStatusAuthorized) {
        [self.liveSession setRunning:YES];
    }
}
開始直播

配置完 LFLiveStreamInfo 物件,並呼叫 startLive: 方法開始直播。

- (void)startLive {
    LFLiveStreamInfo *streamInfo = [LFLiveStreamInfo new];
    streamInfo.url = @"rtmp://your_server/live_stream";
    [self.liveSession startLive:streamInfo];
}

- (void)stopLive {
    [self.liveSession stopLive];
}
處理直播狀態變化

透過實現 LFLiveSessionDelegate,可以監測廣播的狀態變更。

- (void)liveSession:(LFLiveSession *)session liveStateDidChange:(LFLiveState)state {
    switch (state) {
        // 在每個狀態變化時的對應處理
        case LFLiveReady:
            NSLog(@"Ready to start live streaming");
            break;
        case LFLivePending:
            NSLog(@"Connecting...");
            break;
        case LFLiveStart:
            NSLog(@"Live streaming started");
            break;
        case LFLiveStop:
            NSLog(@"Live streaming stopped");
            break;
        case LFLiveError:
            NSLog(@"Live streaming error");
            break;
        case LFLiveRefresh:
            NSLog(@"Live streaming refreshing");
            break;
    }
}
完整示例:

完整的 ViewController.m 看起來如下:

#import "ViewController.h"
#import <LFLiveKit/LFLiveKit.h>

@interface ViewController () <LFLiveSessionDelegate>
@property (nonatomic, strong) LFLiveSession *liveSession;
@end

@implementation ViewController

- (void)viewDidLoad {
    [super viewDidLoad];
    [self requestAccessForVideo];
    [self requestAccessForAudio];
    [self startLive];
}

- (LFLiveSession*)liveSession {
    if (!_liveSession) {
        LFLiveAudioConfiguration *audioConfiguration = [LFLiveAudioConfiguration defaultConfiguration];
        LFLiveVideoConfiguration *videoConfiguration = [LFLiveVideoConfiguration defaultConfiguration];
        
        _liveSession = [[LFLiveSession alloc] initWithAudioConfiguration:audioConfiguration videoConfiguration:videoConfiguration];
        _liveSession.delegate = self;
        _liveSession.preView = self.view;
    }
    return _liveSession;
}

- (void)requestAccessForVideo {
    AVAuthorizationStatus status = [AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeVideo];
    if (status == AVAuthorizationStatusNotDetermined) {
        [AVCaptureDevice requestAccessForMediaType:AVMediaTypeVideo completionHandler:^(BOOL granted) {
            if (granted) {
                dispatch_async(dispatch_get_main_queue(), ^{
                    [self.liveSession setRunning:YES];
                });
            }
        }];
    } else if (status == AVAuthorizationStatusAuthorized) {
        [self.liveSession setRunning:YES];
    }
}

- (void)requestAccessForAudio {
    AVAuthorizationStatus status = [AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeAudio];
    if (status == AVAuthorizationStatusNotDetermined) {
        [AVCaptureDevice requestAccessForMediaType:AVMediaTypeAudio completionHandler:^(BOOL granted) {
            if (granted) {
                dispatch_async(dispatch_get_main_queue(), ^{
                    [self.liveSession setRunning:YES];
                });
            }
        }];
    } else if (status == AVAuthorizationStatusAuthorized) {
        [self.liveSession setRunning:YES];
    }
}

- (void)startLive {
    LFLiveStreamInfo *streamInfo = [LFLiveStreamInfo new];
    streamInfo.url = @"rtmp://your_server/live_stream";
    [self.liveSession startLive:streamInfo];
}

- (void)stopLive {
    [self.liveSession stopLive];
}

- (void)liveSession:(LFLiveSession *)session liveStateDidChange:(LFLiveState)state {
    switch (state) {
        case LFLiveReady:
            NSLog(@"Ready to start live streaming");
            break;
        case LFLivePending:
            NSLog(@"Connecting...");
            break;
        case LFLiveStart:
            NSLog(@"Live streaming started");
            break;
        case LFLiveStop:
            NSLog(@"Live streaming stopped");
            break;
        case LFLiveError:
            NSLog(@"Live streaming error");
            break;
        case LFLiveRefresh:
            NSLog(@"Live streaming refreshing");
            break;
    }
}
@end

二、PLMediaStreamingKit

1. PLMediaStreamingKit 安裝

使用 CocoaPods 安裝:

pod 'PLMediaStreamingKit'

執行 pod install 之後,在專案的任意位置匯入 PLMediaStreamingKit

#import <PLMediaStreamingKit/PLMediaStreamingKit.h>

2. 配置和使用

建立推流會話

PLMediaStreamingSession 是此框架的核心類,用於音影片捕獲、編碼和推流。

- (PLMediaStreamingSession *)streamingSession {
    if (!_streamingSession) {
        PLVideoCaptureConfiguration *videoConfiguration = [PLVideoCaptureConfiguration defaultConfiguration];
        PLAudioCaptureConfiguration *audioConfiguration = [PLAudioCaptureConfiguration defaultConfiguration];
        
        PLVideoStreamingConfiguration *videoStreamingConfiguration = [PLVideoStreamingConfiguration defaultConfiguration];
        PLAudioStreamingConfiguration *audioStreamingConfiguration = [PLAudioStreamingConfiguration defaultConfiguration];
        
        _streamingSession = [[PLMediaStreamingSession alloc] initWithVideoCaptureConfiguration:videoConfiguration
                                                           audioCaptureConfiguration:audioConfiguration
                                                        videoStreamingConfiguration:videoStreamingConfiguration
                                                       audioStreamingConfiguration:audioStreamingConfiguration];
        
        _streamingSession.delegate = self;
        _streamingSession.previewView = self.view;
    }
    return _streamingSession;
}
檢查並請求許可權

和 LFLiveKit 類似,我們需要請求相機和麥克風的許可權:

- (void)requestAccessForVideo {
    AVAuthorizationStatus status = [AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeVideo];
    if (status == AVAuthorizationStatusNotDetermined) {
        [AVCaptureDevice requestAccessForMediaType:AVMediaTypeVideo completionHandler:^(BOOL granted) {
            if (granted) {
                dispatch_async(dispatch_get_main_queue(), ^{
                    [self.streamingSession startCaptureSession];
                });
            }
        }];
    } else if (status == AVAuthorizationStatusAuthorized) {
        [self.streamingSession startCaptureSession];
    }
}

- (void)requestAccessForAudio {
    AVAuthorizationStatus status = [AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeAudio];
    if (status == AVAuthorizationStatusNotDetermined) {
        [AVCaptureDevice requestAccessForMediaType:AVMediaTypeAudio completionHandler:^(BOOL granted) {
            if (granted) {
                dispatch_async(dispatch_get_main_queue(), ^{
                    [self.streamingSession startCaptureSession];
                });
            }
        }];
    } else if (status == AVAuthorizationStatusAuthorized) {
        [self.streamingSession startCaptureSession];
    }
}
開始直播

建立一個 PLStream 物件,包含推流的 URL 和其他配置資訊,並啟動推流。

- (void)startStreaming {
    PLStream *stream = [PLStream new];
    stream.url = @"rtmp://your_server/live_stream";
    
    [self.streamingSession startWithStream:stream feedback:^(PLStreamStartStateFeedback *feedback) {
        if (feedback.state == PLStreamStartStateSuccess) {
            NSLog(@"Streaming Started Successfully");
        } else {
            NSLog(@"Failed to start streaming: %@", feedback.error.localizedDescription);
        }
    }];
}

- (void)stopStreaming {
    [self.streamingSession stop];
}
處理推流狀態變化

透過實現 PLMediaStreamingSessionDelegate 的相關方法,可以監測推流狀態的變化。

- (void)mediaStreamingSession:(PLMediaStreamingSession *)session streamStatusDidUpdate:(PLStreamStatus *)status {
    NSLog(@"Stream status: %@", status);
}

- (void)mediaStreamingSession:(PLMediaStreamingSession *)session didDisconnectWithError:(NSError *)error {
    NSLog(@"Stream disconnected with error: %@", error.localizedDescription);
}
完整示例:

完整的 ViewController.m 可以如下:

#import "ViewController.h"
#import <PLMediaStreamingKit/PLMediaStreamingKit.h>

@interface ViewController () <PLMediaStreamingSessionDelegate>
@property (nonatomic, strong) PLMediaStreamingSession *streamingSession;
@end

@implementation ViewController

- (void)viewDidLoad {
    [super viewDidLoad];
    [self requestAccessForVideo];
    [self requestAccessForAudio];
}

- (PLMediaStreamingSession *)streamingSession {
    if (!_streamingSession) {
        PLVideoCaptureConfiguration *videoConfiguration = [PLVideoCaptureConfiguration defaultConfiguration];
        PLAudioCaptureConfiguration *audioConfiguration = [PLAudioCaptureConfiguration defaultConfiguration];
        
        PLVideoStreamingConfiguration *videoStreamingConfiguration = [PLVideoStreamingConfiguration defaultConfiguration];
        PLAudioStreamingConfiguration *audioStreamingConfiguration = [PLAudioStreamingConfiguration defaultConfiguration];
        
        _streamingSession = [[PLMediaStreamingSession alloc] initWithVideoCaptureConfiguration:videoConfiguration
                                                                   audioCaptureConfiguration:audioConfiguration
                                                            videoStreamingConfiguration:videoStreamingConfiguration
                                                           audioStreamingConfiguration:audioStreamingConfiguration
                                                                                 stream:nil];
        
        _streamingSession.delegate = self;
        _streamingSession.previewView = self.view;
    }
    return _streamingSession;
}

- (void)requestAccessForVideo {
    AVAuthorizationStatus status = [AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeVideo];
    if (status == AVAuthorizationStatusNotDetermined) {
        [AVCaptureDevice requestAccessForMediaType:AVMediaTypeVideo completionHandler:^(BOOL granted) {
            if (granted) {
                dispatch_async(dispatch_get_main_queue(), ^{
                    [self.streamingSession startCaptureSession];
                });
            }
        }];
    } else if (status == AVAuthorizationStatusAuthorized) {
        [self.streamingSession startCaptureSession];
    }
}

- (void)requestAccessForAudio {
    AVAuthorizationStatus status = [AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeAudio];
    if (status == AVAuthorizationStatusNotDetermined) {
        [AVCaptureDevice requestAccessForMediaType:AVMediaTypeAudio completionHandler:^(BOOL granted) {
            if (granted) {
                dispatch_async(dispatch_get_main_queue(), ^{
                    [self.streamingSession startCaptureSession];
                });
            }
        }];
    } else if (status == AVAuthorizationStatusAuthorized) {
        [self.streamingSession startCaptureSession];
    }
}

- (void)startStreaming {
    PLStream *stream = [PLStream new];
    stream.url = @"rtmp://your_server/live_stream";
    
    [self.streamingSession startWithStream:stream feedback:^(PLStreamStartStateFeedback *feedback) {
        if (feedback.state == PLStreamStartStateSuccess) {
            NSLog(@"Streaming Started Successfully");
        } else {
            NSLog(@"Failed to start streaming: %@", feedback.error.localizedDescription);
        }
    }];
}

- (void)stopStreaming {
    [self.streamingSession stop];
}

- (void)mediaStreamingSession:(PLMediaStreamingSession *)session streamStatusDidUpdate:(PLStreamStatus *)status {
    NSLog(@"Stream status: %@", status);
}

- (void)mediaStreamingSession:(PLMediaStreamingSession *)session didDisconnectWithError:(NSError *)error {
    NSLog(@"Stream disconnected with error: %@", error.localizedDescription);
}
@end

相關文章