android使用MediaCodec實現非同步視訊編解碼
android使用MediaCodec實現非同步視訊編解碼
最近在做螢幕投影的專案中需要對H.264的視訊流做解碼播放顯示,專案基於Android 7.0的系統,雖然android系統已經提供了一套以MediaCodec為核心的硬解碼實現方案。但是在實際運用過程中,遇到許多問題,在這裡進行一個總結,方便自己以後查閱,主要涉及以下內容:
- TextureView與SurfaceView
- MediaCodec介紹
- 非同步編解碼實現方式
- 同步編解碼實現方式
TextureView與SurfaceView
我們在做解碼的過程中,需要建立一個Surface來顯示我們的解碼內容,Surface的來源View用的較多的有兩種,一種是TextureView,還有一種是SurfaceView,那這兩種view有什麼區別呢?
SurfaceView: 與普通View不同的是,它有自己的Surface,並且Surface的渲染可以放到單獨執行緒去做,這對於一些遊戲、視訊等效能相關的應用非常有益,因為它不會影響主執行緒對事件的響應。但它也有缺點,因為這個Surface不在View hierachy中,它的顯示也不受View的屬性控制,所以不能進行平移,縮放等變換,也不能放在其它ViewGroup中,一些View中的特性也無法使用。
TextureView: 它可以將內容流直接投影到View中,可以用於實現Live preview等功能。和SurfaceView不同,它不會在WMS中單獨建立視窗,而是作為View hierachy中的一個普通View,因此可以和其它普通View一樣進行移動,旋轉,縮放,動畫等變化。值得注意的是TextureView必須在硬體加速的視窗中。它顯示的內容流資料可以來自App程式或是遠端程式.
MediaCodec介紹
MediaCodec是Android系統提供開發者用於訪問底層多媒體編解碼的元件,作為Android底層多媒體支援基礎通常與MediaExtractor, MediaSync, MediaMuxer,MediaCrypto, MediaDrm, Image, Surface, AudioTrack一起使用,後面我將一一介紹這些功能元件的使用方式。MediaCodec的工作流程:
當MediaCodec開始工作時, 首先會提供一個空的輸入buff給到Client端,Client端把要編碼的資料填充到這個空buff中回給MediaCodec,MediaCodec內部將資料編碼完成後,再填充一個buff提供給client端進行處理。針對編解碼兩種情況,MediaCodec流程如下:
編碼時:
1.將資料填充到MediaCodec提供的空buffer中。填充方式,同步與非同步的實現有所差別。
2.客戶端通知MediaCodec資料已填好
3.MediaCodec內部對資料進行硬編碼
4.填充完成後將資料輸出給客戶端,客戶端根據需求傳輸資料
解碼時:
1. 將編碼後的資料流填充buffer提供給MediaCodec。
2. 客戶端通知MediaCodec資料已填好
3. MediaCodec內部對資料硬解碼
4. 將解碼後的資料輸出給Surface顯示
MediaCodec狀態維護:
非同步編解碼實現方式
非同步編碼器實現程式碼如下:
package com.zdragon.videoio;
import android.media.MediaCodec;
import android.media.MediaCodecInfo;
import android.media.MediaFormat;
import android.os.Handler;
import android.os.HandlerThread;
import android.support.annotation.NonNull;
import android.util.Log;
import android.view.Surface;
import java.io.IOException;
import java.nio.ByteBuffer;
import java.util.concurrent.ArrayBlockingQueue;
/**
* This class use for Encode Video Frame Data.
* Created by zj on 2018/7/29 0029.
*/
public class VideoEncoder {
private final static String TAG = "VideoEncoder";
private final static int CONFIGURE_FLAG_ENCODE = MediaCodec.CONFIGURE_FLAG_ENCODE;
private final static int CACHE_BUFFER_SIZE = 8;
private MediaCodec mMediaCodec;
private MediaFormat mMediaFormat;
private int mViewWidth;
private int mViewHeight;
private Handler mVideoEncoderHandler;
private HandlerThread mVideoEncoderHandlerThread = new HandlerThread("VideoEncoder");
//This video stream format must be I420
private final static ArrayBlockingQueue<byte []> mInputDatasQueue = new ArrayBlockingQueue<byte []>(CACHE_BUFFER_SIZE);
//Cachhe video stream which has been encoded.
private final static ArrayBlockingQueue<byte []> mOutputDatasQueue = new ArrayBlockingQueue<byte[]>(CACHE_BUFFER_SIZE);
private MediaCodec.Callback mCallback = new MediaCodec.Callback() {
@Override
public void onInputBufferAvailable(@NonNull MediaCodec mediaCodec, int id) {
ByteBuffer inputBuffer = mediaCodec.getInputBuffer(id);
inputBuffer.clear();
byte [] dataSources = mInputDatasQueue.poll();
int length = 0;
if(dataSources != null) {
inputBuffer.put(dataSources);
length = dataSources.length;
}
mediaCodec.queueInputBuffer(id,0, length,0,0);
}
@Override
public void onOutputBufferAvailable(@NonNull MediaCodec mediaCodec, int id, @NonNull MediaCodec.BufferInfo bufferInfo) {
ByteBuffer outputBuffer = mMediaCodec.getOutputBuffer(id);
MediaFormat outputFormat = mMediaCodec.getOutputFormat(id);
if(outputBuffer != null && bufferInfo.size > 0){
byte [] buffer = new byte[outputBuffer.remaining()];
outputBuffer.get(buffer);
boolean result = mOutputDatasQueue.offer(buffer);
if(!result){
Log.d(TAG, "Offer to queue failed, queue in full state");
}
}
mMediaCodec.releaseOutputBuffer(id, true);
}
@Override
public void onError(@NonNull MediaCodec mediaCodec, @NonNull MediaCodec.CodecException e) {
Log.d(TAG, "------> onError");
}
@Override
public void onOutputFormatChanged(@NonNull MediaCodec mediaCodec, @NonNull MediaFormat mediaFormat) {
Log.d(TAG, "------> onOutputFormatChanged");
}
};
public VideoEncoder(String mimeType, int viewwidth, int viewheight){
try {
mMediaCodec = MediaCodec.createEncoderByType(mimeType);
} catch (IOException e) {
Log.e(TAG, Log.getStackTraceString(e));
mMediaCodec = null;
return;
}
this.mViewWidth = viewwidth;
this.mViewHeight = viewheight;
mVideoEncoderHandlerThread.start();
mVideoEncoderHandler = new Handler(mVideoEncoderHandlerThread.getLooper());
mMediaFormat = MediaFormat.createVideoFormat(mimeType, mViewWidth, mViewHeight);
mMediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Flexible);
mMediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 1920 * 1280);
mMediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 30);
mMediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 1);
}
/**
* Input Video stream which need encode to Queue
* @param needEncodeData I420 format stream
*/
public void inputFrameToEncoder(byte [] needEncodeData){
boolean inputResult = mInputDatasQueue.offer(needEncodeData);
Log.d(TAG, "-----> inputEncoder queue result = " + inputResult + " queue current size = " + mInputDatasQueue.size());
}
/**
* Get Encoded frame from queue
* @return a encoded frame; it would be null when the queue is empty.
*/
public byte [] pollFrameFromEncoder(){
return mOutputDatasQueue.poll();
}
/**
* start the MediaCodec to encode video data
*/
public void startEncoder(){
if(mMediaCodec != null){
mMediaCodec.setCallback(mCallback, mVideoEncoderHandler);
mMediaCodec.configure(mMediaFormat, null, null, CONFIGURE_FLAG_ENCODE);
mMediaCodec.start();
}else{
throw new IllegalArgumentException("startEncoder failed,is the MediaCodec has been init correct?");
}
}
/**
* stop encode the video data
*/
public void stopEncoder(){
if(mMediaCodec != null){
mMediaCodec.stop();
mMediaCodec.setCallback(null);
}
}
/**
* release all resource that used in Encoder
*/
public void release(){
if(mMediaCodec != null){
mInputDatasQueue.clear();
mOutputDatasQueue.clear();
mMediaCodec.release();
}
}
}
非同步解碼器實現程式碼如下:
package com.zdragon.videoio;
import android.media.MediaCodec;
import android.media.MediaCodecInfo;
import android.media.MediaFormat;
import android.os.Handler;
import android.os.HandlerThread;
import android.support.annotation.NonNull;
import android.util.Log;
import android.view.Surface;
import java.io.IOException;
import java.nio.ByteBuffer;
import java.util.concurrent.ArrayBlockingQueue;
/**
* This class use for Decode Video Frame Data and show to SurfaceTexture
* Created by zj on 2018/7/29 0029.
*/
public class VideoDecoder {
private final static String TAG = "VideoEncoder";
private final static int CONFIGURE_FLAG_DECODE = 0;
private MediaCodec mMediaCodec;
private MediaFormat mMediaFormat;
private Surface mSurface;
private int mViewWidth;
private int mViewHeight;
private VideoEncoder mVideoEncoder;
private Handler mVideoDecoderHandler;
private HandlerThread mVideoDecoderHandlerThread = new HandlerThread("VideoDecoder");
private MediaCodec.Callback mCallback = new MediaCodec.Callback() {
@Override
public void onInputBufferAvailable(@NonNull MediaCodec mediaCodec, int id) {
ByteBuffer inputBuffer = mediaCodec.getInputBuffer(id);
inputBuffer.clear();
byte [] dataSources = null;
if(mVideoEncoder != null) {
dataSources = mVideoEncoder.pollFrameFromEncoder();
}
int length = 0;
if(dataSources != null) {
inputBuffer.put(dataSources);
length = dataSources.length;
}
mediaCodec.queueInputBuffer(id,0, length,0,0);
}
@Override
public void onOutputBufferAvailable(@NonNull MediaCodec mediaCodec, int id, @NonNull MediaCodec.BufferInfo bufferInfo) {
ByteBuffer outputBuffer = mMediaCodec.getOutputBuffer(id);
MediaFormat outputFormat = mMediaCodec.getOutputFormat(id);
if(mMediaFormat == outputFormat && outputBuffer != null && bufferInfo.size > 0){
byte [] buffer = new byte[outputBuffer.remaining()];
outputBuffer.get(buffer);
}
mMediaCodec.releaseOutputBuffer(id, true);
}
@Override
public void onError(@NonNull MediaCodec mediaCodec, @NonNull MediaCodec.CodecException e) {
Log.d(TAG, "------> onError");
}
@Override
public void onOutputFormatChanged(@NonNull MediaCodec mediaCodec, @NonNull MediaFormat mediaFormat) {
Log.d(TAG, "------> onOutputFormatChanged");
}
};
public VideoDecoder(String mimeType, Surface surface, int viewwidth, int viewheight){
try {
mMediaCodec = MediaCodec.createDecoderByType(mimeType);
} catch (IOException e) {
Log.e(TAG, Log.getStackTraceString(e));
mMediaCodec = null;
return;
}
if(surface == null){
return;
}
this.mViewWidth = viewwidth;
this.mViewHeight = viewheight;
this.mSurface = surface;
mVideoDecoderHandlerThread.start();
mVideoDecoderHandler = new Handler(mVideoDecoderHandlerThread.getLooper());
mMediaFormat = MediaFormat.createVideoFormat(mimeType, mViewWidth, mViewHeight);
mMediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Flexible);
mMediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 1920 * 1280);
mMediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 30);
mMediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 1);
}
public void setEncoder(VideoEncoder videoEncoder){
this.mVideoEncoder = videoEncoder;
}
public void startDecoder(){
if(mMediaCodec != null && mSurface != null){
mMediaCodec.setCallback(mCallback, mVideoDecoderHandler);
mMediaCodec.configure(mMediaFormat, mSurface,null,CONFIGURE_FLAG_DECODE);
mMediaCodec.start();
}else{
throw new IllegalArgumentException("startDecoder failed, please check the MediaCodec is init correct");
}
}
public void stopDecoder(){
if(mMediaCodec != null){
mMediaCodec.stop();
}
}
/**
* release all resource that used in Encoder
*/
public void release(){
if(mMediaCodec != null){
mMediaCodec.release();
mMediaCodec = null;
}
}
}
驗證編解碼的正確性,我這裡做了一個demo, 在華為榮耀V8上開啟相機預覽,並將預覽資料編解碼後,在同介面的TextureView上顯示出來。具體實現方式如下:
1.介面UI,第一個TextureView用於預覽攝像頭,第二個TextureView用於顯示解碼後畫面
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:orientation="vertical"
tools:context="com.zdragon.videoio.MainActivity">
<TextureView
android:id="@+id/camera"
android:layout_width="wrap_content"
android:layout_height="0dp"
android:layout_weight="1"/>
<TextureView
android:id="@+id/decode"
android:layout_width="wrap_content"
android:layout_height="0dp"
android:layout_weight="1" />
</LinearLayout>
- 在Activity的實現
package com.zdragon.videoio;
import android.Manifest;
import android.content.pm.PackageManager;
import android.graphics.ImageFormat;
import android.graphics.SurfaceTexture;
import android.hardware.Camera;
import android.support.annotation.NonNull;
import android.support.v4.app.ActivityCompat;
import android.support.v4.content.ContextCompat;
import android.support.v7.app.AppCompatActivity;
import android.os.Bundle;
import android.util.Log;
import android.util.Size;
import android.view.Surface;
import android.view.TextureView;
import java.io.IOException;
import java.util.List;
public class MainActivity extends AppCompatActivity {
private final static String TAG = "VideoIO";
private final static String MIME_FORMAT = "video/avc"; //support h.264
private TextureView mCameraTexture;
private TextureView mDecodeTexture;
private VideoDecoder mVideoDecoder;
private VideoEncoder mVideoEncoder;
private Camera mCamera;
private int mPreviewWidth;
private int mPreviewHeight;
private Camera.PreviewCallback mPreviewCallBack = new Camera.PreviewCallback() {
@Override
public void onPreviewFrame(byte[] bytes, Camera camera) {
byte[] i420bytes = new byte[bytes.length];
//from YV20 to i420
System.arraycopy(bytes, 0, i420bytes, 0, mPreviewWidth * mPreviewHeight);
System.arraycopy(bytes, mPreviewWidth * mPreviewHeight + mPreviewWidth * mPreviewHeight / 4, i420bytes, mPreviewWidth * mPreviewHeight, mPreviewWidth * mPreviewHeight / 4);
System.arraycopy(bytes, mPreviewWidth * mPreviewHeight, i420bytes, mPreviewWidth * mPreviewHeight + mPreviewWidth * mPreviewHeight / 4, mPreviewWidth * mPreviewHeight / 4);
if(mVideoEncoder != null) {
mVideoEncoder.inputFrameToEncoder(i420bytes);
}
}
};
private TextureView.SurfaceTextureListener mCameraTextureListener = new TextureView.SurfaceTextureListener() {
@Override
public void onSurfaceTextureAvailable(SurfaceTexture surfaceTexture, int i, int i1) {
openCamera(surfaceTexture,i, i1);
mVideoEncoder = new VideoEncoder(MIME_FORMAT, mPreviewWidth, mPreviewHeight);
mVideoEncoder.startEncoder();
}
@Override
public void onSurfaceTextureSizeChanged(SurfaceTexture surfaceTexture, int i, int i1) {
}
@Override
public boolean onSurfaceTextureDestroyed(SurfaceTexture surfaceTexture) {
if(mVideoEncoder != null){
mVideoEncoder.release();
}
closeCamera();
return true;
}
@Override
public void onSurfaceTextureUpdated(SurfaceTexture surfaceTexture) {
}
};
private TextureView.SurfaceTextureListener mDecodeTextureListener = new TextureView.SurfaceTextureListener() {
@Override
public void onSurfaceTextureAvailable(SurfaceTexture surfaceTexture, int i, int i1) {
System.out.println("----------" + i + " ," + i1);
mVideoDecoder = new VideoDecoder(MIME_FORMAT, new Surface(surfaceTexture), mPreviewWidth, mPreviewHeight);
mVideoDecoder.setEncoder(mVideoEncoder);
mVideoDecoder.startDecoder();
}
@Override
public void onSurfaceTextureSizeChanged(SurfaceTexture surfaceTexture, int i, int i1) {
}
@Override
public boolean onSurfaceTextureDestroyed(SurfaceTexture surfaceTexture) {
mVideoDecoder.stopDecoder();
mVideoDecoder.release();
return true;
}
@Override
public void onSurfaceTextureUpdated(SurfaceTexture surfaceTexture) {
}
};
@Override
public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) {
super.onRequestPermissionsResult(requestCode, permissions, grantResults);
initView();
}
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
if (ContextCompat.checkSelfPermission(this, Manifest.permission.CAMERA) == PackageManager.PERMISSION_GRANTED) {
Log.i("TEST","Granted");
initView();
} else {
ActivityCompat.requestPermissions(this, new String[]{Manifest.permission.CAMERA}, 1);//1 can be another integer
}
}
private void initView(){
mCameraTexture = (TextureView)findViewById(R.id.camera);
mDecodeTexture = (TextureView)findViewById(R.id.decode);
mCameraTexture.setSurfaceTextureListener(mCameraTextureListener);
mDecodeTexture.setSurfaceTextureListener(mDecodeTextureListener);
}
private void openCamera(SurfaceTexture texture,int width, int height){
if(texture == null){
Log.e(TAG, "openCamera need SurfaceTexture");
return;
}
mCamera = Camera.open(0);
try{
mCamera.setPreviewTexture(texture);
Camera.Parameters parameters = mCamera.getParameters();
parameters.setPreviewFormat(ImageFormat.YV12);
List<Camera.Size> list = parameters.getSupportedPreviewSizes();
for(Camera.Size size: list){
System.out.println("----size width = " + size.width + " size height = " + size.height);
}
mPreviewWidth = 640;
mPreviewHeight = 480;
parameters.setPreviewSize(mPreviewWidth,mPreviewHeight);
mCamera.setParameters(parameters);
mCamera.setPreviewCallback(mPreviewCallBack);
mCamera.startPreview();
}catch(IOException e){
Log.e(TAG, Log.getStackTraceString(e));
mCamera = null;
}
}
private void closeCamera(){
if(mCamera == null){
Log.e(TAG, "Camera not open");
return;
}
mCamera.stopPreview();
mCamera.release();
}
}
最終實現效果如圖,這樣處理的速度幾乎與攝像頭同步:
同步編解碼的方式可以參考Android API文件,MediaCodec的使用說明上有詳細使用用例。同步的時候有時候在不同的機型上,有的機器可以正確播放,有的機器不能出畫面,這時候需要針對機型,在初始化Codec的時候,延時200ms呼叫start函式即可。
工程程式碼下載路徑:
http下載:
ssh下載:
git@github.com:RunningWay/android.git
相關文章
- Android 音視訊 - MediaCodec 編解碼音視訊Android
- Android音視訊(四)MediaCodec編解碼AACAndroid
- Android:MediaCodeC硬編碼解碼視訊,並將視訊幀儲存為圖片檔案Android
- 使用MediaCodeC將圖片集編碼為視訊
- Android 音視訊錄製硬編碼實現Android
- Android音視訊處理之MediaCodecAndroid
- 使用 MediaCodec 在 Android 上進行硬解碼Android
- MediaCodeC解碼視訊指定幀,迅捷、精確
- Android 音視訊開發 視訊編碼,音訊編碼格式Android音訊
- MediaCodec解碼FFmpeg AvPacket
- 音視訊編解碼 -- 編碼引數 CRFCRF
- 音視訊同步!RTCP 協議解析及程式碼實現TCP協議
- Android視訊編碼和直播推流教程Android
- sync-player:使用websocket實現異地同步播放視訊Web
- MediaCodec硬編碼pcm2aac
- MediaCodec、OpenGL、OpenSL/AudioTrack 實現一款簡單的視訊播放器播放器
- WebRTC 音視訊同步原理與實現Web
- Android 音視訊 - EGL 原始碼解析以及 C++ 實現Android原始碼C++
- Android短視訊開發業務中視訊編解碼的相關知識閱讀Android
- 短視訊系統,Android 使用MotionLayout實現動畫效果Android動畫
- 短視訊原始碼,在Android 中opengl es實現燈光效果原始碼Android
- 手機直播原始碼,Android studio 實現簡單的視訊播放原始碼Android
- Android短影片系統硬編碼—實現音影片編碼(三)Android
- Android短影片系統硬編碼—實現音影片編碼(二)Android
- Android如何回撥編碼後的音視訊資料Android
- python中非同步非阻塞如何實現Python非同步
- 網路視訊直播系統開發,視訊解碼是如何實現的?
- FFmpeg程式碼實現視訊剪下
- 史上最易懂的視訊編碼講解,告訴你哪種視訊編碼最適合你?
- 位元組跳動視訊編解碼面經
- 突破Android P非SDK API限制的幾種程式碼實現AndroidAPI
- Thinking in Java--使用NIO實現非阻塞Socket通訊ThinkingJava
- 十、詳解FFplay音視訊同步
- Android端實現多人音視訊聊天應用(二):多人視訊通話Android
- Android 9 的非 SDK 介面限制 | 中文教學視訊Android
- Web 上的隱形負擔:視訊編解碼Web
- Android端實現多人音視訊聊天應用(一)Android
- 如何使用DeepFake實現視訊換臉