[-綜合篇-] 相機、OpenGL、視訊、Flutter和SurfaceView

張風捷特烈發表於2019-10-11

認識一個類,相當於結交一位朋友;看一篇原始碼,相當於一次頂級的會話;
讀一個框架,相當於見證一段思想;做一個程式,相當於創造一個生命;
一次Git提交,相當於記錄一次成長;生活也許並非那麼美好,但一切可以這麼崇高。----張風捷特烈


一、關於SurfaceView

對於視訊、相機、遊戲、Flutter等需要高效能渲染的場景,你都會發現SurfaceView的身影,如果你想進行高效能的渲染,那麼SurfaceView是你必須要過的坎,也是一把開啟視訊之門的鑰匙。 本篇會從一下幾點的極簡操作,來讓你對SurfaceView有個感性的認知:

[1].Camera的預覽和SurfaceView的使用
[2].Camera2的預覽和SurfaceView的使用
[3].OpenGL中的GLSurfaceView
[4].Camera2和OpenGL的結合
[5].視訊播放和和OpenGL的結合
[6].Flutter與SurfaceView的聯絡
複製程式碼

[-綜合篇-] 相機、OpenGL、視訊、Flutter和SurfaceView


1.Camera使用SurfaceView開啟預覽

SurfaceView依賴SurfaceHolder類,所以兩者形影不離。Camera的setPreviewDisplay方法入參是一個SurfaceHolder
SurfaceHolder並不是立馬就建立出來的,需要一個回撥監聽。以便對它建立、改變、銷燬時的感知並進行相關操作。
該監聽的介面為SurfaceHolder.Callback,為了方便,可直接實現之。當然你也可以新建一個類

詳細操作見:Android多媒體之Camera的相關操作

[-綜合篇-] 相機、OpenGL、視訊、Flutter和SurfaceView

public class CameraSurfaceView extends SurfaceView implements SurfaceHolder.Callback {
    private Camera camera;

    public CameraSurfaceView(Context context) {
        this(context,null);
    }

    public CameraSurfaceView(Context context, AttributeSet attrs) {
        this(context, attrs,0);
    }

    public CameraSurfaceView(Context context, AttributeSet attrs, int defStyleAttr) {
        super(context, attrs, defStyleAttr);
        getHolder().addCallback(this);//為SurfaceView的SurfaceHolder新增回撥
    }

    //-----------------覆寫SurfaceHolder.Callback方法----------------------
    @Override
    public void surfaceCreated(SurfaceHolder holder) {
        camera = Camera.open();
        camera.setDisplayOrientation(90);
        try {
            camera.setPreviewDisplay(holder);//Camera+SurfaceHolder
            camera.startPreview();
        } catch (IOException e) {
            e.printStackTrace();
        }
    }

    @Override
    public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {

    }

    @Override
    public void surfaceDestroyed(SurfaceHolder holder) {
        camera.release();//釋放資源
    }
    
}
複製程式碼

2.Camera2中SurfaceView使用

Camera2並不是值Camera2類,而是camera2包下的相機系統,雖然使用起來挺複雜
簡單必有簡單的侷限,複雜必有複雜的價值,它的顯示核心也需要一個SurfaceHolder

詳細操作見:Android多媒體之Camera2的相關操作

[-綜合篇-] 相機、OpenGL、視訊、Flutter和SurfaceView

public class Camera2SurfaceView extends SurfaceView implements SurfaceHolder.Callback {
    private Handler mainHandler;
    private String mCameraID;
    private CameraManager mCameraManager;
    private CameraDevice mCameraDevice;//相機裝置
    private CameraCaptureSession mCameraCaptureSession;
    private Handler childHandler;

    private CameraDevice.StateCallback mStateCallback;

    private Semaphore mCameraOpenCloseLock = new Semaphore(1);//以防止在關閉相機之前應用程式退出

    public Camera2SurfaceView(Context context) {
        this(context,null);
    }

    public Camera2SurfaceView(Context context, AttributeSet attrs) {
        this(context, attrs,0);
    }

    public Camera2SurfaceView(Context context, AttributeSet attrs, int defStyleAttr) {
        super(context, attrs, defStyleAttr);
        getHolder().addCallback(this);//為SurfaceView的SurfaceHolder新增回撥
    }

    //-----------------覆寫SurfaceHolder.Callback方法----------------------
    @Override
    public void surfaceCreated(SurfaceHolder holder) {
        initHandler();//初始化執行緒處理器
        initCamera();//初始化相機
        try {
            if (ActivityCompat.checkSelfPermission(getContext(), Manifest.permission.CAMERA) !=
                    PackageManager.PERMISSION_GRANTED) {
                return;
            }
            mCameraManager.openCamera(mCameraID, mStateCallback, mainHandler);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
    }

    @Override
    public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
 
    }

    @Override
    public void surfaceDestroyed(SurfaceHolder holder) {
        mCameraDevice.close();//釋放資源;//釋放資源
    }
    
    private void initCamera() {
        mCameraID = "" + CameraCharacteristics.LENS_FACING_FRONT;//後攝像頭
        //獲取攝像頭管理器
        mCameraManager = (CameraManager) getContext().getSystemService(Context.CAMERA_SERVICE);

        mStateCallback = new CameraDevice.StateCallback() {
            @Override
            public void onOpened(@NonNull CameraDevice camera) {
                mCameraOpenCloseLock.release();
                mCameraDevice = camera;
                startPreview();
            }
            @Override
            public void onDisconnected(@NonNull CameraDevice camera) {
                mCameraOpenCloseLock.release();
                mCameraDevice.close();
            }
            @Override
            public void onError(@NonNull CameraDevice camera, int error) {
                mCameraOpenCloseLock.release();
                mCameraDevice.close();
            }
        };
    }

    private void initHandler() {
        HandlerThread handlerThread = new HandlerThread("Camera2");
        handlerThread.start();
        mainHandler = new Handler(getMainLooper());//主執行緒Handler
        childHandler = new Handler(handlerThread.getLooper());//子執行緒Handler
    }

    /**
     * 開啟預覽
     */
    private void startPreview() {
        try {
            // 建立預覽需要的CaptureRequest.Builder
            final CaptureRequest.Builder reqBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
            // 將SurfaceView的surface作為CaptureRequest.Builder的目標
            reqBuilder.addTarget(getHolder().getSurface());

            // 建立CameraCaptureSession,該物件負責管理處理預覽請求和拍照請求
            CameraCaptureSession.StateCallback stateCallback = new CameraCaptureSession.StateCallback() {
                        @Override
                        public void onConfigured(@NonNull CameraCaptureSession cameraCaptureSession) {
                            if (null == mCameraDevice) return;
                            mCameraCaptureSession = cameraCaptureSession; // 當攝像頭已經準備好時,開始顯示預覽
                            try {// 顯示預覽
                                mCameraCaptureSession.setRepeatingRequest(reqBuilder.build(), null, childHandler);
                            } catch (CameraAccessException e) {
                                e.printStackTrace();
                            }
                        }
                        @Override
                        public void onConfigureFailed(@NonNull CameraCaptureSession cameraCaptureSession) {

                        }
                    };
            mCameraDevice.createCaptureSession(Collections.singletonList(getHolder().getSurface()),
                    stateCallback, childHandler);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
    }
}

複製程式碼

3.OpenGL中GLSurfaceView使用

GLSurfaceView作為SurfaceView的子類,開啟了一扇叫作OpenGL的大門。
它實現了SurfaceHolder.Callback2介面,需要傳入一個GLSurfaceView.Render介面

[-綜合篇-] 相機、OpenGL、視訊、Flutter和SurfaceView

public class TriangleGLView extends GLSurfaceView implements GLSurfaceView.Renderer {
    private  Triangle mTriangle;
    
    public TriangleGLView(Context context) {
        this(context, null);
    }

    public TriangleGLView(Context context, AttributeSet attrs) {
        super(context, attrs);
        setEGLContextClientVersion(2);//設定OpenGL ES 2.0 context
        setRenderer(this);//設定渲染器
    }

    @Override
    public void onSurfaceCreated(GL10 gl, EGLConfig config) {
        mTriangle=new Triangle();
        GLES20.glClearColor(1.0f, 0.0f, 0.0f, 1.0f);//rgba
    }

    @Override
    public void onSurfaceChanged(GL10 gl, int width, int height) {
        GLES20.glViewport(0, 0, width, height);//GL視口
    }

    @Override
    public void onDrawFrame(GL10 gl) {
        //清除顏色快取和深度快取
        GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT| GLES20.GL_DEPTH_BUFFER_BIT);
        mTriangle.draw();
    }
}
複製程式碼

OpenGL的繪製可謂讓人望而卻步,下面是最簡單的三角形繪製,
如果有興趣可以看一下筆者OpenGL相關文章,仔細看完,基本上可以入門
Android多媒體之GL-ES戰記第一集--勇者集結
Android多媒體之GL-ES戰記第二集--謎團立方
Android多媒體之GLES2戰記第三集--聖火之光
Android多媒體之GLES2戰記第四集--移形換影
Android多媒體之GLES2戰記第五集--宇宙之光
Android多媒體之GLES2戰記第六集--九層之臺

public class Triangle {
    private FloatBuffer vertexBuffer;//頂點緩衝
    private final String vertexShaderCode =//頂點著色程式碼
            "attribute vec4 vPosition;" +
                    "void main() {" +
                    "  gl_Position = vPosition;" +
                    "}";
    private final String fragmentShaderCode =//片元著色程式碼
            "precision mediump float;" +
                    "uniform vec4 vColor;" +
                    "void main() {" +
                    "  gl_FragColor = vColor;" +
                    "}";
    private final int mProgram;
    private int mPositionHandle;//位置控制程式碼
    private int mColorHandle;//顏色控制程式碼
    private final int vertexCount = sCoo.length / COORDS_PER_VERTEX;//頂點個數
    private final int vertexStride = COORDS_PER_VERTEX * 4; // 3*4=12
    // 陣列中每個頂點的座標數
    static final int COORDS_PER_VERTEX = 3;
    static float sCoo[] = {   //以逆時針順序
            0.0f, 0.0f, 0.0f, // 頂部
            -1.0f, -1.0f, 0.0f, // 左下
            1.0f, -1.0f, 0.0f  // 右下
    };
    // 顏色,rgba
    float color[] = {0.63671875f, 0.76953125f, 0.22265625f, 1.0f};
    public Triangle() {
        //初始化頂點位元組緩衝區
        ByteBuffer bb = ByteBuffer.allocateDirect(sCoo.length * 4);//每個浮點數:座標個數* 4位元組
        bb.order(ByteOrder.nativeOrder());//使用本機硬體裝置的位元組順序
        vertexBuffer = bb.asFloatBuffer();// 從位元組緩衝區建立浮點緩衝區
        vertexBuffer.put(sCoo);// 將座標新增到FloatBuffer
        vertexBuffer.position(0);//設定緩衝區以讀取第一個座標
        int vertexShader = loadShader(
                GLES20.GL_VERTEX_SHADER,//頂點著色
                vertexShaderCode);
        int fragmentShader = loadShader
                (GLES20.GL_FRAGMENT_SHADER,//片元著色
                        fragmentShaderCode);
        mProgram = GLES20.glCreateProgram();//建立空的OpenGL ES 程式
        GLES20.glAttachShader(mProgram, vertexShader);//加入頂點著色器
        GLES20.glAttachShader(mProgram, fragmentShader);//加入片元著色器
        GLES20.glLinkProgram(mProgram);//建立可執行的OpenGL ES專案
    }

    private int loadShader(int type, String shaderCode) {
        int shader = GLES20.glCreateShader(type);//建立著色器
        GLES20.glShaderSource(shader, shaderCode);//新增著色器原始碼
        GLES20.glCompileShader(shader);//編譯
        return shader;
    }

    public void draw() {
        // 將程式新增到OpenGL ES環境中
        GLES20.glUseProgram(mProgram);
        //獲取頂點著色器的vPosition成員的控制程式碼
        mPositionHandle = GLES20.glGetAttribLocation(mProgram, "vPosition");
        //啟用三角形頂點的控制程式碼
        GLES20.glEnableVertexAttribArray(mPositionHandle);
        //準備三角座標資料
        GLES20.glVertexAttribPointer(
                mPositionHandle, COORDS_PER_VERTEX,
                GLES20.GL_FLOAT, false,
                vertexStride, vertexBuffer);
        // 獲取片元著色器的vColor成員的控制程式碼
        mColorHandle = GLES20.glGetUniformLocation(mProgram, "vColor");
        //為三角形設定顏色
        GLES20.glUniform4fv(mColorHandle, 1, color, 0);
        //繪製三角形
        GLES20.glDrawArrays(GLES20.GL_TRIANGLES, 0, vertexCount);
        //禁用頂點陣列
        GLES20.glDisableVertexAttribArray(mPositionHandle);
    }
}
複製程式碼

4.OpenGL在相機中的使用

現在捋一下,相機需要一個SurfaceHolder,而GLSurfaceView是一個SurfaceView,郎情妾意。 但好事多磨,並沒有想象中的這麼簡單...
在CameraGLView主類中建立SurfaceTexture物件,並將紋理繫結其上
而通過SurfaceTexture作為入參可以建立SurfaceHolder,一條路就通了。

public class CameraGLView extends GLSurfaceView  implements GLSurfaceView.Renderer {

    private CameraDrawer cameraDrawer;
    public SurfaceTexture surfaceTexture;
    private int[] textureId = new int[1];

    //----------------------------相機操作------------------------------
    private Handler mainHandler;
    private Handler childHandler;
    private String mCameraID;
    private CameraManager mCameraManager;
    private CameraDevice mCameraDevice;//相機裝置
    private CameraCaptureSession mCameraCaptureSession;

    private CameraDevice.StateCallback mStateCallback;
    private Size mVideoSize;
    private Semaphore mCameraOpenCloseLock = new Semaphore(1);//以防止在關閉相機之前應用程式退出
    private Surface surface;


    public CameraGLView(Context context) {
        this(context,null);
    }

    public CameraGLView(Context context, AttributeSet attrs) {
        super(context, attrs);
        setEGLContextClientVersion(3);//設定OpenGL ES 3.0 context
        setRenderer(this);//設定渲染器
    }

    private void initHandler() {
        HandlerThread handlerThread = new HandlerThread("Camera2");
        handlerThread.start();
        mainHandler = new Handler(getMainLooper());//主執行緒Handler
        childHandler = new Handler(handlerThread.getLooper());//子執行緒Handler
    }

    private void initCamera() {

        mCameraID = "" + CameraCharacteristics.LENS_FACING_FRONT;//後攝像頭
        //獲取攝像頭管理器
        mCameraManager = (CameraManager) getContext().getSystemService(Context.CAMERA_SERVICE);
        mVideoSize=getCameraOutputSizes(mCameraManager,mCameraID,SurfaceTexture.class).get(0);

        mStateCallback = new CameraDevice.StateCallback() {
            @Override
            public void onOpened(@NonNull CameraDevice camera) {
                mCameraOpenCloseLock.release();
                mCameraDevice = camera;
                startPreview();
            }
            @Override
            public void onDisconnected(@NonNull CameraDevice camera) {
                mCameraOpenCloseLock.release();
                mCameraDevice.close();
            }
            @Override
            public void onError(@NonNull CameraDevice camera, int error) {
                mCameraOpenCloseLock.release();
                mCameraDevice.close();
            }
        };
    }

    /**
     * 根據輸出類獲取指定相機的輸出尺寸列表,降序排序
     */
    public List<Size> getCameraOutputSizes(CameraManager cameraManager, String cameraId, Class clz){
        try {
            CameraCharacteristics characteristics = cameraManager.getCameraCharacteristics(cameraId);
            StreamConfigurationMap configs = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
            List<Size> sizes = Arrays.asList(configs.getOutputSizes(clz));
            Collections.sort(sizes, (o1, o2) -> o1.getWidth() * o1.getHeight() - o2.getWidth() * o2.getHeight());
            Collections.reverse(sizes);
            return sizes;
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
        return null;
    }
    /**
     * 開啟預覽
     */
    private void startPreview() {
        surfaceTexture.setDefaultBufferSize(mVideoSize.getWidth(), mVideoSize.getHeight());
        surfaceTexture.setOnFrameAvailableListener(surfaceTexture -> requestRender());
        surface = new Surface(surfaceTexture);

        try {
            // 建立預覽需要的CaptureRequest.Builder
            final CaptureRequest.Builder reqBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
            // 將SurfaceView的surface作為CaptureRequest.Builder的目標
            reqBuilder.addTarget(surface);
            // 建立CameraCaptureSession,該物件負責管理處理預覽請求和拍照請求
            CameraCaptureSession.StateCallback stateCallback = new CameraCaptureSession.StateCallback() {
                @Override
                public void onConfigured(@NonNull CameraCaptureSession cameraCaptureSession) {
                    if (null == mCameraDevice) return;
                    mCameraCaptureSession = cameraCaptureSession; // 當攝像頭已經準備好時,開始顯示預覽
                    try {// 顯示預覽
                        mCameraCaptureSession.setRepeatingRequest(reqBuilder.build(), null, childHandler);
                    } catch (CameraAccessException e) {
                        e.printStackTrace();
                    }
                }
                @Override
                public void onConfigureFailed(@NonNull CameraCaptureSession cameraCaptureSession) {

                }
            };
            mCameraDevice.createCaptureSession(Collections.singletonList(surface), stateCallback, childHandler);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
    }
    
    @Override
    public void onSurfaceCreated(GL10 gl, EGLConfig config) {
        cameraDrawer=new CameraDrawer(getContext());
        //建立紋理物件
        GLES30.glGenTextures(textureId.length, textureId, 0);
        //將紋理物件繫結到srufaceTexture
        surfaceTexture = new SurfaceTexture(textureId[0]);        //建立並連線程式

        initHandler();//初始化執行緒處理器
        initCamera();//初始化相機
        try {
            if (ActivityCompat.checkSelfPermission(getContext(), Manifest.permission.CAMERA) !=
                    PackageManager.PERMISSION_GRANTED) {
                return;
            }
            mCameraManager.openCamera(mCameraID, mStateCallback, mainHandler);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
    }

    @Override
    public void onSurfaceChanged(GL10 gl, int width, int height) {
        glViewport(0,0,width,height);
    }

    @Override
    public void onDrawFrame(GL10 gl) {
        surfaceTexture.updateTexImage();
        GLES30.glClear(GLES30.GL_COLOR_BUFFER_BIT);
        cameraDrawer.draw(textureId[0]);
    }
}
複製程式碼

通過CameraDrawer類繪製紋理,這就跟畫三角形非常類似,通過著色器(shader)進行著色

fragment片元:camera.fsh

#version 300 es
#extension GL_OES_EGL_image_external_essl3 : require
precision mediump float;

in vec2 vTexCoord;
out vec4 outColor;
uniform samplerExternalOES sTexture;

void main(){
    outColor = texture(sTexture, vTexCoord);
}
複製程式碼

vertex頂元:camera.vsh

#version 300 es
in vec4 aPosition;
in vec2 aTexCoord;

out vec2 vTexCoord;

void main(){
    gl_Position = aPosition;
    vTexCoord = aTexCoord;
}
複製程式碼

繪畫器:CameraDrawer

public class CameraDrawer {
    private static final String VERTEX_ATTRIB_POSITION = "a_Position";
    private static final int VERTEX_ATTRIB_POSITION_SIZE = 3;
    private static final String VERTEX_ATTRIB_TEXTURE_POSITION = "a_texCoord";
    private static final int VERTEX_ATTRIB_TEXTURE_POSITION_SIZE = 2;
    private static final String UNIFORM_TEXTURE = "s_texture";

    private  float[] vertex ={
            -1f,1f,0.0f,//左上
            -1f,-1f,0.0f,//左下
            1f,-1f,0.0f,//右下
            1f,1f,0.0f//右上
    };

    //紋理座標,(s,t),t座標方向和頂點y座標反著
    public float[] textureCoord = {
            0.0f,1.0f,
            1.0f,1.0f,
            1.0f,0.0f,
            0.0f,0.0f
    };

    private FloatBuffer vertexBuffer;
    private FloatBuffer textureCoordBuffer;
    private int program;
    
    private Context context;


    public CameraDrawer(Context context) {
        this.context = context;

        initVertexAttrib(); //初始化頂點資料

        program = GLUtil.loadAndInitProgram( this.context,"camera.vsh","camera.fsh");
        GLES30.glClearColor(1.0f, 1.0f, 1.0f, 0.0f);
    }

    private void initVertexAttrib() {
        textureCoordBuffer = GLUtil.getFloatBuffer(textureCoord);
        vertexBuffer = GLUtil.getFloatBuffer(vertex);
    }

    public void draw(int textureId){
        GLES30.glUseProgram(program);
        //初始化控制程式碼
        int vertexLoc = GLES30.glGetAttribLocation(program, VERTEX_ATTRIB_POSITION);
        int textureLoc = GLES30.glGetAttribLocation(program, VERTEX_ATTRIB_TEXTURE_POSITION);

        GLES30.glEnableVertexAttribArray(vertexLoc);
        GLES30.glEnableVertexAttribArray(textureLoc);

        GLES30.glVertexAttribPointer(vertexLoc,
                VERTEX_ATTRIB_POSITION_SIZE,
                GLES30.GL_FLOAT,
                false,
                0,
                vertexBuffer);

        GLES30.glVertexAttribPointer(textureLoc,
                VERTEX_ATTRIB_TEXTURE_POSITION_SIZE,
                GLES30.GL_FLOAT,
                false,
                0,
                textureCoordBuffer);

        //紋理繫結
        GLES30.glActiveTexture( GLES30.GL_TEXTURE0);
        GLES30.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textureId);
        GLES30.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,  GLES30.GL_TEXTURE_MIN_FILTER, GLES30.GL_NEAREST);
        GLES30.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,  GLES30.GL_TEXTURE_MAG_FILTER,  GLES30.GL_LINEAR);
        GLES30.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,  GLES30.GL_TEXTURE_WRAP_S,  GLES30.GL_CLAMP_TO_EDGE);
        GLES30.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,  GLES30.GL_TEXTURE_WRAP_T, GLES30.GL_CLAMP_TO_EDGE);
        int uTextureLoc =  GLES30.glGetUniformLocation(program, UNIFORM_TEXTURE);
        GLES30.glUniform1i(uTextureLoc,0);
        //繪製
        GLES30.glDrawArrays( GLES30.GL_TRIANGLE_FAN,0,vertex.length / 3);
        //禁用頂點
        GLES30.glDisableVertexAttribArray(vertexLoc);
        GLES30.glDisableVertexAttribArray(textureLoc);
    }
}
複製程式碼

也許你並不瞭解OpenGL,看到結果會大呼:TM,這麼麻煩,才實先預覽?拜拜,學不動,告辭。 對,很麻煩,之後還會更麻煩。但你不會,別人會。你怕麻煩,別人去鑽研,這就是人與人的差距。
我最不能理解的是怕麻煩的人到處詢問學習方法。只要你不怕麻煩,遇到問題肯去鑽,去看原始碼,去debug,還有什麼能阻擋你。世事有難易乎,為之則難者易,不為則易者難。

OpenGL開啟了一扇大門,根據shader可以進行非常多的操作,濾鏡,貼圖,著色,變換...甚至可以說給我一個shader的用武之處,我能給你創造一個世界

[-綜合篇-] 相機、OpenGL、視訊、Flutter和SurfaceView


5.OpenGL在視訊播放中的使用

如果你稍微瞭解一下視訊播放,會知道MediaPlayer可以和Surface狼狽為奸 於是乎,同理,可以將視訊播放和OpenGL結合,然後通過shader來逆天改命
這裡思路幾乎一致GLVideoView中進行SurfaceTexture和紋理繫結,並生成Surface給MediaPlayer
關於MediaPlayer的視訊播放,詳見:Android多媒體之視訊播放器(基於MediaPlayer)

[-綜合篇-] 相機、OpenGL、視訊、Flutter和SurfaceView

public class GLVideoView extends GLSurfaceView implements GLSurfaceView.Renderer, 
        SurfaceTexture.OnFrameAvailableListener, MediaPlayer.OnVideoSizeChangedListener  {
    private float[] sTMatrix = new float[16];
    private final float[] projectionMatrix=new float[16];

    private SurfaceTexture surfaceTexture;
    private MediaPlayer mediaPlayer;
    private VideoDrawer videoDrawer;
    private int textureId;
    private boolean updateSurface;
    private boolean playerPrepared;

    private int screenWidth,screenHeight;

    public GLVideoView(Context context) {
        super(context);
    }

    public GLVideoView(Context context, AttributeSet attrs) {
        super(context, attrs);
                setEGLContextClientVersion(2);//設定OpenGL ES 3.0 context
        setRenderer(this);//設定渲染器
        initPlayer();
    }

    private void initPlayer() {
        mediaPlayer=new MediaPlayer();
        try{
            mediaPlayer.setDataSource(getContext(), Uri.parse("/sdcard/toly/sh.mp4"));
        }catch (IOException e){
            e.printStackTrace();
        }
        mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
        mediaPlayer.setLooping(true);

        mediaPlayer.setOnVideoSizeChangedListener(this);
    }

    @Override
    public void onSurfaceCreated(GL10 gl, EGLConfig config) {
        videoDrawer=new VideoDrawer(getContext());
        playerPrepared=false;
        synchronized(this) {
            updateSurface = false;
        }

        int[] textures = new int[1];
        GLES20.glGenTextures(1, textures, 0);

        textureId = textures[0];
        GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textureId);

        surfaceTexture = new SurfaceTexture(textureId);
        surfaceTexture.setOnFrameAvailableListener(this);

        Surface surface = new Surface(surfaceTexture);
        mediaPlayer.setSurface(surface);
        surface.release();
        if (!playerPrepared){
            try {
                mediaPlayer.prepare();
                playerPrepared=true;
            } catch (IOException t) {

            }
            mediaPlayer.start();
            playerPrepared=true;
        }
    }

    @Override
    public void onSurfaceChanged(GL10 gl, int width, int height) {
        screenWidth=width; screenHeight=height;
        GLES20.glViewport(0,0,screenWidth,screenHeight);
    }

    @Override
    public void onDrawFrame(GL10 gl) {
        synchronized (this){
            if (updateSurface){
                surfaceTexture.updateTexImage();
                surfaceTexture.getTransformMatrix(sTMatrix);
                updateSurface = false;
            }
        }
        videoDrawer.draw(textureId,projectionMatrix, sTMatrix);
    }

    @Override
    public void onFrameAvailable(SurfaceTexture surfaceTexture) {
        updateSurface = true;
    }

    @Override
    public void onVideoSizeChanged(MediaPlayer mp, int width, int height) {
        screenWidth=width; screenHeight=height;
        updateProjection(width,height);
    }

    private void updateProjection(int videoWidth, int videoHeight){
        float screenRatio=(float)screenWidth/screenHeight;
        float videoRatio=(float)videoWidth/videoHeight;
        if (videoRatio>screenRatio){
            Matrix.orthoM(projectionMatrix,0,
                    -1f,1f,-videoRatio/screenRatio,videoRatio/screenRatio,
                    -1f,1f);
        }else {
            Matrix.orthoM(projectionMatrix,0,
                    -screenRatio/videoRatio,screenRatio/videoRatio,-1f,1f,
                    -1f,1f);
        }
    }
}
複製程式碼

著色器

---->[video.fsh]----
#extension GL_OES_EGL_image_external : require
precision mediump float;
varying vec2 vTexCoord;
uniform samplerExternalOES sTexture;
void main() {
    vec3 color = texture2D(sTexture, vTexCoord).rgb;
    float threshold = 0.8;//閾值
    float mean = (color.r + color.g + color.b) / 3.0;
    color.r = color.g = color.b = mean >= threshold ? 1.0 : 0.0;
    gl_FragColor = vec4(1,color);//固定紅色
}

---->[video.vsh]----
attribute vec4 aPosition;//頂點位置
attribute vec4 aTexCoord;//紋理座標
varying vec2 vTexCoord;
uniform mat4 uMatrix;
uniform mat4 uSTMatrix;
void main() {
    vTexCoord = (uSTMatrix * aTexCoord).xy;
    gl_Position = uMatrix*aPosition;
}
複製程式碼

再通過VideoDrawer進行著色處理和繪製

public class VideoDrawer {
    private Context context;
    private int aPositionLocation;
    private int programId;
    private FloatBuffer vertexBuffer;
    private final float[] vertexData = {
            1f, -1f, 0f,
            -1f, -1f, 0f,
            1f, 1f, 0f,
            -1f, 1f, 0f
    };

    private int uMatrixLocation;

    private final float[] textureVertexData = {
            1f, 0f,
            0f, 0f,
            1f, 1f,
            0f, 1f
    };
    private FloatBuffer textureVertexBuffer;
    private int uTextureSamplerLocation;
    private int aTextureCoordLocation;
    private int uSTMMatrixHandle;

    public VideoDrawer(Context context) {
        this.context = context;

        vertexBuffer = ByteBuffer.allocateDirect(vertexData.length * 4)
                .order(ByteOrder.nativeOrder())
                .asFloatBuffer()
                .put(vertexData);
        vertexBuffer.position(0);

        textureVertexBuffer = ByteBuffer.allocateDirect(textureVertexData.length * 4)
                .order(ByteOrder.nativeOrder())
                .asFloatBuffer()
                .put(textureVertexData);
        textureVertexBuffer.position(0);

        programId = GLUtil.loadAndInitProgram(context, "video.vsh", "video.fsh");
        aPositionLocation = GLES20.glGetAttribLocation(programId, "aPosition");

        uMatrixLocation = GLES20.glGetUniformLocation(programId, "uMatrix");
        uSTMMatrixHandle = GLES20.glGetUniformLocation(programId, "uSTMatrix");
        uTextureSamplerLocation = GLES20.glGetUniformLocation(programId, "sTexture");
        aTextureCoordLocation = GLES20.glGetAttribLocation(programId, "aTexCoord");
    }

    public void draw(int textureId, float[] projectionMatrix, float[] sTMatrix) {
        GLES20.glClear(GLES20.GL_DEPTH_BUFFER_BIT | GLES20.GL_COLOR_BUFFER_BIT);

        GLES20.glUseProgram(programId);
        GLES20.glUniformMatrix4fv(uMatrixLocation, 1, false, projectionMatrix, 0);
        GLES20.glUniformMatrix4fv(uSTMMatrixHandle, 1, false, sTMatrix, 0);

        vertexBuffer.position(0);
        GLES20.glEnableVertexAttribArray(aPositionLocation);
        GLES20.glVertexAttribPointer(aPositionLocation, 3, GLES20.GL_FLOAT, false,
                12, vertexBuffer);

        textureVertexBuffer.position(0);
        GLES20.glEnableVertexAttribArray(aTextureCoordLocation);
        GLES20.glVertexAttribPointer(aTextureCoordLocation, 2, GLES20.GL_FLOAT, false, 8, textureVertexBuffer);
        GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MIN_FILTER,
                GLES20.GL_NEAREST);
        GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);

        GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
        GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textureId);
        GLES20.glUniform1i(uTextureSamplerLocation, 0);
        GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);
    }
}
複製程式碼
6.Flutter 與 SurfaceView

如果你對Flutter的實現有所瞭解,那麼對FlutterView應該並不陌生。對於Android端,
Flutter所有檢視都在FlutterView中進行繪製,而FlutterView便是繼承自SurfaceView
這也足以顯示SurfaceView是多麼強大

public class FlutterView extends SurfaceView 
                implements BinaryMessenger, TextureRegistry {
複製程式碼

既然是SurfaceView,那麼自然少不了前面的那些形式,回撥啦,SurfaceHolder什麼的
在成員屬性中有mSurfaceCallback和nextTextureId,是不是很親切
在構造方法中mSurfaceCallback被直接建立,surfaceCreated、surfaceChanged、surfaceDestroyed

public class FlutterView extends SurfaceView implements BinaryMessenger, TextureRegistry {
    private static final String TAG = "FlutterView";
    //...
    private final Callback mSurfaceCallback;
    //...
    private final AtomicLong nextTextureId;

    //構造方法中
    this.mSurfaceCallback = new Callback() {
                public void surfaceCreated(SurfaceHolder holder) {
                    FlutterView.this.assertAttached();
                    FlutterView.this.mNativeView.getFlutterJNI().onSurfaceCreated(holder.getSurface());
                }

                public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
                    FlutterView.this.assertAttached();
                    FlutterView.this.mNativeView.getFlutterJNI().onSurfaceChanged(width, height);
                }

                public void surfaceDestroyed(SurfaceHolder holder) {
                    FlutterView.this.assertAttached();
                    FlutterView.this.mNativeView.getFlutterJNI().onSurfaceDestroyed();
                }
            };
            this.getHolder().addCallback(this.mSurfaceCallback);
複製程式碼

另外在detach和destroy時mSurfaceCallback都會被移除

    public FlutterNativeView detach() {
        if (!this.isAttached()) {
            return null;
        } else {
            this.getHolder().removeCallback(this.mSurfaceCallback);
            this.mNativeView.detachFromFlutterView();
            FlutterNativeView view = this.mNativeView;
            this.mNativeView = null;
            return view;
        }
    }
    public void destroy() {
        if (this.isAttached()) {
            this.getHolder().removeCallback(this.mSurfaceCallback);
            this.mNativeView.destroy();
            this.mNativeView = null;
        }
    }
複製程式碼

  • 關於SurfaceTexture的例項物件

使用內部類SurfaceTextureRegistryEntry進行構建,setOnFrameAvailableListener和在上面也出現過
所以凡事混個臉熟也挺有價值的,至少當你見到它知道它在幹嘛。

public SurfaceTextureEntry createSurfaceTexture() {
    SurfaceTexture surfaceTexture = new SurfaceTexture(0);
    surfaceTexture.detachFromGLContext();
    FlutterView.SurfaceTextureRegistryEntry entry = new FlutterView.SurfaceTextureRegistryEntry(this.nextTextureId.getAndIncrement(), surfaceTexture);
    this.mNativeView.getFlutterJNI().registerTexture(entry.id(), surfaceTexture);
    return entry;
}

final class SurfaceTextureRegistryEntry implements SurfaceTextureEntry {
    private final long id;
    private final SurfaceTexture surfaceTexture;
    private boolean released;
    private OnFrameAvailableListener onFrameListener = new OnFrameAvailableListener() {
        public void onFrameAvailable(SurfaceTexture texture) {
            if (!SurfaceTextureRegistryEntry.this.released && FlutterView.this.mNativeView != null) {
                FlutterView.this.mNativeView.getFlutterJNI().markTextureFrameAvailable(SurfaceTextureRegistryEntry.this.id);
            }
        }
    };
    SurfaceTextureRegistryEntry(long id, SurfaceTexture surfaceTexture) {
        this.id = id;
        this.surfaceTexture = surfaceTexture;
        if (VERSION.SDK_INT >= 21) {
            this.surfaceTexture.setOnFrameAvailableListener(this.onFrameListener, new Handler());
        } else {
            this.surfaceTexture.setOnFrameAvailableListener(this.onFrameListener);
        }
    }
    public SurfaceTexture surfaceTexture() {
        return this.surfaceTexture;
    }
    public long id() {
        return this.id;
    }
複製程式碼

前面也知道surfaceTexture最重要的是紋理的繫結,在FlutterJNI的nativeRegisterTexture方法中進行實現。

---->[io.flutter.embedding.engine.FlutterJNI#registerTexture]----
    @UiThread
    public void registerTexture(long textureId, @NonNull SurfaceTexture surfaceTexture) {
        this.ensureRunningOnMainThread();
        this.ensureAttachedToNative();
        this.nativeRegisterTexture(this.nativePlatformViewId, textureId, surfaceTexture);
    }

---->[io.flutter.embedding.engine.FlutterJNI#nativeRegisterTexture]----
private native void nativeRegisterTexture(long var1, long var3, @NonNull SurfaceTexture var5);
複製程式碼

放在以前,到這裡我就棄了,不過現在,可以稍稍追一下,首先要明白,nativeRegisterTexture的C++實現的方法在哪
如果想要檢視關於FlutterJNI的C++程式碼,需要下載flutter engine,GitHub地址:
位置:engine-master/shell/platform/android/platform_view_android_jni.h

static void RegisterTexture(JNIEnv* env,
                            jobject jcaller,
                            jlong shell_holder,
                            jlong texture_id,
                            jobject surface_texture) {
  ANDROID_SHELL_HOLDER->GetPlatformView()->RegisterExternalTexture(
      static_cast<int64_t>(texture_id),                        //
      fml::jni::JavaObjectWeakGlobalRef(env, surface_texture)  //
  );
}

bool RegisterApi(JNIEnv* env) {
  static const JNINativeMethod flutter_jni_methods[] = {
     {
          .name = "nativeRegisterTexture",
          .signature = "(JJLandroid/graphics/SurfaceTexture;)V",
          .fnPtr = reinterpret_cast<void*>(&RegisterTexture),
      },
      //...

  if (env->RegisterNatives(g_flutter_jni_class->obj(), flutter_jni_methods,
                           fml::size(flutter_jni_methods)) != 0) {
    FML_LOG(ERROR) << "Failed to RegisterNatives with FlutterJNI";
    return false;
  }
複製程式碼

一不小心又學會了一種JNI方法的註冊方式...這波不虧。什麼是好的學習方法。多看,多想,知識和你不期而遇

bool PlatformViewAndroid::Register(JNIEnv* env) {
  if (env == nullptr) {
    FML_LOG(ERROR) << "No JNIEnv provided";
    return false;
  }

//...
//可見這裡通過FindClass指向了FlutterJNI的Java類,也就是剛才我們看的。
  g_flutter_jni_class = new fml::jni::ScopedJavaGlobalRef<jclass>(
      env, env->FindClass("io/flutter/embedding/engine/FlutterJNI"));
  if (g_flutter_jni_class->is_null()) {
    FML_LOG(ERROR) << "Failed to find FlutterJNI Class.";
    return false;
  }

//...
  return RegisterApi(env);
}
複製程式碼

點到為止,就不繼續挖了。以後有機會專門開坑來挖一篇。到這裡你應該對SurfaceView有了一個感性的認識,最後再貼一遍:

[-綜合篇-] 相機、OpenGL、視訊、Flutter和SurfaceView

那我的任務就結束了,接下來的火炬交給何時夕:Android繪製機制以及Surface家族原始碼全解析
這篇是至今見過對Surface家族解釋的最好的,建議理解熟讀並背誦全文。


好了,本文就到這裡,江湖路遠,後會有期。再見。
我是張風捷特烈,如果有什麼想要交流的,歡迎留言。也可以加微信:zdl1994328

相關文章