ijkplayer-丟幀策略深入分析

Ranchok發表於2019-04-28

1.測試拿過來個視訊,發現用ijk播放器與系統播放器(mediaplayer)播放感覺不一樣,用ijk播放感覺播放頁面卡頓一點,沒有系統播放器那麼流暢。

下面看一下這個問題,這個問題的原因其實很簡單,由於我丟幀值設定的是5,改成1就可以感覺2個播放器在體驗上感覺差不多了。(丟5幀人眼就可以看出來差別了!)。

2.在高通660的機器上播放一個4k(30fps)視訊,但是無法正常播放,實際一秒的解碼幀只有20幀,實際播放只有4幀這樣。導致畫面卡頓,音視訊不同步。

後面發現是由於這個視訊,在高通機器上很多幀解碼都比較慢,導致視訊一直比音訊慢,在硬解碼丟幀時,判斷視訊一直比音訊慢,導致視訊一直在丟幀,也就出現看上去的卡頓。

丟幀原理

首先需要明白丟幀需要丟哪裡的幀,丟什麼幀呢?

丟幀可以丟解碼前的幀也可以丟解碼後的幀,

丟解碼前的幀需要判斷幀型別來丟,可以選擇丟b,p幀,如果需要丟I幀就需要把整個gop全部丟掉防止花屏。

丟解碼後的幀可以不用按照幀型別來丟,由於幀都是解碼後的資料了,如yuv,直接按照pts來判斷音視訊是否不同步來丟就可以了。

1.ffplay中丟幀設計

我們看下ffplay中丟幀的設計:ffplay丟的是解碼後的視訊幀

在video_thread解碼執行緒中,我們可以看到get_video_frame函式主要用於解碼獲取解碼後的資料avframe,然後來檢測判斷丟幀。

static int get_video_frame(VideoState *is, AVFrame *frame)
{
    int got_picture;
    //解碼獲取解碼後的資料avframe
    if ((got_picture = decoder_decode_frame(&is->viddec, frame, NULL)) < 0)
        return -1;

    if (got_picture) {
        double dpts = NAN;
        
        if (frame->pts != AV_NOPTS_VALUE)
            dpts = av_q2d(is->video_st->time_base) * frame->pts;//視訊的pts轉換為ms,也就是當前進度時間
		//視訊的寬高比
        frame->sample_aspect_ratio = av_guess_sample_aspect_ratio(is->ic, is->video_st, frame);
        //丟幀數大於0且同步不是按照video
        if (framedrop>0 || (framedrop && get_master_sync_type(is) != AV_SYNC_VIDEO_MASTER)) {
        
            if (frame->pts != AV_NOPTS_VALUE) {
               //frame_last_filter_delay 預設是0
               //diff小於0,視訊比音訊慢,丟幀
               //diff大於0,視訊比音訊快,不丟幀
                double diff = dpts - get_master_clock(is);
                // AV_NOSYNC_THRESHOLD:同步閾值。如果誤差太大,則不進行校正,也不丟幀來做同步了
                if (!isnan(diff) && fabs(diff) < AV_NOSYNC_THRESHOLD &&
                    diff - is->frame_last_filter_delay < 0 &&
                    is->viddec.pkt_serial == is->vidclk.serial &&
                    is->videoq.nb_packets) {
                    is->frame_drops_early++;
                    av_frame_unref(frame);//丟幀
                    got_picture = 0;
                }
            }
        }
    }

    return got_picture;
}
複製程式碼

2.ijk中丟幀設計

ijk中丟幀丟的也是解碼後的視訊幀,分為硬解碼丟幀和軟解碼丟幀。

2.1.硬解碼丟幀設計

在ffpipenode_android_mediacodec_vdec.c檔案中, func_run_sync函式主要用來處理整個硬解碼的實現邏輯。

/**
*硬解碼處理流程
**/
static int func_run_sync(IJKFF_Pipenode *node)
{
    JNIEnv                *env      = NULL;
    IJKFF_Pipenode_Opaque *opaque   = node->opaque;
    FFPlayer              *ffp      = opaque->ffp;
    VideoState            *is       = ffp->is;
    Decoder               *d        = &is->viddec;
    PacketQueue           *q        = d->queue;
    int                    ret      = 0;
    int                    dequeue_count = 0;
    AVFrame               *frame    = NULL;
    int                    got_frame = 0;
    AVRational             tb         = is->video_st->time_base;
    AVRational             frame_rate = av_guess_frame_rate(is->ic, is->video_st, NULL);
    double                 duration;
    double                 pts;

    if (!opaque->acodec) {
        return ffp_video_thread(ffp);
    }

    if (JNI_OK != SDL_JNI_SetupThreadEnv(&env)) {
        ALOGE("%s: SetupThreadEnv failed\n", __func__);
        return -1;
    }

    frame = av_frame_alloc();
    if (!frame)
        goto fail;
	//建立資料入隊執行緒enqueue_thread_func
    opaque->enqueue_thread = SDL_CreateThreadEx(&opaque->_enqueue_thread, enqueue_thread_func, node, "amediacodec_input_thread");
    if (!opaque->enqueue_thread) {
        ALOGE("%s: SDL_CreateThreadEx failed\n", __func__);
        ret = -1;
        goto fail;
    }
	//迴圈拉取解碼資料
    while (!q->abort_request) {
        int64_t timeUs = opaque->acodec_first_dequeue_output_request ? 0 : AMC_OUTPUT_TIMEOUT_US;
        got_frame = 0;
	    //硬解獲取frame
        ret = drain_output_buffer(env, node, timeUs, &dequeue_count, frame, &got_frame);
        if (opaque->acodec_first_dequeue_output_request) {
            SDL_LockMutex(opaque->acodec_first_dequeue_output_mutex);
            opaque->acodec_first_dequeue_output_request = false;
            SDL_CondSignal(opaque->acodec_first_dequeue_output_cond);
            SDL_UnlockMutex(opaque->acodec_first_dequeue_output_mutex);
        }
		//資料拉取出錯
        if (ret != 0) {
            ret = -1;
            if (got_frame && frame->opaque) //release buffer false通知MediaCodec丟棄這一幀
            {
                SDL_VoutAndroid_releaseBufferProxyP(opaque->weak_vout, (SDL_AMediaCodecBufferProxy **)&frame->opaque, false);
            }
            goto fail;
        }
        if (got_frame) {
            duration = (frame_rate.num && frame_rate.den ? av_q2d((AVRational){frame_rate.den, frame_rate.num}) : 0);
            pts = (frame->pts == AV_NOPTS_VALUE) ? NAN : frame->pts * av_q2d(tb);
		    //設定的丟幀大於0或者 丟幀數不等於0同時主時鐘不是視訊時鐘
            if (ffp->framedrop > 0 || (ffp->framedrop && ffp_get_master_sync_type(is) != AV_SYNC_VIDEO_MASTER)) {
                ffp->stat.decode_frame_count++;//解碼幀計數
                if (frame->pts != AV_NOPTS_VALUE) {
                    double dpts = pts;//如視訊某幀pts
                    double diff = dpts - ffp_get_master_clock(is);//視訊幀與音訊幀的差值(如果主時鐘是音訊的話)
					//frame_last_filter_delay 這個時間是0,所以diff大於0,表示視訊比音訊快,不需要丟幀
					//如果diff小於0,表示視訊比音訊慢,需要丟幀
					//音訊比視訊快,且差距小於最大同步值,超過該值則不做同步處理了
					if (!isnan(diff) && fabs(diff) < AV_NOSYNC_THRESHOLD &&
                        diff - is->frame_last_filter_delay < 0 &&
                        is->viddec.pkt_serial == is->vidclk.serial &&
                        is->videoq.nb_packets) {//解碼器中視訊幀的包序列等於視訊時鐘中的序列號,視訊佇列還有視訊幀
                        is->frame_drops_early++;
                        is->continuous_frame_drops_early++;//初始值是0
                        if (is->continuous_frame_drops_early > ffp->framedrop) {//如果continuous_frame_drops_early變數大於丟幀數,初始化ontinuous_frame_drops_early為0
                            is->continuous_frame_drops_early = 0;
                        } else {
                            ffp->stat.drop_frame_count++;//丟幀數加加
							//丟幀率 = 丟幀數 / 解碼幀數
                            ffp->stat.drop_frame_rate = (float)(ffp->stat.drop_frame_count) / (float)(ffp->stat.decode_frame_count);
                            if (frame->opaque) {//通知mediacodec,釋放掉,不顯示
                                SDL_VoutAndroid_releaseBufferProxyP(opaque->weak_vout, (SDL_AMediaCodecBufferProxy **)&frame->opaque, false);
                            }
                            av_frame_unref(frame);//釋放掉frame
                            continue;
                        }
                    }
                }
            }
			 //幀入隊,放置解碼後視訊佇列中,在video_refresh中處理
            ret = ffp_queue_picture(ffp, frame, pts, duration, av_frame_get_pkt_pos(frame), is->viddec.pkt_serial);
            if (ret) {//入隊出錯,release buffer false通知MediaCodec丟棄這一幀,不顯示
                if (frame->opaque) 
                    SDL_VoutAndroid_releaseBufferProxyP(opaque->weak_vout, (SDL_AMediaCodecBufferProxy **)&frame->opaque, false);
              
            }
            av_frame_unref(frame);
        }
    }

fail:
    av_frame_free(&frame);
    opaque->abort = true;
    SDL_WaitThread(opaque->enqueue_thread, NULL);
    SDL_AMediaCodecFake_abort(opaque->acodec);
    if (opaque->n_buf_out) {
        free(opaque->amc_buf_out);
        opaque->n_buf_out = 0;
        opaque->amc_buf_out = NULL;
        opaque->off_buf_out = 0;
        opaque->last_queued_pts = AV_NOPTS_VALUE;
    }
    if (opaque->acodec) {
        SDL_VoutAndroid_invalidateAllBuffers(opaque->weak_vout);
        SDL_LockMutex(opaque->acodec_mutex);
        SDL_UnlockMutex(opaque->acodec_mutex);
    }
    SDL_AMediaCodec_stop(opaque->acodec);
    SDL_AMediaCodec_decreaseReferenceP(&opaque->acodec);
    ALOGI("MediaCodec: %s: exit: %d", __func__, ret);
    return ret;
#if 0   //硬解出錯,走軟解
fallback_to_ffplay:
    ALOGW("fallback to ffplay decoder\n");
    return ffp_video_thread(opaque->ffp);
#endif
}
複製程式碼

2.2軟解碼丟幀設計

static int get_video_frame(FFPlayer *ffp, AVFrame *frame)
{
    VideoState *is = ffp->is;
    int got_picture;
	//視訊流buffer載入快取統計
    ffp_video_statistic_l(ffp);
	//軟解耗時測試
	//int64_t starttime  = av_gettime_relative();
	//解碼獲取解碼後的資料avframe
    if ((got_picture = decoder_decode_frame(ffp, &is->viddec, frame, NULL)) < 0)
        return -1;
	/*
	if(frame->key_frame) {//關鍵幀軟解耗時測試
		int64_t endtime  = av_gettime_relative();
		int usetime = endtime - starttime;
		ALOGE("zmlruan>>>>>>usetime:%d",usetime);
	}*/
    if (got_picture) {//解碼成功,拿到解碼資料
        double dpts = NAN;

        if (frame->pts != AV_NOPTS_VALUE)
            dpts = av_q2d(is->video_st->time_base) * frame->pts;//視訊的pts轉換為ms,也就是當前進度時間
        //視訊的寬高比
        frame->sample_aspect_ratio = av_guess_sample_aspect_ratio(is->ic, is->video_st, frame);
        //丟幀數大於0且同步不是按照video
        if (ffp->framedrop>0 || (ffp->framedrop && get_master_sync_type(is) != AV_SYNC_VIDEO_MASTER)) {
            ffp->stat.decode_frame_count++;//解碼數
            if (frame->pts != AV_NOPTS_VALUE) {//diff = 視訊時間戳 減去主時鐘時間戳,(這裡看音訊時間戳)
                double diff = dpts - get_master_clock(is); // AV_NOSYNC_THRESHOLD:同步閾值。如果誤差太大,則不進行校正
                if (!isnan(diff) && fabs(diff) < AV_NOSYNC_THRESHOLD &&
                    diff - is->frame_last_filter_delay < 0 &&
                    is->viddec.pkt_serial == is->vidclk.serial &&
                    is->videoq.nb_packets) {
                    is->frame_drops_early++;
                    is->continuous_frame_drops_early++;
                    if (is->continuous_frame_drops_early > ffp->framedrop) {
                        is->continuous_frame_drops_early = 0;
                    } else {
                        ffp->stat.drop_frame_count++;//丟幀數加加
                        //丟幀率 = 丟幀數 / 解碼幀數
                        ffp->stat.drop_frame_rate = (float)(ffp->stat.drop_frame_count) / (float)(ffp->stat.decode_frame_count);
                        av_frame_unref(frame);//丟幀
                        got_picture = 0;//修改返回引數,表示沒獲取到視訊幀,丟掉了
                    }
                }
            }
        }
    }

    return got_picture;
}
複製程式碼

這樣我們就可以看到ijk中丟幀的實現邏輯了。

相關文章