国产xxxx99真实实拍_久久不雅视频_高清韩国a级特黄毛片_嗯老师别我我受不了了小说

資訊專欄INFORMATION COLUMN

android ijkplayer c層分析-初始化(續(xù)1 java層與c層銜接)

Olivia / 1623人閱讀

摘要:初始化的過程上一篇其實并未完全分析完,這回接著來。層的函數(shù)中,最后還有的調(diào)用,走的是層的。結(jié)構(gòu)體如下的和,以及,其余是狀態(tài)及的內(nèi)容。整個過程是個異步的過程,并不阻塞。至于的東西,都是在層創(chuàng)建并填充的。

初始化的過程上一篇其實并未完全分析完,這回接著來。java層的initPlayer函數(shù)中,最后還有native_setup的調(diào)用,走的是c層的IjkMediaPlayer_native_setup。來看看他干了什么吧:

IjkMediaPlayer_native_setup(JNIEnv *env, jobject thiz, jobject weak_this)
{
    MPTRACE("%s
", __func__);
    IjkMediaPlayer *mp = ijkmp_android_create(message_loop);
    JNI_CHECK_GOTO(mp, env, "java/lang/OutOfMemoryError", "mpjni: native_setup: ijkmp_create() failed", LABEL_RETURN);

    jni_set_media_player(env, thiz, mp);
    ijkmp_set_weak_thiz(mp, (*env)->NewGlobalRef(env, weak_this));
    ijkmp_set_inject_opaque(mp, ijkmp_get_weak_thiz(mp));
    ijkmp_android_set_mediacodec_select_callback(mp, mediacodec_select_callback, ijkmp_get_weak_thiz(mp));

LABEL_RETURN:
    ijkmp_dec_ref_p(&mp);
}

首先創(chuàng)建播放器IjkMediaPlayer,傳入一個message_loop。繼續(xù)看ijkmp_android_create:

IjkMediaPlayer *ijkmp_android_create(int(*msg_loop)(void*))
{
    IjkMediaPlayer *mp = ijkmp_create(msg_loop);
    if (!mp)
        goto fail;

    mp->ffplayer->vout = SDL_VoutAndroid_CreateForAndroidSurface();
    if (!mp->ffplayer->vout)
        goto fail;

    mp->ffplayer->pipeline = ffpipeline_create_from_android(mp->ffplayer);
    if (!mp->ffplayer->pipeline)
        goto fail;

    ffpipeline_set_vout(mp->ffplayer->pipeline, mp->ffplayer->vout);

    return mp;

fail:
    ijkmp_dec_ref_p(&mp);
    return NULL;
}

好吧,往下繼續(xù)看ijkmp_create:

IjkMediaPlayer *ijkmp_create(int (*msg_loop)(void*))
{
    IjkMediaPlayer *mp = (IjkMediaPlayer *) mallocz(sizeof(IjkMediaPlayer));
    if (!mp)
        goto fail;

    mp->ffplayer = ffp_create();
    if (!mp->ffplayer)
        goto fail;

    mp->msg_loop = msg_loop;

    ijkmp_inc_ref(mp);
    pthread_mutex_init(&mp->mutex, NULL);

    return mp;

    fail:
    ijkmp_destroy_p(&mp);
    return NULL;
}

一上來為結(jié)構(gòu)體IjkMediaPlayer分配空間,然后填充里面的內(nèi)容,例如ffplayer和msg_loop。結(jié)構(gòu)體如下:

struct IjkMediaPlayer {
    volatile int ref_count;
    pthread_mutex_t mutex;
    FFPlayer *ffplayer;

    int (*msg_loop)(void*);
    SDL_Thread *msg_thread;
    SDL_Thread _msg_thread;

    int mp_state;
    char *data_source;
    void *weak_thiz;

    int restart;
    int restart_from_beginning;
    int seek_req;
    long seek_msec;
};

ffmpeg的player和sdl,以及msg_loop,其余是狀態(tài)及seek的內(nèi)容。
回來到ijkmp_create。看這里將這個函數(shù)指針給了mp的msg_loop。然后是ijkmp_inc_ref,這里設(shè)置了mp的引用計數(shù)加1。好吧,先回來看下這個loop是什么東西,回到最初的IjkMediaPlayer_native_setup。

static int message_loop(void *arg)
{
    MPTRACE("%s
", __func__);

    JNIEnv *env = NULL;
    (*g_jvm)->AttachCurrentThread(g_jvm, &env, NULL );

    IjkMediaPlayer *mp = (IjkMediaPlayer*) arg;
    JNI_CHECK_GOTO(mp, env, NULL, "mpjni: native_message_loop: null mp", LABEL_RETURN);

    message_loop_n(env, mp);

LABEL_RETURN:
    ijkmp_dec_ref_p(&mp);
    (*g_jvm)->DetachCurrentThread(g_jvm);

    MPTRACE("message_loop exit");
    return 0;
}

AttachCurrentThread為了獲取JNIEnv,然后關(guān)鍵點是message_loop_n:

static void message_loop_n(JNIEnv *env, IjkMediaPlayer *mp)
{
    jobject weak_thiz = (jobject) ijkmp_get_weak_thiz(mp);
    JNI_CHECK_GOTO(weak_thiz, env, NULL, "mpjni: message_loop_n: null weak_thiz", LABEL_RETURN);

    while (1) {
        AVMessage msg;

        int retval = ijkmp_get_msg(mp, &msg, 1);
        if (retval < 0)
            break;

        // block-get should never return 0
        assert(retval > 0);

        switch (msg.what) {
        case FFP_MSG_FLUSH:
            MPTRACE("FFP_MSG_FLUSH:
");
            post_event(env, weak_thiz, MEDIA_NOP, 0, 0);
            break;
        case FFP_MSG_ERROR:
            MPTRACE("FFP_MSG_ERROR: %d
", msg.arg1);
            post_event(env, weak_thiz, MEDIA_ERROR, MEDIA_ERROR_IJK_PLAYER, msg.arg1);
            break;
        case FFP_MSG_PREPARED:
            MPTRACE("FFP_MSG_PREPARED:
");
            post_event(env, weak_thiz, MEDIA_PREPARED, 0, 0);
            break;
        case FFP_MSG_COMPLETED:
            MPTRACE("FFP_MSG_COMPLETED:
");
            post_event(env, weak_thiz, MEDIA_PLAYBACK_COMPLETE, 0, 0);
            break;
        case FFP_MSG_VIDEO_SIZE_CHANGED:
            MPTRACE("FFP_MSG_VIDEO_SIZE_CHANGED: %d, %d
", msg.arg1, msg.arg2);
            post_event(env, weak_thiz, MEDIA_SET_VIDEO_SIZE, msg.arg1, msg.arg2);
            break;
        case FFP_MSG_SAR_CHANGED:
            MPTRACE("FFP_MSG_SAR_CHANGED: %d, %d
", msg.arg1, msg.arg2);
            post_event(env, weak_thiz, MEDIA_SET_VIDEO_SAR, msg.arg1, msg.arg2);
            break;
        case FFP_MSG_VIDEO_RENDERING_START:
            MPTRACE("FFP_MSG_VIDEO_RENDERING_START:
");
            post_event(env, weak_thiz, MEDIA_INFO, MEDIA_INFO_VIDEO_RENDERING_START, 0);
            break;
        case FFP_MSG_AUDIO_RENDERING_START:
            MPTRACE("FFP_MSG_AUDIO_RENDERING_START:
");
            post_event(env, weak_thiz, MEDIA_INFO, MEDIA_INFO_AUDIO_RENDERING_START, 0);
            break;
        case FFP_MSG_VIDEO_ROTATION_CHANGED:
            MPTRACE("FFP_MSG_VIDEO_ROTATION_CHANGED: %d
", msg.arg1);
            post_event(env, weak_thiz, MEDIA_INFO, MEDIA_INFO_VIDEO_ROTATION_CHANGED, msg.arg1);
            break;
        case FFP_MSG_BUFFERING_START:
            MPTRACE("FFP_MSG_BUFFERING_START:
");
            post_event(env, weak_thiz, MEDIA_INFO, MEDIA_INFO_BUFFERING_START, 0);
            break;
        case FFP_MSG_BUFFERING_END:
            MPTRACE("FFP_MSG_BUFFERING_END:
");
            post_event(env, weak_thiz, MEDIA_INFO, MEDIA_INFO_BUFFERING_END, 0);
            break;
        case FFP_MSG_BUFFERING_UPDATE:
            // MPTRACE("FFP_MSG_BUFFERING_UPDATE: %d, %d", msg.arg1, msg.arg2);
            post_event(env, weak_thiz, MEDIA_BUFFERING_UPDATE, msg.arg1, msg.arg2);
            break;
        case FFP_MSG_BUFFERING_BYTES_UPDATE:
            break;
        case FFP_MSG_BUFFERING_TIME_UPDATE:
            break;
        case FFP_MSG_SEEK_COMPLETE:
            MPTRACE("FFP_MSG_SEEK_COMPLETE:
");
            post_event(env, weak_thiz, MEDIA_SEEK_COMPLETE, 0, 0);
            break;
        case FFP_MSG_PLAYBACK_STATE_CHANGED:
            break;
        case FFP_MSG_TIMED_TEXT:
            if (msg.obj) {
                jstring text = (*env)->NewStringUTF(env, (char *)msg.obj);
                post_event2(env, weak_thiz, MEDIA_TIMED_TEXT, 0, 0, text);
                J4A_DeleteLocalRef__p(env, &text);
            }
            else {
                post_event2(env, weak_thiz, MEDIA_TIMED_TEXT, 0, 0, NULL);
            }
            break;
        default:
            ALOGE("unknown FFP_MSG_xxx(%d)
", msg.what);
            break;
        }
        msg_free_res(&msg);
    }

LABEL_RETURN:
    ;
}

這明顯是個事件處理loop,關(guān)鍵是post_event,里面就一句話:J4AC_IjkMediaPlayer__postEventFromNative(env, weak_this, what, arg1, arg2, NULL);最后追到J4AC_tv_danmaku_ijk_media_player_IjkMediaPlayer__postEventFromNative,里面是(*env)->CallStaticVoidMethod(env, class_J4AC_tv_danmaku_ijk_media_player_IjkMediaPlayer.id, class_J4AC_tv_danmaku_ijk_media_player_IjkMediaPlayer.method_postEventFromNative, weakThiz, what, arg1, arg2, obj);調(diào)用java層的函數(shù),再看下去:J4A_loadClass__J4AC_tv_danmaku_ijk_media_player_IjkMediaPlayer里面:

class_id = class_J4AC_tv_danmaku_ijk_media_player_IjkMediaPlayer.id;
    name     = "postEventFromNative";
    sign     = "(Ljava/lang/Object;IIILjava/lang/Object;)V";
    class_J4AC_tv_danmaku_ijk_media_player_IjkMediaPlayer.method_postEventFromNative = J4A_GetStaticMethodID__catchAll(env, class_id, name, sign);
    if (class_J4AC_tv_danmaku_ijk_media_player_IjkMediaPlayer.method_postEventFromNative == NULL)
        goto fail;

不用說了吧,走的是java層的函數(shù)postEventFromNative。

@CalledByNative
    private static void postEventFromNative(Object weakThiz, int what,
            int arg1, int arg2, Object obj) {
        if (weakThiz == null)
            return;

        @SuppressWarnings("rawtypes")
        IjkMediaPlayer mp = (IjkMediaPlayer) ((WeakReference) weakThiz).get();
        if (mp == null) {
            return;
        }

        if (what == MEDIA_INFO && arg1 == MEDIA_INFO_STARTED_AS_NEXT) {
            // this acquires the wakelock if needed, and sets the client side
            // state
            mp.start();
        }
        if (mp.mEventHandler != null) {
            Message m = mp.mEventHandler.obtainMessage(what, arg1, arg2, obj);
            mp.mEventHandler.sendMessage(m);
        }
    }

這里把弱引用的IjkMediaPlayer取出來了,然后調(diào)用了mp.mEventHandler.sendMessage(m);那么這個IjkMediaPlayer是哪里來的呢?答案就在IjkMediaPlayer_native_setup,也就是java層的native_setup函數(shù)中傳遞進去的,在最初的initPlayer函數(shù)中調(diào)用的。那么java層的postEventFromNative里面的mp.mEventHandler是什么呢?就是initPlayer里面創(chuàng)建的looper。這下子串起來了吧,java層建立的IjkMediaPlayer,并填充eventhandler,c層在觸發(fā)特定的一些動作(例如打開直播等),會調(diào)用java層的函數(shù)向looper里面發(fā)送message,于是java層就收到了內(nèi)容,可以進行相關(guān)處理了。整個過程是個異步的過程,并不阻塞。至于ffmpeg的東西,都是在c層創(chuàng)建并填充的。

文章版權(quán)歸作者所有,未經(jīng)允許請勿轉(zhuǎn)載,若此文章存在違規(guī)行為,您可以聯(lián)系管理員刪除。

轉(zhuǎn)載請注明本文地址:http://specialneedsforspecialkids.com/yun/66734.html

相關(guān)文章

  • android ijkplayer c分析-prepare過程與讀取線程(續(xù)2-讀取輸入源)

    摘要:下面是,讀取頭信息頭信息。猜測網(wǎng)絡(luò)部分至少在一開始就應當初始化好的,因此在的過程里面找,在中找到了。就先暫時分析到此吧。 這章要簡單分析下ijkplayer是如何從文件或網(wǎng)絡(luò)讀取數(shù)據(jù)源的。還是read_thread函數(shù)中的關(guān)鍵點avformat_open_input函數(shù): int avformat_open_input(AVFormatContext **ps, const char ...

    kevin 評論0 收藏0
  • android ijkplayer c分析-prepare過程與讀取線程(續(xù)1-解碼粗略分析)

    摘要:分別為音頻視頻和字母進行相關(guān)處理。向下跟蹤兩層,會發(fā)現(xiàn),核心函數(shù)是。至此解碼算完了。整個過程真是粗略分析啊,對自己也很抱歉,暫時先這樣吧。 上文中說到在read_thread線程中有個關(guān)鍵函數(shù):avformat_open_input(utils.c),應當是讀取視頻文件的,這個函數(shù)屬于ffmpeg層。這回進入到其中去看下: int avformat_open_input(AVForma...

    zhonghanwen 評論0 收藏0
  • android ijkplayer c分析-prepare過程與讀取線程

    摘要:我們下面先從讀取線程入手。無論這個循環(huán)前后干了什么,都是要走這一步,讀取數(shù)據(jù)幀。從開始,我理解的是計算出當前數(shù)據(jù)幀的時間戳后再計算出播放的起始時間到當前時間,然后看這個時間戳是否在此范圍內(nèi)。 ijkplayer現(xiàn)在比較流行,因為工作關(guān)系,接觸了他,現(xiàn)在做個簡單的分析記錄吧。我這里直接跳過java層代碼,進入c層,因為大多數(shù)的工作都是通過jni調(diào)用到c層來完成的,java層的內(nèi)容并不是主...

    MobService 評論0 收藏0
  • android ijkplayer c分析-prepare過程與讀取線程(續(xù)3-解碼核心video

    摘要:基本上就是對一個數(shù)據(jù)幀的描述。我理解的是一個未解碼的壓縮數(shù)據(jù)幀。 read_thread這個最關(guān)鍵的讀取線程中,逐步跟蹤,可以明確stream_component_open---> decoder_start---> video_thread--->ffplay_video_thread。這個調(diào)用過程,在解碼開始后的異步解碼線程中,調(diào)用的是ffplay_video_thread。具體可...

    _Suqin 評論0 收藏0

發(fā)表評論

0條評論

Olivia

|高級講師

TA的文章

閱讀更多
最新活動
閱讀需要支付1元查看
<