摘要:管協(xié)議和編解碼,管渲染顯示,管理播放器。然后是,這個最后會走到基本上以的初始化內(nèi)容居多,開頭的應(yīng)該都是。然后是,的網(wǎng)絡(luò)初始化。這下子與基礎(chǔ)協(xié)議對應(yīng)的各項(xiàng)操作算是找到了。終于分析完了,總結(jié)起來就是各種初始化,協(xié)議的解碼器的網(wǎng)絡(luò)的回調(diào)上層的。
本來這個過程我是不大想寫初始化的過程,覺得網(wǎng)上已經(jīng)有不少文章來分析了。但是在前面的整個分析過程中,暴露了自己對一些問題理解還不夠透徹,因此有必要做一次。
首先是java層:
private void initPlayer(IjkLibLoader libLoader) { loadLibrariesOnce(libLoader); initNativeOnce(); Looper looper; if ((looper = Looper.myLooper()) != null) { mEventHandler = new EventHandler(this, looper); } else if ((looper = Looper.getMainLooper()) != null) { mEventHandler = new EventHandler(this, looper); } else { mEventHandler = null; } /* * Native setup requires a weak reference to our object. It"s easier to * create it here than in C++. */ native_setup(new WeakReference(this)); }
其實(shí)就2個事情,一個是loadLibrariesOnce,一個是initNativeOnce。前者的代碼就不貼了,就是loadLibrary3個so,分別是ijkffmpeg、ijksdl和ijkplayer。ffmpeg管協(xié)議和編解碼,sdl管渲染顯示,ijkplayer管理播放器。每次調(diào)用loadLibrary都會走到每個so的JNI_OnLoad函數(shù),也就是說這3個so的最開始初始化都在JNI_OnLoad這個函數(shù)內(nèi)處理。回頭我們再看;后者的initNativeOnce里面實(shí)際上走的是native_init。這個對應(yīng)的是jni的函數(shù)IjkMediaPlayer_native_init。在ijkplayer_jni.c中:
static void IjkMediaPlayer_native_init(JNIEnv *env) { MPTRACE("%s ", __func__); }
什么都沒干,對吧。
回來,看看JNI_OnLoad都干了什么:
JNIEXPORT jint JNI_OnLoad(JavaVM *vm, void *reserved) { JNIEnv* env = NULL; g_jvm = vm; if ((*vm)->GetEnv(vm, (void**) &env, JNI_VERSION_1_4) != JNI_OK) { return -1; } assert(env != NULL); pthread_mutex_init(&g_clazz.mutex, NULL ); // FindClass returns LocalReference IJK_FIND_JAVA_CLASS(env, g_clazz.clazz, JNI_CLASS_IJKPLAYER); (*env)->RegisterNatives(env, g_clazz.clazz, g_methods, NELEM(g_methods) ); ijkmp_global_init(); ijkmp_global_set_inject_callback(inject_callback); FFmpegApi_global_init(env); return JNI_VERSION_1_4; }
前面都是通用的一些做法,主要是注冊函數(shù)表,用來在java層能夠調(diào)用c層的函數(shù)。然后是ijkmp_global_init,這個最后會走到ffp_global_init:
void ffp_global_init() { if (g_ffmpeg_global_inited) return; /* register all codecs, demux and protocols */ avcodec_register_all(); #if CONFIG_AVDEVICE avdevice_register_all(); #endif #if CONFIG_AVFILTER avfilter_register_all(); #endif av_register_all(); ijkav_register_all(); avformat_network_init(); av_lockmgr_register(lockmgr); av_log_set_callback(ffp_log_callback_brief); av_init_packet(&flush_pkt); flush_pkt.data = (uint8_t *)&flush_pkt; g_ffmpeg_global_inited = true; }
基本上以ffmpeg的初始化內(nèi)容居多,av開頭的應(yīng)該都是。注冊解碼器,然后協(xié)議的注冊。我們看ijkav_register_all:
void ijkav_register_all(void) { static int initialized; if (initialized) return; initialized = 1; av_register_all(); /* protocols */ av_log(NULL, AV_LOG_INFO, "===== custom modules begin ===== "); #ifdef __ANDROID__ IJK_REGISTER_PROTOCOL(ijkmediadatasource); #endif IJK_REGISTER_PROTOCOL(async); IJK_REGISTER_PROTOCOL(ijklongurl); IJK_REGISTER_PROTOCOL(ijktcphook); IJK_REGISTER_PROTOCOL(ijkhttphook); IJK_REGISTER_PROTOCOL(ijksegment); /* demuxers */ IJK_REGISTER_DEMUXER(ijklivehook); av_log(NULL, AV_LOG_INFO, "===== custom modules end ===== "); }
基本上都是為了支持網(wǎng)絡(luò)傳輸?shù)膮f(xié)議注冊。然后是avformat_network_init,ffmpeg的網(wǎng)絡(luò)初始化。最后到達(dá)ff_network_init,里面就是個WSAStartup。
回來看這么多協(xié)議的注冊,先看下這個宏:
#define IJK_REGISTER_PROTOCOL(x) { extern URLProtocol ijkimp_ff_##x##_protocol; int ijkav_register_##x##_protocol(URLProtocol *protocol, int protocol_size); ijkav_register_##x##_protocol(&ijkimp_ff_##x##_protocol, sizeof(URLProtocol)); }
URLProtocol結(jié)構(gòu)是個關(guān)鍵。那么這個結(jié)構(gòu)的填充靠什么呢?看宏的調(diào)用,找到extern后面的部分,搜索下,原來在不少文件里都有,例如ijkurlhook.c文件中:
URLProtocol ijkimp_ff_ijktcphook_protocol = { .name = "ijktcphook", .url_open2 = ijktcphook_open, .url_read = ijkurlhook_read, .url_write = ijkurlhook_write, .url_close = ijkurlhook_close, .priv_data_size = sizeof(Context), .priv_data_class = &ijktcphook_context_class, };
這里已經(jīng)規(guī)定了打開和寫入關(guān)閉等的函數(shù)。這下子與基礎(chǔ)協(xié)議對應(yīng)的各項(xiàng)操作算是找到了。我們來看看不一樣的live的處理:
#define IJK_REGISTER_DEMUXER(x) { extern AVInputFormat ijkff_##x##_demuxer; ijkav_register_input_format(&ijkff_##x##_demuxer); }
然后會定位到ijklivehook.c文件中的
AVInputFormat ijkff_ijklivehook_demuxer = { .name = "ijklivehook", .long_name = "Live Hook Controller", .flags = AVFMT_NOFILE | AVFMT_TS_DISCONT, .priv_data_size = sizeof(Context), .read_probe = ijklivehook_probe, .read_header2 = ijklivehook_read_header, .read_packet = ijklivehook_read_packet, .read_close = ijklivehook_read_close, .priv_class = &ijklivehook_class, };
往下看,以ijklivehook_read_header為例,可以看到內(nèi)部有url的判斷,區(qū)分rtmp和rtsp,這下子清楚了吧。
簡單總結(jié)一下,就是通過URLProtocol這個結(jié)構(gòu)來規(guī)范化所有的協(xié)議,名稱和操作函數(shù)都在這里定義。
回到ffp_global_init,下面進(jìn)行到了av_init_packet。這里插一下一個數(shù)據(jù)結(jié)構(gòu)AVPacket。這個是存儲壓縮編碼數(shù)據(jù)相關(guān)信息的結(jié)構(gòu)體。
typedef struct AVPacket { /** * A reference to the reference-counted buffer where the packet data is * stored. * May be NULL, then the packet data is not reference-counted. */ AVBufferRef *buf; /** * Presentation timestamp in AVStream->time_base units; the time at which * the decompressed packet will be presented to the user. * Can be AV_NOPTS_VALUE if it is not stored in the file. * pts MUST be larger or equal to dts as presentation cannot happen before * decompression, unless one wants to view hex dumps. Some formats misuse * the terms dts and pts/cts to mean something different. Such timestamps * must be converted to true pts/dts before they are stored in AVPacket. */ int64_t pts; /** * Decompression timestamp in AVStream->time_base units; the time at which * the packet is decompressed. * Can be AV_NOPTS_VALUE if it is not stored in the file. */ int64_t dts; uint8_t *data; int size; int stream_index; /** * A combination of AV_PKT_FLAG values */ int flags; /** * Additional packet data that can be provided by the container. * Packet can contain several types of side information. */ AVPacketSideData *side_data; int side_data_elems; /** * Duration of this packet in AVStream->time_base units, 0 if unknown. * Equals next_pts - this_pts in presentation order. */ int64_t duration; int64_t pos; ///< byte position in stream, -1 if unknown #if FF_API_CONVERGENCE_DURATION /** * @deprecated Same as the duration field, but as int64_t. This was required * for Matroska subtitles, whose duration values could overflow when the * duration field was still an int. */ attribute_deprecated int64_t convergence_duration; #endif } AVPacket;
看到了什么嗎?pts,dts,data。顯示時間戳,解碼時間戳,數(shù)據(jù)。av_init_packet就是個簡單填充,不貼代碼了。回到JNI_OnLoad,然后進(jìn)行的是ijkmp_global_set_inject_callback。
設(shè)置了一個回調(diào),那么看看具體回調(diào)的約定吧:
static int inject_callback(void *opaque, int what, void *data, size_t data_size) { JNIEnv *env = NULL; jobject jbundle = NULL; int ret = -1; SDL_JNI_SetupThreadEnv(&env); jobject weak_thiz = (jobject) opaque; if (weak_thiz == NULL ) goto fail; switch (what) { case AVAPP_CTRL_WILL_HTTP_OPEN: case AVAPP_CTRL_WILL_LIVE_OPEN: case AVAPP_CTRL_WILL_CONCAT_SEGMENT_OPEN: { AVAppIOControl *real_data = (AVAppIOControl *)data; real_data->is_handled = 0; jbundle = J4AC_Bundle__Bundle__catchAll(env); if (!jbundle) { ALOGE("%s: J4AC_Bundle__Bundle__catchAll failed for case %d ", __func__, what); goto fail; } J4AC_Bundle__putString__withCString__catchAll(env, jbundle, "url", real_data->url); J4AC_Bundle__putInt__withCString__catchAll(env, jbundle, "segment_index", real_data->segment_index); J4AC_Bundle__putInt__withCString__catchAll(env, jbundle, "retry_counter", real_data->retry_counter); real_data->is_handled = J4AC_IjkMediaPlayer__onNativeInvoke(env, weak_thiz, what, jbundle); if (J4A_ExceptionCheck__catchAll(env)) { goto fail; } J4AC_Bundle__getString__withCString__asCBuffer(env, jbundle, "url", real_data->url, sizeof(real_data->url)); if (J4A_ExceptionCheck__catchAll(env)) { goto fail; } ret = 0; break; } case AVAPP_EVENT_WILL_HTTP_OPEN: case AVAPP_EVENT_DID_HTTP_OPEN: case AVAPP_EVENT_WILL_HTTP_SEEK: case AVAPP_EVENT_DID_HTTP_SEEK: { AVAppHttpEvent *real_data = (AVAppHttpEvent *) data; jbundle = J4AC_Bundle__Bundle__catchAll(env); if (!jbundle) { ALOGE("%s: J4AC_Bundle__Bundle__catchAll failed for case %d ", __func__, what); goto fail; } J4AC_Bundle__putString__withCString__catchAll(env, jbundle, "url", real_data->url); J4AC_Bundle__putLong__withCString__catchAll(env, jbundle, "offset", real_data->offset); J4AC_Bundle__putInt__withCString__catchAll(env, jbundle, "error", real_data->error); J4AC_Bundle__putInt__withCString__catchAll(env, jbundle, "http_code", real_data->http_code); J4AC_IjkMediaPlayer__onNativeInvoke(env, weak_thiz, what, jbundle); if (J4A_ExceptionCheck__catchAll(env)) goto fail; ret = 0; break; } case AVAPP_CTRL_DID_TCP_OPEN: case AVAPP_CTRL_WILL_TCP_OPEN: { AVAppTcpIOControl *real_data = (AVAppTcpIOControl *)data; jbundle = J4AC_Bundle__Bundle__catchAll(env); if (!jbundle) { ALOGE("%s: J4AC_Bundle__Bundle__catchAll failed for case %d ", __func__, what); goto fail; } J4AC_Bundle__putInt__withCString__catchAll(env, jbundle, "error", real_data->error); J4AC_Bundle__putInt__withCString__catchAll(env, jbundle, "family", real_data->family); J4AC_Bundle__putString__withCString__catchAll(env, jbundle, "ip", real_data->ip); J4AC_Bundle__putInt__withCString__catchAll(env, jbundle, "port", real_data->port); J4AC_Bundle__putInt__withCString__catchAll(env, jbundle, "fd", real_data->fd); J4AC_IjkMediaPlayer__onNativeInvoke(env, weak_thiz, what, jbundle); if (J4A_ExceptionCheck__catchAll(env)) goto fail; ret = 0; break; } default: { ret = 0; } } fail: SDL_JNI_DeleteLocalRefP(env, &jbundle); return ret; }
簡單找個函數(shù)看下:J4AC_IjkMediaPlayer__onNativeInvoke,在java層里找到了定義:
private OnNativeInvokeListener mOnNativeInvokeListener; public void setOnNativeInvokeListener(OnNativeInvokeListener listener) { mOnNativeInvokeListener = listener; } public interface OnNativeInvokeListener { int CTRL_WILL_TCP_OPEN = 0x20001; // NO ARGS int CTRL_DID_TCP_OPEN = 0x20002; // ARG_ERROR, ARG_FAMILIY, ARG_IP, ARG_PORT, ARG_FD int CTRL_WILL_HTTP_OPEN = 0x20003; // ARG_URL, ARG_SEGMENT_INDEX, ARG_RETRY_COUNTER int CTRL_WILL_LIVE_OPEN = 0x20005; // ARG_URL, ARG_RETRY_COUNTER int CTRL_WILL_CONCAT_RESOLVE_SEGMENT = 0x20007; // ARG_URL, ARG_SEGMENT_INDEX, ARG_RETRY_COUNTER int EVENT_WILL_HTTP_OPEN = 0x1; // ARG_URL int EVENT_DID_HTTP_OPEN = 0x2; // ARG_URL, ARG_ERROR, ARG_HTTP_CODE int EVENT_WILL_HTTP_SEEK = 0x3; // ARG_URL, ARG_OFFSET int EVENT_DID_HTTP_SEEK = 0x4; // ARG_URL, ARG_OFFSET, ARG_ERROR, ARG_HTTP_CODE String ARG_URL = "url"; String ARG_SEGMENT_INDEX = "segment_index"; String ARG_RETRY_COUNTER = "retry_counter"; String ARG_ERROR = "error"; String ARG_FAMILIY = "family"; String ARG_IP = "ip"; String ARG_PORT = "port"; String ARG_FD = "fd"; String ARG_OFFSET = "offset"; String ARG_HTTP_CODE = "http_code"; /* * @return true if invoke is handled * @throws Exception on any error */ boolean onNativeInvoke(int what, Bundle args); } @CalledByNative private static boolean onNativeInvoke(Object weakThiz, int what, Bundle args) { DebugLog.ifmt(TAG, "onNativeInvoke %d", what); if (weakThiz == null || !(weakThiz instanceof WeakReference>)) throw new IllegalStateException(".onNativeInvoke()"); @SuppressWarnings("unchecked") WeakReference weakPlayer = (WeakReference ) weakThiz; IjkMediaPlayer player = weakPlayer.get(); if (player == null) throw new IllegalStateException(" .onNativeInvoke()"); OnNativeInvokeListener listener = player.mOnNativeInvokeListener; if (listener != null && listener.onNativeInvoke(what, args)) return true; switch (what) { case OnNativeInvokeListener.CTRL_WILL_CONCAT_RESOLVE_SEGMENT: { OnControlMessageListener onControlMessageListener = player.mOnControlMessageListener; if (onControlMessageListener == null) return false; int segmentIndex = args.getInt(OnNativeInvokeListener.ARG_SEGMENT_INDEX, -1); if (segmentIndex < 0) throw new InvalidParameterException("onNativeInvoke(invalid segment index)"); String newUrl = onControlMessageListener.onControlResolveSegmentUrl(segmentIndex); if (newUrl == null) throw new RuntimeException(new IOException("onNativeInvoke() = ")); args.putString(OnNativeInvokeListener.ARG_URL, newUrl); return true; } default: return false; } }
那么可以確定,這里是注冊的回調(diào),以便通知java層。好吧,回來繼續(xù)JNI_OnLoad,就差FFmpegApi_global_init了:
#define JNI_CLASS_FFMPEG_API "tv/danmaku/ijk/media/player/ffmpeg/FFmpegApi" ...... int FFmpegApi_global_init(JNIEnv *env) { int ret = 0; IJK_FIND_JAVA_CLASS(env, g_clazz.clazz, JNI_CLASS_FFMPEG_API); (*env)->RegisterNatives(env, g_clazz.clazz, g_methods, NELEM(g_methods)); return ret; }
按照定義,找到這個類,只有一句話:
public class FFmpegApi { public static native String av_base64_encode(byte in[]); }
其實(shí)就是個base64的解碼,指向ffmpeg的c函數(shù),這里進(jìn)行了注冊。
終于分析完了,總結(jié)起來就是各種初始化,協(xié)議的、解碼器的、網(wǎng)絡(luò)的、回調(diào)上層的。
文章版權(quán)歸作者所有,未經(jīng)允許請勿轉(zhuǎn)載,若此文章存在違規(guī)行為,您可以聯(lián)系管理員刪除。
轉(zhuǎn)載請注明本文地址:http://specialneedsforspecialkids.com/yun/66743.html
摘要:我們下面先從讀取線程入手。無論這個循環(huán)前后干了什么,都是要走這一步,讀取數(shù)據(jù)幀。從開始,我理解的是計(jì)算出當(dāng)前數(shù)據(jù)幀的時間戳后再計(jì)算出播放的起始時間到當(dāng)前時間,然后看這個時間戳是否在此范圍內(nèi)。 ijkplayer現(xiàn)在比較流行,因?yàn)楣ぷ麝P(guān)系,接觸了他,現(xiàn)在做個簡單的分析記錄吧。我這里直接跳過java層代碼,進(jìn)入c層,因?yàn)榇蠖鄶?shù)的工作都是通過jni調(diào)用到c層來完成的,java層的內(nèi)容并不是主...
摘要:初始化的過程上一篇其實(shí)并未完全分析完,這回接著來。層的函數(shù)中,最后還有的調(diào)用,走的是層的。結(jié)構(gòu)體如下的和,以及,其余是狀態(tài)及的內(nèi)容。整個過程是個異步的過程,并不阻塞。至于的東西,都是在層創(chuàng)建并填充的。 初始化的過程上一篇其實(shí)并未完全分析完,這回接著來。java層的initPlayer函數(shù)中,最后還有native_setup的調(diào)用,走的是c層的IjkMediaPlayer_native_...
摘要:下面是,讀取頭信息頭信息。猜測網(wǎng)絡(luò)部分至少在一開始就應(yīng)當(dāng)初始化好的,因此在的過程里面找,在中找到了。就先暫時分析到此吧。 這章要簡單分析下ijkplayer是如何從文件或網(wǎng)絡(luò)讀取數(shù)據(jù)源的。還是read_thread函數(shù)中的關(guān)鍵點(diǎn)avformat_open_input函數(shù): int avformat_open_input(AVFormatContext **ps, const char ...
摘要:在的過程中,不僅會啟動,而且會啟動。獲取隊(duì)列最后一幀,如果幀中的圖像正常,繼續(xù)走渲染畫面。最后通過消息通知開始渲染。這個返回的偏差值就是后面進(jìn)行是否拋幀或的判斷依據(jù)。 在prepare的stream_open過程中,不僅會啟動read_thread,而且會啟動video_refresh_thread。今天就來看看這個video_refresh_thread干了什么。 static in...
摘要:分別為音頻視頻和字母進(jìn)行相關(guān)處理。向下跟蹤兩層,會發(fā)現(xiàn),核心函數(shù)是。至此解碼算完了。整個過程真是粗略分析啊,對自己也很抱歉,暫時先這樣吧。 上文中說到在read_thread線程中有個關(guān)鍵函數(shù):avformat_open_input(utils.c),應(yīng)當(dāng)是讀取視頻文件的,這個函數(shù)屬于ffmpeg層。這回進(jìn)入到其中去看下: int avformat_open_input(AVForma...
閱讀 2851·2021-09-10 10:51
閱讀 2220·2021-09-02 15:21
閱讀 3213·2019-08-30 15:44
閱讀 878·2019-08-29 18:34
閱讀 1659·2019-08-29 13:15
閱讀 3328·2019-08-26 11:37
閱讀 2702·2019-08-26 10:46
閱讀 1115·2019-08-26 10:26