1.下载FFmpeg-3.3.1源码 解压
2.配置环境变量,在电脑/Users/zhangyipeng/目录下创建.bash_profile文件(已有此文件无需创建),打开文件加入如下配置:
//大家替换成自己的NDK目录就好了,我这里使用的是android studio上下载的ndk目录
export ANDROID_HOME=/Users/zhangyipeng/Library/Android/sdk
export ANDROID_NDK_HOME=$ANDROID_HOME/ndk-bundle
export PATH=${PATH}:$ANDROID_HOME/platform-tools
export PATH=$PATH:$ANDROID_NDK_HOME
4.编译前需要修改ffmpeg-3.3.1目录下的configure文件,修改如下所示:
注释前四行掉,然后换成没有注释的
#SLIBNAME_WITH_MAJOR='$(SLIBNAME).$(LIBMAJOR)'
#LIB_INSTALL_EXTRA_CMD='$$(RANLIB) "$(LIBDIR)/$(LIBNAME)"'
#SLIB_INSTALL_NAME='$(SLIBNAME_WITH_VERSION)'
#SLIB_INSTALL_LINKS='$(SLIBNAME_WITH_MAJOR) $(SLIBNAME)'
SLIBNAME_WITH_MAJOR='$(SLIBPREF)$(FULLNAME)-$(LIBMAJOR)$(SLIBSUF)'
LIB_INSTALL_EXTRA_CMD='$$(RANLIB)"$(LIBDIR)/$(LIBNAME)"'
SLIB_INSTALL_NAME='$(SLIBNAME_WITH_MAJOR)'
SLIB_INSTALL_LINKS='$(SLIBNAME)'
5.在ffmpeg-3.3.1目录下创建文件build_android.sh,打开终端,进入ffmpeg-3.3.1目录,执行如下命令,使此文件可执行
chmod +x build_android.sh
文件内容如下:
#!/bin/bash
# NDK的路径,根据自己的安装位置进行设置
NDK=/Users/zhangyipeng/Library/Android/sdk/ndk-bundle
# 编译针对的平台,可以根据自己的需求进行设置
# 这里选择最低支持android-14, arm架构,生成的so库是放在。
# libs/armeabi文件夹下的,若针对x86架构,要选择arch-x86
PLATFORM=$NDK/platforms/android-14/arch-arm
# 工具链的路径,根据编译的平台不同而不同
# arm-linux-androideabi-4.9与上面设置的PLATFORM对应,4.9为工具的版本号,
# 根据自己安装的NDK版本来确定,一般使用最新的版本
TOOLCHAIN=$NDK/toolchains/arm-linux-androideabi-4.9/prebuilt/darwin-x86_64
function build_one
{
./configure \
--prefix=$PREFIX \
--target-os=linux \
--cross-prefix=$TOOLCHAIN/bin/arm-linux-androideabi- \
--arch=arm \
--sysroot=$PLATFORM \
--extra-cflags="-I$PLATFORM/usr/include" \
--cc=$TOOLCHAIN/bin/arm-linux-androideabi-gcc \
--nm=$TOOLCHAIN/bin/arm-linux-androideabi-nm \
--enable-shared \
--enable-runtime-cpudetect \
--enable-gpl \
--enable-small \
--enable-cross-compile \
--disable-debug \
--disable-static \
--disable-doc \
--disable-asm \
--disable-ffmpeg \
--disable-ffplay \
--disable-ffprobe \
--disable-ffserver \
--disable-postproc \
--disable-avdevice \
--disable-symver \
--disable-stripping \
$ADDITIONAL_CONFIGURE_FLAG
sed -i '' 's/HAVE_LRINT 0/HAVE_LRINT 1/g' config.h
sed -i '' 's/HAVE_LRINTF 0/HAVE_LRINTF 1/g' config.h
sed -i '' 's/HAVE_ROUND 0/HAVE_ROUND 1/g' config.h
sed -i '' 's/HAVE_ROUNDF 0/HAVE_ROUNDF 1/g' config.h
sed -i '' 's/HAVE_TRUNC 0/HAVE_TRUNC 1/g' config.h
sed -i '' 's/HAVE_TRUNCF 0/HAVE_TRUNCF 1/g' config.h
sed -i '' 's/HAVE_CBRT 0/HAVE_CBRT 1/g' config.h
sed -i '' 's/HAVE_RINT 0/HAVE_RINT 1/g' config.h
make clean
make -j4
make install
}
# arm v7vfp
CPU=armv7-a
OPTIMIZE_CFLAGS="-mfloat-abi=softfp -mfpu=vfp -marm -march=$CPU "
PREFIX=./android/$CPU-vfp
ADDITIONAL_CONFIGURE_FLAG=
build_one
# CPU=armv
# PREFIX=$(pwd)/android/$CPU
# ADDI_CFLAGS="-marm"
# build_one
#arm v6
#CPU=armv6
#OPTIMIZE_CFLAGS="-marm -march=$CPU"
#PREFIX=./android/$CPU
#ADDITIONAL_CONFIGURE_FLAG=
#build_one
#arm v7vfpv3
# CPU=armv7-a
# OPTIMIZE_CFLAGS="-mfloat-abi=softfp -mfpu=vfpv3-d16 -marm -march=$CPU "
# PREFIX=./android/$CPU
# ADDITIONAL_CONFIGURE_FLAG=
# build_one
#arm v7n
#CPU=armv7-a
#OPTIMIZE_CFLAGS="-mfloat-abi=softfp -mfpu=neon -marm -march=$CPU -mtune=cortex-a8"
#PREFIX=./android/$CPU
#ADDITIONAL_CONFIGURE_FLAG=--enable-neon
#build_one
#arm v6+vfp
#CPU=armv6
#OPTIMIZE_CFLAGS="-DCMP_HAVE_VFP -mfloat-abi=softfp -mfpu=vfp -marm -march=$CPU"
#PREFIX=./android/${CPU}_vfp
#ADDITIONAL_CONFIGURE_FLAG=
#build_one
6.执行build_android.sh,开始编译ffmpeg成.so动态库
命令如下:
./build_android.sh
7.大概等10~20分钟,编译完成,在ffmpeg-3.3.1目录下会生成一个名为android的文件夹,动态库就在这个目录之中,如图所示
8.如上图所示,会生成6个动态库,当然我们如果觉得动态库太多,使用麻烦,也可以只生成一个动态库,方法如下,我们可以再ffmpeg-3.3.1下再创建一个build_android_all.sh文件,重新执行上述步骤,就可以生成1个名为libffmpeg.so的库
build_android_all.sh文件文件内容如下:
#!/bin/bash
# NDK的路径,根据自己的安装位置进行设置
NDK=/Users/zhangyipeng/Library/Android/sdk/ndk-bundle
# 编译针对的平台,可以根据自己的需求进行设置
# 这里选择最低支持android-14, arm架构,生成的so库是放在
# libs/armeabi文件夹下的,若针对x86架构,要选择arch-x86
PLATFORM=$NDK/platforms/android-14/arch-arm
# 工具链的路径,根据编译的平台不同而不同
# arm-linux-androideabi-4.9与上面设置的PLATFORM对应,4.9为工具的版本号,
# 根据自己安装的NDK版本来确定,一般使用最新的版本
TOOLCHAIN=$NDK/toolchains/arm-linux-androideabi-4.9/prebuilt/darwin-x86_64
function build_one
{
./configure \
--prefix=$PREFIX \
--target-os=linux \
--cross-prefix=$TOOLCHAIN/bin/arm-linux-androideabi- \
--arch=arm \
--sysroot=$PLATFORM \
--extra-cflags="-I$PLATFORM/usr/include" \
--cc=$TOOLCHAIN/bin/arm-linux-androideabi-gcc \
--nm=$TOOLCHAIN/bin/arm-linux-androideabi-nm \
--disable-shared \
--enable-runtime-cpudetect \
--enable-gpl \
--enable-small \
--enable-cross-compile \
--disable-debug \
--enable-static \
--disable-doc \
--disable-asm \
--disable-ffmpeg \
--disable-ffplay \
--disable-ffprobe \
--disable-ffserver \
--disable-postproc \
--disable-avdevice \
--disable-symver \
--disable-stripping \
$ADDITIONAL_CONFIGURE_FLAG
sed -i '' 's/HAVE_LRINT 0/HAVE_LRINT 1/g' config.h
sed -i '' 's/HAVE_LRINTF 0/HAVE_LRINTF 1/g' config.h
sed -i '' 's/HAVE_ROUND 0/HAVE_ROUND 1/g' config.h
sed -i '' 's/HAVE_ROUNDF 0/HAVE_ROUNDF 1/g' config.h
sed -i '' 's/HAVE_TRUNC 0/HAVE_TRUNC 1/g' config.h
sed -i '' 's/HAVE_TRUNCF 0/HAVE_TRUNCF 1/g' config.h
sed -i '' 's/HAVE_CBRT 0/HAVE_CBRT 1/g' config.h
sed -i '' 's/HAVE_RINT 0/HAVE_RINT 1/g' config.h
make clean
make -j4
make install
$TOOLCHAIN/bin/arm-linux-androideabi-ld \
-rpath-link=$PLATFORM/usr/lib \
-L$PLATFORM/usr/lib \
-L$PREFIX/lib \
-soname libffmpeg.so -shared -nostdlib -Bsymbolic --whole-archive --no-undefined -o \
$PREFIX/libffmpeg.so \
libavcodec/libavcodec.a \
libavfilter/libavfilter.a \
libswresample/libswresample.a \
libavformat/libavformat.a \
libavutil/libavutil.a \
libswscale/libswscale.a \
-lc -lm -lz -ldl -llog --dynamic-linker=/system/bin/linker \
$TOOLCHAIN/lib/gcc/arm-linux-androideabi/4.9/libgcc.a
}
# arm v7vfp
CPU=armv7-a
OPTIMIZE_CFLAGS="-mfloat-abi=softfp -mfpu=vfp -marm -march=$CPU "
PREFIX=./android/$CPU-vfp-all
ADDITIONAL_CONFIGURE_FLAG=
build_one
# CPU=armv
# PREFIX=$(pwd)/android/$CPU
# ADDI_CFLAGS="-marm"
# build_one
#arm v6
#CPU=armv6
#OPTIMIZE_CFLAGS="-marm -march=$CPU"
#PREFIX=./android/$CPU
#ADDITIONAL_CONFIGURE_FLAG=
#build_one
#arm v7vfpv3
# CPU=armv7-a
# OPTIMIZE_CFLAGS="-mfloat-abi=softfp -mfpu=vfpv3-d16 -marm -march=$CPU "
# PREFIX=./android/$CPU
# ADDITIONAL_CONFIGURE_FLAG=
# build_one
#arm v7n
#CPU=armv7-a
#OPTIMIZE_CFLAGS="-mfloat-abi=softfp -mfpu=neon -marm -march=$CPU -mtune=cortex-a8"
#PREFIX=./android/$CPU
#ADDITIONAL_CONFIGURE_FLAG=--enable-neon
#build_one
#arm v6+vfp
#CPU=armv6
#OPTIMIZE_CFLAGS="-DCMP_HAVE_VFP -mfloat-abi=softfp -mfpu=vfp -marm -march=$CPU"
#PREFIX=./android/${CPU}_vfp
#ADDITIONAL_CONFIGURE_FLAG=
#build_one
9.在ffmpeg-3.3.1目录下执行如下命令:
chmod +x build_android_all.sh
./build_android_all.sh
10.执行完成后,生成了一个名为libffmpeg.so的动态库,如下图所示:
11.这个动态库其实是相当于把上面六个动态库合并在一起了,所以使用效果一致,使用起来也更为便捷。
从上图可以看出build_android_all.sh脚本的作用其实是先生成6个静态库,然后再把这几个静态库(就是那6个.a文件)合并成一个动态库libffmpeg.so,而build_android.sh其实是直接生成了6个动态库。我们下面对比下这两个脚本,看下区别,如下图所示:
a.脚本区别:
b.编译生成文件的区别:
12.下面开始使用Android Studio创建App工程,并使用以上我们编译生成的动态库,编写一个简单的jni调用ffmpeg,播放网络视频的Demo
13.创建Android工程FFmpegAndroidDemo,然后在main目录下创建jni文件夹,然后复制上图中的include文件夹以及libffmpeg.so(或者include文件夹以及6个so文件)到jni目录下,结构如下图所示:
14.工程根目录的gradle.properties中添加如下代码:
android.useDeprecatedNdk=true
14.工程根目录的local.properties中添加如下代码:
ndk.dir=/Users/zhangyipeng/Library/Android/sdk/ndk-bundle
15.创建FFmpegNdk.java文件,�代码如下:
public class FFmpegNdk {
static {
// System.loadLibrary("avcodec-57");
// System.loadLibrary("avfilter-6");
// System.loadLibrary("avformat-57");
// System.loadLibrary("avutil-55");
// System.loadLibrary("swresample-2");
// System.loadLibrary("swscale-4");
System.loadLibrary("ffmpeg");
System.loadLibrary("myffmpeg");
}
public static native String avcodecinfo();
public static native int playVideo(String url, Object surface);
}
16.MainActivity.java代码如下:
public class MainActivity extends AppCompatActivity implements SurfaceHolder.Callback {
private SurfaceView mSurfaceView;
private SurfaceHolder mSurfaceHolder;
public static final String url2 = "http://58.135.196.138:8090/live/db3bd108e3364bf3888ccaf8377af077/index.m3u8";
public static final String url = "http://tx2.a.yximgs.com/upic/2017/06/06/12/BMjAxNzA2MDYxMjA3MDJfOTg5MDkwODRfMjMzMzY5NjI3OV8xXzM=_hd.mp4?tag=1-1496888787-h-0-2gpzxdvetp-f9da4113e6f3de74";
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
final TextView tv = (TextView) findViewById(R.id.tv);
SurfaceView mSurfaceView = (SurfaceView) findViewById(R.id.surface_view);
tv.setMovementMethod(ScrollingMovementMethod.getInstance());
findViewById(R.id.button).setOnClickListener(new View.OnClickListener() {
@RequiresApi(api = Build.VERSION_CODES.JELLY_BEAN)
@Override
public void onClick(View v) {
tv.setText(FFmpegNdk.avcodecinfo());
startActivity(new Intent(MainActivity.this,VideoActivity.class));
}
});
mSurfaceHolder = mSurfaceView.getHolder();
mSurfaceHolder.addCallback(this);
}
@Override
public void surfaceCreated(SurfaceHolder holder) {
new Thread(new Runnable() {
@Override
public void run() {
FFmpegNdk.playVideo(url,mSurfaceHolder.getSurface());
}
}).start();
}
@Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
}
@Override
public void surfaceDestroyed(SurfaceHolder holder) {
}
}
17.使用javah命令生成.h头文件,在FFmpegAndroidDemo/app/src/main/java目录下使用如下命令,在jni目录下生成com_zyp_ffmpegandroiddemo_FFmpegNdk.h文件
javah -d ../jni com.zyp.ffmpegandroiddemo.FFmpegNdk
18.jni目录下创建ffmpeg_ndk.c文件,代码如下所示:
/* DO NOT EDIT THIS FILE - it is machine generated */
#include <jni.h>
#include <stdio.h>
#include "include/libavcodec/avcodec.h"
#include "include/libavformat/avformat.h"
#include "include/libavfilter/avfilter.h"
//#include "com_zyp_ffmpegandroiddemo_FFmpegNdk.h"
#include <android/native_window.h>
#include <android/native_window_jni.h>
#include <android/log.h>
#include "util.h"
#define TAG "ffmpeg_android_tag"
#define LOGD(...) __android_log_print(ANDROID_LOG_DEBUG,TAG ,__VA_ARGS__) // 定义LOGD类型
#define LOGI(...) __android_log_print(ANDROID_LOG_INFO,TAG ,__VA_ARGS__) // 定义LOGI类型
#define LOGW(...) __android_log_print(ANDROID_LOG_WARN,TAG ,__VA_ARGS__) // 定义LOGW类型
#define LOGE(...) __android_log_print(ANDROID_LOG_ERROR,TAG ,__VA_ARGS__) // 定义LOGE类型
#define LOGF(...) __android_log_print(ANDROID_LOG_FATAL,TAG ,__VA_ARGS__) // 定义LOGF类型
/* Header for class com_zyp_ffmpegandroiddemo_FFmpegNdk */
#ifndef _Included_com_zyp_ffmpegandroiddemo_FFmpegNdk
#define _Included_com_zyp_ffmpegandroiddemo_FFmpegNdk
#ifdef __cplusplus
extern "C" {
#endif
/*
* Class: com_zyp_ffmpegandroiddemo_FFmpegNdk
* Method: avcodecinfo
* Signature: ()V
*/
JNIEXPORT jstring JNICALL Java_com_zyp_ffmpegandroiddemo_FFmpegNdk_avcodecinfo(JNIEnv *env, jobject obj) {
char info[40000] = {0};
av_register_all();
AVCodec *c_temp = av_codec_next(NULL);
while (c_temp != NULL) {
if (c_temp->decode != NULL) {
sprintf(info, "%s[Dec]", info);
} else {
sprintf(info, "%s[Enc]", info);
}
switch (c_temp->type) {
case AVMEDIA_TYPE_VIDEO:
sprintf(info, "%s[Video]", info);
break;
case AVMEDIA_TYPE_AUDIO:
sprintf(info, "%s[Audio]", info);
break;
default:
sprintf(info, "%s[Other]", info);
break;
}
sprintf(info, "%s[%10s]\n", info, c_temp->name);
c_temp = c_temp->next;
}
return (*env)->NewStringUTF(env, info);
}
/*
* Class: com_zyp_ffmpegandroiddemo_FFmpegNdk
* Method: playVideo
* Signature: (Ljava/lang/String;Ljava/lang/Object;)I
*/
JNIEXPORT jint JNICALL Java_com_zyp_ffmpegandroiddemo_FFmpegNdk_playVideo(JNIEnv *env, jclass clazz, jstring url, jobject surface) {
LOGD("start playvideo... url : %s",url);
char * url2= jstringTostring(env,url);
LOGD("start playvideo... url : %s",url2);
av_register_all();
AVFormatContext * pFormatCtx = avformat_alloc_context();
// Open video file
if(avformat_open_input(&pFormatCtx, url2, NULL, NULL)!=0) {
LOGE("Couldn't open file:%s\n", url2);
return -1; // Couldn't open file
}
// Retrieve stream information
if(avformat_find_stream_info(pFormatCtx, NULL)<0) {
LOGE("Couldn't find stream information.");
return -1;
}
// Find the first video stream
int videoStream = -1, i;
for (i = 0; i < pFormatCtx->nb_streams; i++) {
if (pFormatCtx->streams[i]->codec->codec_type == AVMEDIA_TYPE_VIDEO
&& videoStream < 0) {
videoStream = i;
}
}
if(videoStream==-1) {
LOGE("Didn't find a video stream.");
return -1; // Didn't find a video stream
}
// Get a pointer to the codec context for the video stream
AVCodecContext * pCodecCtx = pFormatCtx->streams[videoStream]->codec;
// Find the decoder for the video stream
AVCodec * pCodec = avcodec_find_decoder(pCodecCtx->codec_id);
if(pCodec==NULL) {
LOGE("Codec not found.");
return -1; // Codec not found
}
if(avcodec_open2(pCodecCtx, pCodec, NULL) < 0) {
LOGE("Could not open codec.");
return -1; // Could not open codec
}
// 获取native window
ANativeWindow* nativeWindow = ANativeWindow_fromSurface(env, surface);
// 获取视频宽高
int videoWidth = pCodecCtx->width;
int videoHeight = pCodecCtx->height;
// 设置native window的buffer大小,可自动拉伸
ANativeWindow_setBuffersGeometry(nativeWindow, videoWidth, videoHeight, WINDOW_FORMAT_RGBA_8888);
ANativeWindow_Buffer windowBuffer;
if(avcodec_open2(pCodecCtx, pCodec, NULL)<0) {
LOGE("Could not open codec.");
return -1; // Could not open codec
}
// Allocate video frame
AVFrame * pFrame = av_frame_alloc();
// 用于渲染
AVFrame * pFrameRGBA = av_frame_alloc();
if(pFrameRGBA == NULL || pFrame == NULL) {
LOGE("Could not allocate video frame.");
return -1;
}
// Determine required buffer size and allocate buffer
// buffer中数据就是用于渲染的,且格式为RGBA
int numBytes=av_image_get_buffer_size(AV_PIX_FMT_RGBA, pCodecCtx->width, pCodecCtx->height, 1);
uint8_t * buffer=(uint8_t *)av_malloc(numBytes*sizeof(uint8_t));
av_image_fill_arrays(pFrameRGBA->data, pFrameRGBA->linesize, buffer, AV_PIX_FMT_RGBA,
pCodecCtx->width, pCodecCtx->height, 1);
// 由于解码出来的帧格式不是RGBA的,在渲染之前需要进行格式转换
struct SwsContext *sws_ctx = sws_getContext(pCodecCtx->width,
pCodecCtx->height,
pCodecCtx->pix_fmt,
pCodecCtx->width,
pCodecCtx->height,
AV_PIX_FMT_RGBA,
// SWS_BILINEAR,
NULL,
NULL,
NULL);
int frameFinished;
AVPacket packet;
while(av_read_frame(pFormatCtx, &packet)>=0) {
// Is this a packet from the video stream?
if(packet.stream_index==videoStream) {
// Decode video frame
avcodec_decode_video2(pCodecCtx, pFrame, &frameFinished, &packet);
// 并不是decode一次就可解码出一帧
if (frameFinished) {
// lock native window buffer
ANativeWindow_lock(nativeWindow, &windowBuffer, 0);
// 格式转换
sws_scale(sws_ctx, (uint8_t const * const *)pFrame->data,
pFrame->linesize, 0, pCodecCtx->height,
pFrameRGBA->data, pFrameRGBA->linesize);
// 获取stride
uint8_t * dst = windowBuffer.bits;
int dstStride = windowBuffer.stride * 4;
uint8_t * src = (uint8_t*) (pFrameRGBA->data[0]);
int srcStride = pFrameRGBA->linesize[0];
// 由于window的stride和帧的stride不同,因此需要逐行复制
int h;
for (h = 0; h < videoHeight; h++) {
memcpy(dst + h * dstStride, src + h * srcStride, srcStride);
}
ANativeWindow_unlockAndPost(nativeWindow);
}
}
av_packet_unref(&packet);
}
av_free(buffer);
av_free(pFrameRGBA);
// Free the YUV frame
av_free(pFrame);
// Close the codecs
avcodec_close(pCodecCtx);
// Close the video file
avformat_close_input(&pFormatCtx);
return 0;
}
#ifdef __cplusplus
}
#endif
#endif
19.创建Android.mk文件,代码如下图所示:
# $java目录下执行 javah -d ../jni com.zyp.ffmpegandroiddemo.FFmpegNdk
LOCAL_PATH := $(call my-dir)
#FFFmpeg libray
#include $(CLEAR_VARS)
#LOCAL_MODULE := avcodec
#LOCAL_SRC_FILES := libavcodec-57.so
#include $(PREBUILT_SHARED_LIBRARY)
#
#include $(CLEAR_VARS)
#LOCAL_MODULE := avfilter
#LOCAL_SRC_FILES := libavfilter-6.so
#include $(PREBUILT_SHARED_LIBRARY)
#
#include $(CLEAR_VARS)
#LOCAL_MODULE := avformat
#LOCAL_SRC_FILES := libavformat-57.so
#include $(PREBUILT_SHARED_LIBRARY)
#
#include $(CLEAR_VARS)
#LOCAL_MODULE := avutil
#LOCAL_SRC_FILES := libavutil-55.so
#include $(PREBUILT_SHARED_LIBRARY)
#
#include $(CLEAR_VARS)
#LOCAL_MODULE := swresample
#LOCAL_SRC_FILES := libswresample-2.so
#include $(PREBUILT_SHARED_LIBRARY)
#
#include $(CLEAR_VARS)
#LOCAL_MODULE := swscale
#LOCAL_SRC_FILES := libswscale-4.so
#include $(PREBUILT_SHARED_LIBRARY)
include $(CLEAR_VARS)
LOCAL_MODULE := libffmpeg
LOCAL_SRC_FILES := libffmpeg.so
include $(PREBUILT_SHARED_LIBRARY)
#Program
include $(CLEAR_VARS)
LOCAL_MODULE := myffmpeg
LOCAL_SRC_FILES := ffmpeg_ndk.c
LOCAL_C_INCLUDES += $(LOCAL_PATH)/include/
LOCAL_LDLIBS := -llog -lz -landroid
#LOCAL_SHARED_LIBRARIES := avcodec avdevice avfilter avformat avutil postproc swresample swscale
LOCAL_SHARED_LIBRARIES := ffmpeg
include $(BUILD_SHARED_LIBRARY)
19.创建Application.mk文件,代码如下图所示:
APP_MODULES := myffmpeg
APP_ABI := armeabi armeabi-v7a
APP_PLATFORM := android-10
20.在jni目录下执行ndk-build,会在jni/lib目录下生成so动态库
21.打开app目录下的build.gradle文件,加入如下脚本:
android{
...
sourceSets.main {
jni.srcDirs = []
res.srcDirs = ['src/main/res']
jniLibs.srcDirs = ['src/main/libs']
}
...
}
22.现在一切配置ok,代码和脚本也写完了,可以点击run,运行app了,效果如下图:
视频地址是抓的快手上的_
23.这篇文章主要是根据以下两篇技术文章,然后与自己的实践综合起来写的,感谢。
手把手图文并茂教你用Android Studio编译FFmpeg库并移植
Android最简单的基于FFmpeg的例子(三)---编译FFmpeg成一个SO库
24.Demo下载地址