上一篇文章已经说了怎么编译出android下可用的ffmpeg so文件,并且通过传递一个字符串命令的方式实现需求,真的非常方便,这里我就用这个so来实现一个小视频简单制作功能。
目标功能:
1.分离出视频的原声。
2.得到无声的视频文件。
3.修改视频原声的声音大小。
4.修改背景音乐的声音大小。
5.视频原声和音乐合成。
6.视频和音频合成。
1.打开activity_main.xml绘制一个布局
<?xml version="1.0" encoding="utf-8"?>
<FrameLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:orientation="vertical"
android:layout_width="match_parent"
android:layout_height="match_parent">
<SurfaceView
android:id="@+id/video_surface_view"
android:layout_width="match_parent"
android:layout_height="match_parent" />
<LinearLayout
android:id="@+id/rec_layout"
android:layout_gravity="bottom"
android:orientation="vertical"
android:padding="20dp"
android:layout_width="match_parent"
android:layout_height="wrap_content">
<LinearLayout
android:layout_gravity="end"
android:gravity="center_vertical"
android:layout_marginBottom="10dp"
android:orientation="horizontal"
android:layout_width="wrap_content"
android:layout_height="wrap_content">
<View
android:layout_width="10dp"
android:layout_height="10dp"
android:layout_marginRight="5dp"
android:background="@drawable/ripple_circle"/>
<TextView
android:id="@+id/time"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:textColor="@android:color/white"
android:textSize="16sp"
android:text="00:00"/>
</LinearLayout>
<RelativeLayout
android:layout_width="match_parent"
android:layout_height="wrap_content">
<ProgressBar
android:id="@+id/progress"
android:layout_width="match_parent"
android:layout_height="5dp"
android:layout_centerVertical="true"
android:alpha="0.8"
style="@style/progressBarHorizontal_color"
android:max="30" />
<View
android:id="@+id/marking"
android:layout_width="2dp"
android:layout_height="5dp"
android:alpha="0.7"
android:layout_marginLeft="45dp"
android:layout_centerVertical="true"
android:background="@android:color/holo_red_dark" />
</RelativeLayout>
<RelativeLayout
android:layout_marginTop="20dp"
android:layout_width="match_parent"
android:layout_height="wrap_content">
<ImageView
android:id="@+id/start_video"
android:layout_width="60dp"
android:layout_height="60dp"
android:layout_centerInParent="true"
android:src="@mipmap/bt_start"/>
<ImageView
android:id="@+id/start_video_ing"
android:visibility="gone"
android:layout_width="60dp"
android:layout_height="60dp"
android:layout_centerInParent="true"
android:src="@mipmap/icon_video_ing"/>
</RelativeLayout>
</LinearLayout>
<LinearLayout
android:id="@+id/top_layout"
android:layout_gravity="end"
android:layout_margin="10dip"
android:gravity="center"
android:layout_width="wrap_content"
android:layout_height="wrap_content">
<ImageView
android:id="@+id/inversion"
android:layout_width="37dp"
android:layout_height="37dp"
android:padding="6dp"
android:src="@mipmap/icon_fanzhuan"/>
<ImageView
android:id="@+id/close"
android:layout_width="37dip"
android:layout_height="37dip"
android:padding="10dp"
android:layout_gravity="end"
android:src="@mipmap/live_close_icon" />
</LinearLayout>
</FrameLayout>
ripple_circle.xml
<?xml version="1.0" encoding="utf-8"?>
<shape xmlns:android="http://schemas.android.com/apk/res/android"
android:shape="oval">
<solid android:color="@android:color/holo_red_dark" />
<stroke
android:width="1dp"
android:color="@android:color/white" />
<size
android:height="14dp"
android:width="14dp" />
</shape>
@style/progressBarHorizontal_color
<style name="progressBarHorizontal_color" parent="android:Widget.ProgressBar.Horizontal">
<item name="android:indeterminateOnly">false</item>
<item name="android:progressDrawable">@drawable/progress_color_horizontal</item>
<item name="android:minHeight">5dip</item>
<item name="android:maxHeight">5dip</item>
</style>
progress_color_horizontal.xml
<?xml version="1.0" encoding="utf-8"?>
<layer-list xmlns:android="http://schemas.android.com/apk/res/android">
<item android:id="@android:id/background">
<shape>
<corners android:radius="2dip" />
<gradient
android:startColor="#555555"
android:centerColor="#555555"
android:centerY="0.75"
android:endColor="#555555"
android:angle="270" />
</shape>
</item>
<item android:id="@android:id/secondaryProgress">
<clip>
<shape>
<corners android:radius="5dip" />
<gradient
android:startColor="#80C07AB8"
android:centerColor="#80C07AB8"
android:centerY="0.75"
android:endColor="#a0C07AB8"
android:angle="270" />
</shape>
</clip>
</item>
<item android:id="@android:id/progress">
<clip>
<shape>
<corners android:radius="2dip" />
<gradient
android:startColor="@android:color/holo_red_dark"
android:centerColor="@android:color/holo_red_dark"
android:centerY="0.75"
android:endColor="@android:color/holo_red_dark"
android:angle="270" />
</shape>
</clip>
</item>
</layer-list>
图片资源
2.实现一个调用相机的类,用来录制小视频
我这里简单实现了一个工具类,主要使用Camera配合SurfaceView来录制,这些东西我就不多做解释了,直接看源码。
package com.tangyx.video.ffmpeg;
import android.app.Activity;
import android.hardware.Camera;
import android.media.MediaRecorder;
import android.view.GestureDetector;
import android.view.MotionEvent;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.View;
import java.io.File;
import java.io.IOException;
import java.util.List;
/**
* Created by tangyx on 2017/8/2.
*
*/
public class MediaHelper implements SurfaceHolder.Callback {
private Activity activity;
private MediaRecorder mMediaRecorder;
private Camera mCamera;
private SurfaceView mSurfaceView;
private SurfaceHolder mSurfaceHolder;
private File targetDir;
private String targetName;
private File targetFile;
private boolean isRecording;
private GestureDetector mDetector;
private boolean isZoomIn = false;
private int or = 90;
private int position = Camera.CameraInfo.CAMERA_FACING_BACK;
public MediaHelper(Activity activity) {
this.activity = activity;
}
public void setTargetDir(File file) {
this.targetDir = file;
}
public void setTargetName(String name) {
this.targetName = name;
}
public String getTargetFilePath() {
return targetFile.getPath();
}
public boolean deleteTargetFile() {
if (targetFile.exists()) {
return targetFile.delete();
} else {
return false;
}
}
public void setSurfaceView(SurfaceView view) {
this.mSurfaceView = view;
mSurfaceHolder = mSurfaceView.getHolder();
mSurfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
mSurfaceHolder.addCallback(this);
mDetector = new GestureDetector(activity, new ZoomGestureListener());
mSurfaceView.setOnTouchListener(new View.OnTouchListener() {
@Override
public boolean onTouch(View v, MotionEvent event) {
mDetector.onTouchEvent(event);
return true;
}
});
}
public boolean isRecording() {
return isRecording;
}
public void record() {
if (isRecording) {
try {
mMediaRecorder.stop(); // stop the recording
} catch (RuntimeException e) {
e.printStackTrace();
targetFile.delete();
}
releaseMediaRecorder(); // release the MediaRecorder object
mCamera.lock(); // take camera access back from MediaRecorder
isRecording = false;
} else {
startRecordThread();
}
}
private boolean prepareRecord() {
try {
mMediaRecorder = new MediaRecorder();
mCamera.unlock();
mMediaRecorder.setCamera(mCamera);
mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.DEFAULT);
mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
// mMediaRecorder.setProfile(profile);
mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
mMediaRecorder.setVideoSize(1280, 720);
// mMediaRecorder.setVideoSize(640, 480);
mMediaRecorder.setVideoEncodingBitRate(2 * 1024 * 1024);
mMediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.H264);
mMediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC);
if (position == Camera.CameraInfo.CAMERA_FACING_BACK) {
mMediaRecorder.setOrientationHint(or);
} else {
mMediaRecorder.setOrientationHint(270);
}
targetFile = new File(targetDir, targetName);
mMediaRecorder.setOutputFile(targetFile.getPath());
} catch (Exception e) {
e.printStackTrace();
releaseMediaRecorder();
return false;
}
try {
mMediaRecorder.prepare();
} catch (IllegalStateException e) {
e.printStackTrace();
releaseMediaRecorder();
return false;
} catch (IOException e) {
e.printStackTrace();
releaseMediaRecorder();
return false;
}
return true;
}
public void stopRecordSave() {
if (isRecording) {
isRecording = false;
try {
mMediaRecorder.stop();
} catch (RuntimeException r) {
r.printStackTrace();
} finally {
releaseMediaRecorder();
}
}
}
public void stopRecordUnSave() {
if (isRecording) {
isRecording = false;
try {
mMediaRecorder.stop();
} catch (RuntimeException r) {
if (targetFile.exists()) {
//不保存直接删掉
targetFile.delete();
}
} finally {
releaseMediaRecorder();
}
if (targetFile.exists()) {
//不保存直接删掉
targetFile.delete();
}
}
}
private void startPreView(SurfaceHolder holder) {
if (mCamera == null) {
mCamera = Camera.open(position);
}
if (mCamera != null) {
mCamera.setDisplayOrientation(or);
try {
mCamera.setPreviewDisplay(holder);
Camera.Parameters parameters = mCamera.getParameters();
List<Camera.Size> mSupportedPreviewSizes = parameters.getSupportedPreviewSizes();
if (mSupportedPreviewSizes != null) {
int width = mSurfaceView.getWidth();
int height = mSurfaceView.getHeight();
Camera.Size mPreviewSize = getOptimalPreviewSize(mSupportedPreviewSizes,
Math.max(width, height), Math.min(width, height));
parameters.setPreviewSize(mPreviewSize.width, mPreviewSize.height);
}
List<String> focusModes = parameters.getSupportedFocusModes();
if (focusModes != null) {
for (String mode : focusModes) {
if(mode.contains(Camera.Parameters.FOCUS_MODE_AUTO)){
parameters.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO);
}
}
}
mCamera.setParameters(parameters);
mCamera.startPreview();
} catch (IOException e) {
e.printStackTrace();
}
}
}
public Camera.Size getOptimalPreviewSize(List<Camera.Size> sizes, int w, int h) {
final double ASPECT_TOLERANCE = 0.1;
double targetRatio = (double) w / h;
if (sizes == null) {
return null;
}
Camera.Size optimalSize = null;
double minDiff = Double.MAX_VALUE;
int targetHeight = h;
for (Camera.Size size : sizes) {
double ratio = (double) size.width / size.height;
if (Math.abs(ratio - targetRatio) > ASPECT_TOLERANCE)
continue;
if (Math.abs(size.height - targetHeight) < minDiff) {
optimalSize = size;
minDiff = Math.abs(size.height - targetHeight);
}
}
if (optimalSize == null) {
minDiff = Double.MAX_VALUE;
for (Camera.Size size : sizes) {
if (Math.abs(size.height - targetHeight) < minDiff) {
optimalSize = size;
minDiff = Math.abs(size.height - targetHeight);
}
}
}
return optimalSize;
}
private void releaseMediaRecorder() {
if (mMediaRecorder != null) {
mMediaRecorder.reset();
mMediaRecorder.release();
mMediaRecorder = null;
}
}
public void releaseCamera() {
if (mCamera != null) {
mCamera.release();
mCamera = null;
}
}
@Override
public void surfaceCreated(SurfaceHolder holder) {
mSurfaceHolder = holder;
startPreView(holder);
}
@Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
}
@Override
public void surfaceDestroyed(SurfaceHolder holder) {
if (mCamera != null) {
releaseCamera();
}
if (mMediaRecorder != null) {
releaseMediaRecorder();
}
}
private void startRecordThread() {
if (prepareRecord()) {
try {
mMediaRecorder.start();
isRecording = true;
} catch (RuntimeException r) {
r.printStackTrace();
releaseMediaRecorder();
}
}
}
private class ZoomGestureListener extends GestureDetector.SimpleOnGestureListener {
//双击手势事件
@Override
public boolean onDoubleTap(MotionEvent e) {
super.onDoubleTap(e);
if (!isZoomIn) {
setZoom(20);
isZoomIn = true;
} else {
setZoom(0);
isZoomIn = false;
}
return true;
}
}
private void setZoom(int zoomValue) {
if (mCamera != null) {
Camera.Parameters parameters = mCamera.getParameters();
if (parameters.isZoomSupported()) {
int maxZoom = parameters.getMaxZoom();
if (maxZoom == 0) {
return;
}
if (zoomValue > maxZoom) {
zoomValue = maxZoom;
}
parameters.setZoom(zoomValue);
mCamera.setParameters(parameters);
}
}
}
public void autoChangeCamera() {
if (position == Camera.CameraInfo.CAMERA_FACING_BACK) {
position = Camera.CameraInfo.CAMERA_FACING_FRONT;
} else {
position = Camera.CameraInfo.CAMERA_FACING_BACK;
}
releaseCamera();
stopRecordUnSave();
startPreView(mSurfaceHolder);
}
}
3.布局和工具类都准备好了,接下里就在我们首页MainActivity.java调用工具类录制一段最长30秒,最少8秒的视频。
AndroidManifest.xml中加入相机权限以及读写文件的权限
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/>
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE"/>
<uses-permission android:name="android.permission.RECORD_AUDIO"/>
<uses-permission android:name="android.permission.CAMERA"/>
<uses-feature android:name="android.hardware.camera.autofocus"/>
<uses-feature android:name="android.hardware.camera"/>
在6.0+系统的手机上,只是在注册文件加入权限还不够的,还需要提示用户手动允许开启权限(真是麻烦)所以还需要在写一个权限管理的工具类,以及权限检测提示的activity
创建PermissionHelper.java加入内容
public class PermissionHelper {
private final Context mContext;
public PermissionHelper(Context context) {
mContext = context.getApplicationContext();
}
// 判断权限集合
public boolean lacksPermissions(String... permissions) {
for (String permission : permissions) {
if (lacksPermission(permission)) {
return true;
}
}
return false;
}
// 判断是否缺少权限
private boolean lacksPermission(String permission) {
return ContextCompat.checkSelfPermission(mContext, permission) ==
PackageManager.PERMISSION_DENIED;
}
}
创建PermissionsActivity.java加入以下内容(记得不要忘记在AndroidManifest.xml中注册)
/**
* 权限获取页面
* <p/>
*/
public class PermissionsActivity extends AppCompatActivity {
//基本权限必须有
public final static String[] PERMISSIONS = new String[]{
Manifest.permission.WRITE_EXTERNAL_STORAGE,
Manifest.permission.RECORD_AUDIO,
Manifest.permission.CAMERA
};
public static final int PERMISSIONS_GRANTED = 1010; // 权限授权
public static final int PERMISSIONS_DENIED = 1011; // 权限拒绝
public static final int REQUEST_CODE = 1012; // 请求码
private static final int PERMISSION_REQUEST_CODE = 0; // 系统权限管理页面的参数
private static final String EXTRA_PERMISSIONS =
"megawave.permission.extra_permission"; // 权限参数
private static final String PACKAGE_URL_SCHEME = "package:"; // 方案
private PermissionHelper mChecker; // 权限检测器
private boolean isRequireCheck; // 是否需要系统权限检测, 防止和系统提示框重叠
private static boolean isShowSetting=true;
// 启动当前权限页面的公开接口
public static void startActivityForResult(Activity activity, int requestCode, String... permissions) {
startActivityForResult(activity,requestCode,true,permissions);
}
public static void startActivityForResult(Activity activity, int requestCode,boolean showSetting,String... permissions) {
Intent intent = new Intent(activity, PermissionsActivity.class);
intent.putExtra(EXTRA_PERMISSIONS, permissions);
ActivityCompat.startActivityForResult(activity, intent, requestCode, null);
isShowSetting = showSetting;
}
@Override
protected void onCreate(@Nullable Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
if (getIntent() == null || !getIntent().hasExtra(EXTRA_PERMISSIONS)) {
throw new RuntimeException("PermissionsActivity需要使用静态startActivityForResult方法启动!");
}
setContentView(R.layout.activity_permissions);
mChecker = new PermissionHelper(this);
isRequireCheck = true;
}
@Override
protected void onResume() {
super.onResume();
if (isRequireCheck) {
String[] permissions = getPermissions();
if (mChecker.lacksPermissions(permissions)) {
requestPermissions(permissions); // 请求权限
} else {
allPermissionsGranted(); // 全部权限都已获取
}
} else {
isRequireCheck = true;
}
}
// 返回传递的权限参数
private String[] getPermissions() {
return getIntent().getStringArrayExtra(EXTRA_PERMISSIONS);
}
// 请求权限兼容低版本
private void requestPermissions(String... permissions) {
ActivityCompat.requestPermissions(this, permissions, PERMISSION_REQUEST_CODE);
}
// 全部权限均已获取
private void allPermissionsGranted() {
setResult(PERMISSIONS_GRANTED);
finish();
}
/**
* 用户权限处理,
* 如果全部获取, 则直接过.
* 如果权限缺失, 则提示Dialog.
*
* @param requestCode 请求码
* @param permissions 权限
* @param grantResults 结果
*/
@Override
public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) {
if (requestCode == PERMISSION_REQUEST_CODE && hasAllPermissionsGranted(grantResults)) {
isRequireCheck = true;
allPermissionsGranted();
} else {
isRequireCheck = false;
if(isShowSetting){
showMissingPermissionDialog();
}
}
}
// 含有全部的权限
private boolean hasAllPermissionsGranted(@NonNull int[] grantResults) {
for (int grantResult : grantResults) {
if (grantResult == PackageManager.PERMISSION_DENIED) {
return false;
}
}
return true;
}
// 显示缺失权限提示
public void showMissingPermissionDialog() {
AlertDialog.Builder builder = new AlertDialog.Builder(PermissionsActivity.this);
builder.setTitle(R.string.label_help);
builder.setMessage(R.string.tips_permissions);
// 拒绝, 退出应用
builder.setNegativeButton(R.string.label_quit, new DialogInterface.OnClickListener() {
@Override public void onClick(DialogInterface dialog, int which) {
setResult(-100);
finish();
}
});
builder.setPositiveButton(R.string.label_setting, new DialogInterface.OnClickListener() {
@Override public void onClick(DialogInterface dialog, int which) {
startAppSettings();
}
});
builder.setCancelable(false);
builder.show();
}
// 启动应用的设置
private void startAppSettings() {
Intent intent = new Intent(Settings.ACTION_APPLICATION_DETAILS_SETTINGS);
intent.setData(Uri.parse(PACKAGE_URL_SCHEME + getPackageName()));
startActivity(intent);
}
}
回到MainActivity中,获取SurfaceView控件以及初始化MediaHelper工具类,启动相机。
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
requestWindowFeature(Window.FEATURE_NO_TITLE);
WindowManager.LayoutParams p = this.getWindow().getAttributes();
p.flags |= WindowManager.LayoutParams.FLAG_FULLSCREEN;//|=:或等于,取其一
getWindow().setAttributes(p);
setContentView(R.layout.activity_main);
mSurfaceView = (SurfaceView) findViewById(R.id.video_surface_view);
mStartVideo = (ImageView) findViewById(R.id.start_video);
mStartVideoIng = (ImageView) findViewById(R.id.start_video_ing);
mProgress = (ProgressBar) findViewById(R.id.progress);
mTime = (TextView) findViewById(R.id.time);
findViewById(R.id.close).setOnClickListener(this);
findViewById(R.id.inversion).setOnClickListener(this);
mStartVideo.setOnClickListener(this);
mStartVideoIng.setOnClickListener(this);
//初始化工具类
mMediaHelper = new MediaHelper(this);
//设置视频存放地址的主目录
mMediaHelper.setTargetDir(new File(new FileUtils(this).getStorageDirectory()));
//设置录制视频的名字
mMediaHelper.setTargetName(UUID.randomUUID() + ".mp4");
mPermissionHelper = new PermissionHelper(this);
}
@Override
protected void onResume() {
super.onResume();
if(mPermissionHelper.lacksPermissions(PermissionsActivity.PERMISSIONS)){
PermissionsActivity.startActivityForResult(this,PermissionsActivity.REQUEST_CODE,PermissionsActivity.PERMISSIONS);
}else{
//启动相机
mMediaHelper.setSurfaceView(mSurfaceView);
}
}
@Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if(resultCode == PermissionsActivity.PERMISSIONS_GRANTED){
//启动相机
mMediaHelper.setSurfaceView(mSurfaceView);
}else if(resultCode == -100){
finish();
}
}
FileUtils.java工具类
public class FileUtils {
/**
* sd卡的根目录
*/
private static String mSdRootPath = Environment.getExternalStorageDirectory().getPath();
/**
* 手机的缓存根目录
*/
private static String mDataRootPath = null;
/**
* 保存Image的目录名
*/
private final static String FOLDER_NAME = "/ffmpeg";
public final static String IMAGE_NAME = "/cache";
public FileUtils(Context context){
mDataRootPath = context.getCacheDir().getPath();
makeAppDir();
}
public String makeAppDir(){
String path = getStorageDirectory();
File folderFile = new File(path);
if(!folderFile.exists()){
folderFile.mkdir();
}
path = path + IMAGE_NAME;
folderFile = new File(path);
if(!folderFile.exists()){
folderFile.mkdir();
}
return path;
}
/**
* 获取储存Image的目录
* @return
*/
public String getStorageDirectory(){
String localPath = Environment.getExternalStorageState().equals(Environment.MEDIA_MOUNTED) ?
mSdRootPath + FOLDER_NAME : mDataRootPath + FOLDER_NAME;
File folderFile = new File(localPath);
if(!folderFile.exists()){
folderFile.mkdir();
}
return localPath;
}
/**
* 删除文件
*/
public void deleteFile(String deletePath,String videoPath) {
File file = new File(deletePath);
if (file.exists()) {
File[] files = file.listFiles();
for (File f : files) {
if(f.isDirectory()){
if(f.listFiles().length==0){
f.delete();
}else{
deleteFile(f.getAbsolutePath(),videoPath);
}
}else if(!f.getAbsolutePath().equals(videoPath)){
f.delete();
}
}
}
}
}
4.相机调用成功后,点击录制按钮就开始录制视频了。
@Override
public void onClick(View view) {
switch (view.getId()){
case R.id.close:
mMediaHelper.stopRecordUnSave();
finish();
break;
case R.id.start_video:
mProgressNumber = 0;
mProgress.setProgress(0);
mMediaHelper.record();
startView();
break;
case R.id.start_video_ing:
if(mProgressNumber == 0){
stopView(false);
break;
}
if (mProgressNumber < 8) {
//时间太短不保存
Toast.makeText(this,"请至少录制到红线位置",Toast.LENGTH_LONG).show();
mMediaHelper.stopRecordUnSave();
stopView(false);
break;
}
//停止录制
mMediaHelper.stopRecordSave();
stopView(true);
break;
case R.id.inversion:
mMediaHelper.stopRecordUnSave();
stopView(false);
mMediaHelper.autoChangeCamera();
break;
}
}
private void startView(){
mStartVideo.setVisibility(View.GONE);
mStartVideoIng.setVisibility(View.VISIBLE);
mProgressNumber = 0;
mTime.setText("00:00");
handler.removeMessages(0);
handler.sendMessage(handler.obtainMessage(0));
}
private void stopView(boolean isSave){
int timer = mProgressNumber;
mProgressNumber = 0;
mProgress.setProgress(0);
handler.removeMessages(0);
mTime.setText("00:00");
if(isSave) {
String path = mMediaHelper.getTargetFilePath();
Intent intent = new Intent(this,MakeVideoActivity.class);
intent.putExtra("path",path);
intent.putExtra("time",timer);
startActivity(intent);
}
mStartVideoIng.setVisibility(View.GONE);
mStartVideo.setVisibility(View.VISIBLE);
}
Handler handler = new Handler() {
@Override
public void handleMessage(Message msg) {
switch (msg.what) {
case 0:
mProgress.setProgress(mProgressNumber);
mTime.setText("00:"+(mProgressNumber<10?"0"+mProgressNumber:mProgressNumber));
if(mProgress.getProgress() >= mProgress.getMax()){
mMediaHelper.stopRecordSave();
stopView(true);
}else if (mMediaHelper.isRecording()){
mProgressNumber = mProgressNumber + 1;
sendMessageDelayed(handler.obtainMessage(0), 1000);
}
break;
}
}
};
到这里基本上小视频就录制成功了,点击停止录制,如果小视频时间大于8秒就跳转到下一个页面进行视频制作。(虽然只是个demo,但是细节东西还是要处理的,不然最后demo各自问题,估计要被喷一脸口水)
5.上面stopView方法逻辑中当视频录制结束后跳转到一个MakeVideoActivity中,携带了视频路径地址以及视频时长,
新建MakeVideoActivity(记得在AndroidManifest.xml中注册)在这里实现我们的主要目标功能,视频音视频处理,也是本文中最重要的部分,全程使用ffmpeg来完成这些目标。
打开FFmpegRun这个类新增以下代码。
package com.tangyx.video.ffmpeg;
import android.os.AsyncTask;
/**
* Created by tangyx
* Date 2017/8/1
* email tangyx@live.com
*/
public class FFmpegRun {
static {
System.loadLibrary("ffmpeg");
System.loadLibrary("ffmpeginvoke");
}
public static void execute(String[] commands, final FFmpegRunListener fFmpegRunListener) {
new AsyncTask<String[], Integer, Integer>() {
@Override
protected void onPreExecute() {
if (fFmpegRunListener != null) {
fFmpegRunListener.onStart();
}
}
@Override
protected Integer doInBackground(String[]... params) {
return run(params[0]);
}
@Override
protected void onPostExecute(Integer integer) {
if (fFmpegRunListener != null) {
fFmpegRunListener.onEnd(integer);
}
}
}.execute(commands);
}
public native static int run(String[] commands);
public interface FFmpegRunListener{
void onStart();
void onEnd(int result);
}
}
内容不多,主要封装给外部方便调用,加载相关的so文件。
因为我们是通过命令的方式调用ffmpeg相关的功能,执行命令是有一定的过程,所以需要在线程中来完成,然后通过一个回调的接口传递给外部,外部通过接口在回调的onEnd方法参数值result判断命令是否执行成功。
result值:0表示成功,其他失败。
6.搭建一个制作的布局提供给MakeVideoActivity
<?xml version="1.0" encoding="utf-8"?>
<FrameLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:orientation="vertical"
android:gravity="center"
android:background="@android:color/black"
android:layout_width="match_parent"
android:layout_height="match_parent">
<VideoView
android:id="@+id/video"
android:layout_gravity="center"
android:layout_width="match_parent"
android:layout_height="match_parent" />
<RelativeLayout
android:id="@+id/title_layout"
android:background="#50000000"
android:paddingRight="15dp"
android:paddingLeft="5dp"
android:layout_width="match_parent"
android:layout_height="wrap_content">
<ImageView
android:id="@+id/back"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:padding="10dp"
android:layout_centerVertical="true"
android:src="@mipmap/icon_back_white" />
<TextView
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_centerInParent="true"
android:textColor="@android:color/white"
android:text="制作" />
<TextView
android:id="@+id/next"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_alignParentRight="true"
android:textColor="@android:color/white"
android:padding="10dp"
android:text="下一步" />
</RelativeLayout>
<LinearLayout
android:id="@+id/editor_layout"
android:orientation="vertical"
android:gravity="center"
android:paddingTop="10dp"
android:paddingBottom="10dp"
android:paddingLeft="30dp"
android:paddingRight="30dp"
android:background="#50000000"
android:layout_gravity="bottom"
android:layout_width="match_parent"
android:layout_height="wrap_content">
<LinearLayout
android:id="@+id/video_layout"
android:gravity="center"
android:layout_width="match_parent"
android:layout_height="wrap_content">
<TextView
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:textColor="@android:color/white"
android:textSize="13sp"
android:text="原音" />
<android.support.v7.widget.AppCompatSeekBar
android:id="@+id/video_seek_bar"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:maxHeight="1.5dp"
style="@style/video_seek_bar"
android:progress="50"
android:max="100" />
</LinearLayout>
<LinearLayout
android:gravity="center"
android:layout_marginTop="10dp"
android:orientation="horizontal"
android:layout_width="match_parent"
android:layout_height="wrap_content">
<TextView
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:textColor="@android:color/white"
android:textSize="13sp"
android:text="伴唱" />
<android.support.v7.widget.AppCompatSeekBar
android:id="@+id/music_seek_bar"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:maxHeight="1.5dp"
style="@style/video_seek_bar"
android:progress="50"
android:max="100" />
</LinearLayout>
<RelativeLayout
android:padding="30dp"
android:layout_width="match_parent"
android:layout_height="wrap_content">
<TextView
android:id="@+id/local_music"
android:layout_centerInParent="true"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:drawablePadding="10dp"
android:paddingLeft="10dp"
android:paddingRight="10dp"
android:paddingBottom="5dp"
android:paddingTop="5dp"
android:background="@android:color/white"
android:textColor="@android:color/black"
android:text="选择本地音乐" />
</RelativeLayout>
</LinearLayout>
</FrameLayout>
<style name="video_seek_bar">
<item name="android:thumb">@mipmap/kaibo_icon_huakuai</item>
<item name="android:progressDrawable">@drawable/video_seekbar</item>
</style>
video_seekbar.xml
<?xml version="1.0" encoding="utf-8"?>
<layer-list xmlns:android="http://schemas.android.com/apk/res/android">
<item android:id="@android:id/background">
<shape>
<corners android:radius="5dp" />
<solid android:color="#f3f3f3" />
</shape>
</item>
<item android:id="@android:id/secondaryProgress">
<clip>
<shape>
<corners android:radius="5dp" />
<solid android:color="#f3f3f3" />
</shape>
</clip>
</item>
<item android:id="@android:id/progress">
<clip>
<shape>
<corners android:radius="5dp" />
<solid android:color="#f15a23" />
</shape>
</clip>
</item>
</layer-list>
资源文件
这里先说一下实现思路(差不多是这样):
原唱:录制视频中的原声,需要通过滑动原音的seekbar来实时变化声音的大小。
伴唱:看见下面的本地音乐按钮没有?没错,就是白底黑字的那个按钮,点击它可以跳转到一个加载本地音乐的列表选择一首歌曲在返回到当前页面并且播放,通过伴唱的seekbar实时控制声音的大小。
两个声音是可以同时播放的,并且可以改变自己的音量大小不影响彼此,我这里播放音频(原声和伴唱)都是通过MediaPlayer(如果你不了解它?点我带你飞)来实现的。
所以我这里第一步就是需要把视频中的音频分割出来(采用FFmpeg分离),视频通过VideoView控件来播放,分离出来的音频用MediaPlayer控制播放。
7.因为调用ffmpeg的功能只需要我们传递一个命令等待返回就可以完成我们的需求,所以这里新建一个类(FFmpegCommands.java)来管理我们这次需要用到ffmpeg命令
打开FFmpegCommands.java加入以下两个方法:
/**
* 提取单独的音频
*
* @param videoUrl
* @param outUrl
* @return
*/
public static String[] extractAudio(String videoUrl, String outUrl) {
String[] commands = new String[8];
commands[0] = "ffmpeg";
commands[1] = "-i";
commands[2] = videoUrl;
commands[3] = "-acodec";
commands[4] = "copy";
commands[5] = "-vn";
commands[6] = "-y";
commands[7] = outUrl;
return commands;
}
这个方法最后组成的就是一个ffmepg命令:ffmepg -i videoUrl(录制视频的文件路径) -acodec copy -vn -y outUrl(提取的音频存放路径)
简单解释一下参数含义:
-i 输入文件
-acodec 使用codec编解码
copy 拷贝原始编解码数据
-vn 不做视频记录(只提取音频)
-y 直接覆盖(如果目录下相同文件名字)
当然更多的参数学习,可通过官网查看----->FFmpeg超级传送门
我这里主要实现本文的目标功能,更多的语法等你入门后在自己去深入了解
/**
* 提取单独的视频,没有声音
*
* @param videoUrl
* @param outUrl
* @return
*/
public static String[] extractVideo(String videoUrl, String outUrl) {
String[] commands = new String[8];
commands[0] = "ffmpeg";
commands[1] = "-i";
commands[2] = videoUrl;
commands[3] = "-vcodec";
commands[4] = "copy";
commands[5] = "-an";
commands[6] = "-y";
commands[7] = outUrl;
return commands;
}
这两个方法就是分别得到视频文件(没有声音),音频文件(视频的原声来这)
回到MakeVideoActivity中开始使调用ffmpeg
初始化布局:
@Override
protected void onCreate(@Nullable Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_make_video);
mVideoView = (VideoView) findViewById(R.id.video);
mAudioSeekBar = (AppCompatSeekBar) findViewById(R.id.video_seek_bar);
mMusicSeekBar = (AppCompatSeekBar) findViewById(R.id.music_seek_bar);
mAudioSeekBar.setOnSeekBarChangeListener(this);
mMusicSeekBar.setOnSeekBarChangeListener(this);
findViewById(R.id.next).setOnClickListener(this);
findViewById(R.id.back).setOnClickListener(this);
findViewById(R.id.local_music).setOnClickListener(this);
isPlayer = getIntent().getBooleanExtra(this.getClass().getSimpleName(), false);
if (isPlayer) {
findViewById(R.id.title_layout).setVisibility(View.GONE);
findViewById(R.id.editor_layout).setVisibility(View.GONE);
mVideoView.setVideoPath(getIntent().getStringExtra("path"));
mVideoView.start();
}else{
mFileUtils = new FileUtils(this);
mTargetPath = mFileUtils.getStorageDirectory();
extractVideo();
}
}
其他操作按钮没什么解释,其中有个参数isPlayer,主要最后点击下一步生成文件,我又会Intent到当前MakeVideoActivity中,这时候isPlayer=true,就是播放最后制作完成的成品视频。(偷个懒,少创建一个文件)
新增以下方法
/**
* 提取视频
*/
private void extractVideo() {
final String outVideo = mTargetPath + "/video.mp4";
String[] commands = FFmpegCommands.extractVideo(getIntent().getStringExtra("path"), outVideo);
FFmpegRun.execute(commands, new FFmpegRun.FFmpegRunListener() {
@Override
public void onStart() {
mMediaPath = new ArrayList<>();
Log.e(TAG,"extractVideo ffmpeg start...");
}
@Override
public void onEnd(int result) {
Log.e(TAG,"extractVideo ffmpeg end...");
mMediaPath.add(outVideo);
extractAudio();
}
});
}
/**
* 提取音频
*/
private void extractAudio() {
final String outVideo = mTargetPath + "/audio.aac";
String[] commands = FFmpegCommands.extractAudio(getIntent().getStringExtra("path"), outVideo);
FFmpegRun.execute(commands, new FFmpegRun.FFmpegRunListener() {
@Override
public void onStart() {
mAudioPlayer = new MediaPlayer();
}
@Override
public void onEnd(int result) {
Log.e(TAG,"extractAudio ffmpeg end...");
mMediaPath.add(outVideo);
String path = mMediaPath.get(0);
mVideoView.setVideoPath(path);
try {
mAudioPlayer.setDataSource(mMediaPath.get(1));
mAudioPlayer.setLooping(true);
mAudioPlayer.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
@Override
public void onPrepared(MediaPlayer mediaPlayer) {
mAudioPlayer.setVolume(0.5f, 0.5f);
mAudioPlayer.start();
}
});
mAudioPlayer.prepare();
} catch (IOException e) {
e.printStackTrace();
}
}
});
}
在onCreate中调用extractVideo方法开始执行FFmpeg命令,如果不出现意执行完命令后,音频文件通过MediaPlayer开始播放,视频文件通过VideoView加载,音频默认50%音量。
实现seekbar进度条变化监听实现:
@Override
public void onProgressChanged(SeekBar seekBar, int i, boolean b) {
float volume = i / 100f;
if (mAudioSeekBar == seekBar) {
mAudioPlayer.setVolume(volume, volume);
} else if(mMusicPlayer!=null){
mMusicPlayer.setVolume(volume, volume);
}
}
这里的视频和音频已经被单独分开,大概是这样(gif质量较差,勉强看哈这个意思就行,声音也无法听见,很尴尬。)
其实能够实现这一步说明我们编译的ffmpeg so文件是没有任何问题的,可以传递更多的命令来完成更多的功能,所以继续下一步,选择一首音乐合成到视频中,最后视频有原声也有音乐,并且两种声音控制在最后分别选择的音量进度上。
8.选择本地音乐,并且剪切音频文件和视频时长一样。
新建一个MusicActivity用来展示本地音乐列表
public class MusicActivity extends AppCompatActivity {
private ListView mListView;
private MusicAdapter mAdapter;
@Override
protected void onCreate(@Nullable Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_music);
mListView = (ListView) findViewById(R.id.list);
findViewById(R.id.back).setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
finish();
}
});
new SongTask().execute();
}
private class SongTask extends AsyncTask<Void, Void, List<Music>> implements AdapterView.OnItemClickListener{
@Override
protected void onPreExecute() {
super.onPreExecute();
}
@Override
protected List<Music> doInBackground(Void... voids) {
List<Music> musics = new ArrayList<>();
Cursor cursor = getApplicationContext().getContentResolver().query(
MediaStore.Audio.Media.EXTERNAL_CONTENT_URI, null,
MediaStore.Audio.Media.DATA + " like ?",
new String[]{Environment.getExternalStorageDirectory() + File.separator + "%"},
MediaStore.Audio.Media.DEFAULT_SORT_ORDER);
if (cursor != null) {
for (cursor.moveToFirst(); !cursor.isAfterLast(); cursor.moveToNext()) {
Music music = new Music();
String isMusic = cursor.getString(cursor.getColumnIndexOrThrow(MediaStore.Audio.Media.IS_MUSIC));
if (isMusic != null && isMusic.equals("")) continue;
// int duration = cursor.getInt(cursor.getColumnIndexOrThrow(MediaStore.Audio.Media.DURATION));
String path = cursor.getString(cursor.getColumnIndexOrThrow(MediaStore.Audio.Media.DATA));
Log.e("SLog","music:"+path);
if (!path.endsWith(".mp3")) {
continue;
}
String title = cursor.getString(cursor.getColumnIndexOrThrow(MediaStore.Audio.Media.TITLE));
String artist = cursor.getString(cursor.getColumnIndexOrThrow(MediaStore.Audio.Media.ARTIST));
music.setId(cursor.getString(cursor.getColumnIndexOrThrow(MediaStore.Audio.Media._ID)));
music.setName(title);
music.setSingerName(artist);
music.setSongUrl(path);
musics.add(music);
}
cursor.close();
}
return musics;
}
@Override
protected void onPostExecute(List<Music> musics) {
super.onPostExecute(musics);
mAdapter = new MusicAdapter(MusicActivity.this,musics);
mListView.setAdapter(mAdapter);
mListView.setOnItemClickListener(this);
}
@Override
public void onItemClick(AdapterView<?> adapterView, View view, int i, long l) {
Music music = mAdapter.getItem(i);
Intent intent = new Intent();
intent.putExtra("music",music.getSongUrl());
setResult(10000,intent);
finish();
}
}
}
这里面有个内部类SongTask,防止主UI被卡死,所以在线程中完成本地音乐的遍历,可以合成的音乐格式有很多种,但是可能需要做不同的格式处理,我这里暂时只获取了MP3格式的音频文件。
适配器比较简单,这里就不贴了,后面上传的源码会携带。
9.视频不足30秒,一般的mp3音乐都在几分钟,所以现在点击一个音乐,返回到制作页面MakeVideoActivity需要通过ffmpeg命令剪切音乐后再进行播放。
打开FFmpegCommands新增剪切的命令。
/**
* 裁剪音频
*/
public static String[] cutIntoMusic(String musicUrl, long second, String outUrl) {
String[] commands = new String[10];
commands[0] = "ffmpeg";
commands[1] = "-i";
commands[2] = musicUrl;
commands[3] = "-ss";
commands[4] = "00:00:10";
commands[5] = "-t";
commands[6] = String.valueOf(second);
commands[7] = "-acodec";
commands[8] = "copy";
commands[9] = outUrl;
return commands;
}
参数解释:
-ss 音频开始位置,我这里是从低10秒开始
-t 音频结束位置,我这里传递了一个参数过来,就是视频的时长,
最后得到的音频就是这个mp3文件第10秒开始到10+ second秒结束。
我这里处理完的音频时长和视频时长保持一样。
新增一个方法
private void cutSelectMusic(String musicUrl) {
final String musicPath = mTargetPath + "/bgMusic.aac";
long time = getIntent().getIntExtra("time",0);
String[] commands = FFmpegCommands.cutIntoMusic(musicUrl, time, musicPath);
FFmpegRun.execute(commands, new FFmpegRun.FFmpegRunListener() {
@Override
public void onStart() {
Log.e(TAG,"cutSelectMusic ffmpeg start...");
}
@Override
public void onEnd(int result) {
Log.e(TAG,"cutSelectMusic ffmpeg end...");
if(mMusicPlayer!=null){//移除上一个选择的音乐背景
mMediaPath.remove(mMediaPath.size()-1);
}
mMediaPath.add(musicPath);
stopMediaPlayer();
mMusicPlayer = new MediaPlayer();
try {
mMusicPlayer.setDataSource(musicPath);
mMusicPlayer.setLooping(true);
mMusicPlayer.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
@Override
public void onPrepared(MediaPlayer mediaPlayer) {
mediaPlayer.setVolume(0.5f, 0.5f);
mediaPlayer.start();
mMusicSeekBar.setProgress(50);
}
});
mMusicPlayer.prepareAsync();
} catch (IOException e) {
e.printStackTrace();
}
}
});
}
调用
@Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if (resultCode == 10000) {
String music = data.getStringExtra("music");
cutSelectMusic(music);
}
}
现在制作页面就使用了三个独立文件,一个播放的视频(没有原声),一个视频的原声,一个刚才选择的背景音乐。到这里基本上准备的东西都已经完成了,调节好需要的音量,点击下一步就开始合成。
10.作为刚入门的时候真的苦不堪言,简直有种从入门到放弃的感觉,视频合成多音频我到现在也没弄明白,总之走了很多弯路,各自尝试失败,要么就是达不到要求,其中的弯路过程我就不在这里阐述了,我怕说多了刹不住车把你们带沟里,直接给出最后音视频+背景音乐合成成功的思路:
1.根据两个seebar选择的音量进度,通过ffmpeg命令修改视频原声和背景音乐的对应音量。
2.通过ffmpeg将视频原声和背景音乐合成为一个音频文件。
3.得到最新的音频文件,通过ffmpeg命令合成到视频中。
打开FFmpegCommands新增三个方法。
/**
* 修改音频文件的音量
* @param audioOrMusicUrl
* @param vol
* @param outUrl
* @return
*/
public static String[] changeAudioOrMusicVol(String audioOrMusicUrl, int vol, String outUrl) {
if (SLog.debug)
SLog.w("audioOrMusicUrl:" + audioOrMusicUrl + "\nvol:" + vol + "\noutUrl:" + outUrl);
String[] commands = new String[8];
commands[0] = "ffmpeg";
commands[1] = "-i";
commands[2] = audioOrMusicUrl;
commands[3] = "-vol";
commands[4] = String.valueOf(vol);
commands[5] = "-acodec";
commands[6] = "copy";
commands[7] = outUrl;
return commands;
}
-vol 就是主要改变音量的值。
/**
* @param audio1
* @param audio2
* @param outputUrl
* @return
*/
public static String[] composeAudio(String audio1, String audio2, String outputUrl) {
Log.w("SLog","audio1:" + audio1 + "\naudio2:" + audio2 + "\noutputUrl:" + outputUrl);
String[] commands = new String[10];
commands[0] = "ffmpeg";
//输入
commands[1] = "-i";
commands[2] = audio1;
//音乐
commands[3] = "-i";
commands[4] = audio2;
//覆盖输出
commands[5] = "-filter_complex";
commands[6] = "amix=inputs=2:duration=first:dropout_transition=2";
commands[7] = "-strict";
commands[8] = "-2";
//输出文件
commands[9] = outputUrl;
return commands;
}
混合合并两个音频文件
-filter_complex 很强大,很多滤镜功能,更多的效果可通过官网学习。
我这里用它指定2个音频文件合成,最后的合成的音频时长以第一个音频的时长为准。
/**
* 音频,视频合成
* @param videoUrl
* @param musicOrAudio
* @param outputUrl
* @param second
* @return
*/
public static String[] composeVideo(String videoUrl, String musicOrAudio, String outputUrl, long second) {
Log.w("SLog","videoUrl:" + videoUrl + "\nmusicOrAudio:" + musicOrAudio + "\noutputUrl:" + outputUrl + "\nsecond:" + second);
String[] commands = new String[14];
commands[0] = "ffmpeg";
//输入
commands[1] = "-i";
commands[2] = videoUrl;
//音乐
commands[3] = "-i";
commands[4] = musicOrAudio;
commands[5] = "-ss";
commands[6] = "00:00:00";
commands[7] = "-t";
commands[8] = String.valueOf(second);
//覆盖输出
commands[9] = "-vcodec";
commands[10] = "copy";
commands[11] = "-acodec";
commands[12] = "copy";
//输出文件
commands[13] = outputUrl;
return commands;
}
把音频文件合成到视频。
回到MakeVideoActivity点击下一步开始制作视频
修改onClick增加下一步的点击事件
@Override
public void onClick(View view) {
switch (view.getId()){
case R.id.back:
finish();
mFileUtils.deleteFile(mTargetPath,null);
break;
case R.id.local_music:
Intent intent = new Intent(this,MusicActivity.class);
startActivityForResult(intent,0);
break;
case R.id.next:
composeVideoAudio();
mNext.setTextColor(Color.parseColor("#999999"));
mNext.setEnabled(false);
break;
}
}
/**
* 处理视频原声
*/
private void composeVideoAudio() {
int mAudioVol = mAudioSeekBar.getProgress();
String audioUrl = mMediaPath.get(1);
final String audioOutUrl = mTargetPath + "/tempAudio.aac";
String[] common = FFmpegCommands.changeAudioOrMusicVol(audioUrl, mAudioVol * 10, audioOutUrl);
FFmpegRun.execute(common, new FFmpegRun.FFmpegRunListener() {
@Override
public void onStart() {
Log.e(TAG,"changeAudioVol ffmpeg start...");
handler.sendEmptyMessage(0);
}
@Override
public void onEnd(int result) {
Log.e(TAG,"changeAudioVol ffmpeg end...");
if (mMediaPath.size() == 3) {
composeVideoMusic(audioOutUrl);
} else {
composeMusicAndAudio(audioOutUrl);
}
}
});
}
在onEnd方法有个判断逻辑,如果没有选择本地音乐,那不需要合成背景音乐,只需要处理视频原声后直接合成到视频完成制作,否则就继续处理当前选择的背景音乐。
/**
* 处理背景音乐
*/
/**
* 处理背景音乐
*/
private void composeVideoMusic(final String audioUrl) {
final int mMusicVol = mMusicSeekBar.getProgress();
String musicUrl;
if (audioUrl == null) {
musicUrl = mMediaPath.get(1);
} else {
musicUrl = mMediaPath.get(2);
}
final String musicOutUrl = mTargetPath + "/tempMusic.aac";
final String[] common = FFmpegCommands.changeAudioOrMusicVol(musicUrl, mMusicVol * 10, musicOutUrl);
FFmpegRun.execute(common, new FFmpegRun.FFmpegRunListener() {
@Override
public void onStart() {
Log.e(TAG,"changeMusicVol ffmpeg start...");
handler.sendEmptyMessage(0);
}
@Override
public void onEnd(int result) {
Log.e(TAG,"changeMusicVol ffmpeg end...");
composeAudioAndMusic(audioUrl, musicOutUrl);
}
});
}
原声和背景音乐都处理好了就把两个音频合成一个
/**
* 合成原声和背景音乐
*/
public void composeAudioAndMusic(String audioUrl, String musicUrl) {
if (audioUrl == null) {
composeMusicAndAudio(musicUrl);
} else {
final String musicAudioPath = mTargetPath + "/audioMusic.aac";
String[] common = FFmpegCommands.composeAudio(audioUrl, musicUrl, musicAudioPath);
FFmpegRun.execute(common, new FFmpegRun.FFmpegRunListener() {
@Override
public void onStart() {
Log.e(TAG,"composeAudioAndMusic ffmpeg start...");
handler.sendEmptyMessage(0);
}
@Override
public void onEnd(int result) {
Log.e(TAG,"composeAudioAndMusic ffmpeg end...");
composeMusicAndAudio(musicAudioPath);
}
});
}
}
得到最后的音频文件合成到无声的视频中
/**
* 视频和背景音乐合成
*
* @param bgMusicAndAudio
*/
private void composeMusicAndAudio(String bgMusicAndAudio) {
final String videoAudioPath = mTargetPath + "/videoMusicAudio.mp4";
final String videoUrl = mMediaPath.get(0);
final int time = getIntent().getIntExtra("time",0) - 1;
String[] common = FFmpegCommands.composeVideo(videoUrl, bgMusicAndAudio, videoAudioPath, time);
FFmpegRun.execute(common, new FFmpegRun.FFmpegRunListener() {
@Override
public void onStart() {
Log.e(TAG,"videoAndAudio ffmpeg start...");
handler.sendEmptyMessage(0);
}
@Override
public void onEnd(int result) {
Log.e(TAG,"videoAndAudio ffmpeg end...");
handleVideoNext(videoAudioPath);
}
});
}
如果不出意外就会的到最后制作成功的视频文件,进行其他逻辑处理。
/**
* 适配处理完成,进入下一步
*/
private void handleVideoNext(String videoUrl) {
Message message = new Message();
message.what = 1;
message.obj = videoUrl;
handler.sendMessage(message);
}
Handler handler = new Handler() {
@Override
public void handleMessage(Message msg) {
super.handleMessage(msg);
switch (msg.what) {
case 0:
showProgressLoading();
break;
case 1:
dismissProgress();
String videoPath = (String) msg.obj;
Intent intent = new Intent(MakeVideoActivity.this,MakeVideoActivity.class);
intent.putExtra("path",videoPath);
intent.putExtra(this.getClass().getSimpleName(),true);
startActivity(intent);
finish();
break;
case 2:
dismissProgress();
break;
}
}
};
private void showProgressLoading(){
}
private void dismissProgress(){
}
我这里就是跳转到到当前页面直接播放。
到这里我们就实现了所有的目标功能,当然这里我所用到的ffmpeg功能只是冰山一角,它的更多功能需要自己去挖掘,希望我这里的两篇文章能够将你引入android ffmpeg的入门。
后面有时间我将会提供一个android opencv的demo的功能实现。
源码地址
Android Studio下编译FFmpeg so文件
Android通过FFmpeg实现多段小视频合成
读书不一定改变命运,学习一定让你进步。