实际sOpenGL ES除了可以结合MediaCodec添加滤镜处理,进行视频录制。还可以结合MediaPlayer进行视频的播放,在视频播放的时候添加滤镜。
核心原理
主要的原理就是使用SurfaceTexture来设置MediaPlayer的输出,使用SurfaceTexture创建一个Surface,将这个Surface作为MediaPlayer的输出显示。
和相机预览类似,通过SurfaceTexture的updateTexImage来刷新一帧的图像数据,然后创建的纹理textureId来交给OpenGL进行渲染。
SurfaceTexture
SurfaceTexture是从Android3.0(API 11)加入的一个新类。这个类跟SurfaceView很像,可以从camera preview或者video decode里面获取图像流(image stream)。但是,和SurfaceView不同的是,SurfaceTexture在接收图像流之后,不需要显示出来,而是转为GL外部纹理,因此可用于图像流数据的二次处理(如Camera滤镜,桌面特效等)。
SurfaceTexture从图像流(来自Camera预览,视频解码,GL绘制场景等)中获得帧数据,当调用updateTexImage()时,根据内容流中最近的图像更新SurfaceTexture对应的GL纹理对象。
主要实现逻辑代码
class MediaGLRenderer(ctx:Context?,listener: SurfaceTexture.OnFrameAvailableListener?):GLSurfaceView.Renderer {
private var mContext: Context? = null
//透视矩阵、相机矩阵定义放在基类中,方便传给其他绘制对象
private val mMVPMatrix = FloatArray(16)
private val mTempMatrix = FloatArray(16)
protected var mProjectMatrix = FloatArray(16)
protected var mCameraMatrix = FloatArray(16)
private var mProgram = 0
// 原来的方向不对
private val mPosCoordinate = floatArrayOf(
-1f, -1f,1f,
-1f, 1f,1f,
1f, -1f,1f,
1f, 1f,1f)
private val mTexCoordinate = floatArrayOf(0f, 0f, 0f, 1f, 1f, 0f, 1f, 1f)
private var mPosBuffer: FloatBuffer? = null
private var mTexBuffer: FloatBuffer? = null
lateinit var mPlayer: MediaPlayer
//!!! 此路径需根据自己情况,改为自己手机里的视频路径
private val videoUrl:String = "/storage/emulated/0/Android/data/aom.example.dj.appgl/files/big_buck_bunny.mp4"
// private val videoUrl:String = "http://clips.vorwaerts-gmbh.de/big_buck_bunny.mp4"
private var textureId = 0
private lateinit var surfaceTexture:SurfaceTexture
private var listener: SurfaceTexture.OnFrameAvailableListener? = null
private var uPosHandle = 0
private var aTexHandle = 0
private var mMVPMatrixHandle = 0
private var mTexRotateMatrixHandle = 0
// 旋转矩阵
private val rotateOriMatrix = FloatArray(16)
init {
this.mContext = ctx
Matrix.setIdentityM(mProjectMatrix, 0)
Matrix.setIdentityM(mCameraMatrix, 0)
Matrix.setIdentityM(mMVPMatrix, 0)
Matrix.setIdentityM(mTempMatrix, 0)
mPlayer = MediaPlayer()
mPosBuffer = GLDataUtil.createFloatBuffer(mPosCoordinate)
mTexBuffer = GLDataUtil.createFloatBuffer(mTexCoordinate)
}
fun initMediaPlayer(){
mPlayer.reset()
mPlayer.setDataSource(videoUrl)
val texture = IntArray(1)
GLES30.glGenTextures(1, texture, 0) //生成一个OpenGl纹理
textureId = texture[0]
GLES30.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, texture[0]) //申请纹理存储区域并设置相关参数
GLES30.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GL10.GL_TEXTURE_MIN_FILTER, GL10.GL_LINEAR.toFloat())
GLES30.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GL10.GL_TEXTURE_MAG_FILTER, GL10.GL_LINEAR.toFloat())
GLES30.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GL10.GL_TEXTURE_WRAP_S, GL10.GL_CLAMP_TO_EDGE)
GLES30.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GL10.GL_TEXTURE_WRAP_T, GL10.GL_CLAMP_TO_EDGE)
GLES30.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,0)
surfaceTexture = SurfaceTexture(textureId)
surfaceTexture.setOnFrameAvailableListener(listener)
val surface = Surface(surfaceTexture)
mPlayer.setSurface(surface)
surface.release()
try {
mPlayer.prepare()
mPlayer.start()
}catch (e:IOException){
Log.e(Companion.TAG, "initMediaPlayer: $e")
}
}
override fun onSurfaceCreated(gl: GL10?, config: EGLConfig?) {
//编译顶点着色程序
val vertexShaderStr = ResReadUtils.readResource(R.raw.vertex_media_player_shade)
val vertexShaderId = ShaderUtils.compileVertexShader(vertexShaderStr)
//编译片段着色程序
// fragment_media_player_normal_shade --正常
// fragment_media_player_nostalgia_shade -- 怀旧滤镜
// fragment_media_player_negative_shade -- 负面滤镜
val fragmentShaderStr = ResReadUtils.readResource(R.raw.fragment_media_player_normal_shade)
val fragmentShaderId = ShaderUtils.compileFragmentShader(fragmentShaderStr)
//连接程序
mProgram = ShaderUtils.linkProgram(vertexShaderId, fragmentShaderId)
initMediaPlayer()
}
override fun onSurfaceChanged(gl: GL10?, width: Int, height: Int) {
}
override fun onDrawFrame(gl: GL10?) {
GLES30.glEnable(GLES20.GL_DEPTH_TEST)
GLES30.glClear(GLES20.GL_COLOR_BUFFER_BIT or GLES20.GL_DEPTH_BUFFER_BIT)
GLES30.glUseProgram(mProgram)
uPosHandle = GLES20.glGetAttribLocation(mProgram, "position")
aTexHandle = GLES20.glGetAttribLocation(mProgram, "inputTextureCoordinate")
mMVPMatrixHandle = GLES20.glGetUniformLocation(mProgram, "textureTransform")
mTexRotateMatrixHandle = GLES20.glGetUniformLocation(mProgram,"uTextRotateMatrix")
// 将前面计算得到的mMVPMatrix(frustumM setLookAtM 通过multiplyMM 相乘得到的矩阵) 传入vMatrix中,与顶点矩阵进行相乘
GLES30.glUniformMatrix4fv(mMVPMatrixHandle, 1, false, mMVPMatrix, 0)
surfaceTexture!!.updateTexImage()
//?? 为何要取矩阵??!!!
surfaceTexture.getTransformMatrix(rotateOriMatrix)
GLES30.glVertexAttribPointer(uPosHandle, 3, GLES30.GL_FLOAT, false, 0, mPosBuffer)
GLES30.glVertexAttribPointer(aTexHandle, 2, GLES30.GL_FLOAT, false, 8, mTexBuffer)
GLES30.glUniformMatrix4fv(mTexRotateMatrixHandle, 1, false, rotateOriMatrix, 0)
GLES30.glEnableVertexAttribArray(uPosHandle)
GLES30.glEnableVertexAttribArray(aTexHandle)
//顶点个数是4个 mPosCoordinate.length/2每个定点x、y2个坐标,所以得到顶点个数。
GLES30.glDrawArrays(GLES30.GL_TRIANGLE_STRIP, 0, mPosCoordinate.size / 2)
GLES30.glDisableVertexAttribArray(uPosHandle)
GLES30.glDisableVertexAttribArray(aTexHandle)
GLES30.glUseProgram(0)
}
companion object {
private const val TAG = "MediaGLRenderer"
}
}
本文主要阐述OpenGL ES + MediaPlayer播放原理,所以代码做了简化,也没做播放控制。
需要注意的是MediaPlayer的输出往往不是RGB格式(一般是YUV),而GLSurfaceView需要RGB格式才能正常显示。所以生成纹理代码如下:
val texture = IntArray(1)
GLES30.glGenTextures(1, texture, 0) //生成一个OpenGl纹理
textureId = texture[0]
GLES30.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, texture[0]) //申请纹理存储区域并设置相关参数
GLES30.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GL10.GL_TEXTURE_MIN_FILTER, GL10.GL_LINEAR.toFloat())
GLES30.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GL10.GL_TEXTURE_MAG_FILTER, GL10.GL_LINEAR.toFloat())
GLES30.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GL10.GL_TEXTURE_WRAP_S, GL10.GL_CLAMP_TO_EDGE)
GLES30.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GL10.GL_TEXTURE_WRAP_T, GL10.GL_CLAMP_TO_EDGE)
GLES11Ext.GL_TEXTURE_EXTERNAL_OES的用处:视频解码的输出格式是YUV,这个扩展纹理的作用就是实现YUV格式到RGB的自动转化。
设置矩阵
surfaceTexture.getTransformMatrix(rotateOriMatrix)
这句代码调试时得到的矩阵为:
就是一个4x4的单位矩阵。看到网上几篇文章都使用了这个矩阵,实际是否多余?
顶点着色器代码
uniform mat4 textureTransform;
attribute vec4 inputTextureCoordinate;
attribute vec3 position; //NDK坐标点
varying vec2 textureCoordinate; //纹理坐标点变换后输出
uniform mat4 uTextRotateMatrix;
void main() {
vec4 pos = vec4(position, 1.0);
gl_Position = pos.xyww;
textureCoordinate = (uTextRotateMatrix * inputTextureCoordinate).xy;
}
片元着色器代码
#extension GL_OES_EGL_image_external : require
precision mediump float;
uniform samplerExternalOES videoTex;
varying vec2 textureCoordinate;
void main() {
vec4 tc = texture2D(videoTex, textureCoordinate);
gl_FragColor = tc;
}
samplerExternalOES:Android端用于绘制视频或者相机预览,使用samplerExternalOES需要在shader头部添加
#extension GL_OES_EGL_image_external : require
代码:
https://github.com/godtrace12/DOpenglTest
参考:
https://zhuanlan.zhihu.com/p/164722231
https://blog.csdn.net/King1425/article/details/72773331?spm=1001.2014.3001.5501
https://juejin.cn/post/6903718814467883021#heading-9
https://www.cnblogs.com/hrlnw/p/3277300.html