随笔
写这个之前网络上找了好多办法都没有成功解决
使用本地录制功能录制视频没问题
使用转码推流发送后显示丢失帧目测是有一部分帧图片未进行美颜滤镜效果处理就进行了发送,所以出现闪烁的状况
猜测问题出现的原因:
1. 本地录制使用的获取录制后的元数据和直播获取途径不一样
@property (nonatomic, strong) GPUImageOutput<GPUImageInput> *filter;
@property (nonatomic, strong) GPUImageOutput<GPUImageInput> *output;
@property (nonatomic, strong) GPUImageAlphaBlendFilter *blendFilter;
@property (nonatomic, strong) GPUImageUIElement *uiElementInput;
-------------添加全局-------------------
@property (nonatomic, strong) GPUImagePixelBufferOutput *gpuOutput;
---------------------------------------
/// 视频采集获取关系链如下所示:
/*
要理解它的实现原理,需要搞懂GPUImageUIElement和GPUImageAlphaBlendFilter。
GPUImageUIElement的作用是把一个视图的layer通过CALayer的renderInContext:方法把layer转化为image,
然后作为OpenGL的纹理传给GPUImageAlphaBlendFilter。
而GPUImageAlphaBlendFilter则是一个两输入的blend filter,
第一个输入是摄像头数据,
第二个输入是刚刚提到的GPUImageUIElement的数据,
GPUImageAlphaBlendFilter将这两个输入做alpha blend,可以简单的理解为将第二个输入叠加到第一个的上面,
*/
/*
双重滤镜叠加效果(并联)
fileter滤镜->blendFilter滤镜->gpuImageView展示
uiElementInput->blendFilter滤镜->gpuImageView展示//
uiElementInput: 只有初始化的水印图片,注释 [self.filter addTarget:self.blendFilter] ,屏幕上只有水印录像没有实时影像
filter: 实时影像
*/
[self.filter addTarget:self.blendFilter];
[self.uiElementInput addTarget:self.blendFilter];
[self.blendFilter addTarget:self.gpuImageView];
-------------------------------------------------
| [self.blendFilter addTarget:self.gpuOutput]; |
-------------------------------------------------
if(self.saveLocalVideo) [self.blendFilter addTarget:self.movieWriter];
[self.filter addTarget:self.output];
-------------------------------------------------
| [self.filter addTarget:self.gpuOutput]; |
-------------------------------------------------
[self.uiElementInput update];
以上圈出来的是需要加去的方法解决闪烁的水印丢失的方法
/// 原来对美颜数据的获取方法
[self.output setFrameProcessingCompletionBlock:^(GPUImageOutput *output, CMTime time) {
__strong __typeof(weakSelf)strongSelf = weakSelf;
@autoreleasepool {
GPUImageFramebuffer *imageFramebuffer = output.framebufferForOutput;
GPUImageFramebuffer *imageFramebuffer = output.framebufferForOutput;
CVPixelBufferRef pixelBuffer = [imageFramebuffer pixelBuffer];
if (pixelBuffer && _self.delegate && [_self.delegate respondsToSelector:@selector(captureOutput:pixelBuffer:isBeauty:)]) {
[strongSelf.delegate captureOutput:strongSelf pixelBuffer:pixelBufferRef isBeauty:strongSelf.beautyFace frameTime: time];
}
}
}];
原始的数据获取就出现了推流后的闪烁问题
加入自定义方法后,获取处理后数据的方法如下;
__weak typeof(self) weakSelf = self;
_gpuOutput.pixelBufferCallback = ^(CVPixelBufferRef _Nullable pixelBufferRef, CMTime frameTime) {
__strong __typeof(weakSelf)strongSelf = weakSelf;
if (pixelBufferRef && strongSelf.delegate && [strongSelf.delegate respondsToSelector:@selector(captureOutput:pixelBuffer:isBeauty:frameTime:)]) {
[strongSelf.delegate captureOutput:strongSelf pixelBuffer:pixelBufferRef isBeauty:strongSelf.beautyFace frameTime:frameTime];
}
};
2. 录制方法获取可自行下载源码研究,此处贴一部分做记录
/// GPUImageMovieWriter 录制方法
@interface GPUImageMovieWriter : NSObject <GPUImageInput>
#pragma mark GPUImageInput protocol
- (void)newFrameReadyAtTime:(CMTime)frameTime atIndex:(NSInteger)textureIndex;
{
......
/// 方法内部实现
}
3. 我们自定义一个GPUImagePixelBufferOutput管理我们的美颜数据,模仿录制GPUImageMovieWriter方法进行处理输出
@interface GPUImagePixelBufferOutput : GPUImageRawDataOutput <GPUImageInput>继承自GPUImageRawDataOutput <GPUImageInput>
对数据更加容易处理
源码文件:
GPUImagePixelBufferOutput.h
#import <Foundation/Foundation.h>
#import <UIKit/UIKit.h>
#if __has_include(<GPUImage/GPUImageFramework.h>)
#import <GPUImage/GPUImageRawDataOutput.h>
#else
#import "GPUImageRawDataOutput.h"
#endif
typedef void (^GPUImageBufferOutputBlock) (CVPixelBufferRef _Nullable pixelBufferRef, CMTime frameTime);
NS_ASSUME_NONNULL_BEGIN
@interface GPUImagePixelBufferOutput : GPUImageRawDataOutput <GPUImageInput>
@property(nonatomic, copy)GPUImageBufferOutputBlock pixelBufferCallback;
- (instancetype)initwithImageSize:(CGSize)newImageSize;
@end
NS_ASSUME_NONNULL_END
GPUImagePixelBufferOutput.m
此处有自己项目的自定义需求所以保留了相机采集时的时间戳数据
使用
#pragma mark - GPUImageInput protocol
- (void)newFrameReadyAtTime:(CMTime)frameTime atIndex:(NSInteger)textureIndex;方法进行数据自定义处理,效果是继承自GPUImageRawDataOutput 是数据链路处理过程的数据输出管理,可以获得整个视频流处理后的数据,在这个里面进行复写操作实现自己的业务逻辑
#import "GPUImagePixelBufferOutput.h"
@implementation GPUImagePixelBufferOutput
- (instancetype)initwithImageSize:(CGSize)newImageSize{
if (self == [super initWithImageSize:newImageSize resultsInBGRAFormat:YES]) {
}
return self;
}
#pragma mark - GPUImageInput protocol
- (void)newFrameReadyAtTime:(CMTime)frameTime atIndex:(NSInteger)textureIndex;
{
[super newFrameReadyAtTime:frameTime atIndex:textureIndex];
// Float64 time = CMTimeGetSeconds(frameTime);
// NSLog(@"采集时间???? %f",time);//采集时间???? 1917.957380
[self lockFramebufferForReading];
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
[NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
nil];
// NSDictionary *options = @{(id)kCVPixelBufferIOSurfacePropertiesKey : @{}};
CVPixelBufferRef pixelBuffer = NULL;
CVPixelBufferCreateWithBytes(kCFAllocatorDefault,
imageSize.width,
imageSize.height,
kCVPixelFormatType_32BGRA,
self.rawBytesForImage,
self.bytesPerRowInOutput,
NULL,
NULL,
(__bridge CFDictionaryRef)options,
&pixelBuffer);
if(self.pixelBufferCallback){
self.pixelBufferCallback(pixelBuffer,frameTime);
}
CVPixelBufferRelease(pixelBuffer);
[self unlockFramebufferAfterReading];
}
- (void)willOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer {}
- (void)processAudioBuffer:(CMSampleBufferRef)audioBuffer {}
- (BOOL)hasAudioTrack {return YES;}
@end