ios – GPUImageMovie使用多个图像作为纹理和处理

我正在尝试使用一些GPU Image

Pictures作为纹理源以及片段着色器来过滤播放视频.

我能够以这种方式处理静止图像,但我似乎无法弄清楚我在GPUImageMovie上工作时所缺少的东西我会感激任何提供的帮助.

@property (nonatomic, strong) GPUImageView *gpuPlayerView;
@property (nonatomic, strong) GPUImageMovie *gpuMovie;

AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:self.video];

self.player = [AVPlayer playerWithPlayerItem:playerItem];

self.player.actionAtItemEnd = AVPlayerActionAtItemEndNone;

[self.player play];

self.gpuMovie = [[GPUImageMovie alloc]initWithPlayerItem:playerItem];

self.gpuMovie.playAtActualSpeed = YES;

GPUImagePicture *sourcePicture1 = [[GPUImagePicture alloc]initWithImage:
[UIImage imageNamed:@"FilterBG"]];

GPUImagePicture *sourcePicture2 = [[GPUImagePicture alloc]initWithImage:
[UIImage imageNamed:@"FilterOverlay"]];

GPUImagePicture *sourcePicture3 = [[GPUImagePicture alloc]initWithImage:
[UIImage imageNamed:@"Filter1Map"]];

GPUImageFilter *filter = [[GPUImageFourInputFilter alloc]initWithFragmentShaderFromString:
kFilter1ShaderString];

[self.gpuMovie addTarget:filter atTextureLocation:0];

if (sourcePicture1)
{
    [sourcePicture1 addTarget:filter atTextureLocation:1];
}

if (sourcePicture2)
{
    [sourcePicture2 addTarget:filter atTextureLocation:2];
}

if (sourcePicture3)
{
    [sourcePicture3 addTarget:filter atTextureLocation:3];
}

[filter addTarget:self.gpuPlayerView];

[self.gpuMovie startProcessing];

有一种方法可以用来实现同样的目的

CVOpenGLESTextureCacheCreateTextureFromImage

它允许GL纹理和电影之间的共享缓冲区.

在此核心视频中,OpenGLES纹理缓存用于缓存和管理CVOpenGLESTextureRef纹理.这些纹理缓存为您提供了一种直接读取和写入GLES中各种像素格式(如420v或BGRA)的缓冲区的方法.

//Mapping a BGRA buffer as a source texture:
CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, textureCache, pixelBuffer, NULL, GL_TEXTURE_2D, GL_RGBA, width, height, GL_RGBA, GL_UNSIGNED_BYTE, 0, &outTexture);
//Mapping a BGRA buffer as a renderbuffer:
CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, textureCache, pixelBuffer, NULL, GL_RENDERBUFFER, GL_RGBA8_OES, width, height, GL_RGBA, GL_UNSIGNED_BYTE, 0, &outTexture);
//Mapping the luma plane of a 420v buffer as a source texture:
CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, textureCache, pixelBuffer, NULL, GL_TEXTURE_2D, GL_LUMINANCE, width, height, GL_LUMINANCE, GL_UNSIGNED_BYTE, 0, &outTexture);
//Mapping the chroma plane of a 420v buffer as a source texture:
CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, textureCache, pixelBuffer, NULL, GL_TEXTURE_2D, GL_LUMINANCE_ALPHA, width/2, height/2, GL_LUMINANCE_ALPHA, GL_UNSIGNED_BYTE, 1, &outTexture);
//Mapping a yuvs buffer as a source texture (note: yuvs/f and 2vuy are unpacked and resampled -- not colorspace converted)
CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, textureCache, pixelBuffer, NULL, GL_TEXTURE_2D, GL_RGB_422_APPLE, width, height, GL_RGB_422_APPLE, GL_UNSIGNED_SHORT_8_8_APPLE, 1, &outTexture);

    CVReturn CVOpenGLESTextureCacheCreateTextureFromImage ( CFAllocatorRef
    __nullable allocator, CVOpenGLESTextureCacheRef __nonnull textureCache, CVImageBufferRef __nonnull sourceImage, CFDictionaryRef
    __nullable textureAttributes, GLenum target, GLint internalFormat, GLsizei width, GLsizei height, GLenum format, GLenum type, size_t planeIndex, CVOpenGLESTextureRef __nullable * __nonnull textureOut );

此函数要么创建一个新函数,要么返回一个映射到CVImageBufferRef和相关参数的缓存CVOpenGLESTextureRef纹理对象.此操作在图像缓冲区和基础纹理对象之间创建实时绑定.

我希望这能帮助你创造所需的东西:)

翻译自:https://stackoverflow.com/questions/31149684/gpuimagemovie-use-multiple-images-as-textures-and-process