封装CIImage实现实时渲染

简介:

封装CIImage实现实时渲染

CIImage属于CoreImage里面的东东,用来渲染图片的,为什么要封装它呢?其实很简单,封装好之后使用更加方便.

如果直接使用CIImage渲染图片,使用的流程如下:

只要你仔细研究一下CIImage,你会发现,filter部分与context部分是彼此分离的,context只接受一个CIImage,其他的都不管,所以,这个地方,我们就把它拆分成两部分,一部分是filter,一部分是context.

注:context部分执行了CIImage的渲染工作,CIImage的outputImage方法只是封装了要被渲染的步骤而已.

@property(readonly, nonatomic) CIImage *outputImage
Description    
Returns a CIImage object that encapsulates the operations configured in the filter. (read-only)

ImageFilter.h + ImageFilter.m

//
//  ImageFilter.h
//  CoreImageWapper
//
//  Copyright (c) 2014年 Y.X. All rights reserved.
//

#import <Foundation/Foundation.h>
#import <CoreImage/CoreImage.h>

#pragma mark - 辅助内联函数
NS_INLINE CIImage* CIImageFromImage(UIImage *image)
{
    return [[CIImage alloc] initWithImage:image];
}


#pragma mark - 定义的block
typedef void(^ImageFilterConfigBlock_t)(CIFilter *filter);

@interface ImageFilter : NSObject


#pragma mark - 可读写属性
@property (nonatomic, strong, readwrite) CIImage   *inputCIImage;    // 输入CIImage
@property (nonatomic, strong, readwrite) NSString  *filterName;      // 滤镜名字


#pragma mark - 重写了getter方法,注意
@property (nonatomic, strong, readonly)  CIImage   *outputCIImage;   // 输出CIImage
@property (nonatomic, assign, readonly)  BOOL       filterNameValid; // 滤镜名合法性


#pragma mark - 初始化方法
- (instancetype)init;
- (instancetype)initWithFilterName:(NSString *)filterName;
- (instancetype)initWithFilterName:(NSString *)filterName inputCIImage:(CIImage *)inputCIImage;


#pragma mark - 配置
- (void)configFilter:(ImageFilterConfigBlock_t)block;

@end


//
//  ImageFilter.m
//  CoreImageWapper
//
//  Copyright (c) 2014年 Y.X. All rights reserved.
//

#import "ImageFilter.h"

NSArray *filterNameArray = nil;

@interface ImageFilter ()

@property (nonatomic, strong) CIFilter   *filter;

@end

@implementation ImageFilter

@synthesize outputCIImage   = _outputCIImage;
@synthesize filterNameValid = _filterNameValid;

+ (void)initialize
{
    if (self == [ImageFilter class])
    {
        filterNameArray = [CIFilter filterNamesInCategory:kCICategoryBuiltIn];;
    }
}

- (instancetype)init
{
    return [self initWithFilterName:nil inputCIImage:nil];
}

- (instancetype)initWithFilterName:(NSString *)filterName
{
    return [self initWithFilterName:filterName inputCIImage:nil];
}

- (instancetype)initWithFilterName:(NSString *)filterName inputCIImage:(CIImage *)inputCIImage
{
    self = [super init];
    if (self)
    {
        self.filterName   = filterName;
        self.inputCIImage = inputCIImage;
    }
    return self;
}

-(CIImage *)outputCIImage
{
    if (_filter)
    {
        return [_filter outputImage];
    }
    else
    {
        return nil;
    }
}

- (void)configFilter:(ImageFilterConfigBlock_t)block
{
    if (_filterName && _inputCIImage)
    {
        // 创建滤镜
        _filter = [CIFilter filterWithName:_filterName
                             keysAndValues:kCIInputImageKey, _inputCIImage, nil];
        
        // 设置滤镜
        block(_filter);
    }
    
    block(nil);
}

- (BOOL)filterNameValid
{
    BOOL flag = NO;
    
    if (_filterName)
    {
        for (NSString *name in filterNameArray)
        {
            if ([_filterName isEqualToString:name] == YES)
            {
                flag = YES;
                break;
            }
        }
    }
    
    return flag;
}

@end

ImageRender.h + ImageRender.m
//
//  ImageRender.h
//  CoreImageWapper
//
//  Copyright (c) 2014年 Y.X. All rights reserved.
//

#import <Foundation/Foundation.h>
#import "ImageFilter.h"

@interface ImageRender : NSObject


@property (nonatomic, strong) ImageFilter  *imageFilter;
@property (nonatomic, strong) UIImage      *outputImage;


- (instancetype)init;
- (instancetype)initWithImageFilter:(ImageFilter *)imageFilter;
- (UIImage *)render;

@end


//
//  ImageRender.m
//  CoreImageWapper
//
//  Copyright (c) 2014年 Y.X. All rights reserved.
//

#import "ImageRender.h"

@interface ImageRender ()

@property (nonatomic, strong) CIContext    *context;

@end

@implementation ImageRender

- (instancetype)init
{
    return [self initWithImageFilter:nil];
}

- (instancetype)initWithImageFilter:(ImageFilter *)imageFilter
{
    self = [super init];
    if (self)
    {
        // 基于GPU渲染
        self.context     = [CIContext contextWithOptions:nil];
        self.imageFilter = imageFilter;
    }
    return self;
}

- (UIImage *)render
{
    if (_imageFilter)
    {
        CIImage *outputCIImage = [_imageFilter outputCIImage];
        
        CGImageRef cgImage = [_context createCGImage:outputCIImage
                                            fromRect:[outputCIImage extent]];
        
        self.outputImage = [UIImage imageWithCGImage:cgImage];
        
        CGImageRelease(cgImage);
    }
    
    return self.outputImage;
}

@end

现在,你可以这么使用了:)

好吧,来一个复杂的看看,即时渲染:

- (void)viewDidLoad
{
    [super viewDidLoad];
    
    ImageFilter *filter = [ImageFilter new];
    ImageRender *render = [ImageRender new];
    
    filter.filterName = @"CIPixellate";
    filter.inputCIImage = CIImageFromImage([UIImage imageNamed:@"demo"]);
    [filter configFilter:^(CIFilter *filter) {
        [filter setValue:@1
                  forKey:@"inputScale"];
    }];
    render.imageFilter = filter;
    
    UIImageView *imageView = [[UIImageView alloc] initWithImage:[render render]];
    [self.view addSubview:imageView];
    imageView.center = self.view.center;
    
    _timer = [[GCDTimer alloc] initInQueue:[GCDQueue globalQueue]];
    [_timer event:^{
        
        static int i = 1;
        
        [filter configFilter:^(CIFilter *filter) {
            [filter setValue:[NSNumber numberWithInt:(i++)%10 + 1]
                      forKey:@"inputScale"];
        }];
        render.imageFilter = filter;
        
        [[GCDQueue mainQueue] execute:^{
            imageView.image = [render render];
        }];
        
    } timeInterval:NSEC_PER_SEC/10.f];
    [_timer start];
}


- (void)viewDidLoad
{
    [super viewDidLoad];
    
    ImageFilter *filter = [ImageFilter new];
    ImageRender *render = [ImageRender new];
    
    filter.filterName = @"CIHueAdjust";
    filter.inputCIImage = CIImageFromImage([UIImage imageNamed:@"demo"]);
    [filter configFilter:^(CIFilter *filter) {
        [filter setValue:@(3.14f)
                  forKey:@"inputAngle"];
    }];
    render.imageFilter = filter;

    UIImageView *imageView = [[UIImageView alloc] initWithImage:[render render]];
    [self.view addSubview:imageView];
    imageView.center = self.view.center;
    
    _timer = [[GCDTimer alloc] initInQueue:[GCDQueue globalQueue]];
    [_timer event:^{
        static int i = 0;
        [filter configFilter:^(CIFilter *filter) {
            [filter setValue:[NSNumber numberWithFloat:(i+=1)%314/100.f]
                      forKey:@"inputAngle"];
        }];
        render.imageFilter = filter;
        [[GCDQueue mainQueue] execute:^{
            imageView.image = [render render];
        }];
    } timeInterval:NSEC_PER_SEC/60.f];
    [_timer start];
}

附录:

https://developer.apple.com/library/ios/documentation/GraphicsImaging/Reference/CoreImageFilterReference/Reference/reference.html#//apple_ref/doc/uid/TP40004346

https://developer.apple.com/library/mac/documentation/graphicsimaging/reference/CoreImageFilterReference/Reference/reference.html

https://developer.apple.com/library/ios/documentation/GraphicsImaging/Conceptual/CoreImaging/CoreImage.pdf

http://www.rapidsnail.com/Tutorial/t/2012/112/30/22766/the-coreimage-ios.aspx

http://www.raywenderlich.com/zh-hans/24773/%E5%88%9D%E5%AD%A6ios6-%E4%B8%AD%E7%9A%84core-image%E6%8A%80%E6%9C%AF

http://my.safaribooksonline.com/book/video/9780321637031/chapter-15dot-secret-patches-core-image-filters-and-glsl-pushing-the-boundaries/ch15sec1lev5

http://stackoverflow.com/questions/17041669/creating-a-blurring-overlay-view/17041983#17041983

 

 

 

目录
相关文章
|
8月前
|
运维 小程序 vr&ar
6个维度分析实时渲染和Webgl技术异同
虽然二者均为B/S技术架构路线,但webgl对本地电脑性能还是有些要求,因为webgl的程序有些数据是需要下载到本地,借助本地电脑的显卡和CPU来完成的,不算完全的B/S架构。 而实时渲染技术是完全使用的服务器显卡和CPU等资源,是纯B/S技术架构方案,用户侧的终端只是程序指令的接收和执行,只要能看1080P的视频即可。
131 0
|
Web App开发 前端开发 JavaScript
27 个前端动画库让你的交互更加炫酷
很多时候我们在开发前端页面时都会做一些动画效果来提升用户体验度和页面美观度,所以今天就来给大家推荐几个好用的JavaScript动画库,希望对各位小伙伴有所帮助!
2728 0
|
4天前
|
调度 vr&ar 图形学
【干货】实时云渲染与本地渲染的技术对比
实时渲染分为本地渲染和云渲染两种模式。随着XR技术在建筑、教育、医疗等领域的广泛应用,越来越多企业选择云渲染以提升效率、降低成本并增强协同能力。本文对比分析了这两种渲染模式的优劣,并重点介绍了实时云渲染方案具备便捷性、高效资源调度、超低时延网络、数据安全、终端轻量化及跨系统运行等优势,满足多种XR应用场景需求。
|
4月前
|
开发框架 JavaScript 前端开发
服务端渲染框架
服务端渲染框架
|
8月前
|
JavaScript 前端开发 开发者
Vue的事件处理机制提供了灵活且强大的方式来响应用户的操作和组件间的通信
【5月更文挑战第16天】Vue事件处理包括v-on(@)指令用于绑定事件监听器,如示例中的按钮点击事件。事件修饰符如.stop和.prevent简化逻辑,如阻止表单默认提交。自定义事件允许组件间通信,子组件通过$emit触发事件,父组件用v-on监听并响应。理解这些机制有助于掌握Vue应用的事件控制。
75 4
|
数据可视化 5G 云计算
干货:实时渲染和离线渲染的区别?实时云渲染又是什么?
常见的渲染类型有以下几种:实时渲染、离线渲染、实时云渲染、混合渲染。那么什么是实时渲染?实时渲染和离线渲染有哪些区别?各自有哪些典型应用场景...... 有没有人感觉知道了,但又没完全知道? 今天小编就尽量为大家用简单易懂的方式先解释下实时渲染、离线渲染、实时云渲染这3个概念。
干货:实时渲染和离线渲染的区别?实时云渲染又是什么?
|
JavaScript 前端开发
“深入理解事件处理器、表单综合案例和组件通信“
“深入理解事件处理器、表单综合案例和组件通信“
40 0
|
JavaScript
Vue数据渲染技巧:从三种方式看后端数据的灵活处理
Vue数据渲染技巧:从三种方式看后端数据的灵活处理
206 0