Using GPUImage to Recreate iOS 7 Glass Effect

本文探讨了如何使用GPUImage库复现iOS 7的玻璃模糊效果,并针对性能问题进行了优化。通过调整多种滤镜参数,实现了接近苹果原生效果的表现。

from: http://www.unknownerror.org/opensource/BradLarson/GPUImage/q/stackoverflow/18404907/using-gpuimage-to-recreate-ios-7-glass-effect

I am trying to use the iOS 7 style glass effect in my glass by applying image effects to a screenshot of aMKMapViewThis UIImage category, provided by Apple, is what I am using as a baseline. This method desaturates the source image, applies a tint color, and blurs heavily using the input vals:

[image applyBlurWithRadius:10.0
                 tintColor:[UIColor colorWithRed:229/255.0f green:246/255.0f blue:255/255.0f alpha:0.33] 
     saturationDeltaFactor:0.66
                 maskImage:nil];

This produces the effect I am looking for, but takes way too long — between .3 and .5 seconds to render on an iPhone 4.

enter image description here

I would like to use the excellent GPUImage as my preliminary attempts have been about 5-10 times faster, but I just can't seem to get it right.

GPUImagePicture *stillImageSource = [[GPUImagePicture alloc] initWithImage:image];

GPUImageSaturationFilter *saturationFilter = [[GPUImageSaturationFilter alloc] init];
saturationFilter.saturation = 0.33; // 1.0 - 0.66;
[stillImageSource addTarget:saturationFilter];

GPUImageMonochromeFilter *monochromeFilter = [[GPUImageMonochromeFilter alloc] init];
[monochromeFilter setColor:(GPUVector4){229/255.0f, 246/255.0f, 1.0f, 0.33f}];
[monochromeFilter setIntensity:0.2];
[saturationFilter addTarget:monochromeFilter];

GPUImageFastBlurFilter *blurFilter = [[GPUImageFastBlurFilter alloc] init];
blurFilter.blurSize = 2;
blurFilter.blurPasses = 3;
[monochromeFilter addTarget:blurFilter];

[saturationFilter prepareForImageCapture];
[monochromeFilter prepareForImageCapture];

[stillImageSource processImage];
image = [blurFilter imageFromCurrentlyProcessedOutput];

This produces an image which is close, but not quite there

enter image description here

The blur doesn't seem to be deep enough, but when I try to increase the blurSize above, it becomes grid-like, almost like a kaleidoscope. You can actually see the grid here by zooming in on the second image. The tint-color I am trying to mimic seems to just wash out the image instead of overlaying and blending, which I think the Apple sample is doing.

I have tried to setup the filters according to comments made by @BradLarson in another SO question. Am I using the wrong GPUImage filters to reproduce this effect, or am I just setting them up wrong?

1 Answer

Brad Larson

OK, I've been working on something here for a little while, and I finally have it functional. I just rolled a number of changes to GPUImage's blur filters into the framework, and as a result I believe I have a reasonable replica of Apple's blur effect that they use for things like the control center view.

Previously, the blurs that I had in the framework used a single precalculated radius, and the only way to affect their intensity was to tweak the spacing at which they sampled pixels from the input image. With a limited number of samples per pixel, changing the multiple for the spacing between sampled pixels much above 1.5 started introducing serious blocking artifacts as pixels were skipped.

The new Gaussian blur implementation that I've built combines the performance benefits of precalculated Gaussian weights with the ability to use an arbitrary radius (sigma) for the Gaussian blur. It does this by generating shaders on the fly as they are needed for various radii. It also reduces the number of texture samples required for a given blur radius by using hardware interpolation to read two texels at a time for each sample point.

The new GPUImageiOSBlurFilter combines this tuned arbitrary-radius Gaussian blur filter with a color-correction filter that appears to replicate the adjustment Apple performs to the colors after they've been blurred. I added the below comparison to my answer here, but it shows Apple's built-in blurring from the control center view on the left, and my new GPUImage blur filter on the right:

Apple's blur GPUImage's blur

As a way of improving performance (Apple's blur appears to occur with a sigma of 48, which requires quite a large area to be sampled for each pixel), I use a 4X downsampling before the Gaussian blur, then a 4X upsampling afterward. This reduces the number of pixels that need to be blurred by 16X, and also reduces the blur sigma from 48 to 12. An iPhone 4S can blur the entire screen in roughly 30 ms using this filter.

Getting the blur right is one thing. Apple still does not provide a fast way of getting the image content behind your views, so that most likely will be your bottleneck here for rapidly changing content.


【从高压输电线的架空地线中汲取电能】一个25千瓦受控电源从735千伏线路的架空地线中汲取电能的SimPowerSystems模型(Simulink仿真实现)内容概要:本文介绍了一个基于SimPowerSystems的Simulink仿真模型,用于模拟从735千伏高压输电线的架空地线中汲取25千瓦电能的受控电源系统。该模型聚焦于高压输电线路中架空地线的能量回收技术,通过仿真手段实现对电能采集过程的建模与控制策略验证,体现了电力系统中新型能源获取方式的技术可行性与工程应用潜力。文中还提及该资源属于一系列电力系统仿真研究的一部分,涵盖微电网、储能优化、碳流追踪、鲁棒调度等多个前沿方向,配套提供Matlab/Simulink代码及网盘资料链接,便于科研人员复现与拓展研究。; 适合人群:具备电力系统基础知识、熟悉Matlab/Simulink仿真环境,从事电力工程、能源回收或智能电网相关研究的科研人员及研究生;有一定编程与建模仿真经验的高年级本科生或工程技术人员。; 使用场景及目标:①研究高压输电线路中架空地线的能量回收机制与建模方法;②掌握基于Simulink的电力系统仿真技术,特别是受控电源与电网交互的动态特性分析;③为开展能源 harvesting、分布式供能、电力电子变换器控制等相关课题提供参考模型与技术支撑; 阅读建议:建议结合提供的仿真模型文件进行实操演练,重点理解系统结构设计、参数设置与控制逻辑实现;同时可延伸学习文档中提到的其他电力系统优化与仿真案例,以拓宽研究视野和技术积累。
评论
成就一亿技术人!
拼手气红包6.0元
还能输入1000个字符
 
红包 添加红包
表情包 插入表情
 条评论被折叠 查看
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值