说明:
拆析系列通过分析第三方的Demo, 本文重新整理Demo中涉及到的重点API并以知识点的方式进行分段, 以方便大家学习.
文章中尽量不使用或少使用封装, 目的是让大家清楚为了实现功能所需要的官方核心API是哪些(如果使用封装, 会在封装外面加以注释)
本拆析Demo源码地址: https://github.com/Neojoke/flippingPage
感谢Demo作者Neojoke
- 此文章由 @Scott 编写. 经 @春雨 审核. 若转载此文章,请注明出处和作者
<功能描述>
iOS实现3D翻页效果
核心API
CALayer的anchorPoint属性
, CGImage
, CAShapeLayer
, UIRectConner
, CIFilter
, CAGradientLayer
, CATransform3D
功能实现
Code:
1. CALayer属性 anchorPoint(锚点)
/**
* 每一个UIView内部都默认关联着一个CALayer, UIView有frame、bounds和center三个属性,CALayer也有类似的属性,分别为frame、bounds、position、anchorPoint。frame和bounds比较好理解,bounds可以视为x坐标和y坐标都为0的frame,那position、anchorPoint是什么呢?先看看两者的原型,可知都是CGPoint点。
* @property CGPoint position
* @property CGPoint anchorPoint
*
* anchorPoint 介绍:
* 从一个例子开始入手吧,想象一下,把一张A4白纸用图钉订在书桌上,如果订得不是很紧的话,白纸就可以沿顺时针或逆时针方向围绕图钉旋转,这时候图钉就起着支点的作用。我们要解释的anchorPoint就相当于白纸上的图钉,它主要的作用就是用来作为变换的支点,旋转就是一种变换,类似的还有平移、缩放。
继续扩展,很明显,白纸的旋转形态随图钉的位置不同而不同,图钉订在白纸的正中间与左上角时分别造就了两种旋转形态,这是由图钉(anchorPoint)的位置决定的。如何衡量图钉(anchorPoint)在白纸中的位置呢?在iOS中,anchorPoint点的值是用一种相对bounds的比例值来确定的,在白纸的左上角、右下角,anchorPoint分为为(0,0), (1, 1),也就是说anchorPoint是在单元坐标空间(同时也是左手坐标系)中定义的。类似地,可以得出在白纸的中心点、左下角和右上角的anchorPoint为(0.5,0.5), (0,1), (1,0)。
*/
// 设置右边ImageView的锚点
self.imageViewForRight.layer.anchorPoint = CGPointMake(0, 0.5);
2: CGImage
- (UIImage *)clipImageWithImage:(UIImage *)image isLeftImage:(BOOL)isLeft {
CGRect imgRect = CGRectMake(0, 0, image.size.width / 2, image.size.height);
if (!isLeft) {
imgRect.origin.x = image.size.width / 2;
}
CGImageRef imgRef = CGImageCreateWithImageInRect(image.CGImage, imgRect);
UIImage *clipImage = [UIImage imageWithCGImage:imgRef];
return clipImage;
}
3: CAShapeLayer, UIRectConner
- (CAShapeLayer *)getCornerRidusMashWithIsLeft:(BOOL)isLeft rect:(CGRect)rect {
// 创建CAShapeLayer对象.
CAShapeLayer *layer = [CAShapeLayer layer];
// 创建矩形圆角结构体
UIRectCorner corner = isLeft ? UIRectCornerTopLeft | UIRectCornerBottomLeft : UIRectCornerTopRight | UIRectCornerBottomRight;
// 通过贝塞尔曲线创建矩形圆角.
layer.path = [UIBezierPath bezierPathWithRoundedRect:rect byRoundingCorners:corner cornerRadii:CGSizeMake(10, 10)].CGPath;
return layer;
}
4: CIFilter (iOS 滤镜).
- (UIImage *)getBlurAndReversalImage:(UIImage *)image {
CIContext *context = [CIContext contextWithOptions:nil];
CIImage *inputImage = [CIImage imageWithCGImage:image.CGImage];
// 高斯滤镜.
CIFilter *filter = [CIFilter filterWithName:@"CIGaussianBlur"];
[filter setValue:inputImage forKey:kCIInputImageKey];
[filter setValue:@10.0 forKey:@"inputRadius"];
CIImage *result = [filter valueForKey:kCIOutputImageKey];
result = [result imageByApplyingTransform:CGAffineTransformMakeTranslation(-1, 1)];
CGImageRef ref = [context createCGImage:result fromRect:[inputImage extent]];
UIImage *returnImage = [UIImage imageWithCGImage:ref];
CGImageRelease(ref);
return returnImage;
}
5: 渐变图层(CAGradientLayer)
- (void)gradientLayer {
// 渐变颜色
self.gradientForLeft = [CAGradientLayer layer];
self.gradientForLeft.opacity = 0; /**< 不透明度. */
self.gradientForLeft.colors = @[(id)[UIColor clearColor].CGColor, (id)[UIColor blackColor].CGColor];
self.gradientForLeft.frame = self.imageViewForLeft.bounds;
self.gradientForLeft.startPoint = CGPointMake(1, 1);
self.gradientForLeft.startPoint = CGPointMake(0, 1);
[self.imageViewForLeft.layer addSublayer:self.gradientForLeft];
self.gradientForRight = [CAGradientLayer layer];
self.gradientForRight.opacity = 0;
self.gradientForRight.colors = @[(id)[UIColor clearColor].CGColor, (id)[UIColor blackColor].CGColor];
self.gradientForRight.frame = self.imageViewForRight.bounds;
self.gradientForRight.startPoint = CGPointMake(0, 1);
self.gradientForRight.startPoint = CGPointMake(1, 1);
[self.imageViewForRight.layer addSublayer:self.gradientForRight];
}
6: CATransform3D
- (CATransform3D)getTransForm3DWithAngle:(CGFloat)angle {
CATransform3D transform = CATransform3DIdentity;
transform.m34 = 4.5 / 2000;
transform = CATransform3DRotate(transform, angle, 0, 1, 0);
return transform;
}