Core Animation實戰四(視覺效果)

ZY_FlyWay發表於2017-10-31

我們知道View中封裝了一些動畫和顯示效果那我們為什麼還要操作CALayer層面上呢?

這裡有一些UIView沒有暴露出來的CALayer的功能:

  • 陰影,圓角,帶顏色的邊框
  • 3D變換
  • 非矩形範圍
  • 透明遮罩
  • 多級非線性動畫


這篇主要說陰影圓角邊框,先看一下Demo效果





DEMO例子:

//
//  VisualEffectViewController.m
//  LayerStudyDemo
//
//  Created by apple on 2017/9/26.
//  Copyright © 2017年 ZY. All rights reserved.
//

#import "VisualEffectViewController.h"

@interface VisualEffectViewController ()
@property (weak, nonatomic) IBOutlet UIView *View1;
@property (strong, nonatomic) IBOutlet UIView *View2;
@property (weak, nonatomic) IBOutlet UIView *shadowView;
@property (weak, nonatomic) IBOutlet UIImageView *timeImage;
@end

@implementation VisualEffectViewController

- (void)viewDidLoad {
    [super viewDidLoad];
    [self layerCornerRadiusAndWidth];
    [self shadowEffect];
    [self shadowPath];
}

//layer的圓角,裁剪,邊框。
-(void)layerCornerRadiusAndWidth{
    //圓角弧度
    self.View1.layer.cornerRadius = 30;
    //是否邊框外繪製
    self.View1.layer.masksToBounds = YES;
    //邊框顏色
    self.View1.layer.borderColor = [UIColor blackColor].CGColor;
    //邊框寬度
    self.View1.layer.borderWidth = 3;
}

//陰影效果
-(void)shadowEffect{
    //陰影透明度
    self.shadowView.layer.shadowOpacity = 0.8;
    //陰影顏色
    self.shadowView.layer.shadowColor = [self getColorFromRed:0 Green:1 Blue:0 Alpha:1];
    // shadowOffset屬性控制著陰影的方向和距離。它是一個CGSize的值,寬度控制這陰影橫向的位移,高度控制著縱向的位移。
    self.shadowView.layer.shadowOffset = CGSizeMake(1, 1);
    //shadowRadius屬性控制著陰影的模糊度,當它的值是0的時候,陰影就和檢視一樣有一個非常確定的邊界線。當值越來越大的時候,邊界線看上去就會越來越模糊和自然。蘋果自家的應用設計更偏向於自然的陰影,所以一個非零值再合適不過了。
    self.shadowView.layer.shadowRadius = 100;
}

//CGColorRef
-(CGColorRef) getColorFromRed:(int)red Green:(int)green Blue:(int)blue Alpha:(int)alpha
{
    // RGBA 色彩 (顯示3色)
    CGColorSpaceRef rgbSapceRef = CGColorSpaceCreateDeviceRGB();// RGB 色彩空間
    CGFloat rgbComponents[] = {red, green, blue, 1};// RGBA 顏色元件
    CGColorRef rgbColorRef = CGColorCreate(rgbSapceRef, rgbComponents);// 一般建立 CGColor
    
    return rgbColorRef;
}

//陰影路徑
-(void)shadowPath{
    self.timeImage.layer.shadowOpacity = 0.5;
    
    //create a square shadow
//    CGMutablePathRef squarePath = CGPathCreateMutable();
//    CGPathAddRect(squarePath, NULL, self.timeImage.bounds);
//    self.timeImage.layer.shadowPath = squarePath;
//    CGPathRelease(squarePath);
    
    //create a circular shadow
    CGMutablePathRef circlePath = CGPathCreateMutable();
    CGPathAddEllipseInRect(circlePath, NULL, self.timeImage.bounds);
    self.timeImage.layer.shadowPath = circlePath;
    CGPathRelease(circlePath);
}

- (void)didReceiveMemoryWarning {
    [super didReceiveMemoryWarning];
    // Dispose of any resources that can be recreated.
}


@end


圖層蒙版

//
//  MaskLayerViewController.m
//  LayerStudyDemo
//
//  Created by apple on 2017/9/27.
//  Copyright © 2017年 ZY. All rights reserved.
//

#import "MaskLayerViewController.h"

@interface MaskLayerViewController ()
@property (weak, nonatomic) IBOutlet UIImageView *maskImage;

@end

@implementation MaskLayerViewController

- (void)viewDidLoad {
    [super viewDidLoad];
    [self maskView];
}

//maskLayer
-(void)maskView{
    CALayer  * maskLayer = [CALayer layer];
    maskLayer.frame  =  self.maskImage.bounds;
    UIImage * image = [UIImage imageNamed:@"time.png"];
    maskLayer.contents =  (__bridge id _Nullable)(image.CGImage);
    
    self.maskImage.layer.mask = maskLayer;
}

- (void)didReceiveMemoryWarning {
    [super didReceiveMemoryWarning];
    // Dispose of any resources that can be recreated.
}
@end



拉伸過濾

放大和縮小過濾的演算法屬性


/* The filter types to use when rendering the `contents' property of

 * the layer. The minification filter is used when to reduce the size

 * of image data, the magnification filter to increase the size of

 * image data. Currently the allowed values are `nearest' and `linear'.

 * Both properties default to `linear'. */


@property(copy) NSString *minificationFilter;

@property(copy) NSString *magnificationFilter;


/* The bias factor added when determining which levels of detail to use

 * when minifying using trilinear filtering. The default value is 0.

 * Animatable. */


@property float minificationFilterBias;



演算法如下


/** Contents filter names. **/


CA_EXTERN NSString * const kCAFilterNearest

    CA_AVAILABLE_STARTING (10.5, 2.0, 9.0, 2.0);

CA_EXTERN NSString * const kCAFilterLinear

    CA_AVAILABLE_STARTING (10.5, 2.0, 9.0, 2.0);


/* Trilinear minification filter. Enables mipmap generation. Some

 * renderers may ignore this, or impose additional restrictions, such

 * as source images requiring power-of-two dimensions. */


CA_EXTERN NSString * const kCAFilterTrilinear

    CA_AVAILABLE_STARTING (10.6, 3.0, 9.0, 2.0);



預設的過濾器都是kCAFilterLinear,這個過濾器採用雙線性濾波演算法,它在大多數情況下都表現良好。雙線性濾波演算法通過對多個畫素取樣最終生成新的值,得到一個平滑的表現不錯的拉伸。但是當放大倍數比較大的時候圖片就模糊不清了。

kCAFilterTrilinear和kCAFilterLinear非常相似,大部分情況下二者都看不出來有什麼差別。但是,較雙線性濾波演算法而言,三線性濾波演算法儲存了多個大小情況下的圖片(也叫多重貼圖),並三維取樣,同時結合大圖和小圖的儲存進而得到最後的結果

kCAFilterNearest是一種比較武斷的方法。從名字不難看出,這個演算法(也叫最近過濾)就是取樣最近的單畫素點而不管其他的顏色。這樣做非常快,也不會使圖片模糊。但是,最明顯的效果就是,會使得壓縮圖片更糟,圖片放大之後也顯得塊狀或是馬賽克嚴重。



DEMO如下:


//
//  TensileFilterViewController.m
//  LayerStudyDemo
//
//  Created by apple on 2017/9/28.
//  Copyright © 2017年 ZY. All rights reserved.
//

#import "TensileFilterViewController.h"

@interface TensileFilterViewController ()
@property (strong, nonatomic) IBOutletCollection(UIView) NSArray *LedViews1;
@property (strong, nonatomic) IBOutletCollection(UIView) NSArray *LedView2;
@property (nonatomic, weak) NSTimer *timer;
@end

@implementation TensileFilterViewController
{
    NSArray * array;
}
- (void)viewDidLoad {
    [super viewDidLoad];
    
    array = @[self.LedViews1,self.LedView2];
    UIImage *digits = [UIImage imageNamed:@"led.png"];
    for (int i=0; i<array.count; i++) {
        for (UIView *view in array[i]) {
            //set contents
            view.layer.contents = (__bridge id)digits.CGImage;
            view.layer.contentsRect = CGRectMake(0, 0, 0.1, 1.0);
            view.layer.contentsGravity = kCAGravityResizeAspect;
            if (i==1) {
                view.layer.minificationFilter = kCAFilterNearest;
            }
        }
    }
    
    //start timer
    self.timer = [NSTimer scheduledTimerWithTimeInterval:1.0 target:self selector:@selector(tick) userInfo:nil repeats:YES];
    
}

- (void)tick
{
    //convert time to hours, minutes and seconds
    NSCalendar *calendar = [[NSCalendar alloc] initWithCalendarIdentifier:NSCalendarIdentifierRepublicOfChina];
    NSUInteger units = NSCalendarUnitHour | NSCalendarUnitMinute | NSCalendarUnitSecond;
    
    NSDateComponents *components = [calendar components:units fromDate:[NSDate date]];
    
    for (int i=0; i<array.count; i++) {
        //set hours
        [self setDigit:components.hour / 10 forView:array[i][0]];
        [self setDigit:components.hour % 10 forView:array[i][1]];
        
        //set minutes
        [self setDigit:components.minute / 10 forView:array[i][2]];
        [self setDigit:components.minute % 10 forView:array[i][3]];
        
        //set seconds
        [self setDigit:components.second / 10 forView:array[i][4]];
        [self setDigit:components.second % 10 forView:array[i][5]];
    }
}


- (void)setDigit:(NSInteger)digit forView:(UIView *)view
{
    //adjust contentsRect to select correct digit
    view.layer.contentsRect = CGRectMake(digit * 0.1, 0, 0.1, 1.0);
}


- (void)didReceiveMemoryWarning {
    [super didReceiveMemoryWarning];
    // Dispose of any resources that can be recreated.
}



@end



DEMO地址


相關文章