iOS裡關於圖片的Crop和Resize

齊滇大聖發表於2018-08-07

前言

一般我們處理圖片大小有兩種方式:Crop和Resize,即裁剪和縮放,前者會對原圖進行裁剪,即會丟失一些圖片區域,而後者只是對圖片進行放大縮小,並不對圖片進行裁剪。

知識點

imageOrientation

imageOrientation表示圖片的方向資訊,大部分的時候其實你是不用管它的,因為你儲存在相簿裡或者一些本地的圖片在匯入之前會被自動調整好方向,即都是正常的UIImageOrientationUp方向。

只有在iphone剛拍攝圖片時,Exif中儲存的圖片方向為當前拍攝的方向,iphone中預設橫屏,home鍵在右邊時,圖片的Orientation為UIImageOrientationUp。所以在很多時候,我們在iOS程式裡對剛拍攝的照片進行處理是需要修正方向,程式碼如下:

- (UIImage *)fixOrientation {
    
    // No-op if the orientation is already correct
    if (self.imageOrientation == UIImageOrientationUp) return self;
    
    // We need to calculate the proper transformation to make the image upright.
    // We do it in 2 steps: Rotate if Left/Right/Down, and then flip if Mirrored.
    CGAffineTransform transform = CGAffineTransformIdentity;
    
    switch (self.imageOrientation) {
        case UIImageOrientationDown:
        case UIImageOrientationDownMirrored:
            transform = CGAffineTransformTranslate(transform, self.size.width, self.size.height);
            transform = CGAffineTransformRotate(transform, M_PI);
            break;
            
        case UIImageOrientationLeft:
        case UIImageOrientationLeftMirrored:
            transform = CGAffineTransformTranslate(transform, self.size.width, 0);
            transform = CGAffineTransformRotate(transform, M_PI_2);
            break;
            
        case UIImageOrientationRight:
        case UIImageOrientationRightMirrored:
            transform = CGAffineTransformTranslate(transform, 0, self.size.height);
            transform = CGAffineTransformRotate(transform, -M_PI_2);
            break;
        case UIImageOrientationUp:
        case UIImageOrientationUpMirrored:
            break;
    }
    
    switch (self.imageOrientation) {
        case UIImageOrientationUpMirrored:
        case UIImageOrientationDownMirrored:
            transform = CGAffineTransformTranslate(transform, self.size.width, 0);
            transform = CGAffineTransformScale(transform, -1, 1);
            break;
            
        case UIImageOrientationLeftMirrored:
        case UIImageOrientationRightMirrored:
            transform = CGAffineTransformTranslate(transform, self.size.height, 0);
            transform = CGAffineTransformScale(transform, -1, 1);
            break;
        case UIImageOrientationUp:
        case UIImageOrientationDown:
        case UIImageOrientationLeft:
        case UIImageOrientationRight:
            break;
    }
    
    // Now we draw the underlying CGImage into a new context, applying the transform
    // calculated above.
    CGContextRef ctx = CGBitmapContextCreate(NULL, self.size.width, self.size.height,
                                             CGImageGetBitsPerComponent(self.CGImage), 0,
                                             CGImageGetColorSpace(self.CGImage),
                                             CGImageGetBitmapInfo(self.CGImage));
    CGContextConcatCTM(ctx, transform);
    switch (self.imageOrientation) {
        case UIImageOrientationLeft:
        case UIImageOrientationLeftMirrored:
        case UIImageOrientationRight:
        case UIImageOrientationRightMirrored:
            // Grr...
            CGContextDrawImage(ctx, CGRectMake(0,0,self.size.height,self.size.width), self.CGImage);
            break;
            
        default:
            CGContextDrawImage(ctx, CGRectMake(0,0,self.size.width,self.size.height), self.CGImage);
            break;
    }
    
    // And now we just create a new UIImage from the drawing context
    CGImageRef cgimg = CGBitmapContextCreateImage(ctx);
    UIImage *img = [UIImage imageWithCGImage:cgimg];
    CGContextRelease(ctx);
    CGImageRelease(cgimg);
    return img;
}
複製程式碼

具體的關於圖片中的Orientation可以參考這篇文章: 如何處理iOS中照片的方向

Crop和Resize

Crop

在Crop和Resize圖片的時候需要注意裁剪或縮放的大小為圖片大小還是畫素大小。比如圖片裁剪的時候程式碼如下:

//按image.size大小裁剪
- (UIImage *)croppedImageWithRect:(CGRect)bounds {
    if (self.scale > 1.0f) {
        bounds = CGRectMake(bounds.origin.x * self.scale,
                          bounds.origin.y * self.scale,
                          bounds.size.width * self.scale,
                          bounds.size.height * self.scale);
    }
    
    CGImageRef imageRef = CGImageCreateWithImageInRect(self.CGImage, bounds);
    UIImage *result = [UIImage imageWithCGImage:imageRef scale:self.scale orientation:self.imageOrientation];
    CGImageRelease(imageRef);
    return result;
}

//按畫素大小裁剪
- (UIImage *)croppedImageWithPixelRect:(CGRect)bounds {
    CGImageRef imageRef = CGImageCreateWithImageInRect([self CGImage], bounds);
    UIImage *croppedImage = [UIImage imageWithCGImage:imageRef];
    CGImageRelease(imageRef);
    return croppedImage;
}
複製程式碼

大部分時候我們都會使用第一種按image.size大小裁剪,比如我們裁剪圖片的上半部分的時候,我們會使用image.size:

UIImage *image2 = [image croppedImageWithRect:CGRectMake(0, 0, image.size.width, image.size.height/2)];
複製程式碼

最近有一個需求是對圖片進行裁剪,得到對應畫素大小的圖片,然後傳入圖片識別模型識別。我當時使用的就是第一種方法,比如我當時想得到一個{100,100}畫素大小的圖片,我用的是[image croppedImageWithRect:CGRectMake(0, 0, 100, 100)];,其實這時得到的圖片畫素大小在二倍屏上為{200,200},三倍屏上為{300,300}。

所以我們在裁剪或縮放圖片時要注意需要的大小為圖片大小還是畫素大小,在iOS中畫素和image.size的關係:

test.png (畫素 20*20) test@2x.png(畫素40*40) test@3x.png(畫素 60*60)

UIImage *image = [UIImageimageNamed:@"test.png"];

image.size輸出大小為(20,20);


UIImage *image = [UIImage imageNamed:@"test@2x.png"];

image.size輸出大小為(20,20);


UIImage *image = [UIImage imageNamed:@"test@3x.png"];

image.size輸出大小為(20,20);


image.size輸出的大小會自動識別圖片是幾倍的,如果是3倍的輸出的結果就是畫素除以3,2倍的畫素除以2。
複製程式碼

Resize

縮放圖片大小也類似,關鍵就是UIGraphicsBeginImageContextWithOptions(dstSize, NO, isPixelSize?1:self.scale);這句程式碼,在建立畫布時,如果按畫素大小來縮放不乘以self.scale即可。

-(UIImage*)resizedImageToSize:(CGSize)dstSize isPixelSize:(BOOL)isPixelSize
{
    CGImageRef imgRef = self.CGImage;
    // the below values are regardless of orientation : for UIImages from Camera, width>height (landscape)
    CGSize  srcSize = CGSizeMake(CGImageGetWidth(imgRef), CGImageGetHeight(imgRef)); // not equivalent to self.size (which is dependant on the imageOrientation)!
    
    /* Don't resize if we already meet the required destination size. */
    if (CGSizeEqualToSize(srcSize, dstSize)) {
        return self;
    }
    
    CGFloat scaleRatio = dstSize.width / srcSize.width;
    UIImageOrientation orient = self.imageOrientation;
    CGAffineTransform transform = CGAffineTransformIdentity;
    switch(orient) {
            
        case UIImageOrientationUp: //EXIF = 1
            transform = CGAffineTransformIdentity;
            break;
            
        case UIImageOrientationUpMirrored: //EXIF = 2
            transform = CGAffineTransformMakeTranslation(srcSize.width, 0.0);
            transform = CGAffineTransformScale(transform, -1.0, 1.0);
            break;
            
        case UIImageOrientationDown: //EXIF = 3
            transform = CGAffineTransformMakeTranslation(srcSize.width, srcSize.height);
            transform = CGAffineTransformRotate(transform, M_PI);
            break;
            
        case UIImageOrientationDownMirrored: //EXIF = 4
            transform = CGAffineTransformMakeTranslation(0.0, srcSize.height);
            transform = CGAffineTransformScale(transform, 1.0, -1.0);
            break;
            
        case UIImageOrientationLeftMirrored: //EXIF = 5
            dstSize = CGSizeMake(dstSize.height, dstSize.width);
            transform = CGAffineTransformMakeTranslation(srcSize.height, srcSize.width);
            transform = CGAffineTransformScale(transform, -1.0, 1.0);
            transform = CGAffineTransformRotate(transform, 3.0 * M_PI_2);
            break;
            
        case UIImageOrientationLeft: //EXIF = 6
            dstSize = CGSizeMake(dstSize.height, dstSize.width);
            transform = CGAffineTransformMakeTranslation(0.0, srcSize.width);
            transform = CGAffineTransformRotate(transform, 3.0 * M_PI_2);
            break;
            
        case UIImageOrientationRightMirrored: //EXIF = 7
            dstSize = CGSizeMake(dstSize.height, dstSize.width);
            transform = CGAffineTransformMakeScale(-1.0, 1.0);
            transform = CGAffineTransformRotate(transform, M_PI_2);
            break;
            
        case UIImageOrientationRight: //EXIF = 8
            dstSize = CGSizeMake(dstSize.height, dstSize.width);
            transform = CGAffineTransformMakeTranslation(srcSize.height, 0.0);
            transform = CGAffineTransformRotate(transform, M_PI_2);
            break;
            
        default:
            [NSException raise:NSInternalInconsistencyException format:@"Invalid image orientation"];
            
    }
    
    /////////////////////////////////////////////////////////////////////////////
    // The actual resize: draw the image on a new context, applying a transform matrix
    UIGraphicsBeginImageContextWithOptions(dstSize, NO, isPixelSize?1:self.scale);
    
    CGContextRef context = UIGraphicsGetCurrentContext();
    
    if (!context) {
        return nil;
    }
    
    if (orient == UIImageOrientationRight || orient == UIImageOrientationLeft) {
        CGContextScaleCTM(context, -scaleRatio, scaleRatio);
        CGContextTranslateCTM(context, -srcSize.height, 0);
    } else {
        CGContextScaleCTM(context, scaleRatio, -scaleRatio);
        CGContextTranslateCTM(context, 0, -srcSize.height);
    }
    
    CGContextConcatCTM(context, transform);
    
    // we use srcSize (and not dstSize) as the size to specify is in user space (and we use the CTM to apply a scaleRatio)
    CGContextDrawImage(UIGraphicsGetCurrentContext(), CGRectMake(0, 0, srcSize.width, srcSize.height), imgRef);
    UIImage* resizedImage = UIGraphicsGetImageFromCurrentImageContext();
    UIGraphicsEndImageContext();
    
    return resizedImage;
}

複製程式碼

參考

如何處理iOS中照片的方向

原始碼

相關文章