I have a scenario in my app where I take a screenshot of the video using
[myMovieController requestThumbnailImagesAtTimes:#[#(myMovieController.currentPlaybackTime)] timeOption:MPMovieTimeOptionExact];
which just works fine. Then I have to crop the image with touched location on the video. I have added gesture recognizer of myMovieController. I get the touch location from the gesture Recognizer.
then I use following code to take the screen shot
CGRect cropRect = tapCircleView.frame;
cropRect = CGRectMake(touchPoint.x * image.scale,
touchPoint.y * image.scale,
cropRect.size.width * image.scale,
cropRect.size.height * image.scale);
CGImageRef imageRef = CGImageCreateWithImageInRect([image CGImage], cropRect) ;
UIImage* cropped = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
[self showImage:cropped];
where cropRect width and height is 150.
But crop result is not with correct x and y. And also resulting image is very pixilated.
I have tried every solution but it's not working.
What is it that I am missing?
Thanks.
The image size which captures screenshot is not same as device size on which application is running.
So instead of using same image, change size of image using following code :
CGRect rect = CGRectMake(0,0,[UIScreen mainScreen].bounds.size.width, [UIScreen mainScreen].bounds.size.height);
UIGraphicsBeginImageContext( rect.size );
[image drawInRect:rect];
UIImage *picture1 = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *imageData = UIImagePNGRepresentation(picture1);
UIImage *img=[UIImage imageWithData:imageData];
And now just use this image wherever you want !!
Cheers.
Related
When a image is cropped from the center then crop image will take the aspect ratio of source image,But According to my requirement, aspect ratio will be change with new crop size.
I want to get exact center part of image with new aspect ratio.For example a large image is of size (320*480) then I want to crop center part of image of size (100,100) and aspect ratio will also be 100*100 ,No outer white or black part is required and image quality will be high.
Cropping function :
- (UIImage *)cropImage:(UIImage*)image andFrame:(CGRect)rect {
//Note : rec is nothing but the image frame which u want to crop exactly.
rect = CGRectMake(rect.origin.x*image.scale,
rect.origin.y*image.scale,
rect.size.width*image.scale,
rect.size.height*image.scale);
CGImageRef imageRef = CGImageCreateWithImageInRect([image CGImage], rect);
UIImage *result = [UIImage imageWithCGImage:imageRef
scale:image.scale
orientation:image.imageOrientation];
CGImageRelease(imageRef);
return result;
}
Please help me.
This might run
- (UIImage *)imageByCroppingImage:(UIImage *)image toSize:(CGSize)size
{
// not equivalent to image.size (which depends on the imageOrientation)!
double refWidth = CGImageGetWidth(image.CGImage);
double refHeight = CGImageGetHeight(image.CGImage);
double x = (refWidth - size.width) / 2.0;
double y = (refHeight - size.height) / 2.0;
CGRect cropRect = CGRectMake(x, y, size.height, size.width);
CGImageRef imageRef = CGImageCreateWithImageInRect([image CGImage], cropRect);
UIImage *cropped = [UIImage imageWithCGImage:imageRef scale:0.0 orientation:self.imageOrientation];
CGImageRelease(imageRef);
return cropped;
}
var imageView = UIImageView()
// height and width values corresponds to rectangle height and width
imageView = UIImageView(frame: CGRectMake(0, 0, width, height ))
imageView.image = UIImage(named: "Your Image Name")
// by setting content mode to .ScaleAspectFill image centrally fill in imageView. image might appear beyond image frame.
imageView.contentMode = .ScaleAspectFill
// by setting .clipsToBouds to true, image set to image frame.
imageView.clipsToBounds = true
view.addSubview(imageView)
// Work this code
-(UIImage *)croppedImage
{
UIImage *myImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
[self.bezierPath closePath];
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), 0.0, 0.0, 0.0, 0.0);
_b_image = self.bg_imageview.image;
CGSize imageSize = _b_image.size;
CGRect imageRect = CGRectMake(0, 0, imageSize.width, imageSize.height);
UIGraphicsBeginImageContextWithOptions(imageSize, NO, [[UIScreen mainScreen] scale]);
[self.bezierPath addClip];
[_b_image drawInRect:imageRect];
UIImage *croppedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return croppedImage;
}
Calculate crop rect from image
float imgHeight = 100.0f; //Any according to requirement
float imgWidth = 100.0f; //Any according to requirement
CGRect cropRect = CGRectMake((largeImage.size.width/2)-(imgWidth/2),largeImage.size.height/2)-(imgHeight/2),imgWidth,imgHeight);
Now crop it
CGImageRef imageRef = CGImageCreateWithImageInRect([largeImage CGImage], cropRect);
// or use the UIImage wherever you like
UIImage *croppedImg = [UIImage imageWithCGImage:imageRef]];
CGImageRelease(imageRef);
I have a UIImageView and I want to blur only the bottom portion of the image.
Can someone help me with this ?
I have used UIImage + ImageEffects category to blur the image completely. How can I do it for a specific portion only ?
Split your UIImage into two UIImages. Blur the one you want, leave the other one unaffected. The following splits the image exactly at the center, adjust the rects in the CGImageCreateWithImageInRect calls if you want to move the blurred portion.
UIImage *image = [UIImage imageNamed:#"yourImage.png"];
CGFloat halfImageHeight = image.size.height / 2.f;
CGImageRef topImgRef = CGImageCreateWithImageInRect(image.CGImage, CGRectMake(0, 0, image.size.width, halfImageHeight));
UIImage *topImage = [UIImage imageWithCGImage:topImgRef];
CGImageRelease(topImgRef);
CGImageRef bottomImgRef = CGImageCreateWithImageInRect(image.CGImage, CGRectMake(0, halfImageHeight, image.size.width, halfImageHeight));
UIImage *bottomImage = [UIImage imageWithCGImage:bottomImgRef];
CGImageRelease(bottomImgRef);
// Add blur effects to bottomImage
I encountered a problem when I pinch, pan or rotate a UIImageview in drawInRect, the transform is not being preserved.
How can I preserve the transform in drawInRect?
I tried this but no go :(
- (UIImage*) combineImage:(UIImageView *)selectedImage withOverlay:(UIImageView *)overlayImage
{
/* Identify the region that needs to be cropped */
CGRect viewForImgFrame = self.viewForImg.frame;
NSLog(#"view %#", NSStringFromCGRect(viewForImgFrame));
NSLog(#"selectedImage Img value %#",selectedImage);
NSLog(#"overlayImage Img value %#",overlayImage);
CGSize newImageSize =self.viewForImg.frame.size;
NSLog(#"CGSize %#",NSStringFromCGSize(newImageSize));
UIGraphicsBeginImageContextWithOptions(newImageSize, NO, 0.0); //retina res
//[self.viewForImg.layer renderInContext:UIGraphicsGetCurrentContext()];
[selectedImage.image drawInRect:CGRectMake(0, 0, selectedImage.frame.size.width, selectedImage.frame.size.height)];
CGContextConcatCTM(UIGraphicsGetCurrentContext(), overlayImage.transform);
[overlayImage.image drawInRect:CGRectMake(overlayImage.frame.origin.x, overlayImage.frame.origin.y, overlayImage.frame.size.width, overlayImage.frame.size.height)];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
NSData *imgData = UIImageJPEGRepresentation(image, 0.9); //UIImagePNGRepresentation ( image ); // get JPEG representation
UIImage * imagePNG = [UIImage imageWithData:imgData]; // wrap UIImage around PNG representation
UIGraphicsEndImageContext();
return imagePNG;
}
Any comments are greatly appreciated.
you need to try
CGRectApplyAffineTransform(<#CGRect rect#>, <#CGAffineTransform t#>)
your code should be like
CGContextConcatCTM(UIGraphicsGetCurrentContext(), overlayImage.transform);
CGRect rect = CGRectMake(overlayImage.frame.origin.x, overlayImage.frame.origin.y, overlayImage.frame.size.width, overlayImage.frame.size.height);
CGRect transformedRect = CGRectApplyAffineTransform(rect, overlayImage.transform);
[overlayImage.image drawInRect:transformedRect];
It turned out that the AutoresizingMask changed rect size
[slider setAutoresizingMask:UIViewAutoresizingNone];
So I have a UIImage which I want to crop. I looked and found imageByCroppingToRect method for CIImage. So, I converted the data to CIImage instead of UIImage, crop it using the specified method and then convert the resulting CIImage to UIImage and then display it in a UIImageView.
My code is
NSData *data = [[NSData alloc]initWithData:[def objectForKey:#"imageData"]];
//UIImage *normalImage = [[UIImage alloc]initWithData:data];
CIImage *originalImage = [CIImage imageWithData:data];
[originalImage imageByCroppingToRect:CGRectMake(10, 72, 300, 300)];
self.imageView.image = [UIImage imageWithCIImage:originalImage];
The problem is the image gets rotated by 90 degrees and I am not sure if it is being cropped. This image is captured using the device's camera. I use AVFoundation to access the camera. My session preset is AVCaptureSessionPresetPhoto. I think this is why I get the zooming.
CGRect rect = CGRectMake(10, 72, 300, 300);
CGImageRef imref = CGImageCreateWithImageInRect([yourOriginalImage CGImage], rect);
UIImage *newSubImage = [UIImage imageWithCGImage:imref];
try this. may help u.
EDIT:
Firstly fix your image orientation:
refs : https://github.com/j3r3miah/mapmatic-ios/blob/master/Mapmatic/UIImage+FixOrientation.m
then use above code to crop the Image to Specified Rect.
Not really an answer to your question, but an answer to your problem
https://github.com/mbcharbonneau/UIImage-Categories
especially this file : https://github.com/mbcharbonneau/UIImage-Categories/blob/master/UIImage%2BResize.m
- (UIImage *)croppedImage:(CGRect)bounds {
CGFloat scale = MAX(self.scale, 1.0f);
CGRect scaledBounds = CGRectMake(bounds.origin.x * scale, bounds.origin.y * scale, bounds.size.width * scale, bounds.size.height * scale);
CGImageRef imageRef = CGImageCreateWithImageInRect([self CGImage], scaledBounds);
UIImage *croppedImage = [UIImage imageWithCGImage:imageRef scale:self.scale orientation:UIImageOrientationUp];
CGImageRelease(imageRef);
return croppedImage;
}
you will find there all you need to crop your image
I'm trying to crop a image into a square shape, and then scale it to a size 200x200. Here's the code I use, but it's not really working for me. The image I get is sometimes in a different orientation, or isn't cropped from the center, or is wider than it should be.
float scale = _avatar.image.scale;
UIImageOrientation orientation = _avatar.image.imageOrientation;
if(_avatar.image.size.width < _avatar.image.size.height){ // avatar is a UIImageView
float startingY = (_avatar.image.size.height-_avatar.image.size.width)/2; // image is taller, determine the origin of the width-sized square, which will be the new image
CGImageRef imageRef = CGImageCreateWithImageInRect([_avatar.image CGImage], CGRectMake(0, startingY, _avatar.image.size.width, _avatar.image.size.width));
_avatar.image = [UIImage imageWithCGImage:imageRef scale:scale orientation:orientation];
CGImageRelease(imageRef);
} else {
float startingX = (_avatar.image.size.width-_avatar.image.size.height)/2; // image is wider, determine the origin of the height-sized square, which will be the new image
CGImageRef imageRef = CGImageCreateWithImageInRect([_avatar.image CGImage], CGRectMake(startingX, 0, _avatar.image.size.height, _avatar.image.size.height));
_avatar.image = [UIImage imageWithCGImage:imageRef scale:scale orientation:orientation];
CGImageRelease(imageRef);
}
UIGraphicsBeginImageContextWithOptions(CGSizeMake(200, 200), YES, 0.0);
[_avatar.image drawInRect:CGRectMake(0, 0, 200, 200)];
UIImage *scaledImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
How it must be done to achieve the correct result?
I would suggest you to use this category for UIImage posted on github. They are also really simple classes and you can also use them to learn what is under the hood.
The UIImage has done all the coordinate and orientation tricks for you. So you should not use the width and height from an UIImage object to caculate the square's coordinate. You can take advantages of the UIKit to crop an image.
CGFloat startingX = 0;
CGFloat startingY = 0;
CGFloat squareWidth;
if(_avatar.image.size.width < _avatar.image.size.height){ // avatar is a UIImageView
startingY = (_avatar.image.size.height-_avatar.image.size.width)/2; // image is taller, determine the origin of the width-sized square, which will be the new image
squareWidth = _avatar.image.size.width;
} else {
startingX = (_avatar.image.size.width-_avatar.image.size.height)/2; // image is wider, determine the origin of the height-sized square, which will be the new image
squareWidth = _avatar.image.size.height;
}
UIGraphicsBeginImageContextWithOptions(CGSizeMake(squareWidth, squareWidth), YES, 0.0);
[_avatar.image drawAtPoint:CGPointMake(-startingX, -startingY)]; // Make an offset to draw part of the image
UIImage *croppedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIGraphicsBeginImageContextWithOptions(CGSizeMake(200, 200), YES, 0.0);
[croppedImage drawInRect:CGRectMake(0, 0, 200, 200)];
UIImage *scaledImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();