I have a feature, I want to take a screenshot of a UIView(called "arrangeView" in my codes). and send the picture to my server, and then I will print it, so I want a high quantity picture..
My codes:
UIGraphicsBeginImageContext(arrangeView.frame.size);
[arrangeView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
//save and send method
[self saveImage:newImage WithName:imgTitle];
With these codes I have 2 issues:
the picture quality is too low. because after I uploaded to the
server, I will print it. but the picture I got with the codes only
20~40k, it is too small to print.
one friend told me that the app store don't alow app take a screenshot and send it to the server. he told me to use bitmap.
And I searched in google, I don't find a good solution for it. Would someone help?
Thank you very much.
Screen shot you get will be of same resolution as you main screen size i.e If it is iPhone 5s it will be 640 x 1136. Best way to get screen shot is:
UIView *screenShotView = [arrangeView snapshotViewAfterScreenUpdates:YES];
Please try to use the code below.you may save first in photo album and then you can send to server
UIView* captureView = self.view;
/* Capture the screen shoot at native resolution */
UIGraphicsBeginImageContextWithOptions(captureView.bounds.size, captureView.opaque, 0.0);
[captureView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage * screenshot = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
/* Render the screen shot at custom resolution */
CGRect cropRect = CGRectMake(0 ,0 ,1435 ,1435);
UIGraphicsBeginImageContextWithOptions(cropRect.size, captureView.opaque, 1.0f);
[screenshot drawInRect:cropRect];
UIImage * customScreenShot = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
/* Save to the photo album */
UIImageWriteToSavedPhotosAlbum(customScreenShot , nil, nil, nil);
Related
I am new to iOS development and I really need some help in a task. I need to know how does the cropping and re-sizing image is done in Objective-C. For registration I need to select a profile image and then I want to crop and resize it, But I am unaware how to do it. I have seen many source codes in Github and at SO but none of them seems to help out. Can anyone guide me how does the cropping image work in iOS?
Here is the screenshot of my page I'm working on-
Why you are not using standard UIImagePickerController? If you set imagePicker.allowsEditing = YES; and then in imagePickerController:didFinishPickingMediaWithInfo delegate method get
UIImage *newImage = info[UIImagePickerControllerEditedImage];
you'll get the image cropped by user.
you can use this function
Pass an image and size in it and get image in return.
+ (UIImage*)imageWithImage:(UIImage*)image
scaledToSize:(CGSize)newSize;
{
UIGraphicsBeginImageContext( newSize );
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
My app lets the user take photos, and in every photo there is a small watermark. The problem is: The watermark appears bigger when the photo has been taken with the front camera. I want the watermark to have the same size no matter which camera has been used.
Any ideas?
My code:
UIImage *backgroundImage = image;
UIImage *watermarkImage = [UIImage imageNamed:#"Watermark.png"];
UIGraphicsBeginImageContext(backgroundImage.size);
[backgroundImage drawInRect:CGRectMake(0, 0, backgroundImage.size.width, backgroundImage.size.height)];
[watermarkImage drawInRect:CGRectMake(backgroundImage.size.width - watermarkImage.size.width, backgroundImage.size.height - watermarkImage.size.height, watermarkImage.size.width, watermarkImage.size.height)];
UIImage *result = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
self.imageView.image = result;
The watermark is the same size. The image is not, since the two cameras have different resolutions. You need to resize the watermark in proportion to the image size. I believe you can use scaleImage:toSize: for this.
Figured out a weird solution. I turned on edit mode: [picker setAllowsEditing:YES]; and now the watermark is the same size no matter what camera you use.
I followed the traditional way of drawing the view into UIImage :
UIGraphicsBeginImageContextWithOptions(controller.skView.bounds.size, NO, 1.0);
[controller.skView drawViewHierarchyInRect:controller.skView.bounds afterScreenUpdates:YES];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
I have tried to set the third parameter in UIGraphicsBeginImageContextWithOptions to 2.0 or larger,however the quality of the Image doesn't get better.
Here I post two images(1st is captured by the code,2nd is captured by pressing Home + power button ,this function is supported by system).
I want to know if there is a way to make it better.Appreciating any help!
In order to make a Slide Show, I installed the KASlideShow pod, which is perfect. On the iphone 5S, my Slideshow took all the width (What I wanted), but, when it comes to test the app on the iPhone 6, it takes only 3/4 of the screen.
I tried to scale the UIImage using this method :
CGRect rect = CGRectMake(0,0, width, height);
UIGraphicsBeginImageContext( rect.size );
[originialImage drawInRect:rect];
UIImage *finalImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *imageData = UIImagePNGRepresentation(finalImage);
UIImage *img=[UIImage imageWithData:imageData];
But it only scales to 3/4, even if I put enormous values. It's as if there was a limit.
I tried to use UIViewContentMode.ScaleAspectFill attribute, but it doesn't do anything.
I do not have more ideas to test :/
Thanks for your help.
I am taking screenshot of a particular View in my Xib file with the following code...
UIView* captureView = self.view;
UIGraphicsBeginImageContextWithOptions(captureView.bounds.size, NO , 0.0f);
[captureView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage * screenshot = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(viewImage, nil, nil, nil);
It works fine and saves JPG image to camera roll.
But problem is, There is another UIImageView on the top of my View, that UIImageView has a semi-transparent image in it.
My screenshot doesn't preserve that transparency in the screenshot it is taking.
I want to keep the transparency as it is in the actual screen.
How can you preserve the transparency in the screenshot?
If you specify "No" for the opaque property, your image must include an alpha channel for this to work. Check that your image has an alpha channel.
JPGs don't have transparency so as soon as you convert it to JPG alpha is gone.
This is a known limitation of UIImageWriteToSavedPhotosAlbum
it doesn't keep png.
try this. this code working for me
UIGraphicsBeginImageContext(baseViewOne.frame.size);
[[baseViewOne layer] renderInContext:UIGraphicsGetCurrentContext()];
UIImage * screenshota = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
also check cocoa coder screen shots
NSData* imdata = UIImagePNGRepresentation(_snapshotImgView.image);
UIImage* snapshotPNG = [UIImage imageWithData:imdata];
UIImageWriteToSavedPhotosAlbum(snapshotPNG, self, #selector(image:didFinishSavingWithError:contextInfo:), nil);