Keep transparency in screenshot iOS - ios

I am taking screenshot of a particular View in my Xib file with the following code...
UIView* captureView = self.view;
UIGraphicsBeginImageContextWithOptions(captureView.bounds.size, NO , 0.0f);
[captureView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage * screenshot = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(viewImage, nil, nil, nil);
It works fine and saves JPG image to camera roll.
But problem is, There is another UIImageView on the top of my View, that UIImageView has a semi-transparent image in it.
My screenshot doesn't preserve that transparency in the screenshot it is taking.
I want to keep the transparency as it is in the actual screen.
How can you preserve the transparency in the screenshot?

If you specify "No" for the opaque property, your image must include an alpha channel for this to work. Check that your image has an alpha channel.

JPGs don't have transparency so as soon as you convert it to JPG alpha is gone.
This is a known limitation of UIImageWriteToSavedPhotosAlbum
it doesn't keep png.

try this. this code working for me
UIGraphicsBeginImageContext(baseViewOne.frame.size);
[[baseViewOne layer] renderInContext:UIGraphicsGetCurrentContext()];
UIImage * screenshota = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
also check cocoa coder screen shots

NSData* imdata = UIImagePNGRepresentation(_snapshotImgView.image);
UIImage* snapshotPNG = [UIImage imageWithData:imdata];
UIImageWriteToSavedPhotosAlbum(snapshotPNG, self, #selector(image:didFinishSavingWithError:contextInfo:), nil);

Related

Watermark appears bigger on photos taken with the front camera

My app lets the user take photos, and in every photo there is a small watermark. The problem is: The watermark appears bigger when the photo has been taken with the front camera. I want the watermark to have the same size no matter which camera has been used.
Any ideas?
My code:
UIImage *backgroundImage = image;
UIImage *watermarkImage = [UIImage imageNamed:#"Watermark.png"];
UIGraphicsBeginImageContext(backgroundImage.size);
[backgroundImage drawInRect:CGRectMake(0, 0, backgroundImage.size.width, backgroundImage.size.height)];
[watermarkImage drawInRect:CGRectMake(backgroundImage.size.width - watermarkImage.size.width, backgroundImage.size.height - watermarkImage.size.height, watermarkImage.size.width, watermarkImage.size.height)];
UIImage *result = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
self.imageView.image = result;
The watermark is the same size. The image is not, since the two cameras have different resolutions. You need to resize the watermark in proportion to the image size. I believe you can use scaleImage:toSize: for this.
Figured out a weird solution. I turned on edit mode: [picker setAllowsEditing:YES]; and now the watermark is the same size no matter what camera you use.

how to take a screenshot with bitmap on iOS

I have a feature, I want to take a screenshot of a UIView(called "arrangeView" in my codes). and send the picture to my server, and then I will print it, so I want a high quantity picture..
My codes:
UIGraphicsBeginImageContext(arrangeView.frame.size);
[arrangeView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
//save and send method
[self saveImage:newImage WithName:imgTitle];
With these codes I have 2 issues:
the picture quality is too low. because after I uploaded to the
server, I will print it. but the picture I got with the codes only
20~40k, it is too small to print.
one friend told me that the app store don't alow app take a screenshot and send it to the server. he told me to use bitmap.
And I searched in google, I don't find a good solution for it. Would someone help?
Thank you very much.
Screen shot you get will be of same resolution as you main screen size i.e If it is iPhone 5s it will be 640 x 1136. Best way to get screen shot is:
UIView *screenShotView = [arrangeView snapshotViewAfterScreenUpdates:YES];
Please try to use the code below.you may save first in photo album and then you can send to server
UIView* captureView = self.view;
/* Capture the screen shoot at native resolution */
UIGraphicsBeginImageContextWithOptions(captureView.bounds.size, captureView.opaque, 0.0);
[captureView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage * screenshot = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
/* Render the screen shot at custom resolution */
CGRect cropRect = CGRectMake(0 ,0 ,1435 ,1435);
UIGraphicsBeginImageContextWithOptions(cropRect.size, captureView.opaque, 1.0f);
[screenshot drawInRect:cropRect];
UIImage * customScreenShot = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
/* Save to the photo album */
UIImageWriteToSavedPhotosAlbum(customScreenShot , nil, nil, nil);

UIView to UIImage with layer borders

I have a UIView whose layer has two sublayers, each of which has a 1.5 pixel border around the outside. I am trying to create a UIImage from this view with the following code
UIGraphicsBeginImageContextWithOptions(self.bounds.size, NO, 0.0f);
[self drawViewHierarchyInRect:self.bounds afterScreenUpdates:NO];
UIImage * image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
The code does return a UIImage, but the image is clipped – that is, the image doesn't include the all of the borders on the sublayers. I've tried tweaking the sizes/bounds but to no effect. Any suggestions of what else I might try?
Thanks!
What happens if you send the parent layer a
drawInContext: message instead of telling the view to draw itself?

How to save 2 UIImageView in one image to cameraroll

I have two UIImageView one over the other, and I'dlike to save them in one single file to camera roll.
UIImageWriteToSavedPhotosAlbum(self.myimage.image,nil,nil,nil);
});
the images are laying on top of each other and are the same size. the top one has few alpha in order to see the other one
you can do it in this way..,
Add your both imageview in one UIView., and then take screenshot of This UiView and stored generated image in your desired destination.
Here is Code to take screen shot by coding
// code to take Screen shot
-(void) takeScreenshot
{
// Replace self.view with your view name which containing your ImageViews
UIGraphicsBeginImageContext(self.view.bounds.size);
CGContextRef context = UIGraphicsGetCurrentContext();
[self.view.layer renderInContext:context];
// get a UIImage from the image context
UIImage *screenImage = UIGraphicsGetImageFromCurrentImageContext();
// clean up drawing environment
UIGraphicsEndImageContext();
// Then save your Image in your desired destination
}
Hope it will Help you, Happy Coding

UIImagePickerController image Scaling and maintain its quality

Image Size captured using Camera return's image of size 720*960.
The captured Image is displayed in a UIImageView of 320*436, like this.
UIImageView *imgView=[[UIImageView alloc] initWithFrame:CGRectMake(0.0,0.0,320.0,436.0)];
imgView.image=img;//Image received from camera.
[self.view addSubView:imgView];
This, works fine image 720*960 is scaled to 320*436 and displayed.
Now, from here actual problem starts. I have another image of size 72*72. This image is overlapped with the image received from camera at some arbitrary coordinates.
CGRectMake(0.0,0.0,72.0,72.0);
I am not able to find a better way to handle scaling and applying a overlay of another Image, at the same time maintain its quality.
The image needs to be send to a server.
Use the following code to scale images:
-(UIImage*) imageWithImage:(UIImage*)image scaledToSize:(CGSize)newSize
{
UIGraphicsBeginImageContext(newSize);
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}

Resources