Can't Capture Full Screen of iCarousel - ios

I have a problem to capture full screen of iCarousel. it can capture only index of Carousel only .
UIGraphicsBeginImageContext(caputureView.bounds.size);
[caputureView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *resultingImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();

Try something like this:
- (void) getFullScreenScreenShot
{
AppDelegate* appDelegate = (AppDelegate*)[[UIApplication sharedApplication] delegate];
UIView* superView = appDelegate.viewController.view;
CGRect fullScreenFrame = superView.frame;
UIGraphicsBeginImageContextWithOptions(fullScreenFrame.size, YES, 0.0f);
CGContextTranslateCTM(UIGraphicsGetCurrentContext(), 0.0f, 0.0f);
[superView.layer renderInContext: UIGraphicsGetCurrentContext()];
UIImageView* screenShot = [[UIImageView alloc] initWithImage: UIGraphicsGetImageFromCurrentImageContext()];
UIGraphicsEndImageContext();
NSData* imageData = UIImageJPEGRepresentation(screenShot.image, 1.0);
NSString* previewFileNamePath = [[CPFileManager documentsPath] stringByAppendingString: #"image.jpg"];
if ([imageData writeToFile: previewFileNamePath
atomically: NO])
{
NSLog(#"See filename:%#", previewFileNamePath);
}
else
{
NSLog(#"Error: %#", previewFileNamePath);
}
}

Related

Clipping images using BezierPath and combining the images after clipping

I want to develop a small Jigsaw puzzle game but having problem when combining the image pieces. I can split image but cannot combine them as per my requirement. Here is what I am doing.
For cropping:
[customImageView setImage:[self cropImage:self.mainImage withRect:mCropFrame]];
- (UIImage *) cropImage:(UIImage*)originalImage withRect:(CGRect)rect
{
return [UIImage imageWithCGImage:CGImageCreateWithImageInRect([originalImage CGImage], rect)];
}
For Clipping:
[self setClippingPath:[pieceBezierPathsMutArray_ objectAtIndex:i]:view];
- (UIImageView *) setClippingPath:(UIBezierPath *)clippingPath : (UIImageView *)imgView;
{
if (![[imgView layer] mask])
{
[[imgView layer] setMask:[CAShapeLayer layer]];
}
[(CAShapeLayer*) [[imgView layer] mask] setPath:[clippingPath CGPath]];
return imgView;
}
For Combining:
-(id)initByCombining:(id)oneView andOther:(id)twoView withRegularSize:(CGSize)pieceSize;
{
CustomImageView *one = oneView;//[oneView copy];
CustomImageView *two = twoView;
CGPoint onepoint, twopoint;
if (one.frame.origin.x < two.frame.origin.x)
{
onepoint.x = 0;
twopoint.x = onepoint.x + one.frame.size.width;
}
else
{
onepoint.x = onepoint.x + one.frame.size.width;
twopoint.x = 0;
}
if (one.frame.origin.y < two.frame.origin.y)
{
onepoint.y = 0;
twopoint.y = 0;
}
else
{
onepoint.y = 0;
twopoint.y = 0;
}
CGRect frame;
frame.origin = CGPointZero;
frame.size.width = onepoint.x + one.frame.size.width + two.frame.size.width;
frame.size.height = MAX(one.frame.size.height , two.frame.size.height);
if (self = [self initWithFrame:frame])
{
UIGraphicsPushContext(UIGraphicsGetCurrentContext());
UIGraphicsBeginImageContext(frame.size);
[one.image drawAtPoint:onepoint];
[two.image drawAtPoint:twopoint];
[self.layer renderInContext:UIGraphicsGetCurrentContext()];
self.image = UIGraphicsGetImageFromCurrentImageContext();
self.backgroundColor = [UIColor redColor];
UIGraphicsEndImageContext();
UIGraphicsPopContext();
self.center = one.center;
self.transform = CGAffineTransformScale(incomingTransform, 0.5, 0.5);
self.previousRotation = self.transform;
}
return self;
}
My initial image is this:
After cropping and clipping it becomes like this:
It should look like this after combining.
But it is becoming like this.
When You want to combine the Images I would suggest you to place the Clipping within another UIView , so that the UIView becomes a SuperView of all the Placed Clippings, after the Images have been Placed you can do something like this,
UIGraphicsBeginImageContextWithOptions(superView.bounds.size, superView.opaque, 0.0);
[superView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage * CombinedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return CombinedImage;
and Then just save it as follows..
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library writeImageToSavedPhotosAlbum:[CombinedImage CGImage] orientation:(ALAssetOrientation)[SAVEIMAGE imageOrientation] completionBlock:^(NSURL *assetURL, NSError *error){
if (error) {
// TODO: error handling
UIAlertView *al=[[UIAlertView alloc]initWithTitle:#"" message:#"Error saving image, Please try again" delegate:nil cancelButtonTitle:#"OK" otherButtonTitles:nil, nil];
[al show];
} else {
NSData *imageData = UIImagePNGRepresentation(CombinedImage);
UIImage *finalImage=[UIImage imageWithData:imageData];
}
..Hope this helps

MKMapSnapshotter with MKPolylineRenderer problems

I tried implementing this answer: https://stackoverflow.com/a/22716610, to the problem of adding overlays to mkmapsnapshotter in iOS7 (cant do renderInContext method). I did this as shown below, but the image returned has only the map with no overlays. Forgive me, I am quite new to this. Thanks.
-(void)mapViewDidFinishRenderingMap:(MKMapView *)mapView fullyRendered:(BOOL)fullyRendered
{
if (mapView.tag == 100) {
MKMapSnapshotOptions *options = [[MKMapSnapshotOptions alloc] init];
options.region = mapView.region;
options.size = mapView.frame.size;
options.scale = [[UIScreen mainScreen] scale];
MKMapSnapshotter *snapshotter = [[MKMapSnapshotter alloc] initWithOptions:options];
[snapshotter startWithCompletionHandler:^(MKMapSnapshot *snapshot, NSError *error) {
if (error) {
NSLog(#"[Error] %#", error);
return;
}
UIImage *image = snapshot.image;
UIGraphicsBeginImageContextWithOptions(image.size, YES, image.scale);
{
[image drawAtPoint:CGPointMake(0, 0)];
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextSetStrokeColorWithColor(context, [UIColor redColor].CGColor);
CGContextSetLineWidth(context,5.0f);
CGContextBeginPath(context);
bool first = YES;
NSArray *overlays = mapView.overlays;
for (id <MKOverlay> overlay in overlays) {
CGPoint point = [snapshot pointForCoordinate:overlay.coordinate];
if(first)
{
first = NO;
CGContextMoveToPoint(context,point.x, point.y);
}
else{
CGContextAddLineToPoint(context,point.x, point.y);
}
}
UIImage *compositeImage = UIGraphicsGetImageFromCurrentImageContext();
NSData *data = UIImagePNGRepresentation(compositeImage);
placeToSave = data;
NSLog(#"MapView Snapshot Saved.");
//show image for debugging
UIImageView *imageView = [[UIImageView alloc] initWithFrame:CGRectMake(0, 200, 320, 320)];
imageView.image = compositeImage;
[self.view addSubview:imageView];
}
UIGraphicsEndImageContext();
}];
[mapView setHidden:YES];
}
}

UIImagePickerController forces images in other views to full size

In one spot of my app, I need to use the camera, so I call up the UIImagePickerController. Unfortunately, once I return from the controller, most of the pictures in the app are full size, no matter what their UIImageView attributes say. The exception appears to be UIImageViews in UITableViewCells. This applies to all views in the app, not just ones that have direct connection to the viewcontroller that called the UIImagePickerController. Once while I was messing around, trying to troubleshoot, the problem seemed to disappear on its own, though I have not been able to replicate that.
The code is as follows.
-(void)viewDidAppear:(BOOL)animated {
[super viewDidAppear:animated];
if(![UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera]){
UIAlertView *errorAlertView = [[UIAlertView alloc] initWithTitle:#"Error"
message:#"Device has no camera"
delegate:nil
cancelButtonTitle:#"OK"
otherButtonTitles:nil];
[errorAlertView show];
[_app.navController popPage];
}
else if(firstTime){
UIImagePickerController *picker = [[UIImagePickerController alloc] init];
picker.delegate = self;
picker.allowsEditing = YES;
picker.sourceType = UIImagePickerControllerSourceTypeCamera;
firstTime = false;
[self presentViewController:picker animated:YES completion:NULL];
}
}
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
UIImage *cameraImage = info[UIImagePickerControllerEditedImage];
[picker dismissViewControllerAnimated:YES completion:NULL];
NSString *folderName = #"redApp";
if([_page hasChild:[RWPAGE FOLDER]]){
folderName = [_page getStringFromNode:[RWPAGE FOLDER]];
}
NSDate *datetimeNow = [[NSDate alloc] init];
NSDateFormatter *dateFormatter = [[NSDateFormatter alloc] init];
[dateFormatter setDateFormat:#"yyyy-MM-dd_HH-mm-ss-SSS"];
NSString *filename = [NSString stringWithFormat:#"%#.png",[dateFormatter stringFromDate:datetimeNow]];
NSString *applicationDocumentsDir = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSString *folderPath = [applicationDocumentsDir stringByAppendingPathComponent:folderName];
NSString *filePath = [folderPath stringByAppendingPathComponent:filename];
NSError *error = nil;
if(![[NSFileManager defaultManager] fileExistsAtPath:folderPath isDirectory:nil]){
[[NSFileManager defaultManager] createDirectoryAtPath:folderPath withIntermediateDirectories:NO
attributes:nil error:&error];
}
if(error != nil){
NSLog(#"Create directory error: %#", error);
}
[UIImagePNGRepresentation(cameraImage) writeToFile:filePath options:NSDataWritingAtomic error:&error];
if(error != nil){
NSLog(#"Error in saving image to disk. Error : %#", error);
}
RWXmlNode *nextPage = [_xml getPage:[_page getStringFromNode:[RWPAGE CHILD]]];
nextPage = [nextPage deepClone];
[nextPage addNodeWithName:[RWPAGE FILEPATH] value:filePath];
[_app.navController pushViewWithPage:nextPage];
}
-(void)imagePickerControllerDidCancel:(UIImagePickerController *)picker {
[picker dismissViewControllerAnimated:YES completion:NULL];
[_app.navController popPage];
}
Edit:
To expand upon the above.
The base of the app is a Custom Container View Controller, acting mostly like a Navigation Controller. When a user navigates to a page (what I call the combination of a view and view controller) it is displayed on the Custom Container view, and the previous page is stored in a stack.
One of my pages calls upon a UIImagePicker. Once the image picker has been closed again, and I return to the app, problems appear across the app when I open new pages. I don't see problems on every page, but they are on several independent pages. Most pages look completely unaffected, while the problem pages appear to not obey their constraints.
if you want to compress image use
+(UIImage *)resizeImage:(UIImage*)image newSize:(CGSize)newSize
{
CGRect newRect = CGRectIntegral(CGRectMake(0, 0, newSize.width, newSize.height));
CGImageRef imageRef = image.CGImage;
UIGraphicsBeginImageContextWithOptions(newSize, NO, 0);
CGContextRef context = UIGraphicsGetCurrentContext();
// Set the quality level to use when rescaling
CGContextSetInterpolationQuality(context, kCGInterpolationHigh);
CGAffineTransform flipVertical = CGAffineTransformMake(1, 0, 0, -1, 0, newSize.height);
CGContextConcatCTM(context, flipVertical);
// Draw into the context; this scales the image
CGContextDrawImage(context, newRect, imageRef);
// Get the resized image from the context and a UIImage
CGImageRef newImageRef = CGBitmapContextCreateImage(context);
UIImage *newImage = [UIImage imageWithCGImage:newImageRef];
CGImageRelease(newImageRef);
UIGraphicsEndImageContext();
return newImage;
}
or
+(UIImage *)scaleImage:(UIImage *)image toSize:(CGSize)targetSize {
CGFloat scaleFactor = 1.0;
if (image.size.width > targetSize.width || image.size.height > targetSize.height)
if (!((scaleFactor = (targetSize.width / image.size.width)) > (targetSize.height / image.size.height))) //scale to fit width, or
scaleFactor = targetSize.height / image.size.height; // scale to fit heigth.
UIGraphicsBeginImageContext(targetSize);
CGRect rect = CGRectMake((targetSize.width - image.size.width * scaleFactor) / 2,
(targetSize.height - image.size.height * scaleFactor) / 2,
image.size.width * scaleFactor, image.size.height * scaleFactor);
[image drawInRect:rect];
UIImage *scaledImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return scaledImage;
}
Try like this :-
UIImage *thumbnail =[UIImage imageNamed:#"yourimage.png"];
CGSize itemSize = CGSizeMake(35, 35);
UIGraphicsBeginImageContext(itemSize);
CGRect imageRect = CGRectMake(0.0, 0.0, itemSize.width, itemSize.height);
[thumbnail drawInRect:imageRect];
// Now this below image contains compressed image
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();

cocos2d Using camera to take picture and save that to file

cocos2d Using camera to take picture and save that to file
I can use camera to take picture.
But this picture is a part of this picture.(a white White border)
How get a big picture?
I want to get a clear picture.
thank you!!
-(void)takePhoto{
AppController *appdel = (AppController*) [[UIApplication sharedApplication] delegate];
#try {
uip = [[UIImagePickerController alloc] init] ;
uip.sourceType = UIImagePickerControllerSourceTypeCamera;
uip.allowsEditing = YES;
uip.delegate = self;
}
#catch (NSException * e) {
[uip release];
uip = nil;
}
#finally {
if(uip) {
[appdel.navController presentModalViewController:uip animated:YES];
}
}
}
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info{
profileImage=[info objectForKey:UIImagePickerControllerCropRect];
AppController *appdel = (AppController*) [[UIApplication sharedApplication] delegate];
[appdel.navController dismissModalViewControllerAnimated:YES];
[uip release];
[NSThread detachNewThreadSelector:#selector(writeImgToPath:) toTarget:self withObject:profileImage];
}
-(void)writeImgToPath:(id)sender
{
NSAutoreleasePool *pool = [NSAutoreleasePool new];
UIImage *image = sender;
NSArray *pathArr = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,
NSUserDomainMask,
YES);
CGSize size;
int currentProfileIndex = 1;
NSString *path = [[pathArr objectAtIndex:0]
stringByAppendingPathComponent:[NSString stringWithFormat:#"Img_%d.png",currentProfileIndex]];
size = CGSizeMake(1320, 480);
UIGraphicsBeginImageContext(size);
[image drawInRect:CGRectMake(0, 0, 1320, 480)];
image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *data = UIImagePNGRepresentation(image);
[data writeToFile:path atomically:YES];
NSLog(#”Saved…..”);
CGRect r = CGRectMake(0, 0, 1320, 480);
UIGraphicsBeginImageContext(r.size);
UIImage *img1;
[image drawInRect:r];
img1 = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(img1, nil, nil, nil);
[pool release];
}
-(id) init
{
[self takePhoto];
}
return self;
}

How can I convert FBProfilePictureView to an UIImage?

Everything is working fine with FBProfilePictureView but I need to get that picture from FBProfilePictureView and turn it into an UIImage.
How should I do it?
I tried using this:
UIGraphicsBeginImageContext(self.profilePictureView.frame.size);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
self.TestPictureOutlet.image = viewImage;
But this doesnt work for my solution.
FBProfilePictureView is a UIView, this UIView contains a UIImageView, that is your image, you can get the UIImage from that UIImageView:
profilePictureView is a FBProfilePictureView
UIImage *image = nil;
for (NSObject *obj in [profilePictureView subviews]) {
if ([obj isMemberOfClass:[UIImageView class]]) {
UIImageView *objImg = (UIImageView *)obj;
image = objImg.image;
break;
}
}
EDIT: add another way more quickly but do the same thing
__block UIImage *image = nil;
[self.view.subviews enumerateObjectsUsingBlock:^(NSObject *obj, NSUInteger idx, BOOL *stop) {
if ([obj isMemberOfClass:[UIImageView class]]) {
UIImageView *objImg = (UIImageView *)obj;
image = objImg.image;
*stop = YES;
}
}];
Above both mentioned solutions work fine to get UIImage object out from FBProfilePictureView.
Only thing is, You need to put some delay before to get image from FBProfilePictureView.
Like:
[FBRequest requestForMe] startWithCompletionHandler:
^(FBRequestConnection *connection, NSDictionary<FBGraphUser> *user, NSError *error) {
if (!error) {
myNameLbl.text = user.name;
profileDP.profileID = user.id;
//NOTE THIS LINE WHICH DOES THE MAGIC
[self performSelector:#selector(getUserImageFromFBView) withObject:nil afterDelay:1.0];
}];
- (void)getUserImageFromFBView{
UIImage *img = nil;
//1 - Solution to get UIImage obj
for (NSObject *obj in [profileDP subviews]) {
if ([obj isMemberOfClass:[UIImageView class]]) {
UIImageView *objImg = (UIImageView *)obj;
img = objImg.image;
break;
}
}
//2 - Solution to get UIImage obj
// UIGraphicsBeginImageContext(profileDP.frame.size);
// [profileDP.layer renderInContext:UIGraphicsGetCurrentContext()];
// img = UIGraphicsGetImageFromCurrentImageContext();
// UIGraphicsEndImageContext();
//Here I'm setting image and it works 100% for me.
testImgv.image = img;
}
Regards!
Aamir Ali -
iOS Apps Developer
#Time Group (TGD)
Here the solution.
steps:
Make sure that you assigned the FB user id to object of class
"FBProfilePictureView" in my case this class object is
"userPictureImageView"
-(void)saveFBUserImage
{
CGSize imageSize = self.userPictureImageView.frame.size;
UIGraphicsBeginImageContext(imageSize);
CGContextRef imageContext = UIGraphicsGetCurrentContext();
[self.userPictureImageView.layer renderInContext: imageContext];
UIImage* viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *imageData = UIImageJPEGRepresentation(viewImage,1);
UIImage * img = [UIImage imageWithData:imageData];
NSString *filePath = <specify your path here>;
CGSize size = img.size;
if(size.height > 50)
size.height = 50;
if(size.width > 50)
size.width = 50;
CGRect rect = CGRectMake(0.0f, 0.0f, size.width, size.height);
CGSize size2 = rect.size;
UIGraphicsBeginImageContext(size2);
[img drawInRect:rect];
img = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *newImageData = UIImageJPEGRepresentation(img, 1.0);
[newImageData writeToFile:filePath atomically:YES];
}
That's it. :-)
I have found above solutions to be quite working but here is updated code which i found it more easy going.
#property FBSDKProfilePictureView *pictureView;
if ([FBSDKAccessToken currentAccessToken]) {
self.pictureView=[[FBSDKProfilePictureView alloc]init];
[self.pictureView setProfileID:#"me"];
[self.pictureView setPreservesSuperviewLayoutMargins:YES];
[self.pictureView setPictureMode:FBSDKProfilePictureModeNormal];
[self.pictureView setNeedsImageUpdate];
[self performSelector:#selector(getUserImageFromFBView) withObject:nil afterDelay:1.0];
}
-(void) getUserImageFromFBView
{
UIImage *image = nil;
for (NSObject *obj in [self.pictureView subviews]) {
if ([obj isMemberOfClass:[UIImageView class]]) {
UIImageView *objImg = (UIImageView *)obj;
image = objImg.image;
break;
}
}
[self.profilePic setImage:image forState:UIControlStateNormal];
}
Hope this helps. Here i have put 1 second delay to wait for the profile picture to load.

Resources