Everything is working fine with FBProfilePictureView but I need to get that picture from FBProfilePictureView and turn it into an UIImage.
How should I do it?
I tried using this:
UIGraphicsBeginImageContext(self.profilePictureView.frame.size);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
self.TestPictureOutlet.image = viewImage;
But this doesnt work for my solution.
FBProfilePictureView is a UIView, this UIView contains a UIImageView, that is your image, you can get the UIImage from that UIImageView:
profilePictureView is a FBProfilePictureView
UIImage *image = nil;
for (NSObject *obj in [profilePictureView subviews]) {
if ([obj isMemberOfClass:[UIImageView class]]) {
UIImageView *objImg = (UIImageView *)obj;
image = objImg.image;
break;
}
}
EDIT: add another way more quickly but do the same thing
__block UIImage *image = nil;
[self.view.subviews enumerateObjectsUsingBlock:^(NSObject *obj, NSUInteger idx, BOOL *stop) {
if ([obj isMemberOfClass:[UIImageView class]]) {
UIImageView *objImg = (UIImageView *)obj;
image = objImg.image;
*stop = YES;
}
}];
Above both mentioned solutions work fine to get UIImage object out from FBProfilePictureView.
Only thing is, You need to put some delay before to get image from FBProfilePictureView.
Like:
[FBRequest requestForMe] startWithCompletionHandler:
^(FBRequestConnection *connection, NSDictionary<FBGraphUser> *user, NSError *error) {
if (!error) {
myNameLbl.text = user.name;
profileDP.profileID = user.id;
//NOTE THIS LINE WHICH DOES THE MAGIC
[self performSelector:#selector(getUserImageFromFBView) withObject:nil afterDelay:1.0];
}];
- (void)getUserImageFromFBView{
UIImage *img = nil;
//1 - Solution to get UIImage obj
for (NSObject *obj in [profileDP subviews]) {
if ([obj isMemberOfClass:[UIImageView class]]) {
UIImageView *objImg = (UIImageView *)obj;
img = objImg.image;
break;
}
}
//2 - Solution to get UIImage obj
// UIGraphicsBeginImageContext(profileDP.frame.size);
// [profileDP.layer renderInContext:UIGraphicsGetCurrentContext()];
// img = UIGraphicsGetImageFromCurrentImageContext();
// UIGraphicsEndImageContext();
//Here I'm setting image and it works 100% for me.
testImgv.image = img;
}
Regards!
Aamir Ali -
iOS Apps Developer
#Time Group (TGD)
Here the solution.
steps:
Make sure that you assigned the FB user id to object of class
"FBProfilePictureView" in my case this class object is
"userPictureImageView"
-(void)saveFBUserImage
{
CGSize imageSize = self.userPictureImageView.frame.size;
UIGraphicsBeginImageContext(imageSize);
CGContextRef imageContext = UIGraphicsGetCurrentContext();
[self.userPictureImageView.layer renderInContext: imageContext];
UIImage* viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *imageData = UIImageJPEGRepresentation(viewImage,1);
UIImage * img = [UIImage imageWithData:imageData];
NSString *filePath = <specify your path here>;
CGSize size = img.size;
if(size.height > 50)
size.height = 50;
if(size.width > 50)
size.width = 50;
CGRect rect = CGRectMake(0.0f, 0.0f, size.width, size.height);
CGSize size2 = rect.size;
UIGraphicsBeginImageContext(size2);
[img drawInRect:rect];
img = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *newImageData = UIImageJPEGRepresentation(img, 1.0);
[newImageData writeToFile:filePath atomically:YES];
}
That's it. :-)
I have found above solutions to be quite working but here is updated code which i found it more easy going.
#property FBSDKProfilePictureView *pictureView;
if ([FBSDKAccessToken currentAccessToken]) {
self.pictureView=[[FBSDKProfilePictureView alloc]init];
[self.pictureView setProfileID:#"me"];
[self.pictureView setPreservesSuperviewLayoutMargins:YES];
[self.pictureView setPictureMode:FBSDKProfilePictureModeNormal];
[self.pictureView setNeedsImageUpdate];
[self performSelector:#selector(getUserImageFromFBView) withObject:nil afterDelay:1.0];
}
-(void) getUserImageFromFBView
{
UIImage *image = nil;
for (NSObject *obj in [self.pictureView subviews]) {
if ([obj isMemberOfClass:[UIImageView class]]) {
UIImageView *objImg = (UIImageView *)obj;
image = objImg.image;
break;
}
}
[self.profilePic setImage:image forState:UIControlStateNormal];
}
Hope this helps. Here i have put 1 second delay to wait for the profile picture to load.
Related
Earlier i used SDWebimage third party to display images, now i removed that 3rd party due some other reason. is there any other way to Display GIF image from URL in Imageview without using third party...
Well you can do it simply by following next steps:
Add the following functions
#import <ImageIO/ImageIO.h>
#import <MobileCoreServices/MobileCoreServices.h>
UIImage *GIFFromData(NSData *data) {
if (!data) {
return nil;
}
CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef)(data), NULL);
UIImage *image = nil;
if (GIFSourceContainsAnimatedGIF(source)) {
image = GIFFromImageSource(source);
} else {
image = [UIImage imageWithData:data];
}
if (source) {
CFRelease(source);
}
return image;
}
BOOL GIFSourceContainsAnimatedGIF(CGImageSourceRef source) {
return (source && UTTypeConformsTo(CGImageSourceGetType(source), kUTTypeGIF) && CGImageSourceGetCount(source) > 1);
}
UIImage *GIFFromImageSource(CGImageSourceRef source) {
CFRetain(source);
NSUInteger numberOfFrames = CGImageSourceGetCount(source);
NSMutableArray<UIImage *> *images = [NSMutableArray arrayWithCapacity:numberOfFrames];
NSTimeInterval duration = 0.0;
for (NSUInteger i = 0; i < numberOfFrames; ++i) {
CGImageRef image = CGImageSourceCreateImageAtIndex(source, i, NULL);
if (image) {
UIImage *frameImage = [UIImage imageWithCGImage:image scale:1.0 orientation:UIImageOrientationUp];
[images addObject:frameImage];
CFRelease(image);
} else {
continue;
}
duration += GIFSourceGetFrameDelay(source, i);
}
CFRelease(source);
return [UIImage animatedImageWithImages:images duration:duration];
}
NSTimeInterval GIFSourceGetFrameDelay(CGImageSourceRef source, NSUInteger index) {
NSTimeInterval frameDelay = 0;
CFDictionaryRef imageProperties = CGImageSourceCopyPropertiesAtIndex(source, index, NULL);
if (!imageProperties) {
return frameDelay;
}
CFDictionaryRef gifProperties = nil;
if (CFDictionaryGetValueIfPresent(imageProperties, kCGImagePropertyGIFDictionary, (const void **)&gifProperties)) {
const void *durationValue = nil;
if (CFDictionaryGetValueIfPresent(gifProperties, kCGImagePropertyGIFUnclampedDelayTime, &durationValue)) {
frameDelay = [(__bridge NSNumber *)durationValue doubleValue];
if (frameDelay <= 0) {
if (CFDictionaryGetValueIfPresent(gifProperties, kCGImagePropertyGIFDelayTime, &durationValue)) {
frameDelay = [(__bridge NSNumber *)durationValue doubleValue];
}
}
}
}
CFRelease(imageProperties);
return frameDelay;
}
Download your GIF with NSURLSession
Display it
UIImage *image = GIFFromData(data);
UIImageView *view = [[UIImageView alloc] initWithImage:image];
Hope this helps :)
UIImageView* animatedImageView = [[UIImageView alloc] initWithFrame:self.view.bounds];
animatedImageView.animationImages = [NSArray arrayWithObjects:
[UIImage imageNamed:#"image1.gif"],
[UIImage imageNamed:#"image2.gif"],
[UIImage imageNamed:#"image3.gif"],
[UIImage imageNamed:#"image4.gif"], nil];
animatedImageView.animationDuration = 1.0f;
animatedImageView.animationRepeatCount = 0;
[animatedImageView startAnimating];
[self.view addSubview: animatedImageView];
Add animated Gif image in Iphone UIImageView
I want to develop a small Jigsaw puzzle game but having problem when combining the image pieces. I can split image but cannot combine them as per my requirement. Here is what I am doing.
For cropping:
[customImageView setImage:[self cropImage:self.mainImage withRect:mCropFrame]];
- (UIImage *) cropImage:(UIImage*)originalImage withRect:(CGRect)rect
{
return [UIImage imageWithCGImage:CGImageCreateWithImageInRect([originalImage CGImage], rect)];
}
For Clipping:
[self setClippingPath:[pieceBezierPathsMutArray_ objectAtIndex:i]:view];
- (UIImageView *) setClippingPath:(UIBezierPath *)clippingPath : (UIImageView *)imgView;
{
if (![[imgView layer] mask])
{
[[imgView layer] setMask:[CAShapeLayer layer]];
}
[(CAShapeLayer*) [[imgView layer] mask] setPath:[clippingPath CGPath]];
return imgView;
}
For Combining:
-(id)initByCombining:(id)oneView andOther:(id)twoView withRegularSize:(CGSize)pieceSize;
{
CustomImageView *one = oneView;//[oneView copy];
CustomImageView *two = twoView;
CGPoint onepoint, twopoint;
if (one.frame.origin.x < two.frame.origin.x)
{
onepoint.x = 0;
twopoint.x = onepoint.x + one.frame.size.width;
}
else
{
onepoint.x = onepoint.x + one.frame.size.width;
twopoint.x = 0;
}
if (one.frame.origin.y < two.frame.origin.y)
{
onepoint.y = 0;
twopoint.y = 0;
}
else
{
onepoint.y = 0;
twopoint.y = 0;
}
CGRect frame;
frame.origin = CGPointZero;
frame.size.width = onepoint.x + one.frame.size.width + two.frame.size.width;
frame.size.height = MAX(one.frame.size.height , two.frame.size.height);
if (self = [self initWithFrame:frame])
{
UIGraphicsPushContext(UIGraphicsGetCurrentContext());
UIGraphicsBeginImageContext(frame.size);
[one.image drawAtPoint:onepoint];
[two.image drawAtPoint:twopoint];
[self.layer renderInContext:UIGraphicsGetCurrentContext()];
self.image = UIGraphicsGetImageFromCurrentImageContext();
self.backgroundColor = [UIColor redColor];
UIGraphicsEndImageContext();
UIGraphicsPopContext();
self.center = one.center;
self.transform = CGAffineTransformScale(incomingTransform, 0.5, 0.5);
self.previousRotation = self.transform;
}
return self;
}
My initial image is this:
After cropping and clipping it becomes like this:
It should look like this after combining.
But it is becoming like this.
When You want to combine the Images I would suggest you to place the Clipping within another UIView , so that the UIView becomes a SuperView of all the Placed Clippings, after the Images have been Placed you can do something like this,
UIGraphicsBeginImageContextWithOptions(superView.bounds.size, superView.opaque, 0.0);
[superView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage * CombinedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return CombinedImage;
and Then just save it as follows..
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library writeImageToSavedPhotosAlbum:[CombinedImage CGImage] orientation:(ALAssetOrientation)[SAVEIMAGE imageOrientation] completionBlock:^(NSURL *assetURL, NSError *error){
if (error) {
// TODO: error handling
UIAlertView *al=[[UIAlertView alloc]initWithTitle:#"" message:#"Error saving image, Please try again" delegate:nil cancelButtonTitle:#"OK" otherButtonTitles:nil, nil];
[al show];
} else {
NSData *imageData = UIImagePNGRepresentation(CombinedImage);
UIImage *finalImage=[UIImage imageWithData:imageData];
}
..Hope this helps
I tried implementing this answer: https://stackoverflow.com/a/22716610, to the problem of adding overlays to mkmapsnapshotter in iOS7 (cant do renderInContext method). I did this as shown below, but the image returned has only the map with no overlays. Forgive me, I am quite new to this. Thanks.
-(void)mapViewDidFinishRenderingMap:(MKMapView *)mapView fullyRendered:(BOOL)fullyRendered
{
if (mapView.tag == 100) {
MKMapSnapshotOptions *options = [[MKMapSnapshotOptions alloc] init];
options.region = mapView.region;
options.size = mapView.frame.size;
options.scale = [[UIScreen mainScreen] scale];
MKMapSnapshotter *snapshotter = [[MKMapSnapshotter alloc] initWithOptions:options];
[snapshotter startWithCompletionHandler:^(MKMapSnapshot *snapshot, NSError *error) {
if (error) {
NSLog(#"[Error] %#", error);
return;
}
UIImage *image = snapshot.image;
UIGraphicsBeginImageContextWithOptions(image.size, YES, image.scale);
{
[image drawAtPoint:CGPointMake(0, 0)];
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextSetStrokeColorWithColor(context, [UIColor redColor].CGColor);
CGContextSetLineWidth(context,5.0f);
CGContextBeginPath(context);
bool first = YES;
NSArray *overlays = mapView.overlays;
for (id <MKOverlay> overlay in overlays) {
CGPoint point = [snapshot pointForCoordinate:overlay.coordinate];
if(first)
{
first = NO;
CGContextMoveToPoint(context,point.x, point.y);
}
else{
CGContextAddLineToPoint(context,point.x, point.y);
}
}
UIImage *compositeImage = UIGraphicsGetImageFromCurrentImageContext();
NSData *data = UIImagePNGRepresentation(compositeImage);
placeToSave = data;
NSLog(#"MapView Snapshot Saved.");
//show image for debugging
UIImageView *imageView = [[UIImageView alloc] initWithFrame:CGRectMake(0, 200, 320, 320)];
imageView.image = compositeImage;
[self.view addSubview:imageView];
}
UIGraphicsEndImageContext();
}];
[mapView setHidden:YES];
}
}
I have a problem to capture full screen of iCarousel. it can capture only index of Carousel only .
UIGraphicsBeginImageContext(caputureView.bounds.size);
[caputureView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *resultingImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Try something like this:
- (void) getFullScreenScreenShot
{
AppDelegate* appDelegate = (AppDelegate*)[[UIApplication sharedApplication] delegate];
UIView* superView = appDelegate.viewController.view;
CGRect fullScreenFrame = superView.frame;
UIGraphicsBeginImageContextWithOptions(fullScreenFrame.size, YES, 0.0f);
CGContextTranslateCTM(UIGraphicsGetCurrentContext(), 0.0f, 0.0f);
[superView.layer renderInContext: UIGraphicsGetCurrentContext()];
UIImageView* screenShot = [[UIImageView alloc] initWithImage: UIGraphicsGetImageFromCurrentImageContext()];
UIGraphicsEndImageContext();
NSData* imageData = UIImageJPEGRepresentation(screenShot.image, 1.0);
NSString* previewFileNamePath = [[CPFileManager documentsPath] stringByAppendingString: #"image.jpg"];
if ([imageData writeToFile: previewFileNamePath
atomically: NO])
{
NSLog(#"See filename:%#", previewFileNamePath);
}
else
{
NSLog(#"Error: %#", previewFileNamePath);
}
}
I have a class variable imageView of class UIImageView and a button.
The array of images is loaded from the Gallery by using Assets.
When the loading of all images is done, I show the first image:
- (void)showImage
{
if ([images count] > 0)
[self.imageView setImage:[images objectAtIndex:0]];
currentPage = 0;
}
Everything is good so far,
But when I tap the button with following action:
- (IBAction)buttonNext:(id)sender {
NSLog(#"images array %#", images );
if (currentPage < [images count]-1)
{
NSLog(#"next currentPage %d %d %#",currentPage, [images count],[images objectAtIndex:currentPage])
currentPage++;
[self.imageView setImage:[images objectAtIndex:currentPage]];
}
}
no image replacement takes place. The imageView stucks the previous image.
I debugged the array of images:
images array (
"<UIImage: 0xb38cec0>",
"<UIImage: 0xb0f8280>",
"<UIImage: 0xb063360>",
"<UIImage: 0xb0660a0>"
)
EDIT
The code which loads the images:
-(void)loadImage:(NSString*)url
{
ALAssetsLibrary* assetslibrary = [[ALAssetsLibrary alloc] init];
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
ALAssetRepresentation *rep = [myasset defaultRepresentation];
CGImageRef iref = [rep fullResolutionImage];
if (iref) {
UIImage *largeImage = [UIImage imageWithCGImage:iref];
CGSize size=CGSizeMake(400, 500);//set the width and height
UIGraphicsBeginImageContext(size);
[largeImage drawInRect:CGRectMake(0,0,size.width,size.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
//here is the scaled image which has been changed to the size specified
UIGraphicsEndImageContext();
[images addObject:newImage];
if ([images count] == [pages count])
[self showImage];
}
};
ALAssetsLibraryAccessFailureBlock failureblock = ^(NSError *myerror)
{
NSLog(#"Can't get image - %#",[myerror localizedDescription]);
};
NSURL *asseturl = [NSURL URLWithString:url];
[assetslibrary assetForURL:asseturl
resultBlock:resultblock
failureBlock:failureblock];
}
- (void)viewWillAppear:(BOOL)animated
for(NSDictionary* a in pages)
{
if ([a objectForKey:#"iospath"] != nil)
[self loadImage:[document objectForKey:#"iospath"]];
}
pages is an array loaded in viewDidLoad. The images are there. Why I can not show'em?
Regards