Display User Profile Picture from Salesforce in iOS App - ios

I am trying to display logged in user's profile picture using following:
NSLog(#"url is : %#",[SFAccountManager sharedInstance].idData.pictureUrl);
profilePicData=[NSData dataWithContentsOfURL:[SFAccountManager sharedInstance].idData.pictureUrl];
if ( profilePicData )
{
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *filePath = [NSString stringWithFormat:#"%#/%#", documentsDirectory,#"filename.jpg"];
NSLog(#"pic path: %#",filePath);
[profilePicData writeToFile:filePath atomically:YES];
}
NSLog(#"pic data: %#",profilePicData);
}
in NSLog("%#", [NSData dataWithContentsOfURL:[SFAccountManager sharedInstance].idData.pictureUrl]); shows some data but does not display picture in UIImageView.
Any Help would be appreciated .

I was attempting to do exactly the same inside the Chatter API - I wanted to load image from feed data under "smallPhotoUrl". What I have discovered is, that at the end of URL active token has to be added.
This source introduced me to concept: Accessing Chatter user pics
And I finally was able to find some clear information how to access that token here: salesforce_platform_mobile_services.pdf page 262
So eventually I did this:
NSString *builtImageUrlWithToken = [NSString stringWithFormat:#"%#?oauth_token=%#",
phototoUrl, [SFAccountManager sharedInstance].credentials.accessToken];
NSURL *imageURL = [NSURL URLWithString:builtImageUrlWithToken];
NSData *imageData = [NSData dataWithContentsOfURL:imageURL];
I think it's relevant to this topic, as I came across it while looking for this particular solution. I am sure it could be useful for others as well.
Thanks!

First Apple docs say not to use dataWithContentsOfURL for network calls, because it is synchronous.
So, you can do something like this instead to get the image:
SFIdentityData *idData = [SFAccountManager sharedInstance].idData;
if(idData != nil) {
SFRestRequest *request = [[SFRestRequest alloc] init];
request.endpoint = [idData.pictureUrl absoluteString];
request.method = SFRestMethodGET;
request.path = #"";
[[SFRestAPI sharedInstance] send:request delegate:_delegate];
}
(This is async. You could use blocks instead of delegation.)
In your delegate you can (for example) store the image in CoreData or assign the NSData to a UIImage thus:
[[UIImage alloc] initWithData:picData];
You do also need to make sure that your UIImage is properly wired up, e.g. one way would be through a UIImageView that's an IBOutlet.

Calling salesforce Rest API and getting user information .
NSString *path=#"/services/data/v29.0/chatter/users/me";
SFRestMethod method=0;
SFRestRequest *request = [SFRestRequest requestWithMethod:method path:path queryParams:queryParams];
[[SFRestAPI sharedInstance] send:request delegate:self];
this will return json response.
- (void)request:(SFRestRequest *)request didLoadResponse:(id)dataResponse {….}
Parse required photo dictionary from it and use fullEmailPhotoUrl to load/save images.
using largePhotoUrl and largePhotoUrl did not load picture for me.
Eg. of photo dictionary parsed:
photo = {
fullEmailPhotoUrl = "https://na14.salesforce.com/ncsphoto/<SOMEIDENTIFIERS>";
largePhotoUrl = "https://c.na14.content.force.com/profilephoto/<SOMEIDENTIFIERS>/F";
photoVersionId = <PHOTOID>;
smallPhotoUrl = "https://c.na14.content.force.com/profilephoto/<SOMEIDENTIFIERS>/T";
standardEmailPhotoUrl = "https://na14.salesforce.com/ncsphoto/<SOMEIDENTIFIERS>";
url = "/services/data/v29.0/chatter/users/<SOMEIDENTIFIERS>/photo";
};

In swift 3, Using Alamofire:
if let url = URL(string: baseURl + "/sobjects/Attachment/\(imageID)/Body"){
let header = ["Authorization": "Your Auth Token"]
Alamofire.request(url, method: .get, parameters: nil, encoding: JSONEncoding.default, headers: header).responseData { (response) in
if response.error == nil {
// Show the downloaded image:
if let data = response.data {
self.profileImg.image = UIImage(data: data)
}
}
}
}

Related

How to use iOS security scoped bookmarks across devices?

On iOS (14/15) I'm trying to pass security scoped bookmarks to user picked files on iCloud Drive between devices but whatever I try: I cannot get urls to be restored on another device running the same app.
The app is a UIDocument based app, the code below is in a UIViewController that display the document. The document is created like so:
Document* document = [[Document alloc] initWithFileURL:documentURL];
and then passed on to the ViewController
The url that's going to be bookmarked is picked using a plain UIDocumentPickerViewController
This is how I create a security scoped bookmark:
// Toggle this to create either a document scoped url or an app scoped url
static BOOL DOC_SCOPE = NO;
- (void) documentPicker:(UIDocumentPickerViewController*)controller didPickDocumentsAtURLs:(NSArray <NSURL *>*)urls {
if (urls.count > 0) {
NSURL* url = urls.firstObject;
NSURL* docURL = self.document.fileURL;
BOOL closeSource = [docURL startAccessingSecurityScopedResource];
BOOL doClose = [url startAccessingSecurityScopedResource];
NSData* bookmark = [url bookmarkDataWithOptions:0 includingResourceValuesForKeys:nil relativeToURL:DOC_SCOPE?docURL:nil error:nil];
if (doClose)
[url stopAccessingSecurityScopedResource];
if (closeSource)
[docURL stopAccessingSecurityScopedResource];
NSString* encoded = [NSString stringWithFormat:#"\n[%#]\n", [bookmark base64EncodedStringWithOptions:0]];
At this point I insert the encoded bookmark data in the document data and save the document.
When opening the linked document the bookmark and url are restored like so:
openLink:(NSString*)encoded {
NSData* bookmark = [[NSData alloc] initWithBase64EncodedString:encoded options:0];
NSURL* docURL = self.document.fileURL;
BOOL closeSource = [docURL startAccessingSecurityScopedResource];
NSError* error = nil;
NSURL* url = [NSURL URLByResolvingBookmarkData:bookmark options:0 relativeToURL:DOC_SCOPE?docURL:nil bookmarkDataIsStale:nil error:&error];
if (error != nil)
NSLog(#"%#", error.localizedFailureReason);
if (url != nil) {
BOOL doClose = [url startAccessingSecurityScopedResource];
// here use the url to access linked file
if (doClose)
[url stopAccessingSecurityScopedResource];
}
if (closeSource)
[docURL stopAccessingSecurityScopedResource];
}
To the project's entitlements I have added
com.apple.security.files.bookmarks.app-scope = 1
com.apple.security.files.bookmarks.document-scope = 1
When on the same device I can the restore the bookmark data and get access to the restored URL OK, but when opening the same file on another device, [NSURL URLByResolvingBookmarkData...] always sets an error:
Error Domain=NSCocoaErrorDomain Code=257 "The file couldn’t be opened because you don’t have permission to view it."
This is both the case for app scope book marks and document scope book marks.
Any idea what's missing / how to get this working?

firebase observeeventtype jsqmessagecollectionview avatar

I'm going to show avatar image of users within a conversation. I used JSQMessageViewController, so the function below should be used to achieve this goal. However, observeeventtype seems like get out this function and not being called, and there is a nil(MyuserImage or OtheruserImage) in return. So crash will appear. So how can I get photo url of different users in firebase and then return the expected avatar image? Thank you!
- (id<JSQMessageAvatarImageDataSource>)collectionView:(JSQMessagesCollectionView *)collectionView avatarImageDataForItemAtIndexPath:(NSIndexPath *)indexPath
{
JSQMessage *message = [self.msgArray objectAtIndex:indexPath.item];
if([message.senderId isEqualToString:self.senderId]){
NSString *MyuserId = [FIRAuth auth].currentUser.uid;
__block NSString *MyuserImage;
NSLog(#"uid is : %#",MyuserId);
[[_photoRef child:#"users"] observeEventType:FIRDataEventTypeValue withBlock:^(FIRDataSnapshot *snapshot) {
NSLog(#"My key is : %#",snapshot.key);
if([snapshot.key isEqualToString:MyuserId]){
NSLog(#"snapshot value is : %#", snapshot.value);
MyuserImage = snapshot.value[#"photo"];
}
}];
NSURL *url = [NSURL URLWithString:MyuserImage];
NSData *data = [NSData dataWithContentsOfURL:url];
self.myuserImage= [[UIImage alloc] initWithData:data];
return [JSQMessagesAvatarImageFactory avatarImageWithImage:self.myuserImage diameter:15];
}
else{
NSString *OtheruserId = message.senderId;
__block NSString *OtheruserImage;
NSLog(#"other userId is: %#",OtheruserId);
[[_photoRef child:#"users"]observeEventType:FIRDataEventTypeValue withBlock:^(FIRDataSnapshot *snapshot) {
NSLog(#"other user's key is: %#", snapshot.key);
if([snapshot.key isEqualToString:OtheruserId]){
NSLog(#"snapshot value is: %#", snapshot.value);
OtheruserImage = snapshot.value[#"photo"];
}
}];
NSURL *url = [NSURL URLWithString:OtheruserImage];
NSData *data = [NSData dataWithContentsOfURL:url];
self.otheruserImage= [[UIImage alloc] initWithData:data];
return [JSQMessagesAvatarImageFactory avatarImageWithImage:self.otheruserImage diameter:15];
}
}
From best I can tell it looks like you need to be pulling this data from firebase before this method gets called. Something like creating a struct that matches your data structure and then being able to pull this information from your datasource which presumably would be an array of these structs you've created.
An implementation I'd also recommend exploring is to use a library like SDWebImage which will manage the async call for you and let you set a default avatar while the network request and image rendering happen. But still you will need that url fetched from your database before this method gets called.

Second Api call is slowing down app ios

I am using google places api to search for nearby places. I make one call to get the places, and another call to get the phone number of the places. The second call is slowing down the app. Any way around this? If some sample code could be provided as well that would be great.
s1 = [NSString stringWithFormat:#"https://api.foursquare.com/v2/venues/explore?client_id=%#&client_secret=%#&query=%#&v=20201212&m=swarm&sortByDistance=%i&radius=%f&limit=%#&ll=%f,%f", kClientID, kClientSecret, Name, sortByDistance, meterRadius, recorddisplay, lat, lng];
NSLog(#"This is the foursqaure query: %#", s1);
NSURL *jsonURL = [NSURL URLWithString:[self urlEncodeValue:s1]];
NSString *jsonDataString = [[NSString alloc]initWithContentsOfURL:jsonURL];
NSData *jsonData = [jsonDataString dataUsingEncoding:NSUTF8StringEncoding];
//NSLog(#"This is JSON data: %#", jsonDataString);
if(jsonData == nil)
{
NSLog(#"SEARCH RESULT IS NIL.....");
//[pool release];
return FALSE;
}
else
{
//retrieve the objects in the JSON and then make another http request...
}
This line is wrong:
NSString *jsonDataString = [[NSString alloc]initWithContentsOfURL:jsonURL];
You are networking synchronously on the main thread. Never never never do that. That's the cause of the delay.

Graph API with App Token for Page not working

I'm developing an app with Login with Facebook and also Login with username and password.
Now i want to find all the events of a public page on Facebook for both the types of users (Facebook and Normal).
The problem is that the User with Facebook can retrieve the data, but the "normal" user cannot because data is nil.
The steps are :
1 - Compile this url with the correct credential of my Facebook App :
https://graph.facebook.com/oauth/access_token?client_id=APP_ID&client_secret=APP_SECRET&grant_type=client_credentials
2 - Put the url in my browser and retrieve the App Token In the format :
53682XXXXXXXXXX|w6F3Ic6L48XXXXXXXXXXXXXXXXX
3 - Use this piece of code :
NSString *token = [[[FBSession activeSession] accessTokenData] accessToken];
NSString *urlString;
if (!userWithFb) {
NSString *token = #"53682XXXXXXXXXX|w6F3Ic6L48XXXXXXXXXXXXXXXXX";
urlString = [NSString stringWithFormat:#"https://graph.facebook.com/%#/events?access_token=%#", pageId,token];
}else{
urlString = [NSString stringWithFormat:#"https://graph.facebook.com/%#/events?access_token=%#", pageId,token];
}
NSData* data = [NSData dataWithContentsOfURL:[NSURL URLWithString:urlString]];
if(data != nil)
[self performSelectorOnMainThread:#selector(fetchedData:)
withObject:data waitUntilDone:YES];
BUT the data for the normal user is nil. When I put the "urlstring" in my browser I see all the data , I don't know where is the problem. Waiting for solution I say thanks.
I solved my question by adding this piece of code after the if statement :
NSString *encodedURLString = [urlString stringByAddingPercentEscapesUsingEncoding:NSASCIIStringEncoding];
url = [NSURL URLWithString:encodedURLString];
NSData* data = [NSData dataWithContentsOfURL:url];
The Problem is the "|" in the token, it must be replaced with "&".
Hope this help you !

How to upload image either .png or .jpg on ftp server in ios.?

I want to upload or save image to FTP server from my iOS app. but every time I get error that ftp not connected
I use SCRFTPRequest library.
here is my code...
UIImage *image = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
NSData * imageData = UIImagePNGRepresentation(image);
NSFileManager * fileManager = [NSFileManager defaultManager];
NSArray * paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString * documentsDirectory = [paths objectAtIndex:0];
NSString *fullPath = [documentsDirectory stringByAppendingPathComponent:[NSString stringWithFormat:#"%#.png",image]];
[fileManager createFileAtPath:fullPath contents:imageData attributes:nil];
NSLog(#"image saved");
[picker dismissViewControllerAnimated:YES completion:nil];
ftpRequest = [SCRFTPRequest requestWithURL:[NSURL URLWithString:#"ftp://myURL"] toUploadFile:fullPath];
ftpRequest.username = #"DemoUser";
ftpRequest.password = #"DemoUser";
ftpRequest.customUploadFileName = #"inapp";
ftpRequest.delegate = self;
[ftpRequest startAsynchronous];
From White Raccoon,
Just Drag and Drop the WhiteRaccoon.h and WhiteRaccoon.m file and import CFNetwork framework in your project.
- (void) upload
{
//the upload request needs the input data to be NSData
//so we first convert the image to NSData
UIImage * ourImage = [UIImage imageNamed:#"space.jpg"];
NSData * ourImageData = UIImageJPEGRepresentation(ourImage, 100);
//we create the upload request
//we don't autorelease the object so that it will be around when the callback gets called
//this is not a good practice, in real life development you should use a retain property to store a reference to the request
WRRequestUpload * uploadImage = [[WRRequestUpload alloc] init];
uploadImage.delegate = self;
//for anonymous login just leave the username and password nil
uploadImage.hostname = #"xxx.xxx.xxx.xxx";
uploadImage.username = #"myuser";
uploadImage.password = #"mypass";
//we set our data
uploadImage.sentData = ourImageData;
//the path needs to be absolute to the FTP root folder.
//full URL would be ftp://xxx.xxx.xxx.xxx/space.jpg
uploadImage.path = #"/space.jpg";
//we start the request
[uploadImage start];
}
-(void) requestCompleted:(WRRequest *) request{
//called if 'request' is completed successfully
NSLog(#"%# completed!", request);
}
-(void) requestFailed:(WRRequest *) request{
//called after 'request' ends in error
//we can print the error message
NSLog(#"%#", request.error.message);
}
-(BOOL) shouldOverwriteFileWithRequest:(WRRequest *)request {
//if the file (ftp://xxx.xxx.xxx.xxx/space.jpg) is already on the FTP server,the delegate is asked if the file should be overwritten
//'request' is the request that intended to create the file
return YES;
}
Finally i got success to upload image file on ftp server.
To upload image on ftp i used Gold Raccoon external library.with this library you can easily upload image to ftp server.
https://github.com/albertodebortoli/GoldRaccoon

Resources