I'm going to show avatar image of users within a conversation. I used JSQMessageViewController, so the function below should be used to achieve this goal. However, observeeventtype seems like get out this function and not being called, and there is a nil(MyuserImage or OtheruserImage) in return. So crash will appear. So how can I get photo url of different users in firebase and then return the expected avatar image? Thank you!
- (id<JSQMessageAvatarImageDataSource>)collectionView:(JSQMessagesCollectionView *)collectionView avatarImageDataForItemAtIndexPath:(NSIndexPath *)indexPath
{
JSQMessage *message = [self.msgArray objectAtIndex:indexPath.item];
if([message.senderId isEqualToString:self.senderId]){
NSString *MyuserId = [FIRAuth auth].currentUser.uid;
__block NSString *MyuserImage;
NSLog(#"uid is : %#",MyuserId);
[[_photoRef child:#"users"] observeEventType:FIRDataEventTypeValue withBlock:^(FIRDataSnapshot *snapshot) {
NSLog(#"My key is : %#",snapshot.key);
if([snapshot.key isEqualToString:MyuserId]){
NSLog(#"snapshot value is : %#", snapshot.value);
MyuserImage = snapshot.value[#"photo"];
}
}];
NSURL *url = [NSURL URLWithString:MyuserImage];
NSData *data = [NSData dataWithContentsOfURL:url];
self.myuserImage= [[UIImage alloc] initWithData:data];
return [JSQMessagesAvatarImageFactory avatarImageWithImage:self.myuserImage diameter:15];
}
else{
NSString *OtheruserId = message.senderId;
__block NSString *OtheruserImage;
NSLog(#"other userId is: %#",OtheruserId);
[[_photoRef child:#"users"]observeEventType:FIRDataEventTypeValue withBlock:^(FIRDataSnapshot *snapshot) {
NSLog(#"other user's key is: %#", snapshot.key);
if([snapshot.key isEqualToString:OtheruserId]){
NSLog(#"snapshot value is: %#", snapshot.value);
OtheruserImage = snapshot.value[#"photo"];
}
}];
NSURL *url = [NSURL URLWithString:OtheruserImage];
NSData *data = [NSData dataWithContentsOfURL:url];
self.otheruserImage= [[UIImage alloc] initWithData:data];
return [JSQMessagesAvatarImageFactory avatarImageWithImage:self.otheruserImage diameter:15];
}
}
From best I can tell it looks like you need to be pulling this data from firebase before this method gets called. Something like creating a struct that matches your data structure and then being able to pull this information from your datasource which presumably would be an array of these structs you've created.
An implementation I'd also recommend exploring is to use a library like SDWebImage which will manage the async call for you and let you set a default avatar while the network request and image rendering happen. But still you will need that url fetched from your database before this method gets called.
Related
I am Writing an app which has share extension to save selected photo to my app' local storage from iphone photo gallery.
NSData WriteToFile returns YES but I couldn't find the stored file into the directory in of which I gave path while writing.
So, in short NSData WriteToFile fails to save a photo at given path.
Below is my code.
- (IBAction)acceptButtonTapped:(id)sender
{
__block UIImage *photo;
for (NSExtensionItem *item in self.extensionContext.inputItems)
{
for (NSItemProvider *itemProvider in item.attachments)
{
if ([itemProvider hasItemConformingToTypeIdentifier:(NSString *)kUTTypeImage])
{
[itemProvider loadItemForTypeIdentifier:(NSString *)kUTTypeImage options:nil completionHandler:^(UIImage *image, NSError *error) {
if(image)
{
dispatch_async(dispatch_get_main_queue(), ^{
photo = image;
NSDateFormatter *formatter = [[NSDateFormatter alloc] init];
[formatter setDateFormat:#"yyyy_MM_dd_hh_mm_ss"];
NSString *fileName;
fileName = [NSString stringWithFormat:#"%#.jpeg",[formatter stringFromDate:[NSDate date]]];
dataPath = [dataPath stringByAppendingPathComponent:fileName];
NSData * imageData = [NSData dataWithData:UIImageJPEGRepresentation(image, 1.0)];
BOOL isdone = [imageData writeToFile:dataPath atomically:NO];
NSLog(#"%u", isdone);
});
}
}];
break;
}
}
}
[self.extensionContext completeRequestReturningItems:#[] completionHandler:nil];
}
Any Help would be much appreciable.
Thank you.
If you're trying to access the Document directory from the share extension, NO you can't do that. Share extension or other widgets are separate application from their containing app and therefore have their own sandbox. So you will need to use App Groups to share files.
Application groups are primarily targeted for extensions, more specifically, for widgets.
NSFileManager has a method on it containerURLForSecurityApplicationGroupIdentifier: where you can pass in the identifier you created when turning on App Groups for your apps
NSURL *containerURL = [[NSFileManager defaultManager]
containerURLForSecurityApplicationGroupIdentifier:#"group.com.company.app"];
You can save the files to this location, because you can access the shared application groups from both extension and host app.
You're modifying dataPath on each pass through the loop, appending another filename to it. That will create an ever-growing series of badly formed paths that contain all the filenames.
Don't do that. Create a new local variable filePath, and construct a filename into filePath using
filePath = [docsPath stringByAppendingPathComponent: filename];
Log your path and LOOK AT IT. When your program doesn't behave as expected, don't trust any of your assumptions, because one or more of them may be wrong.
I'm trying to upload images to Firebase like this:
Firebase *ref = [[Firebase alloc] initWithUrl:#"https://<app-name>.firebaseio.com/posts"];
Firebase *newPost = [ref childByAutoId];
NSDictionary *newPostData = #{
#"image" : [self encodeToBase64String:image]
};
[newPost updateChildValues:newPostData];
I'm using this code to encode the image:
- (NSString *)encodeToBase64String:(UIImage *)image {
return [UIImagePNGRepresentation(image) base64EncodedStringWithOptions:NSDataBase64Encoding64CharacterLineLength];
}
But this does not work as the string exceeds the maximum size:
Terminating app due to uncaught exception 'InvalidFirebaseData', reason: '(updateChildValues:) String exceeds max size of 10485760 utf8 bytes:
What can I do to resolve this problem? I haven't found anything online in regards to iOS development and images when using Firebase.
If the image is too big, you should store a smaller image. Let me quote myself: How do you save a file to a Firebase Hosting folder location for images through android?
The Firebase Database allows you to store JSON data. While binary data is not a type that is supported in JSON, it is possible to encode the binary data in say base64 and thus store the image in a (large) string. [But] while this is possible, it is not recommended for anything but small images or as a curiosity to see that it can be done.
Your best option is typically to store the images on a 3rd party image storage service.
As Frank van Puffelen suggested, my solution was to use Amazon S3 for imagine storage, and use Firebase to store a reference to the image location.
I created a method called uploadImage: and it looks like this:
-(void)uploadImage:(UIImage *)image
{
// Create reference to Firebase
Firebase *ref = [[Firebase alloc] initWithUrl:#"https://<MY-APP>.firebaseio.com"];
Firebase *photosRef = [ref childByAppendingPath:#“photos];
Firebase *newPhotoRef = [photosRef childByAutoId];
// Image information
NSString imageId = [[NSUUID UUID] UUIDString];
// Create dictionary containing information
NSDictionary photoInformation = #{
#“photo_id” : imageId
// Here you can add more information about the photo
};
NSString *imagePath = [NSTemporaryDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:#"%#.png", imageId]];
NSData *imageData = UIImagePNGRepresentation(image);
[imageData writeToFile:imagePath atomically:YES];
NSURL *imageUrl = [[NSURL alloc] initFileURLWithPath:imagePath];
AWSS3TransferManagerUploadRequest *uploadRequest = [AWSS3TransferManagerUploadRequest new];
uploadRequest.bucket = #“<AMAZON S3 STORAGE NAME>“; // create your own by setting up an account on Amazon aws.
uploadRequest.key = imageId;
uploadRequest.contentType = #"image/png";
uploadRequest.body = imageUrl;
AWSS3TransferManager *transferManager = [AWSS3TransferManager defaultS3TransferManager];
[[transferManager upload:uploadRequest] continueWithExecutor:[AWSExecutor mainThreadExecutor] withBlock:^id(AWSTask *task) {
if (!task.error) {
// Update Firebase with reference
[newPhotoRef updateChildValues:currentPHD withCompletionBlock:^(NSError *error, Firebase *ref) {
if (!error) {
[newPhotoRef updateChildValues:photoInformation withCompletionBlock:^(NSError *error, Firebase *ref) {
if (!error) {
// Uploaded image to Amazon S3 and reference to Firebase
}
}];
}
}];
} else {
// Error uploading
}
return nil;
}];
}
Edit
The method should be a block method, something like this:
-(void)uploadImage:(UIImage *)image withBlock:(void (^)(Firebase *ref, NSError *error, AWSTask *task))handler
{
// upload
}
I have UITableView with UIWebView in it.
Before I load UITableView, I fetch some JSON on another thread
-(void)fetchJson
{
dispatch_queue_t queue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0);
dispatch_async(queue, ^{
NSString* url = ...
NSData* theData = [[NSData alloc] initWithContentsOfURL:[NSURL URLWithString:url]];
#try {
NSError *error;
NSMutableArray* json = [NSJSONSerialization
JSONObjectWithData:theData
options:NSJSONReadingMutableContainers|NSJSONReadingMutableLeaves
error:&error];
if (error){
NSLog(#"%#",[error localizedDescription]);
NSLog(#"if");
}
else{
...
// here I fetch a string which holdshtml and MathML data in it.
NSString *eq_text = [question objectForKey:#"eq_text"];
stq.question = eq_text;
...
}
}
#catch (NSException *exception) {
NSLog(#"Exception");
}
dispatch_async(dispatch_get_main_queue(), ^{
_counter = _questionsArray.count;
[_tableView reloadData];
});
});
This is how I loadRequest to WebView
- (UITableViewCell *)tableView:(UITableView *)tableView cellForRowAtIndexPath:(NSIndexPath *)indexPath
{
...
[cell.webView loadRequest:[self readMathMlString:question.question]];
...
return cell;
}
With method [self readMathMlString:eq_text] I make NSUrlRequest. I have some MathJax That I need to show in the WebView, And I have to show it this way
-(NSURLRequest*)readMathMlString:(NSString*)content{
NSString *tmpFileName = #"test1.html";
//temp dir
NSString *tempDir = NSTemporaryDirectory();
//create NSURL
NSString *path4 = [tempDir stringByAppendingPathComponent:tmpFileName];
NSURL* url = [NSURL fileURLWithPath:path4];
//setup HTML file contents
NSString *filePath = [[NSBundle mainBundle] pathForResource:#"MathJax" ofType:#"js" inDirectory:#"MathJax-2.2-latest"];
//write to temp file "tempDir/tmpFileName", set MathJax JavaScript to use "filePath" as directory, add "xContent" as content of HTML file
[self writeStringToFile:tempDir fileName:tmpFileName pathName:filePath content:content];
NSURLRequest* req = [[NSURLRequest alloc] initWithURL:url];
//original request to show MathJax stuffs
//[myWebView loadRequest:req];
return req;
}
So, the problem is, for some reason data is shown in the strange order.
When the View is open for the first time, app shows first 4 cells appearing absolutely same, but they shouldn't be.
See example below:
This is the screenshot of what is shown:
This is NSLog of data that should be show
From here, we can see that for some reason, it displays all 4 cells with the data from what should be the 3rd cell.
Also, when i scroll my table view down and then up again, the data in WebView changes again.
Example:
What could be the cause of all this and how to solve it?
I call an API which then gives me a bunch of posts from the website. Each post contains a title, description and a thumbnail. I save the title and description and start the download of the image, but I'm confused how I then link up the image once downloaded with the object in Core Data it responded to.
I then have a UITableView with a NSFetchedResultsController that populates the table view with the Core Data objects.
for (WebServicePost *post in NSArray *posts) {
Post *newPost = [NSEntityDescription insertNewObjectForEntityForName:#"Post" inManagedObjectContext:self.managedObjectContext];
newPost.title = post.title;
newPost.info = post.info;
NSURLSessionDataTask *imageDataTask = [self.downloadSession dataTaskWithURL:[post.URL imgurThumbnailURLWithSize:CSImgurThumbnailSizeSmall] completionHandler:^(NSData *data, NSURLResponse *response, NSError *error) {
UIImage *downloadedImage = [UIImage imageWithData:data];
--> // What do I do here?
}];
[imageDataTask resume];
}
NSError *coreDataError;
if (![self.managedObjectContext save:&coreDataError]) {
NSLog(#"%#", [error localizedDescription]);
}
I'm confused how I link the downloaded image to a cell in the table view.
My plan was to have an NSCache that mapped the indexPath of each cell to the corresponding UIImage, but at this point I don't know the indexPath of the cell. I can't call indexPathForObject: either, as it hasn't been saved to Core Data yet.
I then considered mapping the Post object itself to the UIImage, instead of the indexPath, but this made my fearful as if the Post object changes, the one I had a reference to will no longer work, as it's the initial Post object.
Then in cellForRowAtIndexPath: I'd check if there was an image, if so I'd use it, if not I'd just continue waiting for it.
I'm confused how to match this up properly though. Is my setup wrong?
Before you start your for loop, make sure you add this line:
NSMutableArray *cellData = [NSMutableArray array];
Then in the "What do I do here part" type this code:
NSMutableDictionary *postData = [[NSMutableDictionary alloc] init];
[postData setObject:post forKey:#"post"];
[postData setObject:downloadedImage forKey:#"image"];
[cellData addObject:[postData copy]];
Now you can cycle through your cellData, and every image will be matched up with its post data because they are being pointed to through a dictionary!
So to get a post and it's image you could do this:
NSDictionary *data = [postData objectAtIndex:1];
[data objectForKey:#"post"]; //will return your post data for post in index 1
[data objectForKey:#"image"]; //will return your image for post in index 1
With this code below, I can extract metadata from an image (pre-added to my project), and render the info as text. This is exactly what I want to do. The SYMetadata is created by pointing to an image via URL. initWithAbsolutePathURL. I want to do the same thing with a UIImage or maybe the image that is being loaded to the UIImage. How do I get the URL to the image that the picker selects? Or how do I create an "asset" from this incoming image?
The documentation describes initWithAsset. Have not figured out how to use it yet though, or if this is the right way to go for my purpose. Any help greatly appreciated.
NSURL *imageURL = [[NSBundle mainBundle] URLForResource:#"someImage" withExtension:#"jpg"];
SYMetadata *metadata = [[SYMetadata alloc] initWithAbsolutePathURL:imageURL];
[textView setText:[[metadata allMetadatas] description]];
Note: I tried adding an NSURL like this imageURL = [info valueForKey:#"UIImagePickerControllerReferenceURL"];, in the "pickerDidFinish" method but the metadata is null after I add this URL to the above code.
If you are using the imagePickerController, the delegate method will give you what you need
- (void) imagePickerController:(UIImagePickerController *)picker
didFinishPickingMediaWithInfo:(NSDictionary *)info
{
if ([[info allKeys] containsObject:UIImagePickerControllerReferenceURL]){
// you will get this key if your image comes from a library
[self setMetaDataFromAssetLibrary:info];
} else if ([[info allKeys] containsObject:UIImagePickerControllerMediaMetadata]){
// if the image comes from the camera you get the metadata in it's own key
self.rawMetaData = [self metaDataFromCamera:info];
}
}
From Asset Library - bear in mind that it takes time to complete and has an asynchronous completion block, so you might want to add a completion flag to ensure you don't access the property before it has been updated.
- (void) setMetaDataFromAssetLibrary:(NSDictionary*)info
{
NSURL *assetURL = [info objectForKey:UIImagePickerControllerReferenceURL];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library assetForURL:assetURL
resultBlock:^(ALAsset *asset) {
self.rawMetaData = asset.defaultRepresentation.metadata;
}
failureBlock:^(NSError *error) {
NSLog (#"error %#",error);
}];
}
From Camera:
- (NSDictionary*)metaDataFromCamera:(NSDictionary*)info
{
NSMutableDictionary *imageMetadata = [info objectForKey:UIImagePickerControllerMediaMetadata];
return imageMetadata;
}
Here is how to get metadata from a UIImage
- (NSDictionary*)metaDataFromImage:(UIImage*)image
{
NSData *jpegData = [NSData dataWithData:UIImageJPEGRepresentation(image, 1.0)];
return [self metaDataFromData:jpegData];
}
But take care - a UIImage can already stripped of much of the metadata from the original.. you will be better off getting the metadata from the NSData that was used to create the UIImage...
- (NSDictionary*)metaDataFromData:(NSData*)data
{
CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef)data, NULL);
CFDictionaryRef imageMetaData = CGImageSourceCopyPropertiesAtIndex(source,0,NULL);
return (__bridge NSDictionary *)(imageMetaData);
}
If you've an ALAsset (in my sample _detailItem), you can have metadata in this way:
NSDictionary *myMetadata = [[_detailItem defaultRepresentation] metadata];