I'm trying to upload images to Firebase like this:
Firebase *ref = [[Firebase alloc] initWithUrl:#"https://<app-name>.firebaseio.com/posts"];
Firebase *newPost = [ref childByAutoId];
NSDictionary *newPostData = #{
#"image" : [self encodeToBase64String:image]
};
[newPost updateChildValues:newPostData];
I'm using this code to encode the image:
- (NSString *)encodeToBase64String:(UIImage *)image {
return [UIImagePNGRepresentation(image) base64EncodedStringWithOptions:NSDataBase64Encoding64CharacterLineLength];
}
But this does not work as the string exceeds the maximum size:
Terminating app due to uncaught exception 'InvalidFirebaseData', reason: '(updateChildValues:) String exceeds max size of 10485760 utf8 bytes:
What can I do to resolve this problem? I haven't found anything online in regards to iOS development and images when using Firebase.
If the image is too big, you should store a smaller image. Let me quote myself: How do you save a file to a Firebase Hosting folder location for images through android?
The Firebase Database allows you to store JSON data. While binary data is not a type that is supported in JSON, it is possible to encode the binary data in say base64 and thus store the image in a (large) string. [But] while this is possible, it is not recommended for anything but small images or as a curiosity to see that it can be done.
Your best option is typically to store the images on a 3rd party image storage service.
As Frank van Puffelen suggested, my solution was to use Amazon S3 for imagine storage, and use Firebase to store a reference to the image location.
I created a method called uploadImage: and it looks like this:
-(void)uploadImage:(UIImage *)image
{
// Create reference to Firebase
Firebase *ref = [[Firebase alloc] initWithUrl:#"https://<MY-APP>.firebaseio.com"];
Firebase *photosRef = [ref childByAppendingPath:#“photos];
Firebase *newPhotoRef = [photosRef childByAutoId];
// Image information
NSString imageId = [[NSUUID UUID] UUIDString];
// Create dictionary containing information
NSDictionary photoInformation = #{
#“photo_id” : imageId
// Here you can add more information about the photo
};
NSString *imagePath = [NSTemporaryDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:#"%#.png", imageId]];
NSData *imageData = UIImagePNGRepresentation(image);
[imageData writeToFile:imagePath atomically:YES];
NSURL *imageUrl = [[NSURL alloc] initFileURLWithPath:imagePath];
AWSS3TransferManagerUploadRequest *uploadRequest = [AWSS3TransferManagerUploadRequest new];
uploadRequest.bucket = #“<AMAZON S3 STORAGE NAME>“; // create your own by setting up an account on Amazon aws.
uploadRequest.key = imageId;
uploadRequest.contentType = #"image/png";
uploadRequest.body = imageUrl;
AWSS3TransferManager *transferManager = [AWSS3TransferManager defaultS3TransferManager];
[[transferManager upload:uploadRequest] continueWithExecutor:[AWSExecutor mainThreadExecutor] withBlock:^id(AWSTask *task) {
if (!task.error) {
// Update Firebase with reference
[newPhotoRef updateChildValues:currentPHD withCompletionBlock:^(NSError *error, Firebase *ref) {
if (!error) {
[newPhotoRef updateChildValues:photoInformation withCompletionBlock:^(NSError *error, Firebase *ref) {
if (!error) {
// Uploaded image to Amazon S3 and reference to Firebase
}
}];
}
}];
} else {
// Error uploading
}
return nil;
}];
}
Edit
The method should be a block method, something like this:
-(void)uploadImage:(UIImage *)image withBlock:(void (^)(Firebase *ref, NSError *error, AWSTask *task))handler
{
// upload
}
Related
I'm going to show avatar image of users within a conversation. I used JSQMessageViewController, so the function below should be used to achieve this goal. However, observeeventtype seems like get out this function and not being called, and there is a nil(MyuserImage or OtheruserImage) in return. So crash will appear. So how can I get photo url of different users in firebase and then return the expected avatar image? Thank you!
- (id<JSQMessageAvatarImageDataSource>)collectionView:(JSQMessagesCollectionView *)collectionView avatarImageDataForItemAtIndexPath:(NSIndexPath *)indexPath
{
JSQMessage *message = [self.msgArray objectAtIndex:indexPath.item];
if([message.senderId isEqualToString:self.senderId]){
NSString *MyuserId = [FIRAuth auth].currentUser.uid;
__block NSString *MyuserImage;
NSLog(#"uid is : %#",MyuserId);
[[_photoRef child:#"users"] observeEventType:FIRDataEventTypeValue withBlock:^(FIRDataSnapshot *snapshot) {
NSLog(#"My key is : %#",snapshot.key);
if([snapshot.key isEqualToString:MyuserId]){
NSLog(#"snapshot value is : %#", snapshot.value);
MyuserImage = snapshot.value[#"photo"];
}
}];
NSURL *url = [NSURL URLWithString:MyuserImage];
NSData *data = [NSData dataWithContentsOfURL:url];
self.myuserImage= [[UIImage alloc] initWithData:data];
return [JSQMessagesAvatarImageFactory avatarImageWithImage:self.myuserImage diameter:15];
}
else{
NSString *OtheruserId = message.senderId;
__block NSString *OtheruserImage;
NSLog(#"other userId is: %#",OtheruserId);
[[_photoRef child:#"users"]observeEventType:FIRDataEventTypeValue withBlock:^(FIRDataSnapshot *snapshot) {
NSLog(#"other user's key is: %#", snapshot.key);
if([snapshot.key isEqualToString:OtheruserId]){
NSLog(#"snapshot value is: %#", snapshot.value);
OtheruserImage = snapshot.value[#"photo"];
}
}];
NSURL *url = [NSURL URLWithString:OtheruserImage];
NSData *data = [NSData dataWithContentsOfURL:url];
self.otheruserImage= [[UIImage alloc] initWithData:data];
return [JSQMessagesAvatarImageFactory avatarImageWithImage:self.otheruserImage diameter:15];
}
}
From best I can tell it looks like you need to be pulling this data from firebase before this method gets called. Something like creating a struct that matches your data structure and then being able to pull this information from your datasource which presumably would be an array of these structs you've created.
An implementation I'd also recommend exploring is to use a library like SDWebImage which will manage the async call for you and let you set a default avatar while the network request and image rendering happen. But still you will need that url fetched from your database before this method gets called.
I'm trying to get all photos from photos library with image's metadata. It works fine for 10-20 images but when there are 50+ images it occupies too much memory, which causes to app crash.
Why i need all images into array?
Answer - to send images to server app. [i'm using GCDAsyncSocket to send data on receiver socket/port and i don't have that much waiting time to request images from PHAsset while sending images on socket/port.
My Code :
+(void)getPhotosDataFromCamera:(void(^)(NSMutableArray *arrImageData))completionHandler
{
[PhotosManager checkPhotosPermission:^(bool granted)
{
if (granted)
{
NSMutableArray *arrImageData = [NSMutableArray new];
NSArray *arrImages=[[NSArray alloc] init];
PHFetchResult *result = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeImage options:nil];
NSLog(#"%d",(int)result.count);
arrImages = [result copy];
//--- If no images.
if (arrImages.count <= 0)
{
completionHandler(nil);
return ;
}
__block int index = 1;
__block BOOL isDone = false;
for (PHAsset *asset in arrImages)
{
[PhotosManager requestMetadata:asset withCompletionBlock:^(UIImage *image, NSDictionary *metadata)
{
#autoreleasepool
{
NSData *imageData = metadata?[PhotosManager addExif:image metaData:metadata]:UIImageJPEGRepresentation(image, 1.0f);
if (imageData != nil)
{
[arrImageData addObject:imageData];
NSLog(#"Adding images :%i",index);
//--- Done adding all images.
if (index == arrImages.count)
{
isDone = true;
NSLog(#"Done adding all images with info!!");
completionHandler(arrImageData);
}
index++;
}
}
}];
}
}
else
{
completionHandler(nil);
}
}];
}
typedef void (^PHAssetMetadataBlock)(UIImage *image,NSDictionary *metadata);
+(void)requestMetadata:(PHAsset *)asset withCompletionBlock:(PHAssetMetadataBlock)completionBlock
{
PHContentEditingInputRequestOptions *editOptions = [[PHContentEditingInputRequestOptions alloc]init];
editOptions.networkAccessAllowed = YES;
[asset requestContentEditingInputWithOptions:editOptions completionHandler:^(PHContentEditingInput *contentEditingInput, NSDictionary *info)
{
CIImage *CGimage = [CIImage imageWithContentsOfURL:contentEditingInput.fullSizeImageURL];
UIImage *image = contentEditingInput.displaySizeImage;
dispatch_async(dispatch_get_main_queue(), ^{
completionBlock(image,CGimage.properties);
});
CGimage = nil;
image = nil;
}];
editOptions = nil;
asset =nil;
}
+ (NSData *)addExif:(UIImage*)toImage metaData:(NSDictionary *)container
{
NSData *imageData = UIImageJPEGRepresentation(toImage, 1.0f);
// create an imagesourceref
CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef) imageData, NULL);
// this is the type of image (e.g., public.jpeg)
CFStringRef UTI = CGImageSourceGetType(source);
// create a new data object and write the new image into it
NSMutableData *dest_data = [[NSMutableData alloc] initWithLength:imageData.length+2000];
CGImageDestinationRef destination = CGImageDestinationCreateWithData((__bridge CFMutableDataRef)dest_data, UTI, 1, NULL);
if (!destination) {
NSLog(#"Error: Could not create image destination");
}
// add the image contained in the image source to the destination, overidding the old metadata with our modified metadata
CGImageDestinationAddImageFromSource(destination, source, 0, (__bridge CFDictionaryRef) container);
BOOL success = NO;
success = CGImageDestinationFinalize(destination);
if (!success) {
NSLog(#"Error: Could not create data from image destination");
}
CFRelease(destination);
CFRelease(source);
imageData = nil;
source = nil;
destination = nil;
return dest_data;
}
Well it's not a surprise that you arrive into this situation, since each of your image consumes memory and you instantiate and keep them in memory. This is not really a correct design approach.
In the end it depends on what you want to do with those images.
What I would suggest is that you keep just the array of your PHAsset objects and request the image only on demand.
Like if you want to represent those images into a tableView/collectionView, perform the call to
[PhotosManager requestMetadata:asset withCompletionBlock:^(UIImage *image, NSDictionary *metadata)
directly in the particular method. This way you won't drain the device memory.
There simply is not enough memory on the phone to load all of the images into the photo library into memory at the same time.
If you want to display the images, then only fetch the images that you need for immediate display. For the rest keep just he PHAsset. Make sure to discard the images when you don't need them any more.
If you need thumbnails, then fetch only the thumbnails that you need.
If want to do something with all of the images - like add a watermark to them or process them in some way - then process each image one at a time in a queue.
I cannot advise further as your question doesn't state why you need all of the images.
I am trying to download only image and text(probably HTML string) of a Evernote's note in my iOS app. I have successfully downloaded image from a note . But I did not find any method or process which help me to get text which are written on the note . I have used
ENSDK.framework
-(void)findAllNotes {
NSLog(#"finding all notes..");
[self.session findNotesWithSearch:nil
inNotebook:nil
orScope:ENSessionSearchScopeAll
sortOrder:ENSessionSortOrderNormal
maxResults:255
completion:^(NSArray* findNotesResults,
NSError* findNotesError) {
if (findNotesError) {
[self.session unauthenticate];
NSAssert(NO, #"Could not find notes with error %#", findNotesError);
} else {
[self processFindNotesResults:findNotesResults];
}
}];
}
- (void)processFindNotesResults:(NSArray*)results {
NSParameterAssert(results);
NSLog(#"processing find notes results..");
for (ENSessionFindNotesResult* result in results) {
[self.session downloadNote:result.noteRef
progress:NULL
completion:^(ENNote* note,
NSError* downloadNoteError) {
NSAssert(!downloadNoteError, #"Could not download note with error %#",
downloadNoteError);
[self getDataFromNote:note];
}];
}
}
-(void)getDataFromNote:(ENNote*)note {
for (ENResource* resource in note.resources) {
if ([resource.mimeType hasPrefix:#"image"]) {
UIImage* image = [[UIImage alloc] initWithData:resource.data];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,NSUserDomainMask, YES);
NSString *docs = [paths objectAtIndex:0];
NSString* path = [docs stringByAppendingFormat:#"/image1.jpg"];
NSData* imageData = [NSData dataWithData:UIImageJPEGRepresentation(image, .8)];
NSError *writeError = nil;
if(![imageData writeToFile:path options:NSDataWritingAtomic error:&writeError]) {
NSLog(#"%#: Error saving image: %#", [self class], [writeError localizedDescription]);
}
}
}
}
The content of the note is available to you in the content property of your variable note; i.e. it's in the content property of an ENNote object.
Also note that in addition to accessing the content directly, the Evernote iOS SDK also includes a special method that makes it easy to display a note's content in a UIWebView:
We've made this easy-- rather than serializing it to HTML and fussing with attached image resources, we've provided a method to generate a single Safari "web archive" from the note; this is a bundled data type which UIWebView natively knows how to load directly.
I'm having an issue with uploading a photo to Google Drive via the SDK made available for Objective-C.
The summary of the situation is the following, I create a folder with a defined name, after the folder is created I upload a finite number of photos stores in my app. I wait until I receive confirmation that a photo was uploaded successfully before trying the next one on the list.
The issue I'm having is the following, I know the photo file is ~9MB, and it hits Google Drive successfully. The problem is that I'm uploading it with a MIME type image/jpeg, the file that actually appears in Google Drive is a PNG image file, and it's 22 MB in size!!!!!!! I can't understand why it's interpreting it as a PNG, and why does the size grow so much.
This is my relevant code:
- (void) uploadPhotoToFolder:(NSString *)identifier withIndex:(int)index{
UIImage *content = [[photoArray objectAtIndex:index] objectAtIndex:0];
NSString *mimeType = #"image/jpeg";
GTLDriveFile *metadata = [GTLDriveFile object];
NSString *name =#"FileName";
metadata.name = name;
metadata.parents = #[identifier];
NSData *data = UIImagePNGRepresentation(content);
GTLUploadParameters *uploadParameters = [GTLUploadParameters uploadParametersWithData:data
MIMEType:mimeType];
GTLQueryDrive *query = [GTLQueryDrive queryForFilesCreateWithObject:metadata
uploadParameters:uploadParameters];
[self.service executeQuery:query completionHandler:^(GTLServiceTicket *ticket,
GTLDriveFile *updatedFile,
NSError *error) {
if (error == nil) {
//Notify that upload was successful
}
else {
//Notify that upload failed.
}
}];
}
Thank you in advance for any help.
You're using NSData *data = UIImagePNGRepresentation(content); which makes your image a PNG despite the MIME type you send.
I am Writing an app which has share extension to save selected photo to my app' local storage from iphone photo gallery.
NSData WriteToFile returns YES but I couldn't find the stored file into the directory in of which I gave path while writing.
So, in short NSData WriteToFile fails to save a photo at given path.
Below is my code.
- (IBAction)acceptButtonTapped:(id)sender
{
__block UIImage *photo;
for (NSExtensionItem *item in self.extensionContext.inputItems)
{
for (NSItemProvider *itemProvider in item.attachments)
{
if ([itemProvider hasItemConformingToTypeIdentifier:(NSString *)kUTTypeImage])
{
[itemProvider loadItemForTypeIdentifier:(NSString *)kUTTypeImage options:nil completionHandler:^(UIImage *image, NSError *error) {
if(image)
{
dispatch_async(dispatch_get_main_queue(), ^{
photo = image;
NSDateFormatter *formatter = [[NSDateFormatter alloc] init];
[formatter setDateFormat:#"yyyy_MM_dd_hh_mm_ss"];
NSString *fileName;
fileName = [NSString stringWithFormat:#"%#.jpeg",[formatter stringFromDate:[NSDate date]]];
dataPath = [dataPath stringByAppendingPathComponent:fileName];
NSData * imageData = [NSData dataWithData:UIImageJPEGRepresentation(image, 1.0)];
BOOL isdone = [imageData writeToFile:dataPath atomically:NO];
NSLog(#"%u", isdone);
});
}
}];
break;
}
}
}
[self.extensionContext completeRequestReturningItems:#[] completionHandler:nil];
}
Any Help would be much appreciable.
Thank you.
If you're trying to access the Document directory from the share extension, NO you can't do that. Share extension or other widgets are separate application from their containing app and therefore have their own sandbox. So you will need to use App Groups to share files.
Application groups are primarily targeted for extensions, more specifically, for widgets.
NSFileManager has a method on it containerURLForSecurityApplicationGroupIdentifier: where you can pass in the identifier you created when turning on App Groups for your apps
NSURL *containerURL = [[NSFileManager defaultManager]
containerURLForSecurityApplicationGroupIdentifier:#"group.com.company.app"];
You can save the files to this location, because you can access the shared application groups from both extension and host app.
You're modifying dataPath on each pass through the loop, appending another filename to it. That will create an ever-growing series of badly formed paths that contain all the filenames.
Don't do that. Create a new local variable filePath, and construct a filename into filePath using
filePath = [docsPath stringByAppendingPathComponent: filename];
Log your path and LOOK AT IT. When your program doesn't behave as expected, don't trust any of your assumptions, because one or more of them may be wrong.