Why does my app show a box with question mark on mmssheet? - ios

I am creating an app that makes use of iMessage and MMS.
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer) {
The problem is, after I take a picture then click the send button, the image randomly becomes a question mark.
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
The above code was the first the one we used followed by:
[self sendSMSmessage:myMessage image:imageData];
next, in the sendSMSmessage method, the following codes were called:
MFMessageComposeViewController *myText = [[MFMessageComposeViewController alloc] init];
[myText setMessageComposeDelegate:self];
[myText setBody:myMessage];
[myText addAttachmentData:image typeIdentifier:#"image/jpeg" filename:#"image.jpeg"];
Then, I present the MFMessageComposeViewController myText
[self presentViewController:myText animated:YES completion:nil];
Then, I clicked Send... It successfully sent for the app and I can see the picture on MFMessageComposeViewController. But when I try to look at iMessage, some of the pictures I tried to send is good. But some are not. Some show question mark. Did the images got corrupted in the process or what? I tried to compress the image using the following:
CGFloat compressionQuality = 0.3;
NSData *imageData = [NSData dataWithData:(UIImageJPEGRepresentation
([UIImage imageWithData:[AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer]]
, compressionQuality))];
but still performs the same. In 10 tries, I get something like 4 failed images. Could it be a problem in iPhone or just my app itself? Thanks!

I don't know if some people would experience this, but the thing I did to fix this was:
Declaring the imageData as a strong reference:
__strong NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
And then I changed the typeIdentifier of the myText composer into #"public.jpeg" instead of #"image/jpeg"
[myText addAttachmentData:image typeIdentifier:#"public.jpeg" filename:#"image.jpeg"];

Related

I am using UIActivityViewController , But it rotates images if I share it on Facebook, Email etc

I am using UIActivityViewController to share images.
As it share image on facebook or email. Then image gets rotate. Why it is so ?
My Code is as follows:
ShareArr contains image objects
ActivityViewCustomActivity *aVCA = [[ActivityViewCustomActivity alloc]init];
UIActivityViewController *controller = [[UIActivityViewController alloc] initWithActivityItems:shareArr applicationActivities:[NSArray arrayWithObject:aVCA]];
[controller setCompletionHandler:^(NSString *activityType, BOOL completed) { }];
[self presentViewController:controller animated:YES completion:nil];
Any idea ?
I know this was asked awhile ago but I ran into the same problem and here is my solution.
I found this site bug in UIActivityViewController
Here is the paragraph that I found useful from the site,
Notes: It seems when the system converts the UIImage into an NSData for the MFMailComposeViewController (and other UIActivities) it does NOT preserve the orientation information. This is why when an NSData is passed in, it is rotated correctly as the system doesn't have to convert it.
So here is part of the code I was using in my case (taking a photo),
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
Then the facebook function from the UIActivityViewController was taking the UIImage and converting that back into NSData. So what I did was use the NSData *imageData and put that directly into my activityViewController.
Hope this helps someone.

Send NSData via Airdrop

I've got a (for me) big problem.
I want to send a vcf-file with airdrop from my own app to another iOS device. I've got a NSData object, which i should convert to a vcf file, and this I should send with airdrop to another IOS device.
The NSData object works fine, i can send a vcc file with email, but with airdrop I left my limit.
I tried everything i found here in the forum and on developer.apple.com. But nothing works, I think the reason is, that i have no idea how too start the fix the problem.
Has anybody any idea how i can realize it?
THANKS
I believe this is roughly what you are looking for:
NSString *contactName = nil; // name of person in vcard
NSData *vcfData = nil; // vcard data
NSURL *fileURL = [NSURL fileURLWithPath:[NSTemporaryDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:#"%#.vcf", contactName]]];
NSError *writeError;
if ([vcfData writeToURL:fileURL options:NSDataWritingAtomic error:&writeError]) {
NSArray *activityItems = #[fileURL];
UIActivityViewController *avc = [[UIActivityViewController alloc] initWithActivityItems:activityItems applicationActivities:nil];
[self presentViewController:avc animated:YES completion:nil];
} else {
// failed, handle errors
}
If you still want to support providing NSData to some of the activities you will have to create some objects that conforms to UIActivityItemSource protocol and have some of them return nil where appropriate (see this SO for more details on that). You might find the AirDrop sample code project from Apple helpful too.

ios awkard image loading (url to nsdata to uiimage to uiimageview)

I need you to humor me during this question. This is somewhat pseudo code because the actual situation is quite complex. I wouldn't load an image this way unless I needed to. Assume that I need to.
NSURL *bgImageURL = [NSURL URLWithString:#"https://www.google.com/images/srpr/logo3w.png"];
UIImage *img = [UIImage imageWithData:[NSData dataWithContentsOfURL:bgImageURL]];
[self.anIBOutletOfUIImageView setImage:img];
but I crash out with
-[__NSCFData _isResizable]: unrecognized selector sent to instance 0x9508c70
How can I load an image from a URL into NSData and then load that NSData into a UIImage and set that UIImage as the image for my UIImageView?
Again, I realize this sounds like nonsense, but due to an image caching system I'm using I have to do things this way :(
How I usually handle this situation (not compiled, not tested):
NSURL * url = [NSURL URLWithString:#"https://www.google.com/images/srpr/logo3w.png"];
NSURLRequest * request = [NSURLRequest requestWithURL:url];
[NSURLConnection sendAsynchronousRequest:request
queue:[NSOperationQueue currentQueue]
completionHandler:^(NSURLResponse * resp, NSData * data, NSError * error) {
// No error handling - check `error` if you want to
UIImage * img = [UIImage imageWithData:data];
[self.imageView performSelectorOnMainThread:#selector(setImage:) withObject:img waitUntilDone:YES];
}];
This avoids the long-running network request implied by the call to dataWithContentsOfURL:, so your app can continue doing things on the main thread while the data downloads for the image.
As a side note, you seem to be running into the same error as this question; you might want to check that you're not running into object allocation problems.
I plugged this code into a new, iPad "Single view application" template:
- (void)viewDidLoad {
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
NSURL *bgImageURL = [NSURL URLWithString:#"https://www.google.com/images/srpr/logo3w.png"];
NSData *bgImageData = [NSData dataWithContentsOfURL:bgImageURL];
UIImage *img = [UIImage imageWithData:bgImageData];
[[self Imageview] setImage:img];
}
I got the image to load up properly. I even tried different content modes for the UIImageView, and they all worked as expected.
If you start from scratch, do you get the same problem?
The error message indicates that you're sending a message of "_isResizable" to an NSData object. Maybe you're inadvertently setting the UIImageView's image property equal to the NSData instead of the UIImage.
Edit/Update
I even went back and tried different versions of "one-lining" the code to see if any of that mattered, since my "working" copy was slightly altered from your non-working copy. I couldn't get the code to break. My guess would be that the code in question isn't broken, but something nearby is...

MessageUI framework - image doesn't show in compose view on iPhone

I am using the MessageUI framework to send and image via e-mail after taking the photo using UIImagePickerController. When I take the photo and then invoke the mail mail message interface I get the compose window. When testing on an iPod touch (iOS 4.3.5) and iPad (iOS 5.0.1) I see the image attachment in the body of the compose window. When testing on an iPhone (4S iOS 5.0.1) the image does not appear in the compose window, but rather I see a box the size of the image attachment with an embedded small blue box with a "?" in it.
In both cases when the mail message is sent the image appears in the message received in the Mail app - iOS devices and Mac.
What can I do to fix this?
UPDATE: I've worked around this by changing:
NSData *imageData = [NSData dataWithData:UIImagePNGRepresentation(imageToSend)]
to:
NSData *imageDataJPG = [NSData dataWithData:UIImageJPEGRepresentation(imageToSend, 1.0)];
I can't see in the UIKit docs that there is anything in UIImagePNGRepresentation that would not work on an iPhone ...
(Xcode 4.2.1, ARC, 5.0 SDK, Deploy target 4.3)
Here is the code for composing the message:
-(void)displayComposerSheet
{
MFMailComposeViewController *mailPicker = [[MFMailComposeViewController alloc] init];
mailPicker.mailComposeDelegate = self;
[mailPicker setSubject:#"Photo"];
NSData *imageData = [NSData dataWithData:UIImagePNGRepresentation(imageToSend)];
[mailPicker addAttachmentData:imageData mimeType:#"image/png" fileName:#"Photo"];
// Fill out the email body text.
NSString *line1 = [NSString stringWithFormat: #"I took this photo on my %#.\n", [[UIDevice currentDevice] model]];
NSString *body = [NSString stringWithFormat:#"%#", line1];
[mailPicker setMessageBody:body isHTML:NO];
// Present the mail composition interface.
[self presentModalViewController:mailPicker animated:YES];
}
The image size is restricted so if the image you are sending through is greater than a certain dimension you'll get the effect you describe above.
I had a look at my own code and saw I had
#define MAX_MAIL_DIM 1536
Which seems to be 1024 * 1.5. Sorry I can't remember how I arrived at that number but I suspect it was trial and error.
larik, your suggestion about using JPEG for the data type worked great. PNG files at this size are way too big anyway -- around 10MB. Here's the code with the JPEG NSData:
if ([MFMailComposeViewController canSendMail]) {
picker = [[MFMailComposeViewController alloc] init];
[picker setMailComposeDelegate:self];
[picker setSubject:#"My Picture"];
NSString *emailBody = #"";
[picker setMessageBody:emailBody isHTML:YES];
NSData *data = UIImageJPEGRepresentation(tempImage, 0);
[picker addAttachmentData:data mimeType:#"image/jpg" fileName:#"CameraImage"];
}

AVCaptureOutput takes dark picture even with flash on

I have come up with an implementation of AVFoundation and ImageIO to take care of the photo taking in my application. I have an issue with it, however. The images I take are always dark, even if the flash goes off. Here's the code I use:
[[self currentCaptureOutput] captureStillImageAsynchronouslyFromConnection:[[self currentCaptureOutput].connections lastObject]
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
[[[blockSelf currentPreviewLayer] session] stopRunning];
if (!error) {
NSData *data = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
CGImageSourceRef source = CGImageSourceCreateWithData((CFDataRef) data, NULL);
if (source) {
UIImage *image = [blockSelf imageWithSource:source];
[blockSelf updateWithCapturedImage:image];
CFRelease(source);
}
}
}];
Is there anything there that could cause the image taken to not include the flash?
I found I sometimes got dark images if the AVCaptureSession was set up immediately before this call. Perhaps it takes a while for the auto-exposure & white balance settings to adjust themselves.
The solution was to set up the AVCaptureSession, then wait until the AVCaptureDevice's adjustingExposure and adjustingWhiteBalance properties are both NO (observe these with KVO) before calling -[AVCaptureStillImageOutput captureStillImageAsynchronouslyFromConnection: completionHandler:].

Resources