Taking photos and video with CameraUI: MediaPromise.file is null on iOS - ios

I'm developing a AIR App that asks the user to take a photo or video and upload it to a server.
Everything was OK with Android, but I found out that MediaPromise.file was null on iOS.
I don't need/want to manipulate the image/video at all, I just need to get this File/FileReference and upload it to the server.
My problem is that, as iOS doesn't give me a File from the MediaPromise, I don't have anything to upload.
Googling around I found out how to, asynchronously, load the the data (with the Loader.loadFilePromise()) from the promise, but I don't know how to encode it as a video when I call camera.launch(MediaType.VIDEO); (where camera is a CameraUI instance).
I'm handling the camera.launch(MediaType.IMAGE); call just fine already, but I really want to get the file directly from the OS, if it is possible, so I don't need to encode it myself.
To sum it up, I need to upload the photo/video the user just took with the CameraUI API on the iOS.
How can I do this?
note: I can provide some code if needed...

This seems to be what you're looking for...AIR for iOS - Taking & Uploading Photos

Related

PHObject localIdentifier reliability

Are there ever circumstances where the localIdentifier could change or not be accurate? I'm working on an application that backs up photos and have been told by my colleagues that the localIdentifier can't be trusted. However, after doing some research I have been unable to find anyone talking about this.
LocalIdentifiers sometimes change after updating the iOS version.
I've seen the PHAsset.localIdentifiers (Photo API) change after iOS updates twice - in my own Apps. The last time was after updating to iOS 11. The app in question kept track of localIdentifiers so specific images could be found/sorted according to various predicates.
As soon as the update from iOS10 to 11 finished, all the localIdentifiers saved locally by the app became useless as they didn't match any device images, although they still existed in the camera roll.
Like you I've searched for info on this subject without success. Sorry that my reply is bad news.
Use PHCloudIdentifier everywhere. You can use myCloudIdentifier.stringValue if you need it for data storage purposes.
Use cloudIdentifiers(forLocalIdentifiers:) method to convert from local to cloud and the sister method, localIdentifiers(for: [PHCloudIdentifier]), to convert back again.

Record video of the screen using an App on iOS

I want to build an App that allows me to record what i am doing on my iphone.. like a mirror broadcast but without using external Apps or connect it to a mac.
is it possible ?
If u are talking about onscreen-recording then,
Check this link, u will get a valid program.
What this does is it saves the contents of a UIView to a UIImage. The author suggests you can save a video of the app in use by passing the frames through AVCaptureSession.
I believe it hasn't been tested with an OpenGL subview, but assuming that it works you might be able to modify it slightly to include audio and then you'd be set.
Refer the comments section in the link to t mix audio & video and save it as a quicktime movie.
Also,
Another useful project for this is RecordMyScreen, found here
Search in mac .. Quick Time Player..
Then Open it. and Right CLick and you will find NEw Screen Recording
Option
And You Have to follow below Step.
And Finally You have Video

Capture a image and know the location of that image from where it was captured in iOS

I am going to design a app in iOS in which i have a functionality, whenever i capture a image from an iOS device that "image should not be saved in the camera roll" and "i need to know the exact location from where the image was captured that is street address, country, etc,.....". Any help would be appreciated.
Thanks in advance.
Exif Data is the partially hidden part of a photo file that most people don't think about when they upload a photo to a service like Dropbox or send a photo to a friend. This EXIF data includes the camera model you are using, basic settings of the camera when the picture was taken, the photo resolution and, if your camera has GPS, the location where the photo was taken.
You can extract the Exif data from image file (Get Exif data from UIImage - UIImagePickerController).
You can also view that data with Preview app on mac. To see go to Preview and click for info for image their you get GPS data for file.
It is correct that the iphone(ipad, etc, i'll just call it iphone from now on) strips exif data. This is also not a bug on the iphone but actually a feature.
One of the main reasons android users don't like the iphone and iphone users don't like the androids, is because the iphone is very limited (in terms of freedom to change, alter, etc). You can not just run downloaded apps, have limited access to settings, etc.
This is because the apple strategy is to create a fail-safe product. "If you can not do strange things, strange things will not happen".It tries to protect the user in every way imaginable. It also protects the user when uploading images. In the exif there may be data that can hurt the users privacy. Things like GPS coordinates, but even a timestamp can hurt a user (imagine you uploading a beach picture with a timestamp from a moment you reported in sick with the boss).
So basically it is a safety meassure to strip all exif data. Myself and a lot of other people do not agree with this strategy, but there is nothing we can do about it unfortunately.
The solution
Update: This does not work.
Luckily you can get around this problem. Javascript comes to the rescue. With javascript you can read the exif data and send it with you photo by adding some extra POST data.
please note: this solution was presented to me by another developer and is not yet tested.

How can I load images received from iMessage? [iOS7]

I am trying to load the images received from a contact in iMessage, but I have not found in iOS SDK to do this.
Load the contents of incoming iMessages.
Load the images received by iMessage.
Someone who knows how to help me?
This can not be done with on iOS with the iOS SDK. Due to privacy concerns Apple does not allow access to any messages received via the messages.app.
If you're looking to pull images directly from the Messages app, then no you can't do that. If however, you're okay with only viewing the images that the user has chosen to save to the camera roll, then yes you have a few options.
You can use the UIImagePickerController class to allow the user to select photos from their camera roll, or the AssetsLibrary Framework to get references to these images without the user selecting them. Both solutions do require the user granting you access to the camera roll though.

What is the equivilent to a ContentObserver in iOS?

I use a ContentObserver in my android application to receive a notification whenever a photo is taken. Obviously iOS doesn't use the intent system, so is there an equivalent or alternative way to do this? I would prefer not to write a full camera application if possible.
This is not quite possible. Even the latest Google+ application on iOS, with its Instant Upload feature does not get notified when photos are taken, it simply checks the asset library while the application is running, and then as long as possible while running in the background before getting timed out by the OS.
The following is from the Google+ help on the matter:
Note: Photos and videos will upload while the Google+ application is
open and for a brief period of time afterwards.
HTH
You cannot run in the background and get notified when a picture is taken and saved to the Camera Roll by another app on iOS.
That said, you don't really need to write a whole camera app to be able to let the user take pictures from within your app, or to access Camera Roll from within your running app.
Take a look at UIImagePickerController. It is really easy to let the user select or take a photo on iOS.
There is no equivalent to ContentObserver in iOS.

Resources