UIWebView: How to resize a taken photo (iOS) - ios

Consider my application a webView displaying my website. In my website I have an input tag where to upload a picture. This input tag on a tablet it is great because you can:
upload an existing photo
take an instant photo and upload it
The second option is the one that interests me. The problem is that there is no resizement and the picture sent to the server is 1MB (actual size).
I need that before sending the picture to my server, my application does:
option A: after taken the photo this options are available
option B: automatically the application takes a small photo (already resized)
option C: do you have any other ideas?
Is it possible to do something to resolve this problem? I am developing using SWIFT language.
Thank you

Related

Is there a possibility to save a video to the gallery when the camera is called with a <input type="file">?

I'm trying to build a hybrid iPhone app that makes use of the html-tag for video uploads.
Camera opens when the input is clicked, users can record video and upload works fine.
I'm quite new into swift and iOS-development, therefore my question:
How can I address the video inside the native app part, to additionally store it on the phone inside the gallery? Is this possible?
You should use accept="video/mp4,video/x-m4v,video/*"
MIMEType webkit

Apple Live Photo file format

Apple will introduce Live Photo in iOS 9/iPhone 6s. Where is the file format documented?
A live photo has two resources. They are tied together with an asset identifier (a UUID as a string).
A JPEG; this must have a metadata entry for kCGImagePropertyMakerAppleDictionary with [17 : assetIdentifier] (17 is the Apple Maker Note Asset Identifier key).
A Quicktime MOV encoded with H.264 at the appropriate framerate (12-15fps) and size (1080p). This MOV must have:
Top-level Quicktime Metadata entry for ["com.apple.quicktime.content.identifier" : assetIdentifier]. If using AVAsset you can get this from asset.metadataForFormat(AVMetadataFormatQuickTimeMetadata)
Timed Metadata track with ["com.apple.quicktime.still-image-time" : 0xFF]; The actual still image time matches up to the presentation timestamp for this metadata item. The payload seems to just be a single 0xFF byte (aka -1) and can be ignored. If using an AVAssetReader you can use CMSampleBufferGetOutputPresentationTimeStamp to get this time.
The assetIdentifier is what ties the two items together and the timed metadata track is what tells the system where the still image sits in the movie timeline.
Here's the link. Otherwise, here's the text:
Live Photos
Live Photos is a new feature of iOS 9 that allows users to capture and
relive their favorite moments with richer context than traditional
photos. When the user presses the shutter button, the Camera app
captures much more content along with the regular photo, including
audio and additional frames before and after the photo. When browsing
through these photos, users can interact with them and play back all
the captured content, making the photos come to life.
iOS 9.1 introduces APIs that allow apps to incorporate playback of
Live Photos, as well as export the data for sharing. There is new
support in the Photos framework to fetch a PHLivePhoto object from the
PHImageManager object, which is used to represent all the data that
comprises a Live Photo. You can use a PHLivePhotoView object (defined
in the PhotosUI framework) to display the contents of a Live Photo.
The PHLivePhotoView view takes care of displaying the image, handling
all user interaction, and applying the visual treatments to play back
the content.
You can also use PHAssetResource to access the data of a PHLivePhoto
object for sharing purposes. You can request a PHLivePhoto object for
an asset in the user’s photo library by using PHImageManager or
UIImagePickerController. If you have a sharing extension, you can also
get PHLivePhoto objects by using NSItemProvider. On the receiving side
of a share, you can recreate a PHLivePhoto object from the set of
files originally exported by the sender.
Guidelines for Displaying Live Photos
It’s important to remember that a Live Photo is still a photo. If you have to display a Live Photo in
an environment that doesn’t support PHLivePhotoView, it’s recommended
that you present it as a regular photo.
Don’t display the extra frames and audio of a Live Photo separately.
It's important that the content of the Live Photo be presented in a
consistent way that uses the same visual treatment and interaction
model in all apps.
It’s recommended that you identify a photo as a Live Photo by placing
the badge provided by the PHLivePhotoView class method
livePhotoBadgeImageWithOptions:PHLivePhotoBadgeOptionsOverContent in
the top-left corner of the photo.
Note that there is no support for providing the visual effect that
users experience as they swipe through photos in the Photos app.
Guidelines for Sharing Live Photos
The data of a Live Photo is
exported as a set of files in a PHAssetResource object. The set of
files must be preserved as a unit when you upload them to a server.
When you rebuild a PHLivePhoto with these files on the receiver side,
the files are validated; loading fails if the files don’t come from
the same asset.
If your app lets users apply effects or adjustments to a photo before
sharing it, be sure to apply the same adjustments to all frames of the
Live Photo. Alternatively, if you don’t support adjusting the entire
contents of a Live Photo, share it as a regular photo and show an
appropriate indication to the user.
If your app has UI for picking photos to share, you should let users
play back the entire contents so they know exactly what they are
sharing.When selecting photos to share in your app, users should also
be able to turn a Live Photo off, so they can post it as a traditional
photo.
Outside of the documentation, Live Photos are made up of 2 resources, an image and an mov (quicktime movie file). So every Live Photo has 2 'actual' files connected by the wrapper of the Live Photo type.
Live Photos is actually two files. Original JPEG Image and Full HD Video.
Uniform Type Identifier (UTI) for the format is kUTTypeLivePhoto / com.apple.live-photo
#available(OSX 10.12, *)
public let kUTTypeLivePhoto: CFString
/*
*
* kUTTypeLivePhoto
*
* Live Photo
*
* UTI: com.apple.live-photo
*
*
*/
some additional info about live photos:
agree, it has .mov file extension
it saved in directory /var/mobile/media/DCIM/100apple/ alongside
with jpg version of photo
live photos can be played even on device without 3D touch (I can
play it on my ipad 2017 via long-pressing on the photo)
it can be played even on old phones (such as iphone 5) even on iOS8
if you install PhotosLive tweak

How to open album list like photos native application?

As i saw this in "Viber" application. There is media sharing button which opens all the albums as shown in photos native application. Can we do this ? I found UIImagePickerController cannot shows both simultaneously. How show both photo gallery and video gallery simultaneously ?
You're right about UIImagePickerController here great guide, and github project for picking media.
If you want to share your media here's some cool examples:
Check this tutorial on how to share pictures via email.
If you want to share via mail -
First of all you have to Create NSData object from your PNG/JPEG image data and then implement the method: Send addAttachmentData:mimeType:filename: to your MFMailComposeViewController instance.
Also i think the best thing to use in your case to share stuff like text, pictures etc is Sharekit.

Can I attach an image to an FBDialog?

I'm writing an iOS app that produces an image, and we'd like the user to be able to post that image to their Facebook wall using the standard facebook dialog.
The Facebook iOS SDK provides a dialog for posting to a user's wall, using the dialog:withParams:andDelegate: method, and you can provide the URL of a picture in the params argument. However, that's no good to me - I want to send off an image that exists only on the phone.
The SDK also provides a way to post an image to a user's wall, using the requestWithGraphPath:andParams:andHttpMethod:andDelegate: method. But that's no good to me either - I want to use the standard dialog.
Is there a way for me to send the image data with an FBDialog?
You need to do a photo upload first, with a graph call to '/me/photos' - this will be in an album that your app will have to create, not an existing one (unless your app created it).
That will return you the photo id, which you can then hit the graph again with a call to that specific id: 'https://graph.facebook.com/RETURNING_PHOTO)ID'
This will return name, picture, source, link (and a bunch of other stuff) - you can then use the picture (which is a thumbnail) in your wall post as the picture and add the URL to the source (big image) or use the link to the photo in the album (link) inside your message text.
Hope this helps,
-J

iPhone taking a picture and posting it to a database

I'm currently adding some photo-taking functionality to my application.
I'm curious about something, though. After the photo is taken, this code holds it as a piece of data:
NSData *imageData = UIImagePNGRepresentation(saveImage);
What I need to know is how the picture, after taken by the user, would get sent to the web via a PHP script. The script has been written, but on the iPhone end, what would I have to do programatically to ensure it gets sent to the database in the right format for viewing? I have the "POST" NSURL request setup, which I've used many times for posting strings/numbers, but I don't know if this is different for photo data.
Open Google, search for 'cocoa upload photo iphone', see first result:
How can I upload a photo to a server with the iPhone?

Resources