How to access ipad camera using flex? - ios

My application needs to provide a button to be clicked to turn on the ipad camera and once the picture is taken it should be able to retrieve the picture and show it. I know I can use the CameraRoll - AS3 class but it just allows me to access the camera roll not the camera itself.

You should find your question answered here by Jason Sturges [ take photo using Adobe Builder (flex) for iOS ]
He points you to the XpenseIt Tutorial(Flex mobile project FXP download). Download the Final .fxp file and add it to your workspace and you'll find good resources to adapt to your needs.
Another resource provided by Adobe Blog contributor Christian Cantrell
http://blogs.adobe.com/cantrell/archives/2011/02/how-to-use-cameraui-in-a-cross-platform-way.html

Related

How can i upload video to Twitter using IOS

I have a local video link, and i want when the user press the button -> show twitter app with my video and title there (without using UIActivityViewController), just by pressing the button
You have to choose which way you want to go to upload a video, i.e. post a tweet with just that title you've mentioned.
Option 1
Use the HTTP API and create a URLRequest that contains the video as 'attachment'. You have to handle authentication first or the request will most likely fail.
Option 2 (Not really an option anymore)
Another way would be the iOS SDK provided, as given in the other example here. Problem is: there is no longer an official Twitter SDK for iOS. You could try your luck with the archived OpenSource SDK but maintainability is pretty decent with this.

Apple Live Photo file format

Apple will introduce Live Photo in iOS 9/iPhone 6s. Where is the file format documented?
A live photo has two resources. They are tied together with an asset identifier (a UUID as a string).
A JPEG; this must have a metadata entry for kCGImagePropertyMakerAppleDictionary with [17 : assetIdentifier] (17 is the Apple Maker Note Asset Identifier key).
A Quicktime MOV encoded with H.264 at the appropriate framerate (12-15fps) and size (1080p). This MOV must have:
Top-level Quicktime Metadata entry for ["com.apple.quicktime.content.identifier" : assetIdentifier]. If using AVAsset you can get this from asset.metadataForFormat(AVMetadataFormatQuickTimeMetadata)
Timed Metadata track with ["com.apple.quicktime.still-image-time" : 0xFF]; The actual still image time matches up to the presentation timestamp for this metadata item. The payload seems to just be a single 0xFF byte (aka -1) and can be ignored. If using an AVAssetReader you can use CMSampleBufferGetOutputPresentationTimeStamp to get this time.
The assetIdentifier is what ties the two items together and the timed metadata track is what tells the system where the still image sits in the movie timeline.
Here's the link. Otherwise, here's the text:
Live Photos
Live Photos is a new feature of iOS 9 that allows users to capture and
relive their favorite moments with richer context than traditional
photos. When the user presses the shutter button, the Camera app
captures much more content along with the regular photo, including
audio and additional frames before and after the photo. When browsing
through these photos, users can interact with them and play back all
the captured content, making the photos come to life.
iOS 9.1 introduces APIs that allow apps to incorporate playback of
Live Photos, as well as export the data for sharing. There is new
support in the Photos framework to fetch a PHLivePhoto object from the
PHImageManager object, which is used to represent all the data that
comprises a Live Photo. You can use a PHLivePhotoView object (defined
in the PhotosUI framework) to display the contents of a Live Photo.
The PHLivePhotoView view takes care of displaying the image, handling
all user interaction, and applying the visual treatments to play back
the content.
You can also use PHAssetResource to access the data of a PHLivePhoto
object for sharing purposes. You can request a PHLivePhoto object for
an asset in the user’s photo library by using PHImageManager or
UIImagePickerController. If you have a sharing extension, you can also
get PHLivePhoto objects by using NSItemProvider. On the receiving side
of a share, you can recreate a PHLivePhoto object from the set of
files originally exported by the sender.
Guidelines for Displaying Live Photos
It’s important to remember that a Live Photo is still a photo. If you have to display a Live Photo in
an environment that doesn’t support PHLivePhotoView, it’s recommended
that you present it as a regular photo.
Don’t display the extra frames and audio of a Live Photo separately.
It's important that the content of the Live Photo be presented in a
consistent way that uses the same visual treatment and interaction
model in all apps.
It’s recommended that you identify a photo as a Live Photo by placing
the badge provided by the PHLivePhotoView class method
livePhotoBadgeImageWithOptions:PHLivePhotoBadgeOptionsOverContent in
the top-left corner of the photo.
Note that there is no support for providing the visual effect that
users experience as they swipe through photos in the Photos app.
Guidelines for Sharing Live Photos
The data of a Live Photo is
exported as a set of files in a PHAssetResource object. The set of
files must be preserved as a unit when you upload them to a server.
When you rebuild a PHLivePhoto with these files on the receiver side,
the files are validated; loading fails if the files don’t come from
the same asset.
If your app lets users apply effects or adjustments to a photo before
sharing it, be sure to apply the same adjustments to all frames of the
Live Photo. Alternatively, if you don’t support adjusting the entire
contents of a Live Photo, share it as a regular photo and show an
appropriate indication to the user.
If your app has UI for picking photos to share, you should let users
play back the entire contents so they know exactly what they are
sharing.When selecting photos to share in your app, users should also
be able to turn a Live Photo off, so they can post it as a traditional
photo.
Outside of the documentation, Live Photos are made up of 2 resources, an image and an mov (quicktime movie file). So every Live Photo has 2 'actual' files connected by the wrapper of the Live Photo type.
Live Photos is actually two files. Original JPEG Image and Full HD Video.
Uniform Type Identifier (UTI) for the format is kUTTypeLivePhoto / com.apple.live-photo
#available(OSX 10.12, *)
public let kUTTypeLivePhoto: CFString
/*
*
* kUTTypeLivePhoto
*
* Live Photo
*
* UTI: com.apple.live-photo
*
*
*/
some additional info about live photos:
agree, it has .mov file extension
it saved in directory /var/mobile/media/DCIM/100apple/ alongside
with jpg version of photo
live photos can be played even on device without 3D touch (I can
play it on my ipad 2017 via long-pressing on the photo)
it can be played even on old phones (such as iphone 5) even on iOS8
if you install PhotosLive tweak

UIWebView: How to resize a taken photo (iOS)

Consider my application a webView displaying my website. In my website I have an input tag where to upload a picture. This input tag on a tablet it is great because you can:
upload an existing photo
take an instant photo and upload it
The second option is the one that interests me. The problem is that there is no resizement and the picture sent to the server is 1MB (actual size).
I need that before sending the picture to my server, my application does:
option A: after taken the photo this options are available
option B: automatically the application takes a small photo (already resized)
option C: do you have any other ideas?
Is it possible to do something to resolve this problem? I am developing using SWIFT language.
Thank you

Navigating from HTML pages to native black berry page/screen [duplicate]

This question already has an answer here:
IBM Worklight - Using the camera in BlackBerry
(1 answer)
Closed 8 years ago.
Platform: Black berry 6 and 7
Framework: IBM Worklight
Description:
I am developing an image-scanning application for BlackBerry. The application takes the user to a success page if the correct image is scanned and a failure page is shown on scanning an incorrect image.
Scenario:
I have developed HTML pages for this app, now I need to put transition from HTML page to camera page of Black berry i.e. I have a button on my HTML page labelled as 'Scan Image', on clicking/taping this button camera should open and start scanning images, So my question is :
How that transition can be done, I mean what is the javascript syntax to navigate from HTML page to native camera page?
This is the exact same question as you have asked TODAY, here: IBM Worklight - Using the camera in BlackBerry
Why do you ask again?
There is no "transition" involved. You use the JavaScript API provided by either Apache Cordova or BlackBerry themselves to access the camera, take a picture and handle the success or failure.
Please stop for a moment to write your app. Create a demo app and implement ONLY the camera support; if it works, start implementing it the way you want to, in your app.
Use the Camera API provided by BlackBerry
Use the Camera API provided by Apache Cordova
Please take the time to read the pages and the examples.
There are clear instructions on how to add permission for using the Camera and clear code examples on how to implement it.
http://docs.phonegap.com/en/2.6.0/cordova_camera_camera.md.html#Camera

disabling ipad from creating placemarks on google map

I have a google map that I created with KML:
Here is a link to the map:
http://goo.gl/maps/dkfjU
Here is the complete KML file:
https://dl.dropbox.com/u/94943007/02bb39645a3c9d95afeed5cb9bd5d07c040d8ca8a4ee56b9fb367d38.kml
When browsing this map with my IPAD using safari, for some reason when I tap anywhere on the map, it creates a point there on the map.
How do I disable point creation on the map so that IPAD users are not able to do this?
I checked the link as well and Marcelo seems to be correct. To have a maps with the ability to turn on and off layers you will need to build your own site and perhaps use Javascript and the Google-maps API to build it. Maybe start here: https://developers.google.com/maps/documentation/javascript/tutorial
You cannot disable adding a place mark if you are browsing the map in safari
You can control it if you use map kit iOS API of the iOS SDK.
For importing KML-Data into that SDK, you can use Sample Code from
Apple:
http://developer.apple.com/library/ios/#samplecode/KMLViewer/Introduction/Intro.html
But there is no built-in KML Import in iOS SDK. You can use other
frameworks as mapbox/Simple-KML here
https://github.com/mapbox/Simple-KML
If you want to do this in safari, then you're lost, unless you can manipulate that using JavaScript!!!

Resources