I can get the default camera this way:
Camera.getCamera();
But how to get all available cameras connected to my computer with actionscript?
You can use the Camera.names array to get the list of cameras supported on the system. Camera.getCamera() returns a reference to the default camera.
Using the Camera.names array, you can call Camera.getCamera(name:String = null) and pass the name of the camera to it. To specify a name, use the string representation of the zero-based index position within the Camera.names array. For example, to specify the third camera in the array, use Camera.getCamera("2").
More info : flash.media.Camera : Adobe Livedocs
Unfortunately only the selected camera (via Adobe Flash Player Settings) is available to ActionScript.
trace(Camera.names);
Don't you have code hinting?
Related
I'm trying to capture ID scan using MicroBlink BlinkID library
I need to get both:
processed , cropped unskewed image from front and back of the ID
unprocessed raw UIImage of which front picture was processed out.
returnFullDocumentImage and encodeFullDocumentImage but I'm always getting cropped images accessing those properties:
fullDocumentFrontImage?.image
fullDocumentBackImage?.image
how to get uncropped front image of the ID?
whatever I do, I get nil when trying to access: frontCameraFrame?.image
Found it
I needed to set
blinkIdCombinedRecognizer.saveCameraFrames = true
Yes, that’s correct! By setting saveCameraFrame to true, you can obtain frontCameraFrame, backCameraFrame, and barcodeCameraFrame.
Also, if you want to extract the face image or the signature from the document, you can do that:
self.blinkIdRecognizer?.result.faceImage
self.blinkIdRecognizer?.result.signatureImage
I am working on an application in which I will connect to the T.V using ChromeCast device, to achieve this I have used GoogleCast FrameWork in my project,
I am facing a problem when I try to access the approximate stream position of the video using the below statements,
GCKMediaControlChannel *mediaControlChannel = [[GCKMediaControlChannel alloc] init];
NSLog(#"Approximate stream position is %f",mediaControlChannel.approximateStreamPosition);
But this is resulting in a time difference of 20 seconds.
I tried below statements to get the exact stream position,
GCKMediaStatus *deviceStatus = [[GCKMediaStatus alloc] initWithSessionID:sessionId mediaInformation:self.mediaInformation];
NSLog(#"Stream Position %f", self.deviceStatus.streamPosition);
As the above method is having two parameters,We need to send session as an integer, but we are getting session id as a string alpha numeric and upon converting this to integer resulting in 0.
Can any one help me to get the session ID as an integer, Or suggest me to get the current stream position with any different method.
I use the following code to retrieve the stream position of the video that is currently being casted, and it's pretty accurate! The Google Cast SDK version that we use in our project is 3.5.3.
let position = Double(GCKCastContext.sharedInstance().sessionManager.currentSession!.remoteMediaClient!.mediaStatus!.streamPosition)
Hope it helps!
approximateStreamPosition should give you a pretty accurate time, certainly not off by 20 seconds. You can take a look at our iOS reference app for an example.
You can use approximateStreamPosition for the same.
Below code will print the more accurate time.
if let position = GCKCastContext.sharedInstance().sessionManager.currentSession?.remoteMediaClient?.approximateStreamPosition() {
print("current time ",position)
}
My understanding currently is that:
CameraUI
I can use the CameraUI to access the built in camera for MediaType.VIDEO and that delegates to the built-in video camera app and lets me record a video. My app does that now.
When I stop recording and click the "Use" button, I am returned to my app and theoretically I have a valid MediaPromise.
iOS does -not- provide a valid/usable url/filename to the recorded video (or to photos) and so I would have to use a Loader to bring-in/use/access the 'recorded' video... AND... iOS does not actually create a file anywhere on the device, most importantly, in the Camera Roll where one would expect by the normal behavior when uses the system native camera/video app.
The documentation says that the Loader can load various image types and SWFs but nothing about video data, so I conclude from that that I cannot actually use the CameraUI to generate a valid MediaPromise that I can then pass to a Loader class or similar to read in the information created by the system camera and then manipulate (upload, save to applicationStorageDirectory, and/or display in one of the two video player components available in the API).
CameraRoll
I can have video entities in the iOS Camera Roll but the AS3/Air3.5 CameraRoll class won't let me view/access/reference them in any way.
Normal File I/O
All my attempts to use the Air3.5 File classes to browse to the storage location of the iOS Camera Roll have been rebuffed.
------- Questions -------
Am I correct in believing that there is a way to take video but no way to use the video that's been captured. (No way to use the resulting MediaPromise successfully).
I believe you can take video and access it using Android, but there's nothing in the documentation that says that you cannot using iOS.
Am I correct in believing that iOS sandboxes apps so that they cannot browse to video/photo storage using standard File I/O, but only through the apparently non-workable means I've tried (CameraUI & CameraRoll)
Am I wrong to think that these should be rather obvious NEEDS that one can achieve using the XCode Objective C++ etc route but the AIR Mobile Framework does not allow either because of Apple blocking functionality or because Adobe has failed to meet reasonable expectations?
One item of ironic note to convey. If I use the iOS system camera app to record a video, a thumnail of that video then appears in the Gallery/Camera Roll, and of course, I can share it or view it, or whatever... If I use AIR's CameraRoll.browseForImage(), provided I haven't used the camera to take another image, when it shows me the folder where the pictures are stored, the folder icon uses the thumbnail of the last object added... in this case, the video I took, but if I then enter the folder, the video cannot be found. It's teasing us. It knows it's there, but it is apparently forbidden fruit.
I can't answer all your questions, so this entry may not be acceptable, but I found this page while searching a solution for some the problems you described and thought that someone else may find this answer (partially) useful.
To save the movie you just took you need to open and read the data from the promise.
The iOS won't save the file anywere, so the MediaPromise.file is always null.
This is my solution to the problem:
private var camera:CameraUI;
private var dataInput:IDataInput;
public function recordVideo():void
{
// Start the camera and ask for a video
camera = new CameraUI();
camera.addEventListener(MediaEvent.COMPLETE, onCameraComplete);
camera.launch(MediaType.VIDEO);
}
private function onCameraComplete(event:MediaEvent):void
{
// event.data is a MediaPromise and MediaPromise.open() returns a IDataInput
// Let's cast it to a dispatcher and check when it's complete
dataInput = event.data.open();
var dispatcher:IEventDispatcher = IEventDispatcher(dataInput);
dispatcher.addEventListener(Event.COMPLETE, onDataInputComplete);
}
private function onDataInputComplete(event:Event):void
{
// We can do whatever we want with the data, so we'll store it in a File
var file:File = new File();
var bytes:ByteArray = new ByteArray();
var stream:FileStream = new FileStream();
// Reading the data from the opened MediaPromise
dataInput.readBytes(bytes);
stream.open(file, FileMode.WRITE);
stream.writeBytes(bytes, 0, bytes.bytesAvailable);
stream.close();
}
Also, I'm still looking for a way to put the movie in the CameraRoll
I am using AVFoundation to capture still images from camera (capturing still images and not video frame) using captureStillImageAsynchronouslyFromConnection. This gives to me a buffer of type CMSSampleBuffer, which I am calling imageDataSampleBuffer.
As far as I have understood, this buffer can contain any type of data related to media, and the type of data is determined when I am configuring the output settings.
for output settings, I make a dictionary with value: AVVideoCodecJPEG for key: AVVideoCOdecKey.
There is no other codec option. But when I read the AVFoundation Programming Guide>Media Capture, I can see that 420f, 420v, BGRA, jpeg are the available encoded formats supported for iPhone 3gs (which i am using)
I want to get the yuv420 (i.e. 420v) formatted image data into the imageSampleBuffer. Is that possible?
if I print the availableImageDataCodecTypes, I get only JPEG
if I print availableImageDataCVPixelFormatTypes, I get three numbers 875704422, 875704438, 1111970369.
Is it possible that these three numbers map to 420f, 420v, BGRA?
If yes, which key should I modify in my output settings?
I tried putting the value: [NSNumber numberWithInt:875704438] for key: (id)kCVPixelBufferPixelFormatTypeKey.
Would it work?
If yes, how do I extract this data from the imageSampleBuffer?
Also, In which format is UIImage stored? Can it be any format? Is it just NSData with some extra info which makes it interpreted as an image?
I have been trying to use this method :
Raw image data from camera like "645 PRO"
I am saving the data using writeToFile and I have been trying to open it using irfan view.
But I am unable to verify whether or not the saved file is in yuv format ot not because irfan view gives error that it is unable to read the headers.
my following code returns null ,
byte[] image1 = _videoControl.getSnapshot(null);
any suggestion please
Few important moments about VideoControl.getSnapshot method:
some manufacturers may not implement getSnapshot() method;
the viewfinder must actually be visible on the screen prior to calling getSnapShot();
if you attempt to take pictures too quickly, however, getSnapShot() may
return null. The camera requires time to clear out its buffer and
prepare for the next shot;
you may check MMAPI System Property for "video.snapshot.encodings" before capturing:
if (System.getProperty("video.snapshot.encodings") == null) {
// getSnapshot() is not supported
}
You may read this chapter from book "Advanced BlackBerry Development":
http://books.google.com/books?id=F4Qu-lpoVncC&pg=PA53&lpg=PA53#v=onepage&q&f=false
Since VideoControl.getSnapshot method is not supported by all devices I'd recommend to use another approach. You can start the native BB Camera app with this line of code:
Invoke.invokeApplication(Invoke.APP_TYPE_CAMERA, new CameraArguments());
and then using the FileSystemJournalListener catch the taken image.
The BB SDK on your PC contains samples. Search for 'fileexplorerdemo' sample to see the rest of details.