Can you use the camera of an iOS device in an AS3 app? If so, how?
Yep it's really easy, there are many ways to access the camera in AS3.
Firstly, the same way as you access the camera in normal AS3 applications:
var camera:Camera = Camera.getCamera();
var video=new Video();
video.attachCamera(camera);
this.addChild(video);
This will display the camera in the current display object.
You can also ask for images from the Camera roll using the CameraRoll class:
import flash.media.CameraRoll;
var cameraRoll:CameraRoll = new CameraRoll();
if(CameraRoll.supportsBrowseForImage)
{
cameraRoll.addEventListener(MediaEvent.SELECT, imageSelected);
cameraRoll.addEventListener(Event.CANCEL, browseCancelled);
cameraRoll.addEventListener(ErrorEvent.ERROR, mediaError);
// Ask the user to select an image
cameraRoll.browseForImage();
}
You can use the native "camera" application to take a photo:
import flash.media.CameraUI;
var cameraUI:CameraUI = new CameraUI();
if (CameraUI.isSupported )
{
cameraUI.addEventListener(MediaEvent.COMPLETE, imageSelected);
cameraUI.addEventListener(Event.CANCEL, browseCancelled);
cameraUI.addEventListener(ErrorEvent.ERROR, mediaError);
cameraUI.launch(MediaType.IMAGE);
}
Hope that points you in the right direction.
Related
First, this is a Youtube link showing the problem: Video stretched. The Youtube video is edited to remove unnecessary parts, I am only showing the important parts. As you can see after some time the video gets stretched.
The original video was uploaded to Azure media services and encoded by Azure media using the built-in "AdaptiveStreaming" preset.
I am using HLS dynamic packaging with this url:
https://amswrdev-usso.streaming.media.azure.net/80a2651c-462f-487f-b1a3-87cb72366255/1zIHQ.ism/manifest(format=m3u8-cmaf)
I am testing it on an Iphone 12 pro max, IOS 15.0.1, swift 5.0
I am using the AVPlayerViewController, this is the code:
import Foundation
import SwiftUI
import AVKit
struct VideoPlayerView: UIViewControllerRepresentable {
var player: AVPlayer
#Binding var gravity: AVLayerVideoGravity
func makeUIViewController(context: UIViewControllerRepresentableContext<VideoPlayerView>) -> AVPlayerViewController {
let controller = AVPlayerViewController()
controller.player = player
controller.showsPlaybackControls = false
controller.videoGravity = gravity
return controller
}
func updateUIViewController(_ uiViewController: AVPlayerViewController, context: UIViewControllerRepresentableContext<VideoPlayerView>) {
uiViewController.videoGravity = gravity
}
func dismantleUIViewController(_ uiViewController: AVPlayerViewController, coordinator: Self.Coordinator) {
//print("dismantleUIViewController \(uiViewController)")
}
}
My hypothesis:
Avplayer is not correctly switching to the correct bandwidth
Azure media is not sending the correct variants on the initial playlist
Maybe I don't have the correct parameters for preferredMaximumResolution and preferredForwardBufferDuration? but I don't know what values should be correct.
Dynamic packaging of azure media is now on version 7, maybe is not supported by IOS?
I have been trying to fix it changing my view to have fixed values of height and width but is not working. I have 2 weeks trying to figure out this but nothing is working, Do you have any idea?
Like I said the video is stretched after some time, is not consistent. Sometimes happens immediately and sometimes takes more time but happens.
I'm still on iOS 14 on an iPhone 8 - and it plays just fine for me... so this may be an iOS 15.0.1 issue right now.
In a new project we plan to create following AR showcase:
We want to have a wall with some pipes and cables on it. These will have sensors mounted to control and monitor the pipe/cable-system. Since each sensor will have the same dimensions and appearance we plan to add individual QR Codes to each sensor. Reading the documentation of ARWorldTrackingConfiguration and ARImageTrackingConfiguration shows that ARKit is capable of recognizing known images. But the requirements to images make me wonder if the application would work as we want it to when using several QR Codes:
From detectionImages:
[...], identifying art in a museum or adding animated elements to a movie poster.
From Apples Keynote:
Good Images to Track: High Texture, High local Contrast, well distributed histogram, no repetitive structures
Since QR Codes don't match the requirements completely I'm wondering if it's possible to use about 10 QR Codes and have ARKit recognize each of them individually and reliable. Especially when e.g. 3 Codes are in the view. Does anyone have experience in tracking several QR Codes or even a similar showcase?
Recognizing (several) QR-codes has nothing to do with ARKit and can be done in 3 different ways (AVFramework, CIDetector, Vision), of which the latter is preferable in my opinion because you may also want to use its object tracking capabilities (VNTrackObjectRequest). Also it is more robust in my experience.
If you need to place objects in ARKit scene using locations of the QR-codes, you will need to execute hitTest on ARFrame to find code's 3D location (transform) in the world. On that location you will need to place a custom ARAnchor. Using the anchor, you can add a custom SceneKit node to the scene.
UPDATE: So the suggested strategy would be: 1. find QR codes and their 2D location with Vision, 2. find their 3D location (worldTransform) with ARFrame.hitTest(), 3. create custom (subclassed) ARAnchor and add it to the session, 4. in renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) add a custom node (such as SCNText with billboard constraint) for your custom ARAnchor.
If by any chance you are using RxSwift, it can done the easiest with RxVision framework, because it allows to easily pass the relevant ARFrame along into the handler -
var requests = [RxVNRequest<ARFrame>]()
let barcodesRequest: RxVNDetectBarcodesRequest<ARFrame> = VNDetectBarcodesRequest.rx.request(symbologies: [.QR])
self
.barcodesRequest
.observable
.observeOn(Scheduler.main)
.subscribe { [unowned self] (event) in
switch event {
case .next(let completion):
self.detectCodeHandler(value: completion.value, request: completion.request, error: completion.error) // define the method first
default:
break
}
}
.disposed(by: disposeBag)
if let image = anchor as? ARImageAnchor{
guard let buffer: CVPixelBuffer = sceneView.session.currentFrame?.capturedImage else {
print("could not get a pixel buffer")
return
}
let image = CIImage(cvPixelBuffer: buffer)
var message = ""
let features = detector.features(in: image)
for feature in features as! [CIQRCodeFeature] {
message = feature.messageString
break
}
if image.referenceImage.name == "QR1"{
if message == "QR1"{
// add node 1
}else{
sceneView.session.remove(anchor: anchor)
}
} else if image.referenceImage.name == "QR2"{
if message == "QR2"{
// add node 2
}else{
sceneView.session.remove(anchor: anchor)
}
}
}
detector here is CIDetector.Also you need to check renderer(_:didUpdate:for:). I worked on 4 QR codes.
It works assuming no two QR codes can be seen in a frame at same time.
Is there an easy way to take a photo with exif informations like the standard iOS Camera App and save it to photo album?
try? PHPhotoLibrary.shared().performChangesAndWait {
let creationRequest = PHAssetChangeRequest.creationRequestForAsset(from: image)
// Set metadata location
if let location = self.locationManager?.latestLocation {
creationRequest.location = location
}
}
is not saving location informations to exif.
I tried the following solution
add GPS metadata dictionary to image taken with AVFoundation in swift
I could save basic exif informations using AVCapture but inserting location informations did not work.
I can not believe that there is no easier way.
Hey so I'm trying to have a button that when pressed allows the user to choose 2-5 pictures from their photo library then have whatever photo chosen be set onto a uiimageview? I was looking online and couldn't find anything related to how to do it in swift?
Thanks
I worked out using this : https://github.com/zhangao0086/DKImagePickerController .
Getting selected image's thumbnail images:
let pickerController = DKImagePickerController()
pickerController.sourceType = .Photo
pickerController.didCancelled = { () in
println("didCancelled")
}
pickerController.didSelectedAssets = { [unowned self] (assets: [DKAsset]) in
println("didSelectedAssets")
println(assets)
for(var i = 0; i<(assets.count); i++){
print(assets[i].url)
self.PickedArray .addObject(assets[i].thumbnailImage!)
}
self.performSegueWithIdentifier("SegueToPhotoLibraryView", sender: self)
Getting selected image's urls :
assets[i].url instead of assets[i].thumbnailImage
Hope it helps!
Currently iOS does not provide an image picker out of the box that lets you pick multiple images from the photo library. UIImagePickerController only lets you select one image.
But there are several image picker implementations available that let you pick multiple images. You can find a whole bunch at cocoacontrols.com as #ytbryan already mentioned.
I am currently not aware of any multiple image picker implemented in Swift. If someone finds one, please edit and post the link.
I am working on an iOS app and I need to determine if a song has album art. I am using the MPMusicPlayerController to access the native iOS music library and I am using a MPMediaItemArtwork to capture the artwork sent from the iOS music library. This is the coding I use to get the artwork:
MPMediaItemArtwork *mpArt = [mpSong valueForProperty:MPMediaItemPropertyArtwork];
To test if artwork is present I use this:
if (mpArt)
{
imgArt = [mpArt imageWithSize:CGSizeMake(250, 250)];
}
else
{
imgArt = [UIImage imageNamed:#"Alternative_Artwork_Image.jpg"];
}
No matter what the song's artwork is, the result is always true.
Any help would be appreciated. Thank you in advance.
I think it will always return true if it is an iCloud selection because it will eventually download. Try looking for a correlation with MPMediaItemPropertyIsCloudItem
You can also try getting info from the bounds... perhaps the bounds is 0x0 when the image is not found.