iOS AVFoundation: Get AVCaptureDevice.Format's video dimensions - ios

I'm trying to display the dimensions in which photos/video recordings can be taken to the user.
While photo dimensions are easily accessible through AVCaptureDevice.Format.highResolutionStillImageDimensions, I have no idea how I can achieve the same for videos.
I've tried the AVCaptureDevice.Format.formatDescription.dimensions property (or the .presentationDimensions(..) func), but those just gave me the following compile errors:
Since those are available since iOS 13.0 (I believe that was Swift 5.1?), I should definitely have the API for those (especially since I can CMD + Click on them in my code), and my project should compile. Am I missing something here? Is there a better solution to get the resolution a capture device can record videos in?

You just need below line.That's it.
let dimensions = CMVideoFormatDescriptionGetDimensions(format.formatDescription)

There is no direct API to get the video resolution.
You can set the AVCaptureSession.sessionPreset to one of the AVCaptureSession.Preset, then use canSetSessionPreset(_:) to check if the preset is supported by the current device.

I just did the following for testing and got all available video dimensions printed:
let descriptions = device.formats.map(\.formatDescription)
let sizes = descriptions.map(\.dimensions)
print(sizes)
I also checked if it compiled for iOS, Simulator, and macOS, and it does.
Are you still supporting iOS 12? Did you check your SDK setting?

Related

Azure Media Services - Captions on iOS - Native not AMP

I'm trying to get captioning working on iOS and Android. Android seems relatively straight forward but I can't seem to figure out how (nor do the docs really tell you) to setup WebVTT captions like you can easily do with Windows 10 UWP and AMP.
On Android I am just doing videoView.AddSubtitleSource and it appears to work. On iOS I'm at a loss. From what I can tell it appears that it's possible to load captions not embedded in the file, but I can't find any example of how it works with MediaSelectionOptions and specifying a stream or similar.
I've done this in my own custom renderer in Xamarin Forms but I'm fine converting it from swift or Objective C if needed.
Anyone get this working or know how?
If your interest is iOS native player, check this tutorial video from Apple: https://developer.apple.com/videos/play/wwdc2012/512/ (about 20 min into it)
If your interest is Safari based player, check the examples here: https://developer.apple.com/streaming/examples/ (3 of the examples include webVTT)
The issue with iOS and Safari not playing back VTT side car files appears to be fixed now in the latest iOS release. Not sure exactly when it was fixed, but I'm on 14.7.1 and the following sample is working now
https://ampdemo.azureedge.net/azuremediaplayer.html?url=%2F%2Famssamples.streaming.mediaservices.windows.net%2Fbc57e088-27ec-44e0-ac20-a85ccbcd50da%2FTearsOfSteel.ism%2Fmanifest&subtitles=English,en,%2F%2Famssamples.streaming.mediaservices.windows.net%2Fbc57e088-27ec-44e0-ac20-a85ccbcd50da%2FTOS-en.vtt;Spanish,es,%2F%2Famssamples.streaming.mediaservices.windows.net%2Fbc57e088-27ec-44e0-ac20-a85ccbcd50da%2FTOS-es.vtt;French,fr,%2F%2Famssamples.streaming.mediaservices.windows.net%2Fbc57e088-27ec-44e0-ac20-a85ccbcd50da%2FTOS-fr.vtt;Italian,it,%2F%2Famssamples.streaming.mediaservices.windows.net%2Fbc57e088-27ec-44e0-ac20-a85ccbcd50da%2FTOS-it.vtt

How to programmatically change video codec in Xamarin iOS

I've recently run into a issue with taking video on new iPhones(8 and up) written in Xamarin. When capturing the video video on older devices the codec is H.264, but on new devices Apple has switch to H.265. These videos are played in browser and pretty much everything I've checked doesn't support H.265.
Since you can change in the setting of the device between (high efficiency-H.265 and most compatiable-H.264), I figured you can do this programmatically. I haven't been able to find any information on how to do this if at all. Any help would be appreciated.
You can set the codec on a AVCaptureVideoDataOutput, which you add to your Session, through WeakVideoSettings which is just a dictionary of settings.
You can find the keys in the official apple docs: https://developer.apple.com/documentation/avfoundation/avassetwriterinput/video_settings_dictionaries

How can I see whether the Scan Credit Card feature is working in safari in iOS Simulator

I am trying to change a form that includes credit card input details so that when viewed on iOS it will give the user the option of scanning their credit card details (see below):
However, the solution I think should work (see here) isn't showing the scan card option when I open the page in safari on XCode iOS Simulator so I want to know:
Should I expect the Simulator to work exactly like an actual device (I know some camera based things do not work)?
If so, what steps I need to take in order to get it working?
If not possible, what alternative routes might there be to testing whether the scan card feature is working for my form.
The Simulator does not currently provide a virtual camera.
If this is something you'd like to see, please file a bug report at https://bugreport.apple.com and request camera support be added to the Simulator.

GCKDiscoveryManager not founding Chromecast devices iOS SDK

I'm trying to find Chromecast devices with my iOS App.
I've downloaded cast SDK via CocoaPods, and then I try the following:
let gckCastOptions = GCKCastOptions(receiverApplicationID: kGoogleCastAppReceiverId)
GCKCastContext.setSharedInstanceWithOptions(gckCastOptions)
GCKLogger.sharedInstance().delegate = self
self.discoveryManager = GCKCastContext.sharedInstance().discoveryManager
self.discoveryManager!.addListener(self)
self.discoveryManager!.passiveScan = true
self.discoveryManager!.startDiscovery()
And then, in the listener method:
func didStartDiscoveryForDeviceCategory(deviceCategory: String) {
print("GCKDiscoveryManagerListener: \(deviceCategory)")
print("FOUND: \(self.discoveryManager!.hasDiscoveredDevices)")
}
Result is always false :(
On my Mac, when I open YouTube, I can stream video to Chromecast device, so device is set definitely.
I am testing on simulator? Should I try it on real device? I suppose that this is not necessary since I'm only trying to discover the available devices.
I've tried to add GCKDeviceScanner too, no luck. I suppose that this is the starting point.
Tried to add GCKUICastButton via storyboard - not being able to do it :(
Have a look to this setting. As soon as I enabled it it started working
Also, please make sure you set another kReceiverAppID other than the given example (AABBCCDD), for example we are using the one listed on the example app, for example static NSString *const kReceiverAppID = #"4F8B3483" (I guess you need to make your own at Google Play Developer Console)
And finally, you can be pretty sure it will never work on a simulator, you need REAL device for this.
Have a look in this setting. Go to project capabilities
Go to project setting and select device capabilities. select this option as discussed in image

Default volume/sound of the device

I am going to use code to get the default volume/sound of the device which is set using the volume up or down buttons form the device, Below is the code which I am going to access the sound,
To resolve this error I have done research and found that to access this code we need to use CoreAudio Framework which allows to use the above code,
#import <CoreAudio/CoreAudio.h>
As suggested in a link Here But while goint to try with "<CoreAudio/CoreAudio.h>" it is giving below error for ios 7.0 +.
I have a need to get default system sound and this code is not giving me right to get desired result, is there any alternate way to find the current volume/sound of the device?
Any help is highly appreciated in advance.
You would want to include AudioToolbox/AudioToolbox.h.
You also need to link the framework in your build settings.
Also - The sample link you provided is for OSX while your question states that you are using iOS7. The OSX and iOS coreaudio api is not exactly the same. Some of your code above is for OSX.

Resources