There are two types of focus detection right from iphone 6 introduction,
1. Contrast detection
2. Phase detection
from iphone 6.6+ it uses phase detection.
I am trying to get the current focus system
self.format = [[AVCaptureDeviceFormat alloc] init];
[self.currentDevice setActiveFormat:self.format];
AVCaptureAutoFocusSystem currentSystem = [self.format autoFocusSystem];
if (currentSystem == AVCaptureAutoFocusSystemPhaseDetection)
{
[self.currentDevice addObserver:self forKeyPath:#"lensPosition" options:NSKeyValueObservingOptionNew context:nil];
}
else if(currentSystem == AVCaptureAutoFocusSystemContrastDetection)
{
[self.currentDevice addObserver:self forKeyPath:#"adjustingFocus" options:NSKeyValueObservingOptionNew context:nil];
}
else{
NSLog(#"No observers added");
}
but now its crashing in the following line
AVCaptureAutoFocusSystem currentSystem = [self.format autoFocusSystem];
I am unable to find proper description for the crash.
You're creating some "AVCaptureDeviceFormat", but nothing is actually set up by default with it. It's just unusable garbage (and that's why you are getting a crash).
Each capture device has one or two formats it can work with.
You can see them by doing something like:
for (AVCaptureDeviceFormat *format in [self.currentDevice formats]) {
CFStringRef formatName = CMFormatDescriptionGetExtension([format formatDescription], kCMFormatDescriptionExtension_FormatName);
NSLog(#"format name is %#", (NSString *)formatName);
}
What you should be doing is deciding which AVCaptureDevice you want to use (e.g. the front camera, the back camera, whatever), getting the format from that, and then setting [self.currentDevice setActiveFormat:self.format]".
The WWDC 2013 session "What's New in Camera Capture" video has more information on how to do this.
Related
I'm using AVCaptureMetadataOutput to detect faces on iOS, and I'm trying to set the orientation of the video after the user rotates their device. However, it appears that I can't do this as every time I call the getter isVideoOrientationSupported on the only AVCaptureConnection that my AVCaptureMetadataOutput has, it always returns false. I've tried the code below in every place imaginable, yet it always returns no. Is there any way to set orientation for my metadata?
AVCaptureConnection *conn = [self.metadataOutput connectionWithMediaType:AVMediaTypeMetadataObject];
NSLog(#"%#",self.metadataOutput.connections);
if (!conn) {
NSLog(#"NULL CONNECTION OBJ");
}
if ([conn isVideoOrientationSupported]) {
NSLog(#"Supported!");
}
else {
NSLog(#"Not supported");
}
An Apple Engineer solved this for me over on the Apple Developer Forums. Here's a link. This was their response:
If you want to translate your metadata objects' coordinate space to
that of another AVCaptureOutput (such as the
AVCaptureVideoDataOutput), use
- (AVMetadataObject *)transformedMetadataObjectForMetadataObject:(AVMetadataObject *)metadataObject connection:(AVCaptureConnection *)connection NS_AVAILABLE_IOS(6_0); It's in AVCaptureOutput.h. If you want to
translate the coordinates to the coordinate space of your video
preview layer, use AVCaptureVideoPreviewLayer.h's
- (AVMetadataObject *)transformedMetadataObjectForMetadataObject:(AVMetadataObject *)metadataObject NS_AVAILABLE_IOS(6_0);
SoI have a device (unimag II) attached using their api I redid some of their methods to be compatible with the way I want my app to flow and run. However I now am running into issues such as [UM Info] SDK: initialized UM Warning] StartSwipe: UMRET_NO_READER as well as the swipe starting not being hit when other notifications are. Clearly I am missing something but I have blown so much time on this that I have no clue what I possibly could miss since I made sure that I was as close to their sample app as possible. Obviously there is more code but this is where it should be getting set as opened I would think
-(void)enableSwipe:(CDVInvokedUrlCommand*)command{
[self umsdk_activate];
UmRet ret = [uniReader requestSwipe];
[self displayUmRet: #"Starting swipe task" returnValue: ret];
}
-(void) umsdk_registerObservers:(BOOL) reg {
NSNotificationCenter *nc = [NSNotificationCenter defaultCenter];
//list of notifications and their corresponding selector
const struct {NSString __unsafe_unretained *n; SEL s;} noteAndSel[] = {
//
{uniMagAttachmentNotification , #selector(umDevice_attachment:)},
{uniMagDetachmentNotification , #selector(umDevice_detachment:)},
//
{uniMagInsufficientPowerNotification, #selector(umConnection_lowVolume:)},
{uniMagMonoAudioErrorNotification , #selector(umConnection_monoAudioError:)},
{uniMagPoweringNotification , #selector(umConnection_starting:)},
{uniMagTimeoutNotification , #selector(umConnection_timeout:)},
{uniMagDidConnectNotification , #selector(umConnection_connected:)},
{uniMagDidDisconnectNotification , #selector(umConnection_disconnected:)},
//
{uniMagSwipeNotification , #selector(umSwipe_starting:)},
{uniMagTimeoutSwipeNotification , #selector(umSwipe_timeout:)},
{uniMagDataProcessingNotification , #selector(umDataProcessing:)},
{uniMagInvalidSwipeNotification , #selector(umSwipe_invalid:)},
{uniMagDidReceiveDataNotification , #selector(umSwipe_receivedSwipe:)},
//
{uniMagCmdSendingNotification , #selector(umCommand_starting:)},
{uniMagCommandTimeoutNotification , #selector(umCommand_timeout:)},
{uniMagDidReceiveCmdNotification , #selector(umCommand_receivedResponse:)},
//
{uniMagSystemMessageNotification , #selector(umSystemMessage:)},
{nil, nil},
};
//register or unregister
for (int i=0; noteAndSel[i].s != nil ;i++) {
if (reg)
[nc addObserver:self selector:noteAndSel[i].s name:noteAndSel[i].n object:nil];
else
[nc removeObserver:self name:noteAndSel[i].n object:nil];
}
}
-(void) umsdk_activate {
//register observers for all uniMag notifications
[self umsdk_registerObservers:TRUE];
NSLog(#"activating");
//enable info level NSLogs inside SDK
// Here we turn on before initializing SDK object so the act of initializing is logged
[uniMag enableLogging:TRUE];
//initialize the SDK by creating a uniMag class object
uniReader = [[uniMag alloc] init];
/*
//set SDK to perform the connect task automatically when headset is attached
[uniReader setAutoConnect:TRUE];
*/
//set swipe timeout to infinite. By default, swipe task will timeout after 20 seconds
[uniReader setSwipeTimeoutDuration:0];
//make SDK maximize the volume automatically during connection
[uniReader setAutoAdjustVolume:TRUE];
//By default, the diagnostic wave file logged by the SDK is stored under the temp directory
// Here it is set to be under the Documents folder in the app sandbox so the log can be accessed
// through iTunes file sharing. See UIFileSharingEnabled in iOS doc.
[uniReader setWavePath: [NSHomeDirectory() stringByAppendingPathComponent: #"/Documents/audio.caf"]];
}
I've been searching round for a while to find a way to determine if an iOS external screen is cable connected OR over the air and can't find any obvious way.
I've seen the unofficial AirPlay specs HERE, but can't see any obvious way of detecting it.
Does anybody know if this can be done using legit / 'public' API.
Yes, there actually is a way.
Somewhere in your app, create an instance of MPVolumeView. Hold on to in in some instance variable. You don't have to add it as a subview to anything, it simply has to exist.
Then subscribe to the MPVolumeViewWirelessRouteActiveDidChangeNotification like so:
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(handleWirelessRouteActiveDidChangeNotification:)
name:MPVolumeViewWirelessRouteActiveDidChangeNotification
object:nil];
Add these methods to find out about the state of external displays:
- (BOOL)isAirPlayConnected
{
return _volumeView.isWirelessRouteActive;
}
- (BOOL)isAirPlayMirroringActive
{
if ([self isAirPlayConnected]) {
NSArray *screens = [UIScreen screens];
if ([screens count] > 1) {
return [screens[1] mirroredScreen] == [UIScreen mainScreen];
}
}
return NO;
}
- (BOOL)isAirPlayPlaybackActive
{
return [self isAirPlayConnected] && ![self isAirPlayMirroringActive];
}
- (BOOL)isExternalPlaybackActive
{
if ([self isAirPlayPlaybackActive]) {
return YES;
} else {
NSArray *screens = [UIScreen screens];
if ([screens count] > 1) {
return [screens[1] mirroredScreen] != [UIScreen mainScreen];
}
}
return NO;
}
Additionally you can check for the UIScreenDidConnectNotification and UIScreenDidDisconnectNotification notifications. Armed with all of this, you can tell if you are connected to AirPlay, if AirPlay Mirroring is active, if you AirPlay playback (not mirroring) is active or if you are using any external screen without mirroring.
I don't believe there is any public API for this. I would guess that, in Apple's view, this is not your app's concern. It's up to the user what they do with your app's screen: they can screenshot it and email it to everyone, or just plug a wire into a projector and show it on the side of a building. Trying to prevent these from within an app isn't likely to be possible.
You can achieve some of this, however, with Apple's Configurator tool. It allows you to configure, say, a company-owned iOS device to allow AirPlay only to certain hosts. It can also prevent screenshots and other things that might be helpful. I don't know if you can get exactly what you're looking for, but it might be something to look in to if you have some level of control over the devices this app is going to be installed on.
I have an application in which the user can select from local video files. When one of those thumbnails gets pushed, the user is presented a new view which have a custom video player I've made that presents the video.
This works flawlessly, but only sometimes. The funny thing is that if the user selects a new video (thus getting presented a new view, initializing a new custom video player object) exactly 5 times, the underlying AVPlayerLayer that is used to present the visuals from the player renders black, even though it seems like the underlying asset still loads correctly (the player interface still holds the correct duration for the video and so forth).
When a new custom media player object gets initialized (which happens when the view controller for the media players containing view gets loaded), this is the part of the initializer method which sets up the AVPlayer and its associated item:
// Start to load the specified asset
mediaAsset = [[AVURLAsset alloc] initWithURL:contentURL options:nil];
if (mediaAsset == nil)
NSLog(#"The media asset is zero!!!");
// Now we need to asynchronously load in the tracks of the specified asset (like audio and video tracks). We load them asynchronously to avoid having the entire app UI freeze while loading occours
NSString* keyValueToLoad = #"tracks";
// When loading the tracks asynchronously we also specify a completionHandler, which is the block of code that should be executed once the loading is either or for some reason failed
[mediaAsset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:keyValueToLoad
] completionHandler:^
{
// When this block gets executed we check for potential errors or see if the asset loaded successfully
NSError* error = nil;
AVKeyValueStatus trackStatus = [mediaAsset statusOfValueForKey:keyValueToLoad error:&error];
if (error != nil)
{
NSLog(#"Error: %#", error.description);
}
//switch (trackStatus) {
//case AVKeyValueStatusLoaded:
if (trackStatus == AVKeyValueStatusLoaded)
{
NSLog(#"Did load properly!");
mediaItem = [AVPlayerItem playerItemWithAsset:mediaAsset];
if (mediaItem.error == nil)
NSLog(#"Everything went fine!");
if (mediaItem == nil)
NSLog(#"THE MEDIA ITEM WAS NIL");
[mediaItem addObserver:self forKeyPath:#"status" options:0 context:&itemStatusContext];
mediaContentPlayer = [[AVPlayer alloc] initWithPlayerItem:mediaItem];
[mediaContentView setPlayer:mediaContentPlayer];
//mediaContentView = [AVPlayerLayer playerLayerWithPlayer:mediaContentPlayer];
[activeModeViewBlocked configurePlaybackSliderWithDuration:mediaItem.duration];
originalDuration = mediaItem.duration.value / mediaItem.duration.timescale;
// We will subscribe to a timeObserver on the player to check for the current playback time of the movie within a specified interval. Doing so will allow us to frequently update the user interface with correct information
playbackTimeObserver = [mediaContentPlayer addPeriodicTimeObserverForInterval:CMTimeMake(1, 50) queue:dispatch_get_main_queue() usingBlock:^(CMTime time)
{
NSLog(#"TIME UPDATED!");
[activeModeViewBlocked updatePlaybackSlider:time];
}];
[self syncUI];
}
if (trackStatus == AVKeyValueStatusFailed)
{
NSLog(#"Something failed!");
}
if (trackStatus == AVKeyValueStatusCancelled)
{
NSLog(#"Something was cancelled!");
}
}];
Now if I initialize this custom media player object 5 times exactly, it always starts to render black screens.
Does anyone have any idea of why this could be happening?
This bit me too. There is a limit on the number of concurrent video players that AVFoundation will allow. That number is four(for iOS 4.x, more recently the number seems to have increased. For example, on iOS 7 I've had up to eight on one screen with no issue). That is why it is going black on the fifth one. You can't even assume you'll get four, as other apps may need a 'render pipeline'.
This API causes a render pipeline:
+[AVPlayer playerWithPlayerItem:]
This is not the No video, audio only problem. It's just the opposite.
The problem arises when using iOS 5.0. iPads running 4.3 or lower play the same video files flawlessly.
since iOS 5 changed the way stuff is initialized for MPMoviePlayerControllers, I had to do some SDK based programming in order to get the video to be displayed. Before implementing the snippet I'm showing next, the video and it's controls won't even show up on the screen. The controller would only show a black square with the size and origin of given CGRect frame.
The way I handle it is the following:
The video files are located on the documents folder. So the NSURL has to be initialized as fileURLWithPath. Once that's done, I proceed to initialized the controller with a given frame. Since it wouldn't work otherwise, the view will only add the player once it has changed its loadState. That's achieve by subscribing to a notification. the subscriber selector performs the addition of the controller's view to the parent view on the main thread since the notification could be handled from other threads.
Initializing and adding video to the view:
-(void)addVideo:(NSString*) videoName onRect:(CGRect)rect {
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
iPadMagazineAppDelegate *appDelegate = GET_APP_DELEGATE;
NSArray *dirArray = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *dirName = [dirArray objectAtIndex:0];
// get directory name for this issue
NSURL *baseURL;
/*
BUGFIX: Video does not work on iOS 5.0
*/
if (SYSTEM_VERSION_GREATER_THAN_OR_EQUAL_TO(#"5.0")){
baseURL = [[NSURL fileURLWithPath:dirName]URLByAppendingPathComponent:[appDelegate.currentIssue getIssueDirectoryName ]];
}else {
baseURL = [[NSURL URLWithString:dirName] URLByAppendingPathComponent:[appDelegate.currentIssue getIssueDirectoryName]];
}
/* end BUGFIX: Video does not work on iOS 5.0 */
NSURL *videoURL = [baseURL URLByAppendingPathComponent:videoName];
MPMoviePlayerController * movieController= [[MPMoviePlayerController alloc]initWithContentURL:videoURL];
// set frame for player
movieController.view.frame = rect;
// set auto resizing masks
[movieController.view setAutoresizingMask:UIViewAutoresizingFlexibleWidth|UIViewAutoresizingFlexibleHeight];
// don't auto play.
[movieController setShouldAutoplay:NO];
[movieController setUseApplicationAudioSession:YES];
/*
BUGFIX: Video does not work on iOS 5.0
*/
if (SYSTEM_VERSION_LESS_THAN_OR_EQUAL_TO(#"5.0")) {
[movieController prepareToPlay];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(loadVideo:) name:MPMoviePlayerLoadStateDidChangeNotification object:movieController];
}else {
[pdfView addSubview:movieController.view];
[pdfView bringSubviewToFront: movieController.view];
}
/* end BUGFIX: Video does not work on iOS 5.0 */
[_moviePlayerViewControllerArray addObject:movieController];
[movieController release];
[pool release];
}
notification handler:
-(void)loadVideo:(NSNotification*)notification {
for (MPMoviePlayerController *movieController in _moviePlayerViewControllerArray) {
if (movieController.loadState != MPMovieLoadStateUnknown) {
[pdfView performSelectorOnMainThread:#selector(addSubview:) withObject:movieController.view waitUntilDone:YES];
[pdfView performSelectorOnMainThread:#selector(bringSubviewToFront:) withObject:movieController.view waitUntilDone:YES];
[[NSNotificationCenter defaultCenter] removeObserver:self name:MPMoviePlayerLoadStateDidChangeNotification object:movieController];
}
}
}
Thank you for reading this huge question. I appreciate your answers.
cheers.
Try this:
Set MPMoviePlayerController's property "useApplicationAudioSession" to "NO".
Apparently there's a bug, but it's not related to the MPMoviePlayerController, but to iOS 5 itself.
My iPad was muted from the switch but still played audio from iPod app anyway so I didn't realized that it was that way, so MPMoviePlayerController was fine, but part of the OS did not notice that the iPad was muted.
I've filed the corresponding bug on Apple's bug tracker. Bug ID# 10368531.
I Apologize if I've wasted your time.
UPDATE: Got feedback from apple for the bug. It's expected behavior. :\