How take picture from camera when iOS-app is minimized?
(i.e. after applicationDidEnterBackground: / applicationWillResignActive: )
AppDelegate.m: (thank you link)
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions
{
//To make the code block asynchronous
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
//### background task starts
NSLog(#"Running in the background\n");
while(TRUE)
{
printf("Called"); //////Work fine
[self.window.rootViewController captureNow]; /////Capture picture!
[NSThread sleepForTimeInterval: 10.0]; //wait for 10 sec
}
});
return YES;
}
OurViewController.m: (thank you link)
-(IBAction)captureNow {
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in _stillImageOutput.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection)
{
break;
}
}
NSLog(#"about to request a capture from: %#", _stillImageOutput);
[_stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
CFDictionaryRef exifAttachments = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (error)
{
NSLog(#"ERROR = %#", error); ///// Error!
}
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer]; ////SIGABRT, cause imageSampleBuffer is nil
UIImage *image = [[UIImage alloc] initWithData:imageData];
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil);
[image release];
}];
}
This code works fine, when application is active. But take error (SIGABRT), when app is minimized.
Maybe are there other libraries can afford to do it?
For privacy reasons, you're not allowed to access the camera when your app is in the background.
Why?
Well, I'm glad you asked that. Story time!
Bob is a person who works at the NSA, developing super-secret monkey controlling sharks. Why? He can't say.
Bob one day downloaded an app onto his iPhone, called John's Secret Stealer. Bob doesn't read app titles.
Since Bob is a very forgetful person, he one day forgot to leave his phone in the lockers outside of work. While standing over the super-secret shark recipe, he felt his phone in his pocket, and pulled it out. It had buzzed because he just got a text.
At that moment, John's Secret Stealer took a picture using Bob's phone's rear camera, sent it off to John's servers, and Bob never knew.
The next day, the entire world knew about the secret project to control sharks.
That's an extreme example, but it's the principal of the rule. Apple's policy is that the user is always in control - to avoid situations like Bob's.
Related
I am capturing video in preview mode and would like to display a still image captured by the camera.
I currently save the image and capture output to ivars defined in the interface as:
UIImage *snapshot
AVCaptureStillImageOutput* stillImageOutput;
The video displays fine. However, when I try to capture and display a still image, nothing is appearing and, in fact, the debugger shows the stillImageOutput and image are nil. I think this may be a timing issue with the asynchronous capture and that I need to use a completion handler, but I am weak on completion handlers.
What is the proper way to display a still image immediately after capturing it without tying up UI:
Code to capture still:
- (void)takeSnapshot {
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo]) {
videoConnection = connection;
break;
}
}
if (videoConnection) {
break;
}
}
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer != NULL) {
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
snapshot = [UIImage imageWithData:imageData];
}
}];
}
Code to display still. Note absence of completion handler which may be issue, however, I'm not sure how to write that...
[self takeSnapshot];
self.imageView.image = snapshot;
I would change the takeSnapshot method to take in a completion block and then call that completion block within the completion block of your other async method:
captureStillImageAsynchronouslyFromConnection:completionHandler
Here's an example of a method taking a completion block and then calling back to it in the completion block of a method called internally:
// this correlates to your takeSnapshot method
// you want to add a completion portion to this method
- (void)doSomethingAsynchronouslyWithCompletion:(void (^)(NSData *completionData))completion {
// call your other async method
[self anotherAsyncMethodWithItsOwnCompletion:^(NSData *completionDataFromSecondMethod) {
if (completionDataFromSecondMethod.length > 0) {
// this is where you would receive the CMSampleBufferRef from the completion handler of captureStillImageAsynchronouslyFromConnection:completionHandler
// and convert it over to to data
// make sure the completion block isn't nil if it's nullable
if (completion) {
// you would want to pass back the NSData imageData in the completion block here
completion(completionDataFromSecondMethod);
}
}
}];
}
// this method would simulate the captureStillImageAsynchronouslyFromConnection:completionHandler: method
- (void)anotherAsyncMethodWithItsOwnCompletion:(void (^)(NSData * completionDataFromSecondMethod))anotherCompletion {
// this is just to simulate some time waiting for the asnyc task to complete
// never call sleep in your own code
sleep(3);
if (anotherCompletion) {
// this simulates the fake CFSampleBufferRef passed back by the captureStillImage...
NSData *fakeCompletionData = [#"FakeCompletionString" dataUsingEncoding:NSUTF8StringEncoding];
anotherCompletion(fakeCompletionData);
}
}
And an example of how you would call it:
[self doSomethingAsynchronouslyWithCompletion:^(NSData *completionData) {
if (completionData.length > 0) {
// come back on the main queue to modify any UI Elements
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
// this is where you want want to set your self.imageView.image
// self.imageView.image = [UIImage imageWithData:{{dataFromCompletion}}]
NSLog(#"The completionString result = %#", [[NSString alloc] initWithData:completionData encoding:NSUTF8StringEncoding]);
}];
}
}];
This link may be helpful for getting you started with block syntax: http://goshdarnblocksyntax.com
We're currently trying to get HealthKit to work in the background, in order to deliver steps data to our server when the App is closed.
For experimental purposes we've created a brand new iOS project in XCode, enabled HealhtKit and all background modes in Compabilities. After that, we pretty much run the code (see further down).
So what happens first is that the app ofcourse asks for the permissions, which we grant. What we're expecting is that the app should keep deliver the steps data every hour, to the server. But it doesnt do that, it seems like the app cant do anything when it's not active.
The app only deliver data when it gets resumed or started, but not at all from the background (Soft-closed / Hard-closed)
appdelegate.m:
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions {
[self setTypes];
return YES;
}
-(void) setTypes
{
self.healthStore = [[HKHealthStore alloc] init];
NSMutableSet* types = [[NSMutableSet alloc]init];
[types addObject:[HKObjectType quantityTypeForIdentifier:HKQuantityTypeIdentifierStepCount]];
[self.healthStore requestAuthorizationToShareTypes: types
readTypes: types
completion:^(BOOL success, NSError *error) {
dispatch_async(dispatch_get_main_queue(), ^{
[self observeQuantityType];
[self enableBackgroundDeliveryForQuantityType];
});
}];
}
-(void)enableBackgroundDeliveryForQuantityType{
[self.healthStore enableBackgroundDeliveryForType: [HKQuantityType quantityTypeForIdentifier: HKQuantityTypeIdentifierStepCount] frequency:HKUpdateFrequencyImmediate withCompletion:^(BOOL success, NSError *error) {
}];
}
-(void) observeQuantityType{
HKSampleType *quantityType = [HKSampleType quantityTypeForIdentifier:HKQuantityTypeIdentifierStepCount];
HKObserverQuery *query =
[[HKObserverQuery alloc]
initWithSampleType:quantityType
predicate:nil
updateHandler:^(HKObserverQuery *query,
HKObserverQueryCompletionHandler completionHandler,
NSError *error) {
dispatch_async(dispatch_get_main_queue(), ^{
if (completionHandler) completionHandler();
[self getQuantityResult];
});
}];
[self.healthStore executeQuery:query];
}
-(void) getQuantityResult{
NSInteger limit = 0;
NSPredicate* predicate = nil;
NSString *endKey = HKSampleSortIdentifierEndDate;
NSSortDescriptor *endDate = [NSSortDescriptor sortDescriptorWithKey: endKey ascending: NO];
HKSampleQuery *query = [[HKSampleQuery alloc] initWithSampleType: [HKQuantityType quantityTypeForIdentifier:HKQuantityTypeIdentifierStepCount]
predicate: predicate
limit: limit
sortDescriptors: #[endDate]
resultsHandler:^(HKSampleQuery *query, NSArray* results, NSError *error){
dispatch_async(dispatch_get_main_queue(), ^{
// sends the data using HTTP
[self sendData: [self resultAsNumber:results]];
});
}];
[self.healthStore executeQuery:query];
}
I found this out a little while ago when talking to someone from Apple. Apparently you can't access HK data in the background if the device is locked:
NOTE
Because the HealthKit store is encrypted, your app cannot read data
from the store when the phone is locked. This means your app may not
be able to access the store when it is launched in the background.
However, apps can still write data to the store, even when the phone
is locked. The store temporarily caches the data and saves it to the
encrypted store as soon as the phone is unlocked.
from:
https://developer.apple.com/library/ios/documentation/HealthKit/Reference/HealthKit_Framework/
I see something that might be causing an issue in your AppDelegate, particularly this line:
[[NSURLConnection alloc] initWithRequest:request delegate:self];
This is creating an NSURLConnection, but not starting it. Try changing it to this:
NSURLConnection *connection = [[NSURLConnection alloc] initWithRequest:request delegate:self];
[connection start];
Edit: After taking a second look at the docs
They recommend setting up your observer queries in your application didFinishLaunchingWithOptions: method. In your code above, you set the HKObserverQuery up in the authorization handler, which is called on a random background queue. Try making this change to set it up on the main thread:
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions {
[self setTypes];
[self observeQuantityType];
return YES;
}
HKObserverQuery Reference
I Added ios-8's new touchID API to my app.
It usually works as expected, BUT when entering app while my finger is already on home-button - API's success callback is called but pop-up still appears on screen. after pressing CANCEL UI becomes non-responsive.
I also encountered the same issue, and the solution was to invoke the call to the Touch ID API using a high priority queue, as well as a delay:
// Touch ID must be called with a high priority queue, otherwise it might fail.
// Also, a dispatch_after is required, otherwise we might receive "Pending UI mechanism already set."
dispatch_queue_t highPriorityQueue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0);
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, 0.75 * NSEC_PER_SEC), highPriorityQueue, ^{
LAContext *context = [[LAContext alloc] init];
NSError *error = nil;
// Check if device supports TouchID
if ([context canEvaluatePolicy:LAPolicyDeviceOwnerAuthenticationWithBiometrics error:&error]) {
// TouchID supported, show it to user
[context evaluatePolicy:LAPolicyDeviceOwnerAuthenticationWithBiometrics
localizedReason:#"Unlock Using Touch ID"
reply:^(BOOL success, NSError *error) {
if (success) {
// This action has to be on main thread and must be synchronous
dispatch_async(dispatch_get_main_queue(), ^{
...
});
}
else if (error) {
...
}
}];
}
});
When testing our app, we found a delay of 750ms to be optimal, but your mileage may vary.
Update (03/10/2015): Several iOS developers, like 1Password for example, are reporting that iOS 8.2 have finally fixed this issue.
Whilst using a delay can potentially address the issue, it masks the root cause. You need to ensure you only show the Touch ID dialog when the Application State is Active. If you display it immediately during the launch process (meaning the Application is still technically in an inactive state), then these sorts of display issues can occur. This isn't documented, and I found this out the hard way. Providing a delay seems to fix it because you're application is in an active state by then, but this isn't guarenteed.
To ensure it runs when the application is active, you can check the current application state, and either run it immediately, or when we receive the applicationDidBecomeActive notification. See below for an example:
- (void)setup
{
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(applicationDidBecomeActive:)
name:UIApplicationDidBecomeActiveNotification
object:nil];
}
- (void)dealloc
{
[[NSNotificationCenter defaultCenter] removeObserver:self];
}
- (void)viewWillAppear:(BOOL)animated
{
[super viewWillAppear:animated];
// We need to be in an active state for Touch ID to play nice
// If we're not, defer the presentation until we are
if([UIApplication sharedApplication].applicationState == UIApplicationStateActive)
{
[self presentTouchID];
}
else
{
__weak __typeof(self) wSelf = self;
_onActiveBlock = ^{
[wSelf presentTouchID];
};
}
}
-(void)applicationDidBecomeActive:(NSNotification *)notif
{
if(_onActiveBlock)
{
_onActiveBlock();
_onActiveBlock = nil;
}
}
- (void)presentTouchID
{
_context = [[LAContext alloc] init];
_context.localizedFallbackTitle = _fallbackTitle;
[_context evaluatePolicy:LAPolicyDeviceOwnerAuthenticationWithBiometrics
localizedReason:_reason
reply: ^(BOOL success, NSError *authenticationError)
{
// Handle response here
}];
}
This accepted answer does not address the underlying cause of the problem: invoking evaluatePolicy() twice, the second time while the first invocation is in progress. So the current solution only works sometimes by luck, as everything is timing dependent.
The brute-force, straightforward way to work around the problem is a simple boolean flag to prevent subsequent calls from happening until the first completes.
AppDelegate *delegate = [[UIApplication sharedApplication] delegate];
if ( NSClassFromString(#"LAContext") && ! delegate.touchIDInProgress ) {
delegate.touchIDInProgress = YES;
LAContext *localAuthenticationContext = [[LAContext alloc] init];
__autoreleasing NSError *authenticationError;
if ([localAuthenticationContext canEvaluatePolicy:LAPolicyDeviceOwnerAuthenticationWithBiometrics error:&authenticationError]) {
[localAuthenticationContext evaluatePolicy:LAPolicyDeviceOwnerAuthenticationWithBiometrics localizedReason:kTouchIDReason reply:^(BOOL success, NSError *error) {
delegate.touchIDInProgress = NO;
if (success) {
...
} else {
...
}
}];
}
I started getting the "Pending UI mechanism already set." error mentioned as well, so I decided to see if other apps were affected. I have both Dropbox and Mint set up for Touch ID. Sure enough Touch ID wasn't working for them either and they were falling back to passcodes.
I rebooted my phone and it started working again, so it would seem the Touch ID can bug out and stop working. I'm on iOS 8.2 btw.
I guess the proper way to handle this condition is like those apps do and fallback to password / passcode.
Concurrency, GCD, HUD, iOS
Can some GCD expert tell me how to alter the following method, specifically the "HUD AREA" ?
The HUD flashes for a few seconds when it needs to be up for about 45 seconds, while the
"HUD AREA" code all finishes. I only need the corrected use of GCD (async) here.
NSFetchedResultsControllers provide tableView control during DeepCopy run, where new data (unique) in the Default DB is moved into users existing DB. This code works, but the NSLog msgs continue to scroll away long after the HUD disappear. I am stuck. I am sorry I am so lame in this area.
Many Thanks for reading this, Mark
- (void)loadStore {
if (_store) {return;} // Don’t load store if it’s already loaded
iHungry_MeAppDelegate *appDel = (iHungry_MeAppDelegate*)[[UIApplication sharedApplication] delegate];
BOOL isMigrationNecessary = [self isMigrationNecessaryForStore:[appDel storeURL]];
if (isMigrationNecessary) { // DM Ver upgrade
[self performMigrationForStore:[appDel storeURL]]; // quick
}
BOOL newDataNeedsImporting =
[self isNewDefaultDataAlreadyImportedForStoreWithURL:appDel.storeURL
ofType:NSSQLiteStoreType]; // Data Ver upgrade // quick
if (newDataNeedsImporting) {
/* BEGIN HUD AREA */
[MBProgressHUD showHUDAddedTo:appDel.rootTableViewController.view animated:YES];
dispatch_async(dispatch_get_main_queue(), ^{
[self loadSourceStore]; // quick
[self deepCopyFromPersistentStore:nil]; // LONG
dispatch_async(dispatch_get_main_queue(), ^{
NSDictionary *options =
#{
NSMigratePersistentStoresAutomaticallyOption:#YES
,NSInferMappingModelAutomaticallyOption:#YES
};
NSError *error = nil;
DLog(#"Adding Main Store After DeepCopy");
_store = [_coordinator addPersistentStoreWithType:NSSQLiteStoreType
configuration:nil URL:[appDel storeURL]
options:options error:&error];
if (!_store) {NSLog(#"Failed to add store. Error: %#", error);
abort();
}
else{NSLog(#"Successfully added store: %#", _store);
}
[self setNewDefaultDataAsImportedForStore:_store];// in Store's MetaData
[MBProgressHUD hideHUDForView:appDel.rootTableViewController.view animated:YES];
});
});
/* END HUD AREA */
}else{
DLog(#"Normal Non-Upgrade Load.");
...
}
}
Your very first call to dispatch_async is using the main queue instead of a background queue.
Change it to:
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
I'm having a problem when installing my app from the app store, the camera doesn't work on Iphone 5S, ONLY on Iphone 5S the weird thing is that it does work on the other iphones and IT WORKS on Iphone 5S when installed from the Xcode, it's the same version I checked.
The bug appears in the Camera, I'm using AVCaptureSession to show the camera and this is the code that I'm using to capture the picture.
// Capture the image
- (void) capImage {
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo] ) {
videoConnection = connection;
break;
}
}
if (videoConnection) {
break;
}
}
NSLog(#"about to request a capture from: %#", stillImageOutput);
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
if (imageSampleBuffer != NULL) {
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:(imageSampleBuffer)];
self.imagenCamara.image = [UIImage imageWithData:imageData];
imageSampleBuffer = nil;
imageData = nil;
}
}];
}
Oh, the bug is only the camera, everything else in the app works, only it doesn't capture the pic, it stays gray when the pic is taken
I can't seem to find the problem, I don't make sense of it only NOT working on iphone 5S when installed from App Store but working when installed from Xcode.