Getting an audio device with OpenAL - ipad

I'm trying to use OpenAL for an IOS game I'm working on, but having an issue opening the audio device. Specifically, when I call the function alcOpenDevice(NULL), I get 'NULL' in return. This is causing issues, of course, but I can't tell what I'm doing wrong.
I'm new to OpenAL, so I've been looking at a couple guides here and here to see what I need to do. If I download their sample projects and test 'em, they both work fine. If I copy their files into my project, and ignore the files I made, they still work fine. I'm assuming something got lost in translation when I started rebuilding the code for use in my project. Asking around and searching online hasn't given me any leads though, so I'm hoping someone here could put me on the right track.
Here's the actual setup code I'm using in my AudioPlayer.m
- (void)setup {
audioSampleBuffers = [NSMutableDictionary new];
audioSampleSources = [NSMutableArray new];
[self setupAudioSession];
[self setupAudioDevice];
[self setupNotifications];
}
- (BOOL)setupAudioSession {
// // This has been depricated.
//
// /* Setup the Audio Session and monitor interruptions */
// AudioSessionInitialize(NULL, NULL, AudioInterruptionListenerCallback, NULL);
//
// /* Set the category for the Audio Session */
// UInt32 session_category = kAudioSessionCategory_MediaPlayback;
// AudioSessionSetProperty(kAudioSessionProperty_AudioCategory, sizeof(session_category), &session_category);
//
// /* Make the Audio Session active */
// AudioSessionSetActive(true);
BOOL success = NO;
NSError *error = nil;
AVAudioSession *session = [AVAudioSession sharedInstance];
success = [session setCategory:AVAudioSessionCategoryPlayback error:&error];
if (!success) {
NSLog(#"%# Error setting category: %#", NSStringFromSelector(_cmd), [error localizedDescription]);
return success;
}
success = [session setActive:YES error:&error];
if (!success) {
NSLog(#"Error activating session: %#", [error localizedDescription]);
}
return success;
}
- (BOOL)setupAudioDevice {
// 'NULL' uses the default device.
openALDevice = alcOpenDevice(NULL); // Returns 'NULL'
ALenum error = alGetError(); // Returns '0'
NSLog(#"%i", error);
if (!openALDevice) {
NSLog(#"Something went wrong setting up the audio device.");
return NO;
}
// Create a context to use with the device, and make it the current context.
openALContext = alcCreateContext(openALDevice, NULL);
alcMakeContextCurrent(openALContext);
[self createAudioSources];
// Setup was successful
return YES;
}
- (void)createAudioSources {
ALuint sourceID;
for (int i = 0; i < kMaxConcurrentSources; i++) {
// Create a single source.
alGenSources(1, &sourceID);
// Add it to the array.
[audioSampleSources addObject:[NSNumber numberWithUnsignedInt:sourceID]];
}
}
Note: I'm running IOS 7.1.1 on a new iPad air, and using Xcode 5.1.1. This issue has been confirmed on the iPad, my simulator, and an iPod touch.

The Short Answer:
Apple's implementation of alcOpenDevice() only returns the device once. Every subsequent call returns NULL. This function can be called by a lot of Apple audio code, so take out EVERY TRACE of audio code before using OpenAL and manually calling that function yourself.
The Long Answer:
I spent half a day dealing with this problem while using ObjectAL, and ended up doing exactly what you did, re-making the entire project. It worked, until out of curiosity I copied the entire project over, then same problem again, alcOpenDevice(NULL) returned NULL. By chance I stumbled upon the answer. It was this bit of code in my swift game scene:
let jumpSound = SKAction.playSoundFileNamed("WhistleJump.mp3", waitForCompletion: false)
And then I remembered I had this problem before without SKAction involved. That time it turned out I was using ObjectAL in two different ways, I used OALSimpleAudio in one place, and OpenAL objects in another, and it was initializing my audio session twice.
The common thread between these two incidents is both times alcOpenDevice() was called more than once during the life of the application. The first time it was ObjectAL calling it twice due to my misuse of the library. The second SKAction.playSoundFileNamed() must have called alcOpenDevice() before my ObjectAL code did. Upon further research I found this bit in the OpenAL 1.1 Specification:
6.1.1. Connecting to a Device
The alcOpenDevice function allows the application (i.e. the client program) to connect to a device (i.e. the server).
ALCdevice * alcOpenDevice (const ALCchar *deviceSpecifier);
If the function returns NULL, then no sound driver/device has been found. The argument is a null terminated string that requests a certain device or device configuration. If NULL is specified, the implementation will provide an implementation specific default.
My hunch is that Apple's implementation of this function only returns the correct device ONCE for the life of the application. Every time alcOpenDevice is called after that, it returns NULL. So bottom line: Take out every trace of audio code before switching to OpenAL. Even code that seems safe, like SKAction.playSoundFileNamed() still might contain a call to alcOpenDevice() buried deep in its implementation.
For those using ObjectAL, here is the console output of this problem to help them find their way here from google, as I couldn't find a good answer myself:
OAL Error: +[ALWrapper openDevice:]: Could not open device (null)
OAL Error: -[ALDevice initWithDeviceSpecifier:]: <ALDevice: 0x17679b20>: Failed to create OpenAL device (null)
OAL Error: +[ALWrapper closeDevice:]: Invalid Enum (error code 0x0000a003)
OAL Warning: -[OALAudioSession onAudioError:]: Received audio error notification, but last reset was 0.012216 seconds ago. Doing nothing.
fatal error: unexpectedly found nil while unwrapping an Optional value

This SO answer seems to validate my comment about AVAudioSession conflicting with OpenAL. Try removing AVAudioSession, or initializing OpenAL first (tho I imagine this would cause the inverse problem).

Alright, so I ended up starting over in a fresh project with a copy-pasted version of AudioSamplePlayer from the first sample project I linked. -It worked.
I then edited it step by step back to the format I had set up in my project. -It still works!
I still don't know what I did wrong the first time, and I'm not sure it was even in my audio player anymore, but It's running now. I blame gremlins.
...maybe alien surveillance.

Related

Can't read or create UIManagedDocuments anymore

I have a big issue in my app, which prevents creating new documents and reading them, whereas it worked well until now.
I didn't change anything, and it started bugging from a build to another.
This is the code I'm using:
CLProject *project = [[CLProject alloc] initWithFileURL:projectURL];
NSLog (#"Will save project at URL: %#", projectURL);
[project saveToURL:projectURL forSaveOperation:UIDocumentSaveForCreating completionHandler:^(BOOL success) {
NSLog (#"Project saved: %d", success);
[...]
}];
CLProject is a subclass of UIManagedDocument.
The first NSLog is called, but not the second one. Instead I get an error :
2018-02-14 19:21:03.597495+0100 CamList[2247:750786] Will save project
at URL:
file:///var/mobile/Containers/Data/Application/151E38F5-2214-4876-A188-2AB8B5E8CF6A/Documents/Projects/715A0087-F2EF-439B-A2DD-8E878EF8A973.camlist
2018-02-14 19:21:03.783397+0100 CamList[2247:750886] [default] [ERROR]
Could not get attribute values for item
/var/mobile/Containers/Data/Application/151E38F5-2214-4876-A188-2AB8B5E8CF6A/Documents/Projects/715A0087-F2EF-439B-A2DD-8E878EF8A973.camlist
(n). Error: Error Domain=NSFileProviderInternalErrorDomain Code=1 "The
reader is not permitted to access the URL."
UserInfo={NSLocalizedDescription=The reader is not permitted to access
the URL.}
But it doesn't crash, the app keeps running (but nothing happens because the completion block never gets called).
What I don't understand is that everything was working fine and I haven't changed anything...
Can you help me??
Thanks
Well, it seems to work fine again this morning... Nothing to understand. My iPhone had to be tired...

AVAudioPlayer will not play audio in one project with all correct parameters, but will work fine in another project

I have a huge Xcode project, which I have been working on an update to. After installing the iOS 7.1 SDK, playing audio on the AVAudioPlayer in this project no longer works. I created a new, blank project to test out the exact same code, and it worked perfectly.
I know for certain that the file is copied under the bundle resources, the file is added to the target, the URL is perfect because I was able to get the NSData from the NSURL of the file, and it matched. The AVAudioPlayer is a property with both the strong and nonatomic attributes, but it will not play in this one project. I also made sure to set the AVAudioSession to playback mode.
I even created some blank view controller classes to test out the AVAudioPlayer in the project, and it would not work in any of them, but in the new, blank iOS project I made, the sounds plays fine.
In the .h
NSURL *soundUrl;
#property (nonatomic, strong) AVAudioPlayer *soundPlayer;
In the .m
#synthesize soundPlayer = _soundPlayer;
- (void)playSomeSound {
NSError *audioError;
soundUrl = [[NSURL alloc] initFileURLWithPath:[[NSBundle mainBundle] pathForResource:soundFile ofType:#"m4a"]];
_soundPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:soundUrl error:&audioError];
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:NULL];
_soundPlayer.delegate = self;
[_soundPlayer play];
}
soundFile is just a string of the name of the sound file that I am trying to play.
I try to log for errors in all the delegate methods of the audio player and in the initialization of the audio player. All errors return (null).
Unlike on iOS 7, where the audio simply does not play, on iOS 6.1, initializing the audio player causes a crash with an EXC_BAD_ACCESS code 2.
Also, the AVAudioPlayer in CocosDenshion seems to work. (I have cocos2d used in parts of my project, but it is not a game.)
Another (possibly important) note is that I use AVAudioRecorder in my project as well. That works perfectly without any issues, and I make sure to switch the AVAudioSessionCategory to playback when I am not recording.
Its due to the change that apple made to AVAudioSession properties in iOS 7.
As per Apple's documentation:
The audio category has changed (e.g. AVAudioSessionCategoryPlayback
has been changed to AVAudioSessionCategoryPlayAndRecord).
So change the line [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:NULL]; to
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord error:NULL];
Considering the fact that exactly the same code works in your other project, it's hard to tell what is the problem in this project. But I can give you a couple of suggestions on further debugging it:
Call your -playSomeSound as the first thing you do in application:didFinishLaunchingWithOptions:. This way you will see whether it's not working because of something else you do before calling it.
Add another sound to your project, and try playing that one. Perhaps still the problem is with the original sound.
Also a couple questions:
Where is audioError declared?
What's the result of [_soundPlayer play] (should return YES on
success)?
hava a try:
[_soundPlayer prepareToPlay];
[_soundPlayer play];
Thanks everyone, but somehow it ended up being a conflict between some Frameworks. There was a bug in Kamcord SDK for iOS, but I am in contact with their development team, and they are working to fix it.
This issue is still apparent in the latest Kamcord SDK, so instead of everyone running in to this issue having to contact them, here's the smallest amount of code/changes you need to fix AVAudioPlayer, and to have Kamcord also include audio from AVAudioPlayer in it's recordings.
First, add OpenAL to your frameworks.
Import OpenAL headers in AppDelegate.m
#import <OpenAl/al.h>
#import <OpenAl/alc.h>
Set up static variables for OpenAL device and context:
static ALCdevice *openALDevice;
static ALCcontext *openALContext;
Create two functions to start and to stop OpenAL
- (void)startOpenAL
{
NSLog(#"OpenAL: Starting");
openALDevice = alcOpenDevice(NULL);
openALContext = alcCreateContext(openALDevice, NULL);
alcMakeContextCurrent(openALContext);
}
- (void)stopOpenAL
{
NSLog(#"OpenAL: Stopping");
alcMakeContextCurrent(NULL);
alcDestroyContext(openALContext);
alcCloseDevice(openALDevice);
}
In AppDelegate.m's didFinishLaunchingWithOptions, start OpenAL (I do it before initializing Kamcord)
[self startOpenAL];
Finally, in applicationWillTerminate, call the stop OpenAL
[self stopOpenAL];
You're done. AVAudioPlayer should now start working again and all of your audio is correctly being recorded by Kamcord.

Play audio through upper (phone call) speaker

I'm trying to get audio in my app to play through the upper speaker on the iPhone, the one you press to your ear during a phone call. I know it's possible, because I've played a game from the App Store ("The Heist" by "tap tap tap") that simulates phone calls and does exactly that.
I've done a lot of research online, but I'm having a surprisingly hard time finding ANYONE who has even discussed the possibility. The overwhelming majority of posts seem to be about the handsfree speaker vs plugged-in earphones, (like this and this and this), rather than the upper "phone call" speaker vs the handsfree speaker. (Part of that problem might be not having a good name for it: "phone speaker" often means the handsfree speaker at the bottom of the device, etc, so it's hard to do a really well-targeted search). I've looked into Apple's Audio Session Category Route Overrides, but those again seem to (correct me if I'm wrong) deal only with the handsfree speaker at the bottom, not the speaker at the top of the phone.
I have found ONE post that seems to be about this: link. It even provides a bunch of code, so I thought I was home free, but now I can't seem to get the code to work. For simplicity I just copied the DisableSpeakerPhone method (which if I understand it correctly should be the one to re-route audio to the upper speaker) into my viewDidLoad to see if it would work, but the first "assert" line fails, and the audio continues to play out the bottom. (I also imported the AudioToolbox Framework, as suggested in the comment, so that isn't the problem.)
Here is the main block of code I'm working with (this is what I copied into my viewDidLoad to test), although there are a few more methods in the article I linked to:
void DisableSpeakerPhone () {
UInt32 dataSize = sizeof(CFStringRef);
CFStringRef currentRoute = NULL;
OSStatus result = noErr;
AudioSessionGetProperty(kAudioSessionProperty_AudioRoute, &dataSize, &currentRoute);
// Set the category to use the speakers and microphone.
UInt32 sessionCategory = kAudioSessionCategory_PlayAndRecord;
result = AudioSessionSetProperty (
kAudioSessionProperty_AudioCategory,
sizeof (sessionCategory),
&sessionCategory
);
assert(result == kAudioSessionNoError);
Float64 sampleRate = 44100.0;
dataSize = sizeof(sampleRate);
result = AudioSessionSetProperty (
kAudioSessionProperty_PreferredHardwareSampleRate,
dataSize,
&sampleRate
);
assert(result == kAudioSessionNoError);
// Default to speakerphone if a headset isn't plugged in.
// Overriding the output audio route
UInt32 audioRouteOverride = kAudioSessionOverrideAudioRoute_None;
dataSize = sizeof(audioRouteOverride);
AudioSessionSetProperty(
kAudioSessionProperty_OverrideAudioRoute,
dataSize,
&audioRouteOverride);
assert(result == kAudioSessionNoError);
AudioSessionSetActive(YES);
}
So my question is this: can anyone either A) help me figure out why that code doesn't work, or B) offer a better suggestion for being able to press a button and route the audio up to the upper speaker?
PS I am getting more and more familiar with iOS programming, but this is my first foray into the world of AudioSessions and such, so details and code samples are much appreciated! Thank you for your help!
UPDATE:
From the suggestion of "He Was" (below) I've removed the code quoted above and replaced it with:
[[AVAudioSession sharedInstance] setCategory: AVAudioSessionCategoryPlayAndRecord error:nil];
[[AVAudioSession sharedInstance] setActive: YES error:nil];
at the beginning of viewDidLoad. It still isn't working, though, (by which I mean the audio is still coming out of the speaker at the bottom of the phone instead of the receiver at the top). Apparently the default behavior should be for AVAudioSessionCategoryPlayAndRecord to send audio out of the receiver on its own, so something is still wrong.
More specifically what I'm doing with this code is playing audio through the iPod Music Player (initialized right after the AVAudioSession lines above in viewDidLoad, for what it's worth):
_musicPlayer = [MPMusicPlayerController iPodMusicPlayer];
and the media for that iPod Music Player is chosen through an MPMediaPickerController:
- (void) mediaPicker: (MPMediaPickerController *) mediaPicker didPickMediaItems: (MPMediaItemCollection *) mediaItemCollection {
if (mediaItemCollection) {
[_musicPlayer setQueueWithItemCollection: mediaItemCollection];
[_musicPlayer play];
}
[self dismissViewControllerAnimated:YES completion:nil];
}
This all seems fairly straightforward to me, I've got no errors or warnings, and I know that the Media Picker and Music Player are working correctly because the correct songs start playing, it's just out of the wrong speaker. Could there be a "play media using this AudioSession" method or something? Or is there a way to check what audio session category is currently active, to confirm that nothing could have switched it back or something? Is there a way to emphatically tell the code to USE the receiver, rather than relying on the default to do so? I feel like I'm on the one-yard line, I just need to cross that final bit...
EDIT: I just thought of a theory, wherein it's something about the iPod Music Player that doesn't want to play out of the receiver. My reasoning: it is possible to set a song to start playing through the official iPod app and then seamlessly adjust it (pause, skip, etc) through the app I'm developing. The continuous playback from one app to the next made me think that maybe the iPod Music Player has its own audio route settings, or maybe it doesn't stop to check the settings in the new app? Does anyone who knows what they're talking about think it could it be something like that?
Was struggling with this for a while too. Maybe this would help someone later.You can also use the newer methods of overriding ports. Many of the methods in your sample code are actually deprecated.
So if you have your AudioSession sharedInstance by getting,
NSError *error = nil;
AVAudioSession *session = [AVAudioSession sharedInstance];
[session setCategory:AVAudioSessionCategoryPlayAndRecord error:&error];
[session setActive: YES error:nil];
The session category has to be AVAudioSessionCategoryPlayAndRecord
You can get the current output by checking this value.
AVAudioSessionPortDescription *routePort = session.currentRoute.outputs.firstObject;
NSString *portType = routePort.portType;
And now depending on the port you want to send it to, simply toggle the output using
if ([portType isEqualToString:#"Receiver"]) {
[session overrideOutputAudioPort:AVAudioSessionPortOverrideSpeaker error:&error];
} else {
[session overrideOutputAudioPort:AVAudioSessionPortOverrideNone error:&error];
}
This should be a quick way to toggle the outputs to the speaker phone and receiver.
You have to initialise your audio session first.
Using the C API
AudioSessionInitialize (NULL, NULL, NULL, NULL);
In iOS6 you can use AVAudioSession methods instead (you will need to import the AVFoundation framework to use AVAudioSession):
Initialization using AVAudioSession
self.audioSession = [AVAudioSession sharedInstance];
Setting the audioSession category using AVAudioSession
[self.audioSession setCategory:AVAudioSessionCategoryPlayAndRecord
error:nil];
For further research, if you want better search terms, here are the full names of the constants for the speakers:
const CFStringRef kAudioSessionOutputRoute_BuiltInReceiver;
const CFStringRef kAudioSessionOutputRoute_BuiltInSpeaker;
see apple's docs here
But the real mystery is why you are having any trouble routing to the receiver. It's the default behaviour for the playAndRecord category. Apple's documentation of kAudioSessionOverrideAudioRoute_None:
"Specifies, for the kAudioSessionCategory_PlayAndRecord category, that output audio should go to the receiver. This is the default output audio route for this category."
update
In your updated question you reveal that you are using the MPMusicPlayerController class. This class invokes the global music player (the same player used in the Music app). This music player is separate from your app, and so doesn't share the same audio session as your app's audioSession. Any properties you set on your app's audioSession will be ignored by the MPMusicPlayerController.
If you want control over your app's audio behaviour, you need to use an audio framework internal to your app. This would be AVAudioRecorder / AVAudioPlayer or Core Audio (Audio Queues, Audio Units or OpenAL). Whichever method you use, the audio session can be controlled either via AVAudioSession properties or via the Core Audio API. Core Audio gives you more fine-grained control, but with each new release of iOS more of it is ported over to AVFoundation, so start with that.
Also remember that the audio session provides a way for you to describe the intended behaviour of your app's audio in relation to the total iOS environment, but it will not hand you total control. Apple takes care to ensure that the user's expectations of their device's audio behaviour remain consistent between apps, and when one app needs to interrupt another's audio stream.
update 2
In your edit you allude to the possibility of audio sessions checking other app's audio session settings. That does not happen1. The idea is that each app sets it's preferences for it's own audio behaviour using it's self-contained audio session. The operating system arbitrates between conflicting audio requirements when more than one app competes for an unshareable resource, such as the internal microphone or one of the speakers, and will usually decide in favour of that behaviour which is most likely to meet the user's expectations of the device as a whole.
The MPMusicPlayerController class is slightly unusual in that it gives you the ability for one app to have some degree of control over another. In this case, your app is not playing the audio, it is sending a request to the Music Player to play audio on your behalf. Your control is limited by the extent of the MPMusicPlayerController API. For more control, your app will have to provide it's own implementation of audio playback.
In your comment you wonder:
Could there be a way to pull an MPMediaItem from the MPMusicPlayerController and then play them through the app-specific audio session, or anything like that?
That's a (big) subject for a new question. Here is a good starting read (from Chris Adamson's blog) From iPod Library to PCM Samples in Far Fewer Steps Than Were Previously Necessary - it's the sequel to From iphone media library to pcm samples in dozens of confounding and potentially lossy steps - that should give you a sense to the complexity you will face. This may have got easier since iOS6 but I wouldn't be so sure!
1 there is an otherAudioPlaying read-only BOOL property in ios6, but that's about it
Swift 3.0 Code
func provider(_ provider: CXProvider, didActivate audioSession: AVAudioSession) {
let routePort: AVAudioSessionPortDescription? = obsession. current Route. outputs. first
let portType: String? = routePort?.portType
if (portType == "Receiver") {
try? audioSession.overrideOutputAudioPort(.speaker)
}
else {
try? audioSession.overrideOutputAudioPort(.none)
}
swift 5.0
func activateProximitySensor(isOn: Bool) {
let device = UIDevice.current
device.isProximityMonitoringEnabled = isOn
if isOn {
NotificationCenter.default.addObserver(self, selector: #selector(proximityStateDidChange), name: UIDevice.proximityStateDidChangeNotification, object: device)
let session = AVAudioSession.sharedInstance()
do{
try session.setCategory(.playAndRecord)
try session.setActive(true)
try session.overrideOutputAudioPort(AVAudioSession.PortOverride.speaker)
} catch {
print ("\(#file) - \(#function) error: \(error.localizedDescription)")
}
} else {
NotificationCenter.default.removeObserver(self, name: UIDevice.proximityStateDidChangeNotification, object: device)
}
}
#objc func proximityStateDidChange(notification: NSNotification) {
if let device = notification.object as? UIDevice {
print(device)
let session = AVAudioSession.sharedInstance()
do{
let routePort: AVAudioSessionPortDescription? = session.currentRoute.outputs.first
let portType = routePort?.portType
if let type = portType, type.rawValue == "Receiver" {
try session.overrideOutputAudioPort(AVAudioSession.PortOverride.speaker)
} else {
try session.overrideOutputAudioPort(AVAudioSession.PortOverride.none)
}
} catch {
print ("\(#file) - \(#function) error: \(error.localizedDescription)")
}
}
}

Detect if headset is plugged in - iOS 5

I'm aware that this is a question already asked, I've found possible duplicates:
Detecting if headphones are plugged into iPhone
headphone plug-in plug-out event when audio route doesn't change - iOS
Detect if headphones (not microphone) are plugged in to an iOS device
...and more info on the WWW. But I've tried out every solution given and everytime I have problem, probably because they are old threads and are referring to iOS 4.
How can I detect it on iOS 5.0?
Thanks
If you're okay with an iOS 6-only solution, Apple added several new AVAudioSession properties that let you detect audio routes in just a few lines (and without the use of C).
Use this method to check for headphones (or adjust it to check for other outputs - "Speaker", "Headset", etc.):
- (BOOL)isHeadsetPluggedIn
{
// Get array of current audio outputs (there should only be one)
NSArray *outputs = [[AVAudioSession sharedInstance] currentRoute].outputs;
NSString *portName = [[outputs objectAtIndex:0] portName];
if ([portName isEqualToString:#"Headphones"]) {
return YES;
}
return NO;
}
If you want to respond to audio route changes passively, you can do this with the new NSNotification, AVAudioSessionRouteChangeNotification. Unfortunately, this notification doesn't tell you what the new route is, just the previous route that it switched from. But, you can just call some variation of the method above to get the current route.
Wes seems to have a great solution. Alas it is not international-proof. This code only works for the English language. In Dutch, for instance, the headset is called 'Koptelefoon' and
*portName
contains indeed 'Koptelefoon' which makes the test fail.
This will do the job internationally correct:
if ([portDescription.portType isEqualToString:AVAudioSessionPortHeadphones])
;

NSFileCoordinator error when using UIManagedDocument in iOS 5.0 simulator

I am using a UIManagedDocument in iOS 5.0, running the app on the simulator, using XCode 4.2 under OSX 10.6. The code in question looks as follows:
if (![[NSFileManager defaultManager] fileExistsAtPath:[self.photoDatabase.fileURL path]]) {
// does not exist on disk, so create it
[self.photoDatabase saveToURL:self.photoDatabase.fileURL forSaveOperation:UIDocumentSaveForCreating completionHandler:^(BOOL success) {
[self setupFetchedResultsController];
[self fetchFlickrDataIntoDocument:self.photoDatabase];
}];
} else if (self.photoDatabase.documentState == UIDocumentStateClosed) {
// exists on disk, but we need to open it
// *** the following line generates the message ***
[self.photoDatabase openWithCompletionHandler:^(BOOL success) {
//[self setupFetchedResultsController];
}];
} else if (self.photoDatabase.documentState == UIDocumentStateNormal) {
// already open and ready to use
[self setupFetchedResultsController];
}
Running the marked line creates the following message on the log:
2012-01-10 22:33:17.109 Photomania[5149:4803] NSFileCoordinator: A surprising server error was signaled. Details: Connection invalid
After the message is sent, the UIManagedDocument may or may not work—I have not found the circumstances that determine this, yet.
I am pretty sure that the code is correct, as it's actually one of the code examples in the CS193p course from Stanford. The whole example can be downloaded at their website under
http://www.stanford.edu/class/cs193p/cgi-bin/drupal/
Direct link to the code:
http://www.stanford.edu/class/cs193p/cgi-bin/drupal/system/files/sample_code/Photomania_0.zip
Additionally, the code runs fine on the device itself, without generating the "surprising" message, and running all the code that comes afterwards just fine.
I have not found anything on Google, neither on the Apple Developer pages. Restarting the simulator, or XCode, or reinstalling both of them does not change the behaviour.
Any ideas?
I can only say that I've had this happen to me several times. For me, I'm lazy after I update my dataModel and so far, each time I've gotten this error it was because I had changed my data model. Usually, all I need to do is delete my app from the simulator and re-run it and it has always turned out fine. Hope this helps someone out there.
I think I have found the answer. It looks like the automatic saving for UIManagedDocument kicks in only after a few seconds on the simulator.
So I minimized the app on the simulator, by pressing the home button, and then clicked on the icon to maximize it again. And then I terminated the app in simulator.
When I re-launched the app, the database was loaded. The error still shows up - it comes because the document is in "closed" state (that's normal - that's why CS193P asked to call openWithCompletionHandler), but my data across launches is preserved. Unfortunately I have to do the minimize/maximize routine before terminating the app, or the changes are discarded at next launch.
Can you verify that this is the behavior you are able to recreate? At least for testing purposes this should be a good enough trick to use.
Try upgrading to the latest iOS 5.1. I don't think UIManagedDocument with iCloud works reliably in 5.0. This has been my experience.
I love the Stanford iTunes class. However, I think the sample code for using UIManagedDocument is wrong. In fact, he notes in the demo that he is only doing it that way because he wants to just fetch the information right then. In the code comments, he says not to use the auto-save features because the data will not be saved if the app quits. however, UIManagedDocument will save anything that's necessary before quitting. It has all pertinent handlers for quitting/multitasking/etc to make sure the data is saved.
So, if you are using that code as your example, here's a version that should work, and does not use saveToURL (I don't have a flickr account, so I didn't actually run it - but this is how the class is designed to work). Please let me know if it does not work.
- (void)fetchFlickrDataIntoDocument:(UIManagedDocument *)document
{
NSManagedObjectContext *ctx = [[NSManagedObjectContext alloc] initWithConcurrencyType: NSPrivateQueueConcurrencyType];
ctx.parentContext = document.managedObjectContext;
[ctx performBlock:^{
NSArray *photos = [FlickrFetcher recentGeoreferencedPhotos];
for (NSDictionary *flickrInfo in photos) {
[Photo photoWithFlickrInfo:flickrInfo inManagedObjectContext:ctx];
// Push changes to document MOC
[ctx save:0]; // propagates changes to parent MOC
// and tell the document it is dirty and needs to be saved
// It will be saved when the document decides its time to save
// but it *will* be saved.
[document updateChangeCount:UIDocumentChangeDone]
}
}];
}
Still had errors when the last path component for document file url was #"Database". Adding an extension #"Database.db" seems to have fixed it, everything running fine now. Have also upgraded to Lion though.
NSURL *url = [[[NSFileManager defaultManager] URLsForDirectory:NSDocumentDirectory inDomains:NSUserDomainMask] lastObject];
url = [url URLByAppendingPathComponent:#"Database.db"];

Resources