I have a playercontroller view..in this view I am playing selected poem from an array which have collection of my poems.
Now when playing any poem I want to take control to push, play, next, previous and volume control from Apple Watch.
Here is my code for the iPhone which plays a poem:
if(btnClick == 1) {
self.lblTitle.text=[titles objectAtIndex:0];
[self.imgpoem setImage:[poemImages objectAtIndex:0]];
NSString *songurl=[poemcollection objectAtIndex:0];
[self playselectdpoem:songurl];
}
How can I add this functionality to the WatchKit Extension to control all the things?
To control the iPhone app from the watch, you have to use the openParentApplication method. You can take a look HERE, how to implement this method.
The same thing as sending data is to send a signal to the iPhone to do something - in your case, to Pause, Play, Stop ect.
Related
I'm making an app that will play audiobooks synced with iTunes. Is there a way my player can remember the playback position? Or do I need to implement this myself with some sort of database?
I'm testing on iOS 8.4
The key is to set the current playback time to the bookmark time of the mpmediaitem before playing.
Here's an example:
[self.player setQueueWithItemCollection:self.collection];
self.player.currentPlaybackTime = [self.collection.items[0] bookmarkTime];
[self.player play];
An audiobook file will automatically remember its playback position, and when asked to play again later, will resume from that position - that is a built-in feature of the Music app (now iBooks) and therefore of the MPMusicPlayerController.
However, you will notice that the Music app can lose track of the currently playing item (if the user restarts the device). And of course the user might change the currently playing item manually.
Thus, if you want your app to return to what it was playing previously, you will have to save that currently playing item information yourself. And so you might as well save the current playback position too, thus making yourself even more reliable than the Music app.
Recently I've found out that MediaPlayer remote control event (MPRemoteCommand) handlers are not getting called in iOS application with embedded Unity player. I'm using Xcode 6.3.1 and Unity 5.0.1f1, however, it looks like it can be reproduced with any combination of currently available versions.
This is the official Apple documentation for handling remote control events on iOS: https://developer.apple.com/library/ios/documentation/EventHandling/Conceptual/EventHandlingiPhoneOS/Remote-ControlEvents/Remote-ControlEvents.html
The issue is very easy to reproduce - it's just 24 lines of code added to an empty project.
Create new Single View Application in Xcode (to make working example without Unity player) or create an empty project in Unity and immediately build it targeting iOS platform (File -> Build Settings..., iOS, Build) with all the settings left with their default values to get Xcode project generated by Unity (to make broken example with embedded Unity player).
Import MediaPlayer framework header file in AppDelegate.h (for example without Unity) or UnityAppController.h (for example with Unity):
#import <MediaPlayer/MediaPlayer.h>
Declare a property for MPMoviePlayerController object in the class interface located in AppDelegate.h/UnityAppController.h:
#property (strong, nonatomic) MPMoviePlayerController *player;
Insert this code which creates a player with URL (it's a live radio station, you can use any other HTTP live audio source that you'd like to), registers handlers for play, pause and togglePlayPause remote control events and starts playing the audio (thus making application the "Now Playing" app) in the beginning of application:didFinishLaunchingWithOptions: method located in AppDelegate.m/UnityAppController.mm (you can also add it just before return statement of that method - it doesn't matter):
self.player = [[MPMoviePlayerController alloc] initWithContentURL:[NSURL URLWithString:#"http://81.9.96.34/eldoradio-64k"]];
[[MPRemoteCommandCenter sharedCommandCenter].playCommand addTarget:self action:#selector(playCommand)];
[[MPRemoteCommandCenter sharedCommandCenter].pauseCommand addTarget:self action:#selector(pauseCommand)];
[[MPRemoteCommandCenter sharedCommandCenter].togglePlayPauseCommand addTarget:self action:#selector(togglePlayPauseCommand)];
[self.player play];
Add implementation of handlers to AppDelegate.m/UnityAppController.mm:
- (void)playCommand {
[self.player play];
}
- (void)pauseCommand {
[self.player pause];
}
- (void)togglePlayPauseCommand {
if (self.player.playbackState == MPMoviePlaybackStatePlaying) {
[self.player pause];
} else {
[self.player play];
}
}
Build and run the application. You should hear music playing once it is started.
Trigger some remote control events. The easiest way is to plug-in standard Apple EarPods and press play-pause button on them which should generate togglePlayPause event, but opening iOS Control Center (the UI you get by swiping the screen from the bottom) and pressing Play/Pause button from there is also okay (they should generate separate play and pause events respectively).
In the application without Unity audio stops and starts playing according to remote control events being sent.
In the application with Unity handlers are never getting called (you can set breakpoints in them and see that under debugger) and nothing happens at all. You can also notice that the appearance of iOS Control Center is completely different: in the application with embedded Unity player it has play and two rewind buttons regardless of audio state, but in the application without Unity it shows just pause button when audio is playing or just play button when it is not playing.
You can also replace MPMoviePlayerController with AVPlayer from AVFoundation framework - the result will be exactly the same.
Does anybody know any reason for this behavior and/or any workaround for it? This affects our application very badly.
P.S.: I've already asked similar question before, but didn't get any answers. The example in this one is as simple as possible.
My app has pretty much the same setup as yours - and the same problem. As long as I used a mixable audio session
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayAndRecord, withOptions: [.DuckOthers, .InterruptSpokenAudioAndMixWithOthers])
my app wouldn't trigger any of the MPRemoteCommand handlers.
When I turned my audio session into a non-mixable one (by removing the options)
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayAndRecord)
my MPRemoteCommand handlers would get called whenever I pressed the play/pause button on my headset.
So, the solution is to use a non-mixable audio session, which PlayAndRecord is without the explicit mixing options, instead of a mixable audio session.
From what I understand it is currently not possible to play a sound from your watch kit app on the watch. If this is the case what is the best way to play a sound? Currently I am using openParentApplication:reply: to run code on the phone that plays the sound. Is this the best way?
Update: I also found that I can play audio directly from the watch extension using the same methods as in the parent app. No Idea if this will actually play when not on the simulator.
There currently isn't any way to play audio on the Watch. Your assumptions are 100% correct. Apple doesn't want to have to transfer the audio clips from the Watch Extension to the Watch at runtime. Currently, the only supported transfer caching system is for images. For now, you will have to play the audio on the iOS device. You can use one of the following approaches to trigger playing the sound:
openParentApplication:reply: - opens app in the background if not currently running
Darwin notifications through something like MMWormhole
Playing the sound from the Watch extension is certainly a gray area. My guess is that it will play on the iOS Device or not play at all when testing with devices. I would certainly advise against that approach though. Your Watch Extension will typically only be open for a VERY short period of time. It would be a much better idea to play the audio from the iOS app since that is guaranteed to continue running.
As of watchOS 2, you can play audio files using WKAudioFilePlayer on the watch. It appears that mp3 files are not supported, but caf and wav are. Copy the sound file to the extension target.
Setup:
NSURL *assetURL = [[NSBundle mainBundle] URLForResource:#"file" withExtension:#"wav"];
WKAudioFileAsset *asset = [WKAudioFileAsset assetWithURL:assetURL];
WKAudioFilePlayerItem *playerItem = [WKAudioFilePlayerItem playerItemWithAsset:asset];
WKAudioFilePlayer *audioFilePlayer = [WKAudioFilePlayer playerWithPlayerItem:playerItem];
Play:
if (audioFilePlayer.status == WKAudioFilePlayerStatusReadyToPlay) {
[self.audioFilePlayer play];
}
Alternatively, you can use the WKInterfaceMovie class:
[self presentMediaPlayerControllerWithURL:assetURL options:nil completion:nil];
This is the case: I made an app using the Audio Streamer library that reads an audio file from a remote server, but I find a problem only if I do this:
Launch the app
Start a Podcast (audio stream)
Pause it
Put the app on the background (home button)
Lock the phone
Unlock it
Reactivate the app
And only then, my stream will be stopped. I am attempting to reach the paused state, but could not get it done.
This happens only if I put the app in background, if not and I lock/unlock the iPhone, everything is alright again. If I pause the stream then put the app into background (Home button) and then the app is resigned from background-state, everything is OK.
So, this problem only occurs if these two things happen: App sent to background + Lock/unlock the iphone
does it use AVAudioPlayer under the hood? If so you need to handle the AVAudioPlayerDelegate protocols:
- (void) audioPlayerBeginInterruption: (AVAudioPlayer *) player {
and
- (void) audioPlayerEndInterruption: (AVAudioPlayer *) player {
.. methods. Essentially use the first to store the fact that the AVAudioPlayer was stopped due to an interruption, and the second to start it off again. Fiddly, but unfortunately neccessary.
Here's a link https://developer.apple.com/documentation/avfoundation/avaudioplayerdelegate
In my iPhone app I have designed a custom video player, currently it is very basic with just a play pause and stop button,
but I would like the user to be able to scrub, (I think thats the right word) the video like you can do with apple's original media player.
So for Instance I would like to be able to take a UISlider and have it control the current postiion of the videos playback if you get what I mean. Oh and incase your curious, the way I pause/play/stop the video is by using this simple piece of code [self.theMovie play]; [self.theMovie stop]; [self.theMovie pause]; The trouble is I don't know how to scrub the video.
Any help appreciated.
I've asked the same question: customize quicktime iphone and here MPVideoPlayer add/remove buttons
It seams that you have to posibilities:
You can add you view over the main window. An sample can be found here: MoviePlayer Sample
You can iterate through views, found one you need and Add/Remove views. I don't know yet how must does Apple likes/dislikes this method.