I would like to implement video recording in Android TV, or at least a button that fires the recording event, I added the gradle dependency, but I could not find any RecordAction or RecordButton, only MultiAction, FastForwardAction etc.
What I would need is a button where the user clicks/presses and the current time of the current streamed media would be saved.
You may follow this documentation. To tell the system that your TV input service supports recording, set the android:canRecord attribute in your service metadata XML file to true:
<tv-input xmlns:android="http://schemas.android.com/apk/res/android"
android:canRecord="true"
android:setupActivity="com.example.sampletvinput.SampleTvInputSetupActivity" />
Alternatively, you can indicate recording support in your code using these steps:
In your TV input service onCreate() method, create a new TvInputInfo object using the TvInputInfo.Builder class.
When creating the new TvInputInfo object, call setCanRecord(true) before calling build() to indicate your service supports recording.
Register your TvInputInfo object with the system by calling TvInputManager.updateTvInputInfo().
Regarding button where the user clicks/presses, unfortunately I can't find any samples about it. According to the same reference above, the system will invoke the RecordingSession.onStartRecording() callback when the system calls RecordingSession.onTune(). Then your app must start recording immediately.
Related
Issue: When the iOS device is on silent, the sound being played from the app is muted.
Upon some digging into the iOS code, I found that out of the 7 "audio session categories" the right one to use for a music app is playback.
Question: How do I set the category in the audio_service package?
Package version: 0.18.0-beta.1
audio_service only manages remote control of your app via notifications, lock screens, etc. The audio session is typically managed by the audio player plugin that you use.
If you use just_audio, it will by default set the required category, but if not, you can manually override the category via the audio_session package. e.g. The code below will configure reasonable defaults for a podcast app, including setting the category to playback:
(await AudioSession.instance).configure(const AudioSessionConfiguration.speech());
This library helped me to solve this problem: https://pub.dev/packages/audio_session
import 'package:audio_session/audio_session.dart';
...
final session = await AudioSession.instance;
await session.configure(AudioSessionConfiguration.music());
...
For my APP, one of the important function is to mute the iPhone, but I can't find any available iOS API that I can use to mute the phone(or change the ringer volume level to minimum). Is their a specific API for developer to mute (or change the ringer volume of) the phone, if their is not, is there a indirect way to do this?
I believe you can only mute other application sounds. You need to configure the AVAudioSession category :
AVAudioSession Class Reference : http://goo.gl/rh7CX7 .
Look for which fit the most for your application. You only need to set it once in your code (make sure it's called).
AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategorySoloAmbient, error: nil) // AVAudioSessionCategorySoloAmbient is default AVAudioSession.sharedInstance().setActive(true, error: nil)
No.
Applications developed using the official SDK cannot change (and in most cases cannot even access) system-wide settings.
It is possible, but only using private API's. I only went as far as muting the ringer, but you should be able to control the master level as well.
See How to disable iOS System Sounds
I have two questions regarding Remote Control Events on iOS:
I know that music applications are registered to remote control events and then can receive such events from the iPhone's player widget.
Let's say I want my app to fire such events, is that possible?
How does headphones for example generate those events?
Without private API, you cannot send remotecontrol event to your application.
The reason is we cannot create such an Event (UIEvent) to send out by using:
[[UIApplication sharedApplication] sendEvent:anEvent];
You can, however save a registered event then play back by calling the above-function.
I don't know if it possible for headphone events but with private API, you can send some events like: home button press, power button press or mouse events (not tested).
You should read this book:
http://www.amazon.com/gp/product/1118057651/ref=pd_lpo_sbs_dp_ss_1?pf_rd_p=1535523702&pf_rd_s=lpo-top-stripe-1&pf_rd_t=201&pf_rd_i=0321278542&pf_rd_m=ATVPDKIKX0DER&pf_rd_r=0T2AMHJCEEKJN41YJHD5
It'll be a hard work to make this works.
Take a look at GSEvent to know how to send an event to OS (iOS).
Edit: I've found 2 event types in GSEvent.GSEventType:
kGSEventHeadsetButtonDown = 1018,
kGSEventHeadsetButtonUp = 1019,
PS: - this use private API so it will be rejected if you post this app to AppStore.
I have an app that speaks to the user and listens to the user's speech response. I've noticed that when I plug my phone into my car audio system and use the app, when my app is done speaking, it receives an interruption notification and the Music app starts playing music instead of allowing my app to continue.
This doesn't happen if the phone is not attached to an external device, and this doesn't happen the moment I plug the phone in, only when the speech stops and the phone is playing through the car. I have done some testing and determined that this behavior appears when I call the beginReceivingRemoteControlEvents method on my application. If I don't sign up for remote control events when my application loads, the problem does not occur, but I cannot display 'now playing' information for my audio or use the car's controls for controlling playback.
Has anyone found a way to listen for remote control events without forfeiting control of the device's audio playback?
This is often caused by the car stereo rather than your iOS device. Check the stereo's manual and switch it from Audio Mode to iPod Mode (or whatever your manual names these options). Basically, your car stereo is listening for the track ended notification and using that to trigger a 'play next track' notification. This calls the MPMusicPlayer which usually picks the first track alphabetically in your device's library. It could be that there is a workaround in software but I've found that the easiest thing is to change the setting on the car stereo.
Use the following to disable remote control events (you may have to replace togglePlayPauseCommand with playCommand, or do both):
MPRemoteCommandCenter *commandCenter = [MPRemoteCommandCenter sharedCommandCenter];
[commandCenter.togglePlayPauseCommand addTargetWithHandler:^MPRemoteCommandHandlerStatus(MPRemoteCommandEvent * _Nonnull event) {
NSLog(#"toggle button pressed");
return MPRemoteCommandHandlerStatusSuccess;
}];
or, if you prefer to use a method instead of a block:
[commandCenter.togglePlayPauseCommand addTarget:self action:#selector(toggleButtonAction)];
To stop:
[commandCenter.togglePlayPauseCommand removeTarget:self];
or:
[commandCenter.togglePlayPauseCommand removeTarget:self action:#selector(toggleButtonAction)];
You'll need to add this to the includes area of your file:
#import MediaPlayer;
I'm setting up an AVAudioSession when the app launches and setting the delegate to the appDelegate. Everything seems to be working (playback, etc) except that beginInterruption on the delegate is not being called when the phone receives a call. When the call ends endInterruption is being called though.
The only thought I have is that the audio player code I'm using used to be based on AVAudioPlayer, but is now using AVPlayer. The callbacks for the AVAudioPlayer delegate for handling interrupts are still in there, but it seems odd that they would conflict in any way.
Looking at the header, in iOS6, it looks like AVAudioSessionDelegate is now deprecated.
Use AVAudioSessionInterruptionNotification instead in iOS6.
Update: That didn't work. I think there's a bug in the framework.
Yes, in my experience, beginInterruption, nor the newly documented AVAudioSessionInterruptionNotification work properly. What I had to do was track the status of the player using a local flag, then handle the endInterruption:withFlags: method in order to track recovery from interruptions.
With iOS 6, the resuming from an interruption will at least keep your AudioPlayer in the right place, so there was no need for me to store the last known play time of my AVAudioPlayer, I simply had to hit play.
Here's the solution that I came up with. It seems like iOS 6 kills your audio with a Media Reset if an AVPlayer stays resident too long. What ends up happening, is the AVPlayer plays, but no sound comes out. The rate on the AVPlayer is 1, but there's absolutely no sound. To add pain to the situation, there's no error on either the AVAudioSession setActive, nor the AVPlayer itself that indicates that there's a problem.
Add to the fact that you can't depend on appWillResignActive, because your app may already be in the background if you're depending on remote control gestures at all.
The final solution I implemented was to add a periodic observer on the AVPlayer, and record the last known time. When I receive the event that I've been given back control, I create a new AVPlayer, load it with the AVPlayerItem, and seekToTime to the proper time.
It's quite an annoying workaround, but at least it works, and avoids the periodic crashes that were happening.
I can confirm that using the C api, the interruption method is also not called when the interruption begins; only when it ends
(AudioSessionInitialize (nil, nil, interruptionListenerCallback, (__bridge void *)(self));
I've also filed a bug report with apple for the issue.
Edit: This is fixed in iOS 6.1 (but not iOS 6.0.1)
Just call:
[[AVAudioSession sharedInstance] setDelegate: self];
I just checked on my iPhone 5 (running iOS 6.0) by setting a breakpoint in the AudioSessionInterruptionListener callback function that was declared in AudioSessionInitialize(), and this interrupt callback does, in fact, get called when the app has an active audio session and audio unit and is interrupted with an incoming phone call (Xcode shows the app stopped at the breakpoint at the beginning of the interruption, which I then continue from).
I have the app then stop its audio unit and de-activate its audio session. Then, on the end interruption callback, the app re-activates the audio session and restarts the audio unit without problems (the app is recording audio properly afterwards).
I built a brand new audio streaming (AVPlayer) application atop iOS 6.0.x and found the same problem.
Delegates are now deprecated and we have to use notifications, that's great, however here's my findings:
During an incoming phone call I get only AVAudioSessionInterruptionTypeEnded in my handler, along with AVAudioSessionInterruptionOptionShouldResume. Audio session gets suspended automatically (audio fades) and I just need to resume playback of AVPlayer.
However when attempting to launch a game, such as CSR Racing, I oddly get the dreaded AVAudioSessionInterruptionTypeBegan but no sign when my application can resume playback, not even killing the game.
Now, this may depend on other factors, such as my audio category (in my case AVAudioSessionCategoryPlayback) and the mixing settings of both applications (kAudioSessionProperty_OverrideCategoryMixWithOthers), I'm not sure, but definitely I see something out of place.
Hopefully others reported that on 6.1beta this is fixed and I yet have to upgrade, so we'll see.