I'm developing a WebRTC iOS application to receive video/audio streaming from a webcam.
The device only receives audio and video streams, does not collect audio and video, so do not need to apply for microphone permissions.
How can I prohibit the application of microphone permisson?
#kemmitorz
I have deleted the following method,but did not solve the problem.
- (RTCRtpSender *)createAudioSender
{
RTCMediaConstraints *constraints = [self defaultMediaAudioConstraints];
RTCAudioSource *source = [_factory audioSourceWithConstraints:constraints];
RTCAudioTrack *track = [_factory audioTrackWithSource:source
trackId:kARDAudioTrackId];
RTCRtpSender *sender = [_peerConnection
senderWithKind:kRTCMediaStreamTrackKindAudio
streamId:kARDMediaStreamId];
sender.track = track;
return sender;
}
If I set OfferToReceiveAudio to false. The device will not apply for microphone permission. But the received video has no sound.
- (RTCMediaConstraints )defaultOfferConstraints
{
NSDictionary *mandatoryConstraints = #{
#"OfferToReceiveAudio" : #"true",
#"OfferToReceiveVideo" : #"true"
};
RTCMediaConstraints constraints = [[RTCMediaConstraints alloc]
initWithMandatoryConstraints:mandatoryConstraints
optionalConstraints:nil];
return constraints;
}
Recording audio requires explicit permission from the user. The first
time your app’s audio session attempts to use an audio input route
while using a category that enables recording (see “Audio Session
Categories”), the system automatically prompts the user for
permission; alternatively, you can call requestRecordPermission: to
prompt the user at a time of your choosing
switch ([[AVAudioSession sharedInstance] recordPermission]) {
case AVAudioSessionRecordPermissionGranted:
// here call your record method and put this condition in your viewcontroller.
break;
case AVAudioSessionRecordPermissionDenied:
break;
case AVAudioSessionRecordPermissionUndetermined:
// This is the initial state before a user has made any choice
// You can use this spot to request permission here if you want
break;
default:
break;
}
Happy Coding..
Any query then tell me.
Related
I am working on iOS Webrtc and want to make Audio only call with Audio Only SDP
I am creating Offer as bellow but SDP still has audio and video in it
What is the correct way to create AUDIO only offer
NSDictionary *mandatoryConstraints = #{
#"OfferToReceiveAudio" : #"true",
#"OfferToReceiveVideo" : #"false"
};
RTCMediaConstraints* constraints =
[[RTCMediaConstraints alloc] initWithMandatoryConstraints:mandatoryConstraints
optionalConstraints:nil];
I was able to do this.
Create your peerConnection with those constraints
Create an AudioTrack
Create a MediaStream, add the audio track to the MediaStream
Add the MediaStream to your peerConnection
Create the offer and send to the receiver
On the receiving side, you do the reverse
I am able to find the solution by creating Video Sender only if video is enabled as below.
if (_videoAllowed) {
[self createVideoSender];
}
I have made a code that captures device video input and so far it is working fine. Here is what I have set
// add preview layer
_previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:_session];
_previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.videoView.layer addSublayer:_previewLayer];
// add movie output
_movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
[_session addOutput:_movieFileOutput];
AVCaptureConnection *movieFileOutputConnection = [_movieFileOutput connectionWithMediaType:AVMediaTypeVideo];
movieFileOutputConnection.videoOrientation = [self videoOrientationFromCurrentDeviceOrientation];
// start session
[_session startRunning];
where:
- (AVCaptureVideoOrientation) videoOrientationFromCurrentDeviceOrientation {
switch ([[UIApplication sharedApplication] statusBarOrientation]) {
case UIInterfaceOrientationPortrait: {
return AVCaptureVideoOrientationPortrait;
}
case UIInterfaceOrientationLandscapeLeft: {
return AVCaptureVideoOrientationLandscapeLeft;
}
case UIInterfaceOrientationLandscapeRight: {
return AVCaptureVideoOrientationLandscapeRight;
}
case UIInterfaceOrientationPortraitUpsideDown: {
return AVCaptureVideoOrientationPortraitUpsideDown;
}
case UIInterfaceOrientationUnknown: {
return 0;
}
}
}
Now when interface orientation changes I want my output also to change, so I have this:
- (void) updatePreviewLayer {
_previewLayer.frame = CGRectMake(0, 0, self.videoView.frame.size.width, self.videoView.frame.size.height);
_previewLayer.connection.videoOrientation = [self videoOrientationFromCurrentDeviceOrientation];
[_session beginConfiguration];
AVCaptureConnection *movieFileOutpurConnection = [_movieFileOutput connectionWithMediaType:AVMediaTypeVideo];
movieFileOutpurConnection.videoOrientation = [self videoOrientationFromCurrentDeviceOrientation];
[_session commitConfiguration];
}
But alas it is not working. It seems once I first set video orientation on movie output, it stays like than, it can not be changed later. So if I start filming in landscape mode, and then change to portrait, the video will be ok for the landscape mode, but portrait mode will be rotated. It is the same if I start in portrait mode, than landscape will be rotated.
Is there any way to do this right?
Try adding this before you start your session:
[_movieFileOutput setRecordsVideoOrientationAndMirroringChanges:YES asMetadataTrackForConnection:movieFileOutputConnection];
The header file documentation for this method makes it sound very much like what you're looking for:
Controls whether or not the movie file output will create a timed metadata track that records samples which
reflect changes made to the given connection's videoOrientation and videoMirrored properties during
recording.
There's more interesting information there, I'd read it all.
However, this method doesn't actually rotate your frames, it uses timed metadata to instruct players to do it at playback time, so it's possible that not all players will support this feature. If that's a deal breaker, then you can abandon AVCaptureMovieFileOutput in favour of the lower level AVCaptureVideoDataOutput + AVAssetWriter combination, where your videoOrientation changes actually rotate the frames, resulting in files that will playback correctly in any player:
If an AVCaptureVideoDataOutput instance's connection's videoOrientation or videoMirrored properties are set to
non-default values, the output applies the desired mirroring and orientation by physically rotating and or flipping
sample buffers as they pass through it.
p.s. I don't think you need the beginConfiguration/commitConfiguration pair if you're only changing one property as that's for batching multiple modifications into one atomic update.
Have you tried pausing the session before changing configuration?
I have camera view using AVFoundation and if phone call or Skype call is active then we can't use camera.
How can i check if AVFoundation will not open then i need to open other view without using camera.
if i will check this-
BOOL isPlayingWithOthers = [[AVAudioSession sharedInstance] isOtherAudioPlaying];
then it will not open when any other app playing audio.
Any suggestions ?
The CTCallCenter object has a currentCalls property which is an NSSet of the current calls. If there is a call then the currentCalls property should be != nil.
If you want to know if any of the calls is actually connected, then you'll have to iterate through the current calls and check the callState to determine if it is CTCallStateConnected or not.
#import <CoreTelephony/CTCallCenter.h>
#import <CoreTelephony/CTCall.h>
-(bool)isOnPhoneCall {
/*
Returns YES if the user is currently on a phone call
*/
CTCallCenter *callCenter = [[[CTCallCenter alloc] init] autorelease];
for (CTCall *call in callCenter.currentCalls) {
if (call.callState == CTCallStateConnected) {
return YES;
}
}
return NO;
}
I am using this method in Swift, Tarun answer helped me.
import CallKit
func isOnPhoneCall() -> Bool {
/*
Returns true if the user is currently on a phone call
*/
for call in CXCallObserver().calls {
if call.hasEnded == false {
return true
}
}
return false
}
Your app delegate will receive the -applicationDidResignActive message and your app can listen for the UIApplicationDidResignActiveNotification. These will be received when your app is interrupted by a call as well as in other cases where the app is interrupted, such as when the screen locks or the user presses the lock button.
For more details how to handle interruptions in Responding to Interruptions.
Also refer stack overflow post on How can we detect call interruption in our iphone application?
I have an application which requires to use the microphone for recording user voice. I'm trying to make a speech to text.
I'm work with SpeechKit.framework and below is my code used:
-(void)starRecording{
self.voiceSearch = [[SKRecognizer alloc] initWithType:SKSearchRecognizerType
detection:SKShortEndOfSpeechDetection
language:[[USER_DEFAULT valueForKey:LANGUAGE_SPEECH_DIC] valueForKey:#"record"]
delegate:self];
}
- (void)recognizer:(SKRecognizer *)recognizer didFinishWithResults:(SKRecognition *)results {
long numOfResults = [results.results count];
if (numOfResults > 0) {
// update the text of text field with best result from SpeechKit
self.recordString = [results firstResult];
[self sendChatWithMediaType:#"messageCall" MediaUrl:#"" ContactDetail:#"{}" LocationDetail:#"{}"];
[self.voiceSearch stopRecording];
}
if (self.voiceSearch) {
[self.voiceSearch cancel];
}
[self starRecording];
}
That makes the SKRecognizer to be always open and that thing reduce the application performance.
I want to start the SKRecognizer when the microphone is detecting input audio.
I have a method for that? A method which is called when the microphone have input sound for me or a method which is always returning the level of audio detected?
Thank you!
You need to use the SpeechKit class to set up the audio.
Look here for details;
http://www.raywenderlich.com/60870/building-ios-app-like-siri
This project shows how to detect audio threshold;
github.com/picciano/iOS-Audio-Recoginzer
I'm trying to create an app that transfers data between 2+ phones using GKSession. Thing is there are two options:
First: using the GKPeerPicker.. However here I get stuck at the point where I have to implement my own WIFI interface.. apple provides no instructions on how to do that:
- (void)peerPickerController:(GKPeerPickerController *)picker didSelectConnectionType: (GKPeerPickerConnectionType)type {
if (type == GKPeerPickerConnectionTypeOnline) {
picker.delegate = nil;
[picker dismiss];
[picker autorelease];
// Implement your own internet user interface here.
}
}
Second: Skipping GKPeerPicker and doing the whole thing my self, like in this example. However the app dev documentation doesn't provide any instructions on how to send/receive data without using GKPeerPicker.. (nor could I find any example of that on thew web)
I just figured out how to connect devices without the peerpicker. It was a bit of a guessing game because the documentation is pretty unclear and I've looked for so long on the internet for any info about this. I'll try to explain everything here to clear up any questions anyone in the future might have.
From the documentation:
A GKSession object provides the ability to discover and connect to
nearby iOS devices using Bluetooth or Wi-fi.
This was the first step to understand it for me. I thought the GKPeerPickerController was responsible of the advertising and connecting but GKSession actually does all that.
The second thing to understand is that what is referred to as peers are not necessarily connected to you. They can just be nearby waiting to be discovered and connected to. All peers have a state
GKPeerStateAvailable (this is what's useful!)
GKPeerStateUnavailable
GKPeerStateConnected
GKPeerStateDisconnected
GKPeerStateConnecting
So how do we actually connect? Well first we have to create a GKSession object to be able to find peers around us and see when they become available:
// nil will become the device name
GKSession *gkSession = [[GKSession alloc] initWithSessionID:#"something.unique.i.use.my.bundle.name" displayName:nil sessionMode:GKSessionModePeer];
[gkSession setDataReceiveHandler:self withContext:nil];
gkSession.delegate = self;
gkSession.available = YES; // I'm not sure this if this is the default value, this might not be needed
Now we have some delegate calls to respond to. session:didReceiveConnectionRequestFromPeer: and session:peer:didChangeState (you should also handle the calls of GKSessionDelegate for disconnection and failure appropriately)
-(void)session:(GKSession *)session peer:(NSString *)peerID didChangeState:(GKPeerConnectionState)state
{
if(state == GKPeerStateDisconnected)
{
// A peer disconnected
}
else if(state == GKPeerStateConnected)
{
// You can now send messages to the connected peer(s)
int number = 1337;
[session sendDataToAllPeers:[NSData dataWithBytes:&number length:4] withDataMode:GKSendDataReliable error:nil];
}
else if (state == GKPeerStateAvailable)
{
// A device became available, meaning we can connect to it. Lets do it! (or at least try and make a request)
/*
Notice: This will connect to every iphone that's nearby you directly.
You would maybe want to make an interface similar to peerpicker instead
In that case, you should just save this peer in a availablePeers array and
call this method later on. For your UI, the name of the peer can be
retrived with [session displayNameForPeer:peerId]
*/
[session connectToPeer:peerID withTimeout:10];
}
}
The other peer now received a request that he should respond to.
-(void)session:(GKSession *)session didReceiveConnectionRequestFromPeer:(NSString *)peerID
{
// We can now decide to deny or accept
bool shouldAccept = YES;
if(shouldAccept)
{
[session acceptConnectionFromPeer:peerID error:nil];
}
else
{
[session denyConnectionFromPeer:peerID];
}
}
Finally to receive our little 1337 message
-(void)receiveData:(NSData *)data fromPeer:(NSString *)peer inSession:(GKSession*)session context:(void *)context
{
int number = 1337;
if([data isEqualToData:[NSData dataWithBytes:&number length:4]])
{
NSLog(#"Yey!");
}
}