Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking for code must demonstrate a minimal understanding of the problem being solved. Include attempted solutions, why they didn't work, and the expected results. See also: Stack Overflow question checklist
Closed 9 years ago.
Improve this question
Is it possible to route the audio being played by an AVAudioPlayer(for example) to the Microphone as input?
The aim is to be able to play a sound whilst on the phone to someone. That the other person can hear.
Thanks
Chris
Short answer: No
Long answer.
There are lots of ways to route audio though the system. There a several methods of accomplishing what are describing. But Apple restricts access to the audio system while the phone is active. If you attempt to circumvent this your app will be removed from the app store.
You request the ability to use the microphone as input. A microphone is a sensor that detects movement in the air. The only way to use it as input would require that you make vibrations in the air. One could play music through the speaker on the phone that is then picked up by the microphone then that audio is shipped the the receiver via it's normal audio route.
A better way would likely be to mix your audio into the the audio recorded from the microphone and send it to the receiver then play the unmixed audio via the ear piece so the sender can hear it cleanly and the receiver can hear it mixed with a voice. If you are developing a VOIP app you could make this happen.
RemoteIO and AudioUnit are likely the best place to start for creating a low latency audio routing system that enables you to mix microphone input with an existing audio stream. You can read about it here.
http://atastypixel.com/blog/using-remoteio-audio-unit/
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 1 year ago.
Improve this question
I want to play .ogg/.oga audio files from a remote URL in my iOS app. I also want to play audio even when the app is in background state.
I have tried https://github.com/iosdevzone/IDZAQAudioPlayer this player, but it cannot play from remote URLs and only play local audio in foreground state.
Anyone can help me out?
I don't know of any player that does that directly, but there are a couple of OGG decoding libraries that you can use in iOS:
The Xiph libraries which you can find precompiled for iOS.
A public domain OGG Vorbis decoder library which is contained in a standalone C file and therefore is easy to integrate into any project.
You would have to pass the streaming data to the decoder and play the decoded samples. For this you could save the decoded samples into a buffer and play them with an AVAudioPlayerNode.
I also want to play audio even app is in background state.
There is an app capability in iOS that enables you to play background audio.
This question already has an answer here:
Audio Information of Current Track iOS Swift
(1 answer)
Closed 6 years ago.
How can I access another app's currently playing audio - the actual audio item but metadata is welcome too. I can see that this question has been asked a lot but with few solutions being offered over the years. I understand apple's philosophy for probably not wanting an app to be able to do this. I also understand that such a request is probably outside of the iOS API. With that being said, I would really like some kind of solution.
Logically, I feel that
MPNowPlayingInfoCenter.defaultCenter().nowPlayingInfo
should return the info for whatever is currently playing; however, as others have mentioned, this value is always nil for audio being played outside of your app. Notably, popular audio apps seem to fail to use the MPNowPlayingInfoCenter class, making such audio fail to appear.
If using the default music app, one can use
MPMusicPlayerController.systemMusicPlayer().nowPlayingItem
However, what is a more consistent way to access audio playing through the podcasts app, Spotify, Pandora, Safari, etc?
Has anyone found a solution to this? Are there any old Objective-C frameworks that support this functionality?
One approach might be viable if there is there some way I can access the audio path of the item currently being played. For example, if I could get the path of the currently playing item, I could create an AV object from it:
AVAudioPlayer(contentsOfURL: audioUrl)
So is there a way I can get the audio url of the currently playing item and use it that way?
Is another approach better?
If a native solution does not exist, is it possible to bodge something together for it to work? Any advice or ideas welcome.
Edit: I don't believe anybody has been able to achieve this; however, I think a lot of people would like to. If this has been addressed, please link it! :)
This isn't currently possible in iOS. Just changing your AVAudioSession category options to .MixWithOthers, what will be an option to get info Song Info from other apps, causes your nowPlayingInfo to be ignored.
iOS only considers non-mixing apps for inclusion in MPNowPlayingInfoCenter, because there is uncertainty as to which app would show up in (e.g.) Control Center if there are multiple mixing apps playing at the same time.
Proposal: An option would be to use a music fingerprinting algorithm to recognize what is being played by recording it from your App.
Some interesting projects in this direction:
Gracenote https://developer.gracenote.com/ Gracenote (which is owned by Sony) has opened up it's SDKs and APIs and has a proper dev portal.
EchoNest & Spotify API http://developer.echonest.com/ Fusioned with Spotify since March 2016
ACRCloud https://www.acrcloud.com/ offers ACR solutions for custom files such as TV commercials, music tec
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I have to do two way walkie talkie iOS app.. I searched a lot got information about 'push to talk service'. But can't get clear idea. May anyone can help me? How it works by iPhone for a particular channel... I got a iTunes link
https://itunes.apple.com/us/app/two-way-walkie-talkie/id595560554?mt=8
Can anyone tell me how this app works? Is they recorded and send voices to other user or they are sending live audio ? Also they are not getting any information from user..
Most walkie talkie apps are done by sending recorded audio file instead of sending live audio stream.
If you want to implement a record-and-forward type walkie talkie, you need to have a backend file server for storing temporary audio files which can be downloaded by the receiver side.
If you choose the hard way, i.e, sending live audio stream, it's another level of complexity. You're look at implementing literally a VOIP app. You may use PJSIP for your VOIP core functionality. However, you might end up spending months on the project in this case.
Personally, I strongly recommend you to go with the first one.
I'm just beginning my research into this question and simply because the need for a fast turn around I thought I would post the question here while I continue research on my own.
The first question I have is, is it possible to detect the device's outgoing audio signal(s) - from any source - using CoreAudio or any of the audio frameworks (or combination of) listed in the dev library?
My instinct tells me this is locked down.
Has anyone any experience with this?
I guess you can only detect if its playing or not with using
[[AVAudioSession sharedInstance] secondaryAudioShouldBeSilencedHint]
Will be true when another application with a non-mixable audio session
is playing audio
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
EDIT: I have tried using several different audio codecs via HandBrake, each produces the same result.
EDIT: We've experimented with the audio and we've discovered that wearing ear buds or headphones on the mobile device makes the problem appear to disappear. Obviously something is incorrect with the recorded audio (stereo) where the mobile phones, via their speakers, are expecting mono. Is forcing stereo to mono during the conversion process causing the problem? More research ensues.
I am stumped and have banged my head against the wall for 2 days now trying to figure this out. We posted a video on our companies website that works well on all devices except for mobile phone devices including iPhone and Samsung. On those devices the video will play, but the audio is garbled beyond use. Here is the link to the video -
http://www.youtube.com/watch?v=6IVHv6nG20M&list=UUKTqrRWGA_rcxOYVVocB9kg
I have also used the .mp4, .ogv, and .webm versions of the file on our website and the result is the same on phones. Here is the URL where the video is being used -
http://www.apexinnovations.com/miRULE.html
I have tried converting to different video types using Miro Video Converter, uploading and testing all the while. The results are the same - works well on all devices save for phones. I have also exported the video to YouTube using iMovie and still I have the same results.
The kicker? The other two videos on our YouTube channel and on our website play just fine on every device. There was no difference in the way that I output or converted or uploaded those files.
Has anyone else ever had this experience? What could I be doing wrong to make this happen? Is there a fix for it?
Thank you very much for any insight that you may have!
It doesn't matter how you encode your audio or video. Youtube will re-encode it anyway. Or have you ever seen a video on youtube that you would have no sound or picture for? This would not be possible with 7bln people in the world uploading videos directly from their camcorders.
On the top of that, youtube will adapt the bitrate to suit your connection speed. So with a phone with bad GPRS connection it might sound garbled, while on the same phone connected to wifi, it might sound just fine. This is both for user experience but mostly to conserve bandwidth.
You can see their guidelines here: https://support.google.com/youtube/bin/answer.py?hl=en&answer=1722171&topic=1728573&parent=1728585&rd=1
But they will ultimately re-encode it anyway.