Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
EDIT: I have tried using several different audio codecs via HandBrake, each produces the same result.
EDIT: We've experimented with the audio and we've discovered that wearing ear buds or headphones on the mobile device makes the problem appear to disappear. Obviously something is incorrect with the recorded audio (stereo) where the mobile phones, via their speakers, are expecting mono. Is forcing stereo to mono during the conversion process causing the problem? More research ensues.
I am stumped and have banged my head against the wall for 2 days now trying to figure this out. We posted a video on our companies website that works well on all devices except for mobile phone devices including iPhone and Samsung. On those devices the video will play, but the audio is garbled beyond use. Here is the link to the video -
http://www.youtube.com/watch?v=6IVHv6nG20M&list=UUKTqrRWGA_rcxOYVVocB9kg
I have also used the .mp4, .ogv, and .webm versions of the file on our website and the result is the same on phones. Here is the URL where the video is being used -
http://www.apexinnovations.com/miRULE.html
I have tried converting to different video types using Miro Video Converter, uploading and testing all the while. The results are the same - works well on all devices save for phones. I have also exported the video to YouTube using iMovie and still I have the same results.
The kicker? The other two videos on our YouTube channel and on our website play just fine on every device. There was no difference in the way that I output or converted or uploaded those files.
Has anyone else ever had this experience? What could I be doing wrong to make this happen? Is there a fix for it?
Thank you very much for any insight that you may have!
It doesn't matter how you encode your audio or video. Youtube will re-encode it anyway. Or have you ever seen a video on youtube that you would have no sound or picture for? This would not be possible with 7bln people in the world uploading videos directly from their camcorders.
On the top of that, youtube will adapt the bitrate to suit your connection speed. So with a phone with bad GPRS connection it might sound garbled, while on the same phone connected to wifi, it might sound just fine. This is both for user experience but mostly to conserve bandwidth.
You can see their guidelines here: https://support.google.com/youtube/bin/answer.py?hl=en&answer=1722171&topic=1728573&parent=1728585&rd=1
But they will ultimately re-encode it anyway.
Related
A Little Background On Why I Have To Do This
I am currently optimising an app in order to improve the transferring of media files to the WiFi speakers that our team developed. Our solution before was using iPhone as an HTTP server and then allow the speakers to connect and download music from it. But unfortunately a lot of problems occurred such as frequent slow transfer speed, file read failure, and when user uses the "seek" command, the speakers would have to download the whole file in order for it to seek into that particular time before it starts to play. This is a very bad experience for our users.
What I Need
In order to solve the problem I mentioned above. We thought of changing the HTTP server to an RTP server that will be ran on an iPhone and then allows the WiFi speakers to stream music from it. However, from what I read on other Q&A platforms they mentioned that iPhone does that support transferring of data using RTP. I also tried searching here in stack but were not able to find an answer that solves my problem.
My Question
Is it possible to run an RTP server on iPhone and is there any demo about this that I can refer to?
Any suggestions would be high appreciated.
Please read link http://dss.macosforge.org/
Darwin Streaming Server from Apple official.
However, I'm not sure it can work on iOS.
Best regards,
I'm essentially working as a junior software engineer and I've been tasked with creating an iPad application that is capable of receiving a real-time stream from a UAV. The camera hardware hasn't been determined so I need to put a spec together so that it can be used in my iOS app.
However I feel like I'm massively inexperienced to do so as I don't have much of an understanding about media streaming and experience writing iOS apps to produce a decoder to play the video data - I started with Obj-C about 4 months ago when I joined the company.
I apologise if this isn't the best outlet for such a request but could anyone shed some light on the process of receiving low latency video streams or is it far beyond my current ability?
Thanks for any advice!
If you need to play the video you are receiving or do any playback operations, this should be what you are looking for: Writing an app to stream video to iPhone
You can also record the video, documentation from apple
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking for code must demonstrate a minimal understanding of the problem being solved. Include attempted solutions, why they didn't work, and the expected results. See also: Stack Overflow question checklist
Closed 9 years ago.
Improve this question
Is it possible to route the audio being played by an AVAudioPlayer(for example) to the Microphone as input?
The aim is to be able to play a sound whilst on the phone to someone. That the other person can hear.
Thanks
Chris
Short answer: No
Long answer.
There are lots of ways to route audio though the system. There a several methods of accomplishing what are describing. But Apple restricts access to the audio system while the phone is active. If you attempt to circumvent this your app will be removed from the app store.
You request the ability to use the microphone as input. A microphone is a sensor that detects movement in the air. The only way to use it as input would require that you make vibrations in the air. One could play music through the speaker on the phone that is then picked up by the microphone then that audio is shipped the the receiver via it's normal audio route.
A better way would likely be to mix your audio into the the audio recorded from the microphone and send it to the receiver then play the unmixed audio via the ear piece so the sender can hear it cleanly and the receiver can hear it mixed with a voice. If you are developing a VOIP app you could make this happen.
RemoteIO and AudioUnit are likely the best place to start for creating a low latency audio routing system that enables you to mix microphone input with an existing audio stream. You can read about it here.
http://atastypixel.com/blog/using-remoteio-audio-unit/
I'd like to stream video from the camera on an iOS device to a receiver via wifi, in effect turning the device into a wireless webcam. Is there a way to build a small app that captures video input on an iOS app and sends it via an RTSP stream or similar?
As this is an ad hoc experiment, I'm not concerned about App Store guidelines and can jailbreak if necessary.
If I interpret your question correctly you more or less need to solve four problems:
Get the camera feed.
Convert/encode this to the right format.
Stream the data.
Prevent the phone from locking itself and going into deep sleep.
The first one is fairly simple and Apple has as always provided good documentation and examples -> API link. Make sure you check out their example in the end as you will get a CMSampleBufferRef data object back.
For the second and third part, you should check out the CFNetwork framework and specially CFFTPStream for streaming using FTP.
If your are only building this for yourself then you can always turn off the Auto-Lock feature in the settings. If you on the other hand would like to distribute this to other users you could use a trick to play a mute sound every 10 seconds. This is more or less how all the alarm clocks work in the App Store. Here's a tutorial. =)
I hope I helped a little bit at least.
Good luck and best regards!
I'm 70% of the way to doing the same thing. Here's how I did it:
Capture content from video input
Chop video into files for use in HTML Live Streaming.
Spin up a web server on the iPhone and make the video files available.
Connect to the IP address of the phone and viola! you've got live streaming video.
Last time I touched the code I was trying to debug my Live Streaming not working. I'll try and get my source code posted on github this weekend, if you'd like to take a look.
In my app I use MPMoviePlayerController to play an mp3 file from a web server. This plays while downloading the whole file, which is fine over WiFi. But now I want it to work over 3G (and get it into the app store). How do I get it to just buffer the next 10 seconds or so (as per apple rules)? I'm digging through the documentation on AVPlayer, HTTP Live streaming, etc, but I'm still confused about the best way to do this. With so many podcast apps out there, I'm suprised there aren't more tutorials/libraries about it.
Thanks for your time.
I investigated this as well, and I was not able find a way to limit the look-ahead buffer using MPMoviePlayerController. I believe you would have to load chunks at the network layer and feed them in at the AVFoundation layer, but I have not attempted this myself.
That said, I can confirm that you can get an app approved that plays mp3 files using MPMoviePlayerController over both WiFi and 3G connections. In my app I added a setting so the user can decide whether to enable mp3 downloads over 3G or not, although I don't know if that was needed to get approved. I provided it so users didn't inadvertently incur bandwidth costs.