Background Audio Player: not working on device - windows-phone-7.1

I'm trying to use the Background Audio Player without success. The application works perfectly in the emulator but not on a Nokia Lumia 800. I've read the whole thread at: http://social.msdn.microsoft.com/Forums/en-US/wpdevelop/thread/394de7c4-4334-46f8-a01a-30b49c6ec242/ but this is not a codec issue.
What am I doing? I create an AudioTrack object with a remote server source URI and set it to the BAP instance. Then I call Play(). The player's state is going to "Playing" but no sound on the device.
What did I did to ensure that this is not a codec issue? I updated the code described above to first download the whole file from the remote server source URI and save it to the isolated storage. Then I created an AudioTrack object with the local file URI and got sound.
So, I suspect this is a buffering issue, as my player never change its state to BufferingStarted nor BufferingStopped. But, unfortunately, I don't know what I can do to help on this.
Any ideas?
Thanks a lot in advance,
Fabian

There is a great article about that kind of problem: http://www.johanpaul.com/blog/2012/09/wp75-backgroundaudioplayer-crashes/

Related

How to let AVPlayer plays mp3 from URL without waiting to load the whole file in swift

I am playing a file using AVPlayer and it is working fine. What I am looking for is that when the application plays the file it should play as soon as some enough buffer available. Currently what I have noticed is that if the file is big it waits for longer time to start playing the file which means it waits until the file is loaded in the buffer or so.
Any thoughts?
First, thank you Paul, your hint helped.
I have resolved the issue as follow:
Instead of having the URL as the mp3 file; I have created a text file and called it audio.m3u then I have just wrote the url address of my mp3 file in the newly created file (the m3u). Then played the file (audio.m3u) in the AVPlayer.
thats all :)

libffmpeg: writing an RTSP stream to an output file

I'm working with libffmpeg in an iOS app. My goal is to connect to an RTSP source and write the media out to a file that can later be used with the iOS media player. Ideally I'd like to do this without transcoding the incoming data. I also want to be able to later re-encode the media with AVAssetExportSession if the user chooses to do so.
Because I want to create a file that is compatible with iOS, I'm limited (I believe) to mpeg, mp4 or quicktime (mov) formats.
Whenever I try to use one of these formats, I see the following warnings during my call to avformat_write_header:
[mov # 0x16401c00] Codec for stream 0 does not use global headers but container format requires global headers
[mov # 0x16401c00] Codec for stream 1 does not use global headers but container format requires global headers
My understanding is that the header wants to know the ultimate file size, which I do not know (the RTSP server is live streaming a camera, and the user stops the recording whenever they want). I guess that makes sense, but I know that others have successfully done this using the ffmpeg command line, so I'm confused as to what else I need to do here.
If I ignore the warning, I can still proceed with writing the file. If I choose mpeg or mp4 formats, my app crashes when I call av_write_trailer. If I use mov, I can successfully close the file, and the file does play back, but usually fails when I try to hand it to the AVAssetExportSession.
I would appreciate any insight into this. Thanks.
Frank
I found what appears to be a solution -- at least, it eliminates the warning. I had to set the CODEC_FLAG_GLOBAL_HEADER on both the audio and video codecs, before calling avcodec_open2.

Ruby on Rails capture webcam video and audio

Is there any gem out there that can interact with users webcam to capture video and audio and upload it to the server?
Have you heard about navigator.getUserMedia()? That could the trick. More info: http://www.html5rocks.com/en/tutorials/getusermedia/intro/
They talk about it in the Mobile Web Development course in Udacity (lesson 10).
https://www.youtube.com/watch?v=j6mzYt5fJpg
Have you tried headshot ?
I just tried it yesterday, but ran into some trouble with the flash player, not beeing able to set permissions for camera use
try it out, its easy to add to your application, maybe you'll make it work.
and better backup your files before.

Playing microsoft smooth streaming file from a local machine

I want to play a smooth streaming (.ims) file with smooth streaming sample player on my local machine on which i don't have server.
I read that you should give a url of the file in the InitParams param section in the html of the player. I tried with URL like file://localhost/D:/SmoothStreamin/A_MSS_1280_720p_24fps_200kbps/A_MSS_1280_720p_24fps_200kbps.ism but it didnt worked.
file://localhost/D:/SmoothStreamin/A_MSS_1280_720p_24fps_200kbps/A_MSS_1280_720p_24fps_200kbps.ism /Manifest
dont forget the manifest part.

Buffering in Blackberry

Currently I am working on the application for which I have to use audio buffering. Please tell me how I can achieve this task as I am trying myself.
read this Buffer and play streamed media
Also, there's a great example in the Blackberry SDK, which shows buffered playback of a MP3 file.
Take a look in the 'samples' folder of JDE/Eclipse Plugin.

Resources