Okay so i'm developing an application that supports casting local media to your chrome cast. So far i have connected to my device and have streamed a sample video but I'm know struggerling with streaming local files such as .MP4/.MP3 files that are located in the documents directory of my application. I have tried to use the URL of my file instead of the sample video. However this does not work. I believe this is due to the fact the file path is not within at http:// format however I'm not sure. If i am correct in thinking this How can i get round it?
Here is the code I'm using to stream the google sample video to the chrome cat:
GCKMediaInformation *mediaInformation =
[[GCKMediaInformation alloc] initWithContentID:
#"http://commondatastorage.googleapis.com/gtv-videos-bucket/sample/BigBuckBunny.mp4"
streamType:GCKMediaStreamTypeNone
contentType:type
metadata:metadata
streamDuration:0
customData:nil];
Thanks In Advance...
The answer is reasonably simple in the end you just need to serve your files to a http server and play them from there. I used CocoaHTTPServer
Chromecast is accessing shared media through the network. Thus, you should run http server inside your app to provide streaming access to the shared media. It won't work with local file URLs because most probably they are not accessible from the other devices within your network.
Related
I'm working with libffmpeg in an iOS app. My goal is to connect to an RTSP source and write the media out to a file that can later be used with the iOS media player. Ideally I'd like to do this without transcoding the incoming data. I also want to be able to later re-encode the media with AVAssetExportSession if the user chooses to do so.
Because I want to create a file that is compatible with iOS, I'm limited (I believe) to mpeg, mp4 or quicktime (mov) formats.
Whenever I try to use one of these formats, I see the following warnings during my call to avformat_write_header:
[mov # 0x16401c00] Codec for stream 0 does not use global headers but container format requires global headers
[mov # 0x16401c00] Codec for stream 1 does not use global headers but container format requires global headers
My understanding is that the header wants to know the ultimate file size, which I do not know (the RTSP server is live streaming a camera, and the user stops the recording whenever they want). I guess that makes sense, but I know that others have successfully done this using the ffmpeg command line, so I'm confused as to what else I need to do here.
If I ignore the warning, I can still proceed with writing the file. If I choose mpeg or mp4 formats, my app crashes when I call av_write_trailer. If I use mov, I can successfully close the file, and the file does play back, but usually fails when I try to hand it to the AVAssetExportSession.
I would appreciate any insight into this. Thanks.
Frank
I found what appears to be a solution -- at least, it eliminates the warning. I had to set the CODEC_FLAG_GLOBAL_HEADER on both the audio and video codecs, before calling avcodec_open2.
I have a Cordova application that I use to be able to link to mp3 and mp4 files relatively using a src like so:
../../Documents/videos/video.mp4
I just now updated my Cordova application to the latest version and these relative URLs don't work so I've been expermenting with other solutions.
It looks like if I use the cordova.file.documentsDirectory (iOS only) variable I can link to them that way but when I save references to these files in the database the GUID of the application changes and the URL is no longer valid when the app is rebuilt and relaunched.
I tried using cdvfile://localhost/persistent/ but this seems to only work for images and not video or audio files using HTML5 audio and video tags for playback.
Ultimately I could save the files with a variable that gets replaced at run-time but this is obviously not the preferred "solution." Something like [documentsfolder]/videos/video.mp4
How can I link to a persistent file location and have it work with images, audio files, and video files?
I would love to use the cdvfile url but have it work with mp3 and mp4 files.
Thank you.
You should be able to access any resource [[[NSBundle mainBundle]resourcePath]stringByAppendingPathComponent:#"www/[your path]"]
I want to play a smooth streaming (.ims) file with smooth streaming sample player on my local machine on which i don't have server.
I read that you should give a url of the file in the InitParams param section in the html of the player. I tried with URL like file://localhost/D:/SmoothStreamin/A_MSS_1280_720p_24fps_200kbps/A_MSS_1280_720p_24fps_200kbps.ism but it didnt worked.
file://localhost/D:/SmoothStreamin/A_MSS_1280_720p_24fps_200kbps/A_MSS_1280_720p_24fps_200kbps.ism /Manifest
dont forget the manifest part.
i am sending live rtmp stream to wowza server with a live application config but everytime i connect to the stream for watching the live stream, its start from the beginning of the stream. I can see wowza is creating bigger and bigger file in /content directory and this file will everytime be played from the beginning.
How can i say wowza to send it live, like send the last 10 seconds of the file?
Best regards,
Chris
You'll probably have more luck asking this question on the Wowza forums, where their support team regularly addresses these questions. You'll need to provide more information:
What is your input (camera, flash media encoder, file)?
Can you stream VOD?
A large file building in the content directory sounds like you may be recording the stream. Do you have any add-ons such as the live stream recorder installed?
What url are you using to connect?
Well that answer is simple. You set the wrong stream type in the config.
I guess you set rtp-live-record instead of rtp-live that should fix it.
Kind Regards, Sui
Follow these instructions to create a live stream and broadcasting it
Go to the Wowza Media Server Directory. Probably it is /usr/local/WowzaMediaServer
cd applications;mkdir live
Here live is your application name
cd ../conf;mkdir live;cp Application.xml live/
Now edit the Application.xml file
cd live;vim Application.xml
Change the stream type default to live
live
set HTTPStreamers
cupertinostreaming,smoothstreaming,sanjosestreaming
set LiveStreamPacketizers to
cupertinostreamingpacketizer, smoothstreamingpacketizer, sanjosestreamingpacketizer
set Playmethod
none
Now go to your browser and open the player
file:///usr/local/WowzaMediaServer/examples/LiveVideoStreaming/client/live.html
Server : rtmp://;1935/live
Stream :
Note * : For live streaming you have to use an encoder ( Adobe Flash media Live Encoder , etc )
I'm trying to use Youtube API and always get error "ytplayer is not defined".
I've copied the code in http://code.google.com/apis/youtube/chromeless_example_1.html to my .html file, hosted in my computer "ytplayer is not defined". I tried different code but again said "ytplayer is not defined".
What is the problem?
From the YouTube JavaScript Player API Reference:
Note: To test any of these calls, you
must have your file running on a
webserver, as the Flash player
restricts calls between local files
and the internet.
how are you running the application? if its in the air framework, check you haven't got transparency=true set in the configuration. it won't load swf with transparency enabled.